digitalethicsThere are plenty opportunities to collect customer and user data of any kind, but what are the consequences? Customers are becoming more and more transparent as companies can trace every step they take! Shouldn’t there be a taboo area for marketers, defining where they have no right to be? Is transparency sufficient to allow self-determination on the net? In order to clarify all these questions, we ultimately need digital ethics.

For instance in 2010, Google CEO Eric Schmidt said: "We know where you are. We know where you were. We know more or less what you think." I don’t know how you feel, but his statements are freaking me out a bit. At least Schmidt gave advice as well on what to do if you don’t like these facts. In an interview on CNBC he said a year earlier:" If there is something that you do not want anyone to know, maybe you should not do it anyway."

Respecting privacy

Well, the French philosopher Michel Foucault called such a situation of being aware that there is constant monitoring and observation “panopticism”.

The ‘Panopticon’ is a type of institutional building designed by the English philosopher and social theorist Jeremy Bentham in the late 18th century. The concept of the design is to allow a single guard to observe (-opticon) all (pan-) inmates of an institution without the inmates being able to tell whether or not they are being watched. Although it is physically impossible for the single guard to observe all cells at once, the fact that the inmates cannot know when they are being watched means that all inmates must act as though they are watched at all times, effectively controlling their own behavior constantly. Bentham himself described the Panopticon as "a new mode of obtaining power of mind over mind, in a quantity hitherto without example." Elsewhere, he described the Panopticon prison as "a mill for grinding rogues into honest ones”.

“He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection", Foucault writes.

The ones observed are disciplining themselves constantly. Of course, every individual knows that every step he does is traceable. Hence, he adapts his behavior to normative requirements.

It’s a kind of nightmare scenario painted here that does not correspond to reality, or does it?

No, I am not starting a campaign that propagates not to use data for marketing purposes, just want to point out that there are solutions and ways in which companies can deal responsibly with data, instead of risking to eventually lose the consumers’ trust.

Surveys show that lack of trust in businesses during data processing and the known risks of the Internet in general doesn’t discourage the majority of online users from disclosing personal information. Fact is that nobody can elude the Internet, as a result having the protection of privacy taking a back seat sometimes. It is simply too convenient to get information, and it is easy to shop online, not to mention the social possibilities of Facebook that nobody wants to miss.

Nourishing trust

Trust is becoming increasingly important and a kind of currency in times of electronic eavesdropping and the power of Google and Co. Hence, I don’t have to remember how quickly it can be destroyed and how tedious it is to rebuilding it - if possible at all.

The legal frameworks are in general not enough, rather it is important to respect and accept users’ privacy. Just keep in mind that not everything that is technically possible is appreciated by the users, even if it is in accordance with data protection laws. However, fewer and better advertising definitely annoys the user a bit less.

In case of trust having been put at risk, companies can only build it up again if they follow their own rules, which means informing their users, while being open, honest and transparent.

Transparency is (no) universal remedy

The important keyword here is "self-determination". A company that is watching over its customers like “Big Brother” is acting in a way which is no longer up to date. Many companies have recognized that and some already offer a cookie opt-out, even though they only work with anonymized data.

In general, a note on the use of cookies is seen more often on websites already. So, it’s all again about clarification of what is done with the data of the users - where they are going to and to whom are they available.

But perhaps education on the benefits of digital advertising would be useful, too, such as personally tailored offers or last but not least, an advertising-funded and therefore free internet?

The ethical standards of the analog world can’t simply be adopted to the digital environment, hence it needs new social rules. However, transparency alone isn’t the magic bullet, as it isn’t a regulations tool and only important as a general principle of law.

We all know that in our digital world, transparency is often ineffective as a protection tool and would only work if users have a profound freedom of choice, as even legal requirements can’t regulate everything. Companies need to know that there is a limit and that there is an area that remains off-limits for them - whatever data they managed to collect.

Coming back to the Panopticon I mentioned, that creates a consciousness of permanent visibility as a form of power and building on Foucault, contemporary social critics often assert that technology has allowed for the deployment of panoptic structures invisibly throughout society.

Surveillance by CCTV cameras in public spaces is an example of a technology that brings the gaze of a superior into the daily lives of the populace.

Or just think about the recent Samsung's smart TV incident making consumer question what such screens are really doing and capable of, after seeing the company’s privacy policy that warns users to “please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.”

Of course, Samsung’s SmartTV came under fire, although the company reassures that neither the TV's mic nor the one in the remote are monitoring everything users say. Nevertheless, it seems that smart TV makers haven't finished tackling the privacy concerns yet, reminding me a bit of all the debates more than a decade ago, when ISPs took the heat for keeping their cards close to their chest since they track users' activities constantly. I am wondering if we just have to get used to the fact that the internet allows for a panoptic form of observation and make best use of it - as one global Internet community.

By Daniela La Marca