Information and services on the Internet are often offered as “free” in exchange for your personal information. This is how we all produce mountains of data every day. In doing so, we also trade a bit of privacy. Privacy is valuable. Not just for you as an individual, but for society as a whole. It is a prerequisite for freedom, self-development and a functioning democracy.
What is privacy?
How wonderful that you can share something with your family, friends and family doctor in confidence. They know best how to interpret your words and crazy excesses. This context disappears in the digital world. I’m going to tell you why privacy is so important, what it compromises, and what you can do to protect privacy.
The term privacy is derived from the Latin “privatus,” meaning “separate from the rest. It is a human right recognized by the United Nations. Sometimes people claim they have nothing to hide. This implies that a good person does not need privacy. If you are concerned about privacy you are -according to this logic- probably a bad person. Yet there are things you talk about with your family, best friend or doctor but not with your neighbor, co-worker or a casual passerby. Privacy is the freedom to choose what and with whom to share certain information.
So privacy has nothing to do with hiding things. The statement “if you have nothing to hide….” is only used as a fallacy when a piece of privacy freedom is taken away from you.
In the car, you sing along to the entire Top 40 at the top of your lungs until you stand still at a stoplight and feel that someone is watching you. If you think you are being watched you are going to behave more bravely and conformist (differently), according to various social studies. This means that your actions no longer come from intrinsic motivation, but that you act as you think is socially desirable. With that idea as a starting point, the philosopher Bentham designed a model for a prison in 1791: the “panopticon. Bentham’s design features a circular structure with a tower in the center from which prisoners can be watched. Inmates cannot see if anyone is in the control room. As a result, the prisoner cannot know whether he is actually being observed or not, but because he is constantly facing the central tower he always feels watched. This keeps the prisoner motivated enough to continuously behave as if he is being watched; thus, he is forced to behave conformist (well-behaved).
Privacy and anonymity
Privacy is not just about you as an individual, privacy is also a prerequisite for a functioning democracy. Once it was normal to advocate that women should not be allowed to vote or that homophily was a disease. Anyone who contradicted this could be met with much resistance. Alternative ideas about this could be developed first in private spaces at the time and later disseminated.
Privacy allows the emergence of alternative thinking, which benefits innovation and diversity. One might also wonder what journalism looks like if sources can no longer be protected. Anyone who says he has nothing to hide often directly disproves this with his actions. We all use passwords, locks, curtains and sometimes we whisper. Not to hide criminal activity, but simply because we want to be ourselves in a familiar environment and control what we share.
Needing privacy is also not the same as wanting to be anonymous. People who want to be anonymous try to hide something like their face or real name. People who need privacy just want to be able to choose what they want to share with you. At the pharmacy, you often see a white line on the floor to create some distance between those waiting and the person whose turn it is. People like to have some privacy, and these are all people who have “nothing to hide” anyway.
If you feel like there are cameras everywhere watching you, you used to be diagnosed with a personality disorder. Today, it is reality. We are followed in our doings throughout the day. What do they all know about us?
We begin with two well-known players.
What does a Facebook like say about your personality? A single like is not much, but multiple likes are, according to a Cambridge University study (Kosinski, 2013). This study showed that personalities can be easily predicted based on interactions on Facebook. Researchers developed an algorithm that could predict a participant’s personality more accurately than a close peer by analyzing just 10 Facebook likes. 150 likes were enough to predict a personality better than a parent, sibling can. And with 300 likes, the algorithm won over a partner.
Even without a Facebook account, Facebook can collect your data. When any Web site adds a Facebook “like” button, certain data is tracked from visitors to that site: Facebook gets their IP address, browser and OS fingerprint, among other things. Mark Zuckerberg promised he would never sell this data to third parties. By now we know that Mark Zuckerberg is not very good at keeping his promises.
Google founded in 1998 by Larry Page and Sergey Brin has been a subsidiary of Alphabet Inc. since 2015. The company became known for their search engine. Later, more and more free services were added, such as Gmail, YouTube and Google drive. The company also developed “Android,” which is a free smartphone operating system that now runs on 85 percent of smartphones. In 2014, Google bought the Nest company that makes smart thermostats, indoor or outdoor security cameras and smoke and carbon monoxide detectors, among other things. Google’s smart speaker (or smart microphone) has recently become available in the Netherlands. Few places can be found where Google does not have a (physical) presence.
Google and Facebook’s main revenue model is selling ads. Yet Google, like Facebook, has proven that it breaks its promises easily. In 1989, for example, the founders promised not to show ads in search results. Also, Google would never merge data from different services. All these promises have since been broken and yet we continue to use their services. I suspect this is because these promises are broken in steps that stretch but do not break our tolerance limit. They call this tactic salami tactics in sales. The Correspondent – promises Google
The above statement is not always true. Sometimes nonprofit organizations offer their services for free. They can do this because, for example, they generate income from donations. One example is the messaging app Signal. Always look critically at a provider’s revenue model.
Your devices are getting smart
Not only Google and Facebook want to know everything about you. Data traders like Acxiom (US) and Focum (NL) sell citizens’ data to banks, companies and organizations. When you link this data together, you get a detailed picture of how consumers live and where their interests lie. This data can come from anywhere. Especially now that our home appliances are becoming “smart.
Internet of Things (IoT)
In fact, nowadays more and more physical (household) devices are connected to the Internet. They communicate, as it were, with the Internet without human intervention. We call this phenomenon the “Internet of Things. The term is used primarily for devices that are not normally expected to have an Internet connection and can communicate with the Internet independently of human action. This could be your thermostat or your toaster and there are even smart clothespins already. Internet of things makes your life easier. Your thermostat knows what time you will get home on average, pre-heats the house and signals the coffee maker to start bubbling. When you get home, Google Home is waiting for you.
Google Home is a smart speaker that features a voice assistant. For example, you can ask it to play a music track, set an alarm clock, or turn on the lights.
All of these devices generate mountains of data. When the data from different sensors in these devices are combined, we can learn a lot about a person or household.
An IBM researcher discovered that a household had had meatballs for dinner by analyzing (at first glance incoherent) data such as energy consumption, carbon monoxide and carbon dioxide levels, temperature and humidity of the house during the day. (Heath, 2014). This is a good example that linking data is a powerful tool.
Unfortunately, it is not always clear to the user who gets their hands on this data. In the case of a smart thermostat, the energy supplier and the manufacturer of the smart meter may be able to see your data. How this data is then processed and used is also not always clear. Thus, your sensitive data may end up in the hands of third parties.
Choosing and sharing
In 2019, DDMA (trade association for Data Driven Marketing) surveyed how the Dutch think about data and privacy. The survey shows that 58% of Dutch people are concerned about their privacy but often do not act on it. Research institute TNO calls this phenomenon a privacy paradox. Young people do not care that personal data about them is being collected and used. The study does not indicate whether this attitude will change later in life. In my research on privacy, I am trying to understand why we share so much personal information on the Internet.
Sharing is not always intentional
When you visit a Web site, several companies obtain information about you. Most users do not see this. But with an Adblocker such as Disconnect and Privacy Badger, you can make the “peeps” visible.
Privacy as a medium of exchange
We also consciously share information. According to a market survey by Multiscope among 6,000 Dutch people, many are willing to share activity and health data with their health insurer if there is a lower health care premium in return. Thirty percent of respondents also said they would be willing to share the data with their employer in exchange for a reward such as days off or a bonus, for example. 16 percent of respondents also said they would be willing to share their data with a commercial organization in exchange for discounts on products
Only 28% of Dutch people are aware of their rights regarding data collection and online privacy. In this area, the privacy discussion resembles the climate change discussion. Despite lack of knowledge, I expect consumers to become increasingly aware and critical of privacy in the future.
Trust is the most frequently cited factor by which consumers decide whether or not to share data. Reputation and popularity of a brand are becoming increasingly decisive in the data economy. Business has the opportunity now to develop a sustainable data culture. Building reputation takes time; undermining a reputation takes no time at all.
A good example of this is Lenovo. This organization supplied laptops with so-called “adware” installed on them in 2014. This allowed advertisers to purchase ad space from Lenovo. The technique (Superfish) they used to do this turned out to be dangerously unsafe. This is because a browser could not tell if a website was truly secure or not. A secure connection is essential when doing online banking, for example. Lenovo thereby wantonly endangered users. This cost Lenovo not only $7.3 million, but also consumer confidence (PCMweb, 2015).
I believe that in the near future, privacy will increasingly be seen as a Unique Selling Point. Apple has now made a strong commitment to this and I am sure many companies will follow. NRC recently announced it would no longer use programmatic advertising via open exchange. NRC feels that this way of advertising does not fit with the trusted news brand NRC and its relationship of trust with readers and advertisers (NRC Media 2018).
The Lock-in Effect
Awareness is already an important step in privacy. Several alternatives can be found in the area of social networking. Yet only a few players dominate the market such as Facebook, Instagram and WhatsApp (all owned by Facebook).
A new user is more likely to join a network that his friends are already members of. It is difficult for existing users to switch to another service for the same reason, this is called the lock-in effect or network effect. This makes it difficult for a new or alternative provider to compete with tech-behemoths such as Facebook or Google. For example, a privacy-friendly alternative to WhatsApp is Signal.
When I ask if people would consider switching to Signal for privacy reasons, I am often told that they “have nothing to hide anyway. This kind of statement reminds me of the term cognitive dissonance. This theory assumes that people strive for harmony in what they believe they know and experience about a particular product. Such statements possibly reduce the dissonance and thus the unpleasant feeling one might experience when consciously considering the privacy one gives up for the convenience of a free product.
Your digital identity
Your data can be used to create a digital profile. This profile is then used to target products through online advertising or direct marketing. Using profiles for targeted ads is called targeting.
In fact, Facebook has already developed a powerful algorithm “FBLearner Flow” that can predict what you will do before you know it yourself (Biddle, 2018). With this development, in the future it will be possible for products to be sent to you without you placing an order. They call this “predictive shipping. For now, this sounds like future music. But we also find it completely normal today for Google to finish your sentence, while you look surprised when your partner does so. The dark side of profiling is that there can be false assumptions and biases in your profile. Indeed, profiling takes little to no account of important context. If you are mistargeted by an advertiser, it is irritating at best. But in the future, it could mean that the mortgage lender or insurance company rejects your application because of a “high risk profile.” Indeed, this is already happening (Walraven, J. 2018).
An insurer can request an assessment from a data company such as Focum. Focum uses your data to determine whether you fall within a fraud pattern and whether you are creditworthy. This advice is often taken to heart by the insurer. This is positive on the one hand because it prevents fraud, but the blind faith we have in algorithms and big data can also cause us to make decisions without considering important context.
Fiction or reality
We go one step further. The 2002 film Minority Report is a film set in 2054. A Washington D.C. police special unit called “Pre-Crime” aims to arrest would-be killers before they commit this murder. In this film, the police hereby rely on the visions of three genetically modified mediums, called “precogs,” who can flawlessly predict the future.
However, CAS (Crime Anticipation System) is not a fictional system but a real data mining system developed to map crime patterns. This form of enforcement is called Predictive Policing, a system that predicts a breach of law before it actually occurs. Currently, however, CAS does not designate individuals, but districts divided into a grid of 125-by-125-meter squares. Nevertheless, an extension at the individual level is
is not inconceivable in the future.
There are more than 450 fixed ANPR cameras in the Netherlands (at the time of writing) that can check your license plate. If you drive your car to work every day, that is your “normal” pattern, or behavior, according to CAS. Stopping once in an industrial area to check your brakes could then be noticed by CAS as aberrant behavior in the future. This inevitably causes people to behave differently. We are going to behave more bravely, just as prisoners do because they are constantly monitored by panopticon.
Rise of Trump
There are many theories about how Trump came to power. One such theory is that Trump’s campaign team has had success using microtargeting. As mentioned, Cambridge University researcher Michał Kosiński showed that with a limited number of “likes,” people’s motivations can be understood better than friends or family members can. Targeted individual psychological influence on people is a powerful tool. Cambridge analytics (not to be confused with Cambridge University) used this science and was guilty of manipulating several elections, according to various sources. All through unlawfully obtained (demographic) Facebook data. Of course, you can be skeptical about the effectiveness of this form of influence, but your data can and will be used to influence you.
Among other things, the documentary The Facebook Dilemma tells how “the Russians” used Facebook to spread fake messages to turn people against each other, with the goal of sowing division in America. In other countries, this kind of fake news has also led to violence. In many countries, Facebook is the main source of information (news). The question is whether Facebook wants to see and take its responsibility in this. So far, Facebook has mostly ignored all the alarm bells. About Privacy wants to stay far away from “conspiracies,” but by now we do see Facebook as an organization that scores extremely poorly on ethics. For example, Facebook does not hesitate to make users involuntarily participate in dubious experiments/research.
From connecting to controlling
Once upon a time, the purpose of the World Wide Web was to facilitate the exchange of information among scientists working together in CERN’s mostly international projects. A great tool for connecting people. Google founders Brin and Page wanted nothing to do with ads for the first period after leaving academia. But they too realized that advertisements were inevitable if they wanted to fulfill their sky-high ambitions. Advertising on the Internet became a revenue model with nothing wrong with that in itself. With Facebook’s Cambridge Analytica data scandal, among others, it has become clear that data can be used not only to target advertising, but also to influence people on an individual level. Even if one were to doubt its effectiveness, the intent is evident.
The Chinese government has already taken the first steps with a social credit system that the government plans to introduce in 2020. The plan involves giving all Chinese citizens a certain score based on their behavior. Among other things, this score has implications when applying for a loan or the chance of getting a good job. Your friendships and social interactions also affect the score. As a result, some people will be shunned by others because they have a low score. Critics argue that the Chinese government will use the system to further control the behavior of its citizens.
Privacy survival tips
Now that you have read this piece, hopefully you are aware of the importance of privacy.
Fortunately, there are plenty of ways to protect your data and privacy. Most Internet companies are data-hungry, but there are plenty of privacy-friendly alternatives.
– Look for privacy-friendly alternatives
Instead of Whatsapp -> Signal
Search Engine – > DuckDuckGo
Mail -> Tutanota or Protonmail
Browser -> Firefox
– Install an ad blocker
These stop you from seeing ads and block tracking cookies. A few examples:
Better Blocker (ios)
– Change your password regularly
Get a different password for each site, or even better, a passphrase and change it regularly. There are those who write down their passwords in a notebook, but it is much more convenient and secure to use a password manager.
Password Manager -> Bitwarden (free)
– Delete your cookies regularly
– Think consciously and logically
Nothing goes for nothing, every person understands that. When something is offered for free, you often pay with something other than money. Google makes beautiful things that I use myself sometimes, like YouTube. But when I use these kinds of free services, I am aware myself that I am paying with my personal information. For example, I’d rather not trade my personal information for some cute cat videos.
Perhaps you should think of your personal information as legal tender and ask yourself: how much am I worth?
Stephens-Davidowitz, S. (2018). Everybody Lies (Rev. ed.). London, England: Bloomsbury.
Martijn, M., & Tokmetzis, D. (2016). You do have something to hide (Herz. ed.). Amsterdam, Netherlands: The Correspondent.
Schep, T. (2016). Design my privacy (Rev. ed.). Amsterdam, Netherlands: BIS Tokmetzis, D. (2012). The Digital Shadow (Rev. ed.). Houten, Netherlands: Spectrum.
Walraven, J. (2018). The theft of the century. How we lost AND can regain our privacy. Antwerp, Belgium: Van Halewyck.
Cialdini, R. (2009). Influence (5th ed.). Amsterdam, Netherlands: Academic Service.
Van Wijk, K., & Huijzer, D. (2011). The media explosion (4th ed.). Amsterdam, Netherlands: Academic Service.
Multiscope. (s.d.). Half of Dutch people willing to share data with insurer | Multiscope. Accessed Oct. 22, 2018, from http://www.multiscope.nl/persberichten/helft-nederlanders-bereid-data-te-delen-met- insurer.html
Kosinski, M. (2013, April 9). Private traits and attributes are predictable from digital records of human behavior. Accessed October 2, 2018, from http://www.pnas.org/content/110/15/5802.full
MT. (2017, June 8). Sharp negotiating #9: the salami tactic – MT.co.uk. Accessed November 5, 2018, from https://www.mt.nl/business/scherp-onderhandelen-9-salamitactiek/537785
Heath, N. (2014, June 28). I know what you ate last supper: What home sensors will reveal about your life. Accessed Oct. 22, 2018, from https://www.techrepublic.com/blog/european-technology/i-know- what-you-ate-last-supper-what-home-sensors-will-reveal-about-your-life/
Marketingfacts. (s.d.). Nudging https://www.marketingfacts.nl/berichten/nudging-onbewust-gedrag-bewust- influencing
Keach, S. (2018, April 16). Paid version of Facebook? Here’s how much it could cost you each month. Accessed November 3, 2018, from https://www.thesun.co.uk/tech/6062210/facebook-paid-money- monthly-fee-how-much/
Shobhit, S. (2018, April 11). How Much Can Facebook Potentially Make from Selling Your Data? Accessed November 3, 2018, from https://www.investopedia.com/tech/how-much-can-facebook- potentially-make-selling-your-data/
Sherr, I. (2018, April 19). Facebook, Cambridge Analytica and data mining: What you need to know. Accessed September 21, 2018, from https://www.cnet.com/news/facebook-cambridge-analytica- data-mining-and-trump-what-you-need-to-know/
Greenfield, P. (2018, June 29). The Cambridge Analytica files: the story so far. Accessed October 2, 2018, from https://www.theguardian.com/news/2018/mar/26/the-cambridge-analytica-files-the-story-so-far
Albrecht, L. (2017, Oct. 17). How behavioral economics is being used against you. Accessed October 2, 2018, from https://www.marketwatch.com/story/nobel-prize-winning-economist-richard-thalers- nudge-theory-has-a-dark-side-too-2017-10-17
Security.com. (2017, Feb. 1). Research: 17% Dutch internet users use adblocker. Accessed October 15, 2018, from https://www.security.nl/posting/502155/ Research3A+1725+Dutch+internet users+used+adblocker
Big Data revolution at the expense of privacy citizens. (2017, Aug. 8). Accessed October 3, 2018, from https://www.graydon.nl/blog/big-data-revolutie-ten-koste-van-privacy-burgers
Schepers, A. (2015b, July 7). The police of the future monitor every citizen non-stop. Accessed October 15, 2018, from https://decorrespondent.nl/3044/de-politie-van-de-toekomst- keeps-every-citizen-non-stop-in-the-holes/279163005704-5df91b90
Biddle, S. (2018, April 13). Facebook Uses Artificial Intelligence to Predict Your Future Actions for Advertisers, Says Confidential Document. Accessed October 3, 2018, from https://theintercept.com/ 2018/04/13/facebook-advertising-data-artificial-intelligence-ai/
PrivacyNews.com. (2018, Nov. 17). License plate registration. Accessed November 20, 2018, from
PCMweb. (2015, Jan. 14). What about Lenovo? 5 questions about the Superfish malware. Accessed December 18, 2018, from https://pcmweb.nl/artikelen/algemeen/hoe-zit-het-nou-met- lenovo-5-questions-about-the-superfish-malware
NRC Media. (2018, Oct. 15). NRC no longer offers ads via open exchange ” NRC Media. Accessed December 18, 2018, from https://www.nrcmedia.nl/nieuws/nrc-biedt-geen-advertenties- more-to-open-exchange/
McMullan, T. (2017, Feb. 21). What does the panopticon mean in the age of digital surveillance? Accessed December 1, 2018, from https://www.theguardian.com/technology/2015/jul/23/ panopticon-digital-surveillance-jeremy-bentham
Young, A. L., & Quan-Haase, A. (2013). Privacy protection strategies on Facebook. Information, Communication & Society, 16(4), 479-500. http://dx.doi.org/10.1080/1369118X.2013.777757
NRC. (2017, March 13). China will soon assess all citizens based on big data. Consulted on 16
oktober 2018, van https://www.nrc.nl/nieuws/2017/03/13/big-brother-meets-big-data-7073575-a1550087
DDMA. (2016). DDMA Privacy Monitor 2016. Consulted from https://ddma.nl/download/54652/
DDMA. (2018). DDMA Privacy Monitor 2018. Consulted from https://ddma.nl/download/66878/
The Facebook Dilemma https://www.pbs.org/wgbh/frontline/film/facebook-dilemma/