JOURNAL 01 — 2019








Who the f*ck are the

“FUTURAE LOKA”?



The Latin word for the ‘Future’ is ‘Futurae’ and the Sanskrit word for ‘Human Race’ is ‘Loka’.
     This mark represents the mixed beauty of the heritage of the ‘Futurae Loka’ (A.K.A Earthlings of a mixed heritage background.)   

And these are our stories.

I say stories and not articles because I think it’s important for us to teach the open minded lessons that our experiences growing up, and moving through life with the autonomy to move between different cultures gives us.
     How seemingly at first the emotions of feeling as if we are somewhat lacking and don’t belong to our heritages is not true, and instead we have the power to create and forge our own, new and inspiring identities.
     With the growing tensions between people of all kinds, we should tell of the encouraging benefits of living a multicultural lifestyle, which many of us already do, yet fail to realise.




Mark





THE PRIVACY PARADOX AND WHY IT WILL PERSIST



Casually watching the loss of our privacy.
Unknown
by Jonathan Quaade
jonathanquaade.com
10 June 2019

Most people will tell you that they "care" about online privacy. 74 percent of Americans say it's very important that they control who can obtain their personal information (Raine, 2018). I don't believe it. People say they care, but people don't actually care and it is a well-documented fact. We have a tendency towards privacy-compromising behaviour online, resulting in a dichotomy between privacy attitudes and actual behaviour (Acquisti 2004, Barnes 2006). There is so much discussion on regaining and retaining our privacy but as individuals we do almost nothing. This discrepancy between our online behaviour and our actual actions is called the privacy paradox (Barnes 2006).

The privacy paradox applies to everything on the internet because every company in the world is collecting and using our personal information. And yes, it's every company. How many emails did you receive when the EU brought in the new GDPR rules? It wasn't just multinational tech corporations but even smaller companies' website or apps we visited or bought something from a decade ago messaged to tell you, "We're committed to managing and safeguarding the information you give us... Please opt-in" (I know that all those emails went directly to the trash, btw). Everything you have ever done, liked and thought about is stored in some air-conditioned storage facility. Yet, we continue to watch, scroll, tap, message, and browse incessantly without thought or care.

We consider data collection too narrowly and therefore disregard our privacy. “Who cares if Facebook knows I listened to Childish Gambino or went to Sainsbury’s? They don’t have my password or credit card details”. As long as our obviously sensitive data is ‘safe’ we reduce the less significant data collection as inconsequential and trivial. However, everytime one of us receives a hyper specific ad we bemoan the terrifying reality of companies knowing too much about us. This proves that people are aware that “any personal information can become sensitive information” (Ted Talks, 2013) and that all the stand-alone information fragments are stored, analysed and used actively with astonishing precision. So why don’t we do anything to protect our privacy? I think there are four primary reasons why our behaviours and actions will not change and why the privacy paradox will persist.

One, I strongly believe it's because the collection of our personal information is invisible. We may understand it but we don’t see what we are handing over. Let's draw a comparison. If your waiter asked you to give him all your contacts, your phone number, put a location-tracker on you and give him a transcript of all your online conversation in order to eat at a Wagamama restaurant, there is simply no way you would agree to it. However, this is what tech companies ask of us to use their services and we don't batter an eye. It is all obscured behind our glossy screens. We are completely disconnected from the disclosure and the consequences of the information exchange, mainly a fault of the services design.

Secondly, there is a problem with how we are asked to allow data collection. It's done through terms of service and privacy policies. These agreements are a joke. They do not help users at all and they are designed and written in a way that makes reading them impractical at best and extremely unlikely in practice. It would take the average person 201 hours a year to read all the terms of service of all the websites they visit. (McDonald and Cranor, 2008) 🤦🏾‍♂️So the obvious result is that we either don’t read them or simply ignore them. People may also just blindly trust them. We may think if privacy policies were that bad, would our governments not step in? Wouldn't they regulate it? But even if governments did regulate it's always done in retrospect, and most companies are aware of the psychological dynamics that prevent us from safeguarding our privacy effectively. “Various sometimes, subtle factors can be used to activate or suppress privacy concerns, which in turn affect behaviour”(Acquisti, Brandimarte and Loewenstein, 2018) . Unless, users actually read the agreements there is no way to guarantee they don't continue to get exploited.

Furthermore, the human tendency to ignore whatever is hard to see and the design surrounding terms of service means that any person with conviction can collect and exploit human’s information. People are indoctrinated to "tick here to agree" without second thought. I actually conducted an investigation into this phenomenon, by asking a hundred individuals to fill out a physical form asking for their age, height, weight, nationality, etc. and followed by ten personal questions. The questions were designed to allow me to obtain similar specific information like what Facebook can deduce from their data about their users i.e. “how many times have you travelled in the last year?”, “how many selfies have you taken this month?” or “how many times have you called your parents this week?” Then afterwards, the individual would have to sign the form, attesting that their answers and personal information could be used in any way possible. Everyone signed it, and only one person read it. It poignantly demonstrated how little we consider the use and loss of our personal information.

Lastly, Acquisiti, Brandimarte and Loewenstein also described that our will to protect our privacy is being overpowered by our deep-seated instinct to share. (Acquisti, Brandimarte and Loewenstein, 2018). Online has become the place to fill that instinct instantly. We have a need to be social, to share, to receive validation and the internet makes all so easy. The interactions we use to have at a more local level between family, friends and your community has been replaced by tweets, posts and likes. Online sharing and validation from it is now so embedded in our culture that it trumps any fear of data theft, privacy and misuse, so we happily hand over our information blindly without consideration for the consequence. Tech companies exploit this, and all other institutions today benefit from it. Our banks, our travel agencies, our fashion companies all benefit from the access to the massive reservoir of everyone's personal information. Our world is a place of total visibility, transparency and accessibility, but people benefit massively from it, and therefore privacy concerns are disregarded.

We are disconnected from the disclosure and the consequences of information exchange, services are obscured and we continue to share endlessly. We have unconsciously constructed this phenomenon where the world’s hottest commodity, humans, and all their information and their identity, are all publically on display, to be viewed, scrutinised and bought or obtained as easily getting milk from the supermarket. Just like the privacy paradox states, we express a desire to protect our personal information but that desire has no resemblance to world we are creating. It should be in our best interest to change the status quo and gain back some control. Maybe we could just change human behaviour, but that is unlikely to ever happen. The system itself is broken and is breaking with huge new privacy controversies appearing so regularly it feels like if our behaviour was ever going to change, it should have changed by now.

There is plenty of work to do, so we better get started but I'm just going to go buy this thing that Instagram showed me an ad for. I'll be back... as soon as I have actually started to care about my privacy.



Bibliography [in order of appearance]

Raine, L. (2018). The state of privacy in post-Snowden America. [online] Pew Research Center. Available at: http://www.pewresearch.org/fact-tank/2016/09/21/the-state-of-privacy-in-america/ [Accessed 7 May 2018].

Acquisti, A. (2004). Privacy in electronic commerce and the economics of immediate gratification. Proceeding EC '04 Proceedings of the 5th ACM conference on Electronic commerce, [online] pp.21-29. Available at: https://dl.acm.org/citation.cfm?id=988777 [Accessed 8 Sep. 2018].

Barnes, S. (2006). A privacy paradox: Social networking in the United States. First Monday, [online] 11(9). Available at: http://firstmonday.org/article/view/1394/1312. [Accessed 3 Aug. 2018].

Barnes, S. (2006).

Ted Talks (2013). Alessandro Acquisti Why Privacy Matters. [video] Available at: https://www.ted.com/talks/alessandro_acquisti_why_privacy_matters[Accessed 12 May 2018].

McDonald, A. and Cranor, L. (2008). The Cost of Reading Privacy Policies. I/S: A Journal of Law and Policy for the Information Society, (2008 Privacy Year in Review issue), p.19.

Acquisti, A., Brandimarte, L. and Loewenstein, G. (2018). Privacy and human behavior in the age of information. SCIENCE, [online] 347(6221), pp.509-514. Available at: https://pdfs.semanticscholar.org/a7c5/f9f224a556cbb7f35028094ac91739ecee8b.pdf?_ga=2.207081043.1544888707.1524409344-836998413.1522099863 [Accessed 12 May 2018].

Acquisti, A., Brandimarte, L. and Loewenstein, G. (2018). pp.509-514






Mark