As Facebook’s Prometheus lay chained to a rock, the eagles picked and tore at his flesh to get at the tasty liver within, and society watched. They watched from their phones and tablets, the displays on their workout equipment and even from their company computers. Hours passed until Zuckerberg, close to death, was rejuvenated and his liver renewed for the hungry congressional birds to partake once again.
Data technology has profoundly impacted humanity. Information and knowledge are literally in the palm of the hand and yet demand continues to grow. Over the course of the past several decades societal consumption of data, and the resulting information, have grown exponentially and so too have the challenges inherent in that growth. The internet has ensured that people from all societies, nationalities, and races can collaborate on issues to arrive at answers and solutions faster and more efficiently than ever before. Personal data, like other forms of data, have been shared freely and have contributed to the explosion of information essentially becoming a form of public domain. But is this what society intends for their data? Data sharing technology has furthered innovation and creativity while exposing sensitive personal information that, over time, has fueled privacy concerns which threaten to stifle future advancement. How does a progressive society limit an expanding infringement on personal privacy while being careful not to stifle creativity, innovation and advancement?
Historically speaking, data consumption and generation have outpaced the ability to govern that progress as well as the problems that progress present. Once the domain of “hackers” intent on bypassing laws and controls for fun, data breaches have evolved into weaponized technological activities used by criminals and state-sponsored actors alike. These nefarious forces are eager to terrorize or cripple infrastructure, benefit monetarily or both. In much more benign examples, the individual citizen’s personal data is used by companies to increase their bottom line. Breaches such as those seen with Target and Experian point to the vulnerability of large corporations but more concerning are those data issues associated with the likes of Facebook, which speak to a lack of willingness to take personal privacy seriously. Companies like Facebook and other “social-media platforms are essentially the “acme of communication--better, even, than face-to-face conversations” and yet, they have the prospect of impacting many people when data is misused (“The Medium is the”, 2016). It is important to note that data and information are different but related. Simply stated, data are raw facts often used to measure or describe something whereas information is the result of adding context to those facts. For instance, an individual’s social security number on its own is just a nine digit number. One could imagine a nine-digit number and it would be no more sensitive than if he or she created a name for a character in a novel. Add context to that number however, such as the name and address of the social security number’s owner, and there now exists a threat. This ability to gather and apply context to data has resulted from great leaps in data technology over several decades. Advances in databases, wireless data transmission and mobile devices have ushered in a revolution in which access to endless knowledge has led to some unintended outcomes.
Technology rich online environments cater to social desires and house a great deal of personal data required to serve those that frequent the environments. In fact, “several authors have speculated that we are moving toward a society where digital technology becomes invisible and our physical and virtual realities are integrated” which will make it even harder to know when data should be protected (Corcoran). Technologies such as machine learning and artificial intelligence utilize vast amounts of sometimes sensitive data to predict behaviors and thereby obfuscate the line between physical and virtual realities. As noted in the “Harvard Law Review” of 2013, “The combination of extensive databases and cheap microprocessors has spawned an analytics industry that is changing the ways in which consumer products are marketed and priced” (Strahilevitz). These technologies can result in companies like Amazon suggesting certain products based on past purchases or in social media platforms serving up political ads to influence the outcome of a campaign as was the case with Facebook. As such it is important to ensure that how and when this technology is used is carried out with the moral and philosophical consequences in mind.
The discovery that Facebook had allowed Cambridge Analytica to access and keep vast amounts of consumer data for the purposes of benefitting a U.S. presidential campaign is troubling on many fronts. For one, this was not a breach but in fact a policy failure in which Facebook neglected to ensure the data it had willingly shared with Cambridge Analytica was properly disposed of once the agreement was terminated. What this failure highlights is the societal ignorance around the dangers of leaving personal data unprotected in the interest of moving ever forward as if in a car where the passengers can “see where they want to go, but little besides” (Brown). One question that must be asked is; If something can be done, does that mean it should it be done? Easy access to data technologies such as artificial intelligence and machine learning raise serious philosophical and moral questions about the pace with which data technology is evolving and suggests that humanity is unwilling or perhaps unable to put the genie back in the bottle; although efforts to do so have certainly been made.
In 2016, for example, the European Union enacted the General Data Protection Regulation or GDPR in which the rights of consumers were, for the first time, placed above the rights of companies and organizations who benefit from their data. The fact that consumer privacy was “just a formality, boiling down to try to prepare an information sheet” that left consumers with no other choice “than accept the conditions…set by Data Controllers”, meant that consumers had no real protection prior to implementing the regulation (Connected Cars). GDPR and the California Consumer Privacy Act of 2018 (CCPA) are social and political responses to the frequent data mishaps which illustrate a blatant disregard for the protection of consumer privacy. In an effort to curb both the impact of data breaches and require companies to demonstrate some semblance of data literacy, the regulations allow consumers to sue large and small organizations alike for the mishandling of personal data. By leveraging the ability for governments to act on behalf of their citizens, the regulations place the power of data ownership back into the hands of the true data owners. There is concern, however, that regulation could stifle the creativity and innovation the free access to data has enabled thus far. When considering the resources required to staff new departments within organizations to comply with regulations it's easy imagine the possible impact to a business’s bottom line.
Society must grapple with tough questions such as these as it moves forward to innovate in the data space. Although there have been numerous negative events that highlight the inherent challenges associated with data sharing, there have also been many beneficial advancements. The key is balancing the ability to do so with the moral obligation of questioning whether personal data should be utilized for the benefit of companies at all.
Having brought a great deal of attention to the problems associated with the use of personal data in order to deliver more personalized experiences, Facebook has illustrated the relationship between humans and technology. This is a complex relationship with many examples, both good and bad, that have occurred over the course of human history. When examining the complicated role data plays in the development of humanity, one can see that the historical significance of data and information usage is not only necessary for evolution but fraught with peril. Whether exploring the splitting of the atom or the development of fossil fuels, knowledge has been created from information and information is created by adding context to data. When people created technologies to take advantage of the vast amounts of data generated, they opened up opportunities to exploit that technology and, as such, data has become the technological fuel to drive insight. One could consider data a commodity worth more than oil as it is the only resource in a company that is irreplaceable.
Facebook’s use of data to drive better insight and attract more people to their platform led to decisions based on greed that should have been challenged as the company matured. Historically speaking, corporations are not exactly morally responsible when left to their own devices as the bottom line is often a first priority when making decisions. What this means when examining data technologies such as Facebook and the algorithms used by companies like Amazon, is that the question of whether it is morally responsible to take an action should always be considered first. Additionally, given the track record of certain corporations, it should be required to study topics of the ethics of data and information use that illustrate both good and bad outcomes for anyone studying technology at the university level. There is a balance, however, in applying restrictions to the technology that people are so closely connected with and rely on. As Antoine Wright puts in in his article, “A Lens of Humanity Through Our Technologies”: “Could it be that mobile devices, connectivity, computing, and [any technology], point to an amplification of humanity — or maybe simply show as some of our best-invented descriptions of what it means to be human?” (Wright).
While being careful to avoid mistakes of the past there must be a balance of, and avoidance in, stifling creativity when it comes to the use of data technologies. Recognizing the historical mistakes that illustrate morally corrupt decisions of the past should not come at the cost of creating regulation and laws that oppress data’s ability to revolutionize the human experience. Regulation like GDPR protects consumers from the deeds of corporations left unchecked, but could also result in bureaucracy that saps resources and creativity. The answer is personal accountability. Regulation like GDPR needs to be coupled with personal ownership and accountability for data so as not to solely rely on government to police our digital activities. In other words, don’t just agree to the terms and conditions without knowing what they mean.
Examining data technologies through a humanities lens illustrates just how important creative use of technologies like Facebook have become in society. While some may argue that these technologies have people stuck with their faces plastered to their devices, others might argue that the world has become much more connected. As such, the eagerness to just accept legalese and move on in palpable. Facebook demonstrates a network example of digital humanities in that the connection algorithm makes “associations [that] are recorded as pairs of nodes (i.e., people, places, things) and edges (i.e., relationships) and analyzed for patterns of connectedness, distance, and density” (Sula). The creativity involved in thinking of new ways for people to become connected, such as through their networks of friends and family, has led, in some ways, to frequent attempts to better the offerings that preceded them by creating new ways to link individuals who have similar interests.
The outcome of this creativity, however, sometimes has an opposite effect on the technological advances that allowed it to exist in the first place. The original idea was to create a platform to bring people together but the innovation that followed created a silo effect in which people are being connected to only those they most likely align with. This becomes an even greater issue given the interactivity of platforms like Facebook. Whereas in the past, society simply consumed the information is was presented, current technological advances dictate that “the dirty power of Facebook and Twitter is located in their potential to collapse the boundaries between information and interactivity” (Bury). What the Facebook data controversy demonstrates is that technological creativity generated only in the interest of growing the bottom line is creativity best left alone.
On the technology front, advancements that have made computers and communication devices increasingly smaller have enabled an explosion of education and information consumption. People are smarter today than at any point in history and yet, both the desire for more and the dangers posed are growing as well. When is it socially acceptable to take what belongs to someone else and use it for one’s own gain despite the impact to the other person? The love affair with technology that allows faster, more efficient consumption of data and information has undoubtedly come at the cost of culture. While, “on the one hand, [people] like and desire the tech industry’s offerings, personally…on the other hand, [people] can’t always see the consequences of those offerings at a social scale” (Bogost, 2019). In the pursuit of making the world community smaller and more accessible, people have created silos of their own making to the exclusion of those they deem undesirable to their point of view or way of life.
From a personal data standpoint this has had an interesting effect. People willfully hand over personal data to the large corporations they trust while creating social media islands for them and those they align with to keep others out. This is especially true of news media and other similar information outlets. There seems to be no middle ground when searching for news these days. One is either on the side of the left or the right and for those out there that do not align with one of those extremes, well, you’re not invited. Social technologies such as Facebook thrive because people “love it when [they’re] bathed in what things [they] like to click on, and so the machine automatically feeds [them] the stuff that [they] like” (Berners-Lee). Unfortunately, this scenario also means that people live in what Tim Berners-Lee calls a “filter bubble” that limits the richness of an open and varied experience and provides power to organizations who control the personal data generated in that bubble. While social media giants get richer harvesting personal data, people get culturally poorer in their ability to try new things or experience diverse ideas. When technology organizations use data to make decisions for the individual, the randomness and objectivity of human interaction is removed and in its place is established a machine that knows them and feeds them only the things they like.
Society must question then, who Prometheus is in this data parable described earlier. Zuckerberg undoubtedly felt as though he were Prometheus during the pseudo inquisition as he only wanted to do something good for society in bringing them closer together. However, the technology his company created began to be infused with the power of corporate needs in the pursuit of greater profits. The technology required more data to make its decisions. It had to know more about the people it served and therefore no thought was put into the tenants of the scientific method, or at least in the purest sense of it. Where were the questions? The ones that asked if something should be done, or whether it would cause harm. Undoubtedly research was done, but to what end and what extent? And the hypothesis? That was simply an obscure signpost on the highway to corporate growth that resulted in humanity’s social interaction being the experiment. The scientific method has been bastardized to fulfill the needs of the greedy. The punishment for the never-ending quest for knowledge and understanding can be cruel and fate can deliver that punishment with little to no empathy. Thus, society is Prometheus in this scenario and personal data its fire.
Berners-Lee, T. (n.d.). A Magna Carta for the web. Retrieved from https://www.ted.com/talks/tim_berners_lee_a_magna_carta_for_the_web
Bogost, I. (2019, October 10). Technology Sabotaged Public Safety. Retrieved from The Atlantic: https://www.theatlantic.com/technology/archive/2019/10/how-technology-sabotaged-public-safety/599611/.
Brown, J. S., Duguid, P., & Weinberger, D. (2017). The Social Life of Information : Updated, with a New Preface. Boston: Harvard Business Review Press. Retrieved from https://search-ebscohost-com.ezproxy.snhu.edu/login.aspx?direct=true&db=nlebk&AN=1798528&site=eds-live&scope=site
Bury, R. (2018). Television viewing and fan practice in an era of multiple screens. In J. BurgessA. Marwick & T. Poell The sage handbook of social media (pp. 372-389). 55 City Road, London: SAGE Publications Ltd doi: 10.4135/9781473984066.n21
Connected Cars under the GDPR. (2019). 2019 AEIT International Conference of Electrical and Electronic Technologies for Automotive (AEIT AUTOMOTIVE), Electrical and Electronic Technologies for Automotive (AEIT AUTOMOTIVE), 2019 AEIT International Conference Of, 1. https://doi-org.ezproxy.snhu.edu/10.23919/EETA.2019.8804515
Corcoran, P. & Mooney, P. (2018). Digital earth. In B. Warf (Ed.), The SAGE Encyclopedia of the internet (pp. 233-234). Thousand Oaks,, CA: SAGE Publications, Inc. doi: 10.4135/9781473960367.n72
Strahilevitz, L. J. (2013). Toward a Positive Theory of Privacy Law. Harvard Law Review, 126(7), 2010–2042. Retrieved from https://search-ebscohost-com.ezproxy.snhu.edu/login.aspx?direct=true&db=bsu&AN=87598629&site=eds-live&scope=site
Sula, C. (2018). Digital humanities. In B. Warf (Ed.), The SAGE Encyclopedia of the internet (pp. 235-240). Thousand Oaks,, CA: SAGE Publications, Inc. doi: 10.4135/9781473960367.n73
The medium is the messengers; anthropology. (2016, Mar 05). The Economist, 418, 75-76. Retrieved from http://ezproxy.snhu.edu/login?qurl=https%3A%2F%2Fsearch.proquest.com%2Fdocview%2F1771720098%3Faccountid%3D3783
Wright, A. R. J. (2017, August 28). A Lens of Humanity Through Our Technologies. Retrieved September 27, 2019, from https://medium.com/the-mission/a-lens-of-humanity-through-our-technologies-a004b61f167a.