Algorithmic content creation and the neoliberal hypercapitalism

Writer and artist James Bridle recently posted an article on Medium in which he describes the strange and abusive world of children's youtube. The article as a whole is worth a read, and I strongly encourage you first to read that and then come back here.

In short, what Bridle describes is the strange and often sick world of the postdigital condition, what I could name digital capitalism, where digital technologies are being used to quickly create content in order to get page-views to get ad clicks to get money. This maybe complicatedly sounding scheme is actually pervasive on the internet. Because we are not willing/used/want/ to pay for the digital content, the creators turn to ad sales. This, of course, is propelled, and made possible in the first hand, by the digital platforms inventions of algorithmic and automated selling of ads and targeting these ads through keywords. The amount of ad clicks one has to generate to profit is enormous. Thus, creating quality content on this context is often not profitable. So, in an age where access to digital technology is effortless, the solution (for some) is first to create machines that click these ads and follow your 'channel.' These are made to get profits, but also for platforms to notice your content and start to promote it. However, even these measures are not adequate; in fact, platforms try, in some measure, to cancel the automatized ad clicking with another set of algorithms thus creating an automatized digital battleground. Then, the solution for some content creators  is to allocate the whole or most of the creation process to algorithms. Mashing up digital characters, with digital motion libraries and then to post that content with a 'word salad' of algorithmically created keywords to better be seen and found. This creates a bizarre world of machine-made content that borrows from popular content and then remixes it over and over again to be seen and clicked by people, and in a big portion machines.

As Bridle describes, some of the algorithmically created content is harmless, and can even be seen to have some value. However, what is distressing is the edge cases, the dark, violent, abusive things that arise from the never-ending loop of algorithmically curated, analyzed and created content. As Bridle , along with for example Rushkoff (Rushkoff, 2010; 2013) mentions, is the internets ability, or digital technologies in general, to amplify the periphery, the on and the off, the 1 and the 0. If we do not carefully think through our algorithms,  building some sort of value system in them, the algorithms just amplify that which has the best impact factor. Often than not, that is something disturbing, something that wakes our curiosity, sticks to our eye. For instance, in Bridle's example the mash-ups with Peppa the pig, where a pirated version of Peppa is positioned into a violent and abusive situation. In addition, the algorithmic mash-ups generate violence that happens almost in the background, in the unconscious, creating a disconcerting and stressful experience.

Children's youtube videos are just one example of the problems inherent in our digital society. Similar things happen everywhere, just try googling anything, and with a little digging, you find yourself in the garbage ocean of algorithmically generated websites and other content. As a side note, when these algorithms get smarter with machine learning, some of the grossest examples can be eliminated. - Which instead might not be a good thing, if it leaves us with as abusive, but better-made content. Furthermore, machine learning discloses problems of it own such as biases in the learning data, which for until now have turned out to be somewhat racist and misogynist.

I think that the problems within the digital technologies can be categorized into two categories. (Actually, I think they can be categorized in a plethora of ways, but for the sake of argument, I am just going to sack them into two categories.) First is the biases inherent in the digital technology itself. This refers to the aforementioned nature of digital aggregating the extremes. Rushkoff speaks of it as the basis of digitality, the duality of the system: You can only choose yes, or no, one or zero (Rushkoff, 2010; 2013; 2016). Even though we can enhance the resolution where yes or no becomes dozens, or millions of them, however, according to Rushkoff, this tendency stiöl accentuates the extremes. Furthermore, the abstract nature of digitality distances us from the reality. The main innovation of digital technology has been stated to be the ability to program and reprogram the machine (Ceruzzi, 2012). So instead of building a machine for a specific task, we now have a universal device that can be programmed for almost any task. The problem, in this context, is that the programmed nature of the digital technology presents an abstraction, meta-layer, which distances us from the task at hand, be it a social correspondence or something else. This meta-layer of digitality makes digitality harder to grasp; when the digital is ubiquitous, everywhere, digital technology is not something we anymore go to or use, as in the olden days when we 'went to the internet'. Digitality surrounds us and is with us all the time, albeit abstract and hard to embody. Here, a mere understanding of the workings of the digital might not be enough; we need a way to grasp the digitality. In my research I have together with foresight researcher Dr. Mikko Dufva proposed a concept of digi-grasping, a concept we can use to assess and talk about the digitality in our physical world.  I am not going to expand on the notion of digi-grasping here. However, what I want to do is to raise attention to the notion of embodied digitality. That we have to see that the way we use digital technology affects our physical world. That the digital is not just cyberspace, virtual reality, but a physical reality that affects the whole embodied being.

The second, and arguably (And argued for example by Evgeny Morozov:(Morozov, 2014; 2015; 2016)) the more critical aspect is the marriage between digital technology and current neoliberal hyper-capitalism, which seems to accelerate the problematics of digital. Alternatively, to put it in another way: the digital technology enables a unseen efficiency of capitalism. In short, because we, the collective we, do not own our data and pay nothing for the services and happily use the ad-backed free service offered to us by few monolithic platforms we are in a situation where the digital world is managed through just a few corporations and is affected by their interest of creating profit for their shareholders. This, in turn, can, of course, be traced back to the absurdity of our current economic system, which has almost no relation to the actual world and is abstract in itself.

One way to start disassembling the problem might be in trying to comprehend the digitality and reclaim the surrounding world to ourselves. The ability to grasp the post-digital world might lead to more ethical and aesthetical standards for our future. Moreover, I hope we can start to break apart the current economic model and seek alternative ways to distribute wealth. In one sense digital is not the threat, but the way we use it. However, we also need awareness to see where digital technology is appropriate and where it is not. Sometimes the best interface might not be digital. Also, sometimes it is. Furthermore, I feel we desperately need a policies for gathering of the massive amounts of data as well as the appreciation for the digital content: In this current landscape we should see that in order to get quality tools or services we need to pay for these, from tax money or our own pockets or to create and maintain, them together. (This also alludes to that even if digital is abstract it is also very material, needing, and in many cases, wasting valuable natural resources. Maintaining a digital service is not resource free.)

 

References:

 

Ceruzzi, P. E. (2012). Computing. MIT Press.

Morozov, E. (2014). To Save Everything, Click Here. PublicAffairs.

Morozov, E. (2015, February 1). Why cities need to fight Uber and give people a real transport choice. Retrieved May 10, 2017, from http://www.theguardian.com/commentisfree/2015/feb/01/cities-need-to-fight-uber-trasnsport-choice-evgeny-morozov

Morozov, E. (2016, December 4). Data populists must seize our information – for the benefit of us all. Retrieved February 1, 2017, from http://www.theguardian.com/commentisfree/2016/dec/04/data-populists-must-seize-information-for-benefit-of-all-evgeny-morozov

Rushkoff, D. (2010). Program Or Be Programmed. OR Books.

Rushkoff, D. (2013). Present Shock. Penguin.

Rushkoff, D. (2016, October).  Team Human. Presented at the teamhuman.fm.