How algorithm works in the episode and could work in a real life situation like the one in Arkangel? Is it efficient or does its application create problems?

Move the mouse on the topic you are interested in reading.


If we are talking about Arkangel, we have to introduce the concept of the algorithm and the problematics that stand behind it.

Algorithms

An algorithm is a set of rules established to solve a task, and it’s at the basics of any kind of computing (Manovich, 2002). It’s the element that makes any software works. How Manovich says in his work The Language of New Media, algorithms are an example of transcoding (2002). Transcoding is defined as the way in which digital technology translates our “human” culture into a new “digital” culture. Therefore, an algorithm is an example of this merging of technology into our culture, because they encapsulate the world according to their logic. After having coded any information into a digital code, algorithms process this information, as a digital brain that reasons according to its logic.

In the episode Arkangel, there are many examples of algorithm work. The parental control system used in the episode has a software that is able to solve specific tasks. For instance, every time Sara, the little girl, sees something considered disturbing, the software blurs her vision, so that she is not able to distinguish the image. Moreover, every time Sara’s vital signs change, the software plays an alarm to inform her mother. All of these are actions solved through an algorithm, that receives data, process the information, and decides what action the software has to make.

Algorithms applied to Reality

What is interesting to understand, however, is the application of algorithm in our reality. Are algorithms always successful in their task? Can we totally rely on this technology, or should we apply them only in specific situations? One thing is, in fact, when an algorithm has to solve a technical task, processing scientific data. It is different, however, if the information received can be interpreted in different ways.

As we already said, computer models any information into a data structure, that is used by algorithms to solve a task. However, it is not easy to reduce our reality, made of human interactions, into data. Often, situations cannot be separated into different groups, and there is not always a unilateral way of seeing an event. Not everything, in real life, is black or white, yes or no, 0 or 1. For this reason, it is not easy to let a machine decide among different choices that are part of a broader context.

Black Mirror’s message is, partly, exactly this one: the use of technology can be problematic in our reality. In the episode, there are examples of these problems. One day, when Sara is still a child, she is left at home with her grandfather, while her mother is at work. At some point, her grandfather has a heart attack. At that moment, the parental control system considers the episode as disturbing for Sara and decides to blur her vision, not permitting her to intervene in the situation.

Another example is when Sara smokes marijuana with her friends. The software does not distinguish between a voluntary situation or an involuntary one. For this reason, the system plays the alarm that informs that her vital signs have changed.


Unfortunately, these examples are not limited to the episode. In our society, data and algorithms play an important part in the modern workplace. However, it is still a developing technology and we never forget the importance of having a human component to overlook the algorithm’s decisions.

Cathy O’Neil deals with this topic in her work “Weapons of Math Destruction” (2017). She gives the example of a school where teachers were listed thanks to an algorithm that established which teachers were the best (O’ Neil, 2017). One of these teachers, called Susan, found herself with a low score, and when she tried to understand why, she found that the reason behind it was a mistake of the algorithm (O’ Neil, 2017). Her students, in fact, used to have high scores in their previous schools because the teachers were high-grader (a technique to make the school appear higher in the rankings) (O’ Neil, 2017). When these students arrived at Susan’s school their grades suddenly dropped (O’ Neil, 2017). The algorithm that listed the teachers, therefore, established that it was Susan’s fault and, therefore, that she had to be a bad teacher(O’ Neil, 2017).

Moreover, there are some tasks that an algorithm could try to solve, but that will never be completely successful. For example, an algorithm can suggest us who can be our friend on Facebook, based on our interests. However, That doesn’t mean that we will actually become friends with that person in real life. The same reasoning applies to dating apps. An app like Tinder can suggest us somebody that fits our taste, but it doesn’t mean that we will like him/her in reality.

For all these reasons, we can argue that it is not always safe to rely on digital technology. And even more, if we consider the fact that there is no legal recourse or official authority to hold a company responsible for the actions of its algorithms (Pew Research Center). In an age when we allow our attention and our spending to be directed by algorithms, when we rarely fully understand these systems, we should all be concerned about the vulnerability of the choices made a computer.

Work Cited

Experts on the Pros and Cons of Algorithms. (2017, February 8). Retrieved March 10, 2019, from http://www.pewinternet.org/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/.

Manovich, L. (2002) The Language of New Media, Cambridge, MA: MIT Press; selected excerpts.

O'Neil, C. (2017). Weapons of math destruction: How big data increases inequality and threatens democracy. London: Penguin Books.