selection of work

Work in progress

Interface(d) ideology

ABSTRACT Currently, much of our being in the world is mediated by computer interfaces. Cast into a society where information technology regulates the majority of our production processes, we resort to touchscreens, keyboards, scanners, etc. for the most basic of actions. Estranged from authentic social life and with online representations lurking on the cellphone in our pocket, we reach out to these same interfaces to get to know both others and ourselves. With this mediation, we experience a world --- one which we take in certain cases for 'the world' --- in an interface. Perhaps the earliest philosophical contemplation on interfaces comes from Heidegger, classifying the typewriter as `an intermediate thing', not quite a machine, but 'its production [...] conditioned by machine technology' (Heidegger 1992). Indeed, the interface design confers a certain reality and reason; the interface presses its own logic on our experiences and initiates the construction of a particular subject, conditioned into particular modes of production and into a particular understanding of the world around her.

In this essay we will pay particular attention to the material presence of the interface. The interface embraces a surface tension between the artifical and the biological, between the logic of the machine expressed in flows of electrons and the logic of the user embodied in human flesh. Over time our interface-actions shifted; e.g. writing became typing, typing became swiping. We examine how different ideologies materialize in interfaces and produce subjects by employing inter alia different levels and styles of anthropomorphism, interpellation, knowledge requirements, and user action. The interface establishes a mode of being in which it is suggested to be a surface, while it is, in fact, a thing. We are currently at a stage where the design of the interface is predominantly shaped to give the illusion of a window; flat, thin and shiny. The interface is not set-up to give the user knowledge of and power over what the machine is and does. Instead, it focuses on giving the illusion of being an `invisible mediator', by reducing its physical and machine-like presence into a visual opening where we can touch and swipe our way to the other. This immediate appeal as something ready-at-hand conceals the mechanics of the machine, renders us blind as to its inner working and potentiality, and thus strips us of our autonomy towards it. The utopia of an easily usable machine is put to use to veil under a thin flat surface the actual dystopia where the user's feelings and attention are hijacked, commodified, and nudged for the benefit of corporations. With the current interfaces that establish a permanent interactive connection to the Web, we are not only the spectator, but we become spectator and spectacle in one. Especially with the increased use of sensor readings, the interface is assimilating us. The interface does not only interface our access to the digital realm beyond, instead it is interfacing us to the digital realm.

This paper is in progress and turned out to be quite a box of Pandora. It is written with Ludo Gorzeman, and so far we have presented this paper at the Theoretical traditions, new technologies conference in Paris in June 2018 and at the hackersconference TBD in Amsterdam in July 2018. We expect to finish the paper in March 2019.

Selection of publications

  • Escaping the Panopticon Over Time: Balancing the Right To Be Forgotten and Freedom of Expression in a Technological Architecture

  • Forgetting Bits and Pieces: An Exploration of the 'Right to Be Forgotten' as Implementation of 'Forgetting' in Online Memory Processes

  • Timing the Right to Be Forgotten: A Study into "Time" as a Factor in Deciding About Retention or Erasure of Data


Escaping the Panopticon Over Time

Balancing the Right To Be Forgotten and Freedom of Expression in a Technological Architecture
paper by P. Korenhof and L. Gorzeman

Paper abstract The 'right to be forgotten' has been labelled censorship and disastrous for the freedom of expression. In this paper, we explain that effecting the 'right to be forgotten' with regard to search results is 'censorship' at the level of information retrieval. We however claim it is the least heavy yet most effective means to get the minimum amount of censorship overall, while enabling people to evolve beyond their past opinions. We argue that applying the 'right to be forgotten' to search results is not a question of just 'censoring' search engines, but that seen from a broader perspective we - as society - will inevitably have to deal with developments in information technologies and choose between three types of 'censorship': (1) censorship of original sources, that is on the level of information storage; (2) censorship on the level of the initial encoding of that information or (3) censorship on the level of information retrieval. These three levels at which 'censorship' can take place are the three basic elements of the memory process; whether biological, technological or hybrid with the use of mnemonic technologies. Applying censorship as a means of 'forgetting' in the collective hybrid memory of the Web enables us to counter - at least partially - the functioning of the Web as a 'Panopticon over Time'.

Read article

Forgetting Bits and Pieces:

An Exploration of the 'Right to Be Forgotten' as Implementation of 'Forgetting' in Online Memory Processes
paper by P. Korenhof

Paper abstract Technology has changed and still is changing our internal and external memory processes. The World Wide Web (Web) can function as an external transactive memory and can store and provide access to personal information for a very long time. The "right to be forgotten or erasure" (R2BFE), article 17 of the proposed General Data Protection Regulation, aims at helping individuals to control the availability of online accessible personal information. This paper takes the term "forgetting" in the article's title seriously and reviews the manner in which the R2BFE implements "forgetting" into the transactive memory on the Web. Exploring the concept of forgetting in this context shows that there is a far broader scale of options to implement digital forgetting than is offered today by the R2BFE. The analysis shows where the R2BFE is insufficient and risks affecting other interests at stake more than is necessary by the application of too narrow a notion of forgetting. This paper suggests that the R2BFE could be transformed into a more successful implementation of "forgetting" in the online transactive memory if it were to draw more heavily on the mechanisms of human forgetting.


Read article

Timing the Right to Be Forgotten

A Study into "Time" as a Factor in Deciding About Retention or Erasure of Data
paper by P. Korenhof, J. Ausloos, I. Szekely, M. Ambrose, G. Sartor, and R. Leenes

Paper abstract The so-called "Right to Be Forgotten or Erasure" (RTBF), article 17 of the proposed General Data Protection Regulation, provides individuals with a means to oppose the often persistent digital memory of the Web. Because digital information technologies affect the accessibility of information over time and time plays a fundamental role in biological forgetting, 'time' is a factor that should play a pivotal role in the RTBF. This chapter explores the roles that 'time' plays and could play in decisions regarding the retention or erasure of data. Two roles are identified: (1) 'time' as the marker of a discrete moment where the grounds for retention no longer hold and 'forgetting' of the data should follow and (2) 'time' as a factor in the balance of interests, as adding or removing weight to the request to 'forget' personal information or its opposing interest. The chapter elaborates on these two roles from different perspectives and highlights the importance and underdeveloped understanding of the second role.


Read article