Fellows Wintersemester 2019/20

Anne Dippel

The Medium is the Messenger

The medium is the messenger brings together media theoretical argumentations with approaches of anthropology and science and technology studies. Based on ethnographic field work amongst a research group at the Jülich Supercomputing Center, it investigates the propinquity of computer simulations to the construction of theories. The first part of this essay introduces into the framework of our media anthropological research on event-based computer simulation of that famous quantum-mechanical, double-split experiment which was always at the center of epistemological considerations. Its description in this case takes a turn that has much to do with computers and particularly messengers and whereby we can observe that media are anything but neutral entities. Taking the example of an event-based simulation of quantum mechanics, our qualitative field work investigates how the simulation of a physical experiment in silico allows a parallel description, that challenges established theoretical explanations of the wave-particle dualism. Thus, we situate the ethnographic case study within the broader field of physics, exploring the scope of Ludwik Fleck’s concept of thought style once more to understand the contemporary field of physics with its diverse approaches to investigate and understand nature.

The ethnography enters into the observed research field via a thick description. We expound here the fundamental convulsion of physical “cosmology” which was first made manifest and present through the empirical findings of early nuclear physics and then was cast in the theoretical form of quantum mechanics. By cosmology we don’t mean that eponymous branch of physics concerned with origin, development and the end of the universe; we mean the anthropologically construed notions of individuals, communities and societies that strive to apprehend the fundamental constitution of the world in a symbolic and hierarchical fashion. In relation to this we explicate the special uses to which computer simulation are put in our research field and the conception of the medial qualities of the computer which prevails there.

Traditionally the laws of nature are expressed by differential equations describing temporal changes of physical quantities. But in Quantum Physics we have the puzzling co-existence of discontinuities like the quantum leap and of continuous phenomena like interference patterns. Theory deals with this by again using  differential equations, yielding a continuous wave function and then interpreting it as the probability for discreet events like the click of a detector. The annoying consequence of this approach is that the event of the measurement does not appear in theory but only the unobservable smooth behaviour in between. This yields severe epistemic problems, like wave particle dualism or cats being  dead and alive simultaneously.

Our research shows how a new methodology appears that puts physics upside down: simulating single events in a computer does not suffer from those epistemic problems but gives a realistic description with an even higher accuracy. It removes the problematic division of the world into a classical and a quantum version along with many other aporetic consequences of "old" quantum theory. But: it infringes traditional theoretical foundations like the so called First Principles as the Schrödinger Equation from which all phenomena could be derived. This differential equation is replaced by rules implemented as simulation software. A universal consistent hierarchical system of idealisations is replaced by a data driven algorithmic procedure.

Computer simulations appear in the end as new types of theory, adding to the existing  approaches. We reflect this phenomenon as crisis and enrichment of current physics culture in special and current epistemologies in general, producing an excess of rationality and explanatory powers on the one hand, on the other hand unveiling how fashion and fringe are made on the basis of technological factualities.

In the end we ask, whether and how we can reassure ourselves about truthfulness without the measurement as confirmation of universal First Principles? Is a certain kind of “Turing Test” the solution? If humans are unable to distinguish computer output from laboratory data, does the trust into the truth telling capacities of simulations grow? Or do they appear as dangerous (un)black-boxes, willing tools of their commanders?

Rupert Gaderer

»The Society if« - Computer, Simulation, Zukunftswissen

Die Beschäftigung mit der Zukunft zieht ihren Reiz daraus, Bereiche des Ungewissen, des Unsicheren und des Unbekannten zu explorieren. Sie ist getragen von einem hypothetischen Index von (Nicht-)Wissen, der genuin mit den epistemologischen Bedingungen von Computersimulationen und ihren Zeitregimen verknüpft ist. Computersimulationen arrangieren virtuelle Lebenswelten, in denen Szenarien des Möglichen dargestellt und erprobt werden. Das Projekt geht von der Beobachtung aus, dass Computersimulationen, Orte entstehen lassen, an denen die Zukunft bereits angekommen ist. Unter dieser Prämisse thematisiert das Forschungsprojekt den Zusammenhang von Zukunftswissen und Simulation in dreifacher Weise:

Erstens beschäftigt es sich mit Computersimulationen als Medien der Schließung von Zukunft, etwa bei ›simulatorischen Taktiken‹ in Social-Media-Anwendungen. Hier wird die Verbindung von Technologischem und Sozialem in den letzten Jahren mehr und mehr für die Errechnung eines zukünftigen Verhaltens eingesetzt und für die Regulierung moderner Kontrollgesellschaften relevant.

Zweitens untersucht das Forschungsprojekt Computersimulationen als Medien der Öffnung von Zukunft: Ausgehend von der These, dass sich derzeit ein gesellschaftspolitischer Turn ›Zurück in die Zukunft‹ abzeichnet, ist der Stellenwert von Computersimulationen im Kontext des Bemühens um eine Rehabilitation utopischen Denkens in Form von ›Real Utopias‹ relevant.

Drittens befasst sich das Projekt mit Computersimulationen als Narrationen von Zukunft: Einerseits stellt sich die Frage, inwiefern Computersimulationen als zeitbasierte Medien Erzählstrukturen spezifizieren. Andererseits sind Medienkulturen der Computersimulation in der Science-Fiction angesprochen. Auffallend ist nämlich, dass gerade in Literatur, Film und Videospielen eine intensive Auseinandersetzung mit der Mimesis von digitalen Simulationen der Zukunft stattfindet – sowohl in Bezug auf eine Problematisierung und Antizipation möglicher Entwicklungen der algorithmischen Vorhersage als auch in Form eines (ganz wörtlichen) ›Durchspielens‹ alternativer Welten.

Projektverantwortliche: PD Dr. Rupert Gaderer und Prof. Dr. Sebastian Vehlken

Peter Krapp

The Distraction Economy of Models, Simulations, and Games

As Senior Fellow at the federally funded Institute for Advanced Study on Media Cultures of Computer Simulation, or MECS, at Lüneburg University in Germany, the project I will pursue during my research leave is the completion of a current book project on models, games, and simulations, rooted in part in my teaching (in Informatics and in Humanities) on game studies and the history of simulations. The industry study of simulation technologies in gaming covers console games, mobile games, retro games, and virtual worlds; each chapter focuses on a point where quantitative data yield qualitative evaluations: reviews and awards in relation to sales patterns for console game development, in-game trade and governance in online gaming, the role of advertising in mobile games, the pivotal but constrained role of music in retro game aesthetics, and data mining in think tank simulations. Investigating these game industry trends pivots on a critical vocabulary for the history of simulations and virtual worlds. As advances in computer simulation have been turned to the ends of entertainment software, the project investigates historical formations in games, based on a critical framework for modeling and simulation. From war gaming to early digital computing and from flight simulators and radar screens to the affordances of immersive graphic user interfaces and control devices, the modalities of human-computer interaction distinguish simulation after 1945 from broader connotations of modeling. On this background, one can trace a critical history of computer games as art objects, cultural artifacts, and gateways to alternate realities.

Lasquety-Reyes

Austrian Economics Model

This project attempts to create an agent-based model (ABM) computer simulation of Austrian economics. Austrian economics is a specific school of thought in economics founded by Carl Menger (1840-1921) and developed by eminent representatives such as Ludwig von Mises (1881 – 1973) and the Nobel laureate Friedrich Hayek (1899 – 1992). It differs from other economic schools of thought in its emphasis on methodological individualism (a focus on the purposeful actions of individuals), subjectivism (the value of goods is based on the subjective desires of human beings), and viewing the market as a complex and dynamic process. This contrasts with mainstream approaches to economics that employ a set of idealized mathematical equations to represent and predict the behavior of a whole economy.

The design of the Austrian Economics Model (AEM) is inspired by the Sugarscape model of Epstein and Axtell (1996). The Sugarscape model was the first large-scale agent-based model. Starting with agents that move around a grid world and collect “sugar” in order to survive, Epstein and Axtell were able to add more and more complex phenomena such as trade and wealth distribution, war, evolution, and the transmission of diseases. The AEM also begins with agents that move around in a grid world to collect natural resources. From these natural resources, agents produce more complex goods, trade with each other, and choose different strategies to obtain their economic goals. The purpose of the AEM is not to make predictions about the current economic situation, but to create believable artificial economies which capture important aspects of real life economic behavior, especially through the lens of the Austrian school.

Since the financial crisis of 2008, there has been increased interest in ABM and Austrian economics. However, there have only been two attempts in the past to simulate Austrian economics. The first was by Don Lavoie and the “Agorics Project” at George Mason University in the 1990s, the second was by Hendrik Hagedorn in 2015. Our project differs from these previous attempts in its use of a BDI (Belief-Desire-Intention) cognitive architecture (Rao 1995). The minds of agents are structured according to what they know/believe, what they want/desire, and the plans that they employ/intend to obtain their desires. I argue that such a cognitive architecture allows us to capture the intrinsic subjectivism in Austrian economics. Among cognitive architectures, BDI is also one of the most successful and popular because it captures how human beings think they think (Norling 2004, Balke and Gilber 2014). In the AEM, the BDI framework dictates a wide range of actions that an agent can take, from negotiating prices with a seller to creating a brand new product.

The AEM is developed using Godot, a free and open-source game engine, while additional data analysis is done with Python. For more information about the project, please visit www.austrianeconomicsmodel.com.

Oliver Leistert

Blockchains. Selbstvalidierung als Simulation von Vertrauen

Mit dem Aufkommen der ersten Blockchain durch bitcoin vor rund 10 Jahren erschien eine informatische Neuerung, die Wellen schlagen sollte: sich selbstvalidierende, dezentrale Buchungsbücher. Insbesondere die Erforschung von Konsens-Protokollen ist seither vorangegangen. In ihnen sind die Regeln niedergelegt, wie sich Netzwerkknoten auf die Fortschreibung von Datenblöcken als Kette „einigen“. 
Selbstvalidierung bedeutet, dass das, was das Ergebnis des Prozesses der Validierung ist, von allen im Netz der Blockchain hängenden Knoten möglichst synchron als nächster Eintrag in der Chronik der Ereignisse akzeptiert ist. Diese programmierte automatische Akzeptanz einer Wahrheit, sprich der Fortschreibung der Kette, entlastet Verfahren des Vertrauens darüber, was richtig ist, an anderer Stelle: Buchungsvorgänge und darauf aufbauende programmierte Verträge (Smart Contracts) stehen für diese Auslagerung des Vertrauens in die Protokolle und die Kryptographie. Es wird eine Simulation von Vertrauen auf der Rechnerebene betrieben, die in der Lage ist, das Vertrauen, das an anderer Stelle desselben Prozesses bisher gebraucht wurde, teilweise zu ersetzen. Oder doch nicht? 
Vertrauen ist eine nur schwer zu fassende Kategorie des gesellschaftlichen Verkehrs. Als „Breitenphänomen“ ermöglicht sie den zivilen Umgang miteinander. An Maschinen externalisierte kryptographische Operationen als Träger gesellschaftstiftender Operationen des Vertrauens anzusehen, ist von daher ein weiter Schritt. Die Gleichung Vertrauen = mathematischer Beweis scheint enorm reduktionistisch.

Lukas Mairhofer

Ways of observing. Decectors and the epistemic community

I will investigate this dual function of computer simulations for detectors in contemparary physics experiments.Detectors enhance our senses, thus redefining the epistemic community. Mediating between the observers and the observed, they can be accounted for as part of the investigated object as well as part of the investigating subject.

Detectors are linked to computer simulations on at least two different levels. On the one hand, computer simulation is part of almost all experiments in current fundamental research: In setting them up, often crucial parts of experiments are first simulated using finite elements methods. Later the simulated behavior of the detector becomes part of the data evaluation. This way, computer simulations have an important epistemological function, adding a distinct element to the process of scientific cognition. The model is implemented by the experimental practice, in which we apply it to reality. But at the same time we need to reverse this step and translate the apparatus performing the implementation into a theoretical model.

On the other hand, simulation is part of the detection process itself: The results of experiments are usually obtained by fitting models to the data, with or without free parameters. In modern particle physics simulations enter the measurement process on an even more fundamental level. The detection of new particles requires to identify a signal against a background that is orders of magnitude larger then the signal itself. These measurements become only possible because the background is not primarily given as white noise but consists of events that are already understood. Monte Carlo simulations of those events make sense out of the noise and allows to subtract it.

Martin Woesler

Society 5.0 in China

Chinese Society is strongly controlled: There is almost total surveillance, big data and algorithmic analysis, almost no privacy/data protection and disciplinary sanctions. The media characteristics go beyond Niklas Luhmann’s Media Epoch 4.0 (1997) and therefore maybe categorized “5.0”. Luhmann described the machine by surface and depth. In China, we are inside the machine, the communication moves from men to machines.

The individual is controlled with a Social Credit System (SCS, to be realized largely by 2020) digitally and externally. According to Warnke 2019, the SCS can partly be described as a protocol (Galloway 2004). It technically requires adherence to the rules, non-adherence results in non-participation (gamification logic). Since SCS is a creative and flexible combination of different data sources (which are not always available and may contradict each other), it can also partly be described as platform and stack (Bratton 2015).

This research project builds on and continues the earlier one of summer term 2019, which concentrated on the Chinese Social Credit System. This continuation asks: How is the Social Credit System been implemented until 2020? And in general: How far is the individual controllable? Behaviourism says largely (Skinner 1974), totalitarist ideologies try to control the thoughts with different means (see the “blank sheet” by Mao 1958). Neoliberalist Facebook knows the individual better than it knows itself and manipulates the individual. The SCS gets positive feedback due to brainwash, the unfree survey setting and the happiness of the simple-minded (with outbreaks of critique/violence).

The Chinese individual is educated and guided from preschool until after retirement with 10 percent of school classes, university courses and training-on-the-job being devoted to ideological indoctrination – starting from any leadership position – which is reinforced by personal tutors, psychological pressure and group dynamics. To go to university, one has to serve in the military first. Military camps are located close to university campuses.

China develops a Digital System for Society-Management (DSSM, to be realized largely by 2025): Algorithms take over decisions, which stands in the Chinese tradition of meritist and legalist ideas. The human factor is replaced by learning algorithms: From the rule of men to rule of law – however, the party is always first. The Chinese understanding is that the planned economy failed because of the human factor (mentality of fulfillment, sugarcoated figures), China’s Society is the third digital attempt to realize Socialism with digitalization, after Cybersin and TRAN failed.

The SCS also contains social components, like encouraging visiting ones parents and enhancing societal credibility.

While the transaction costs for this society are high (ca. 7% of the annual budget is spent on inner security), it is still economically more successful than (neo)liberal societies. The main resource for the future information economy is data. How far will neoliberalism use the totalitarian data and technologies and therefore support the system? How far will Western societies adapt to the Chinese model.

To control each process in reality, it is copied into a simulated reality. Predictive scenarios include avatars and group reactions. Chinese writers envision a future of endless technological progress.

Mario Schulze

Datenfilme. Zur digitalen Reanalyse analytischer Strömungsfilme

Anhand eines konkreten Beispiels aus der Strömungsforschung möchte im Rahmen meiner Fellowship am MECS der Frage nachgehen, was die Mathematisierung und Computerisierung wissenschaftlicher Bilder verspricht und antreibt. Seit den 1920er Jahren entwickelte Ludwig Prandtl kinematographische Verfahren zur Strömungserforschung in Wasserkanälen. Im Rahmen eines Forschungsprojekts zum wissenschaftlichen Film an der ZHdK beforsche ich die Geschichte eines der Filme von Prandtl (gemeinsam mit meiner Kollegin Sarine Waltenspül): von den ersten Versuchen seiner Produktion über seinen Einsatz als Evidenz auf den weltweiten Konferenzreisen Prandtls bis zu seiner Verwendung als Lehrfilm im Nationalsozialismus sowie später in den USA im Kontext des Space Race mit den Sowjets (Abb. 1). Am MECS beschäftige ich mich mit den bisher letzten Episoden dieser „Filmbiographie“. 2007 wurde eine digitalisierte Version des Films durch Wissenschaftler des DLR (Deutsches Zentrum für Luft- und Raumfahrt) mithilfe von particle image velocimetry (PIV) und dann erneut 2019 unter unserer Mitarbeit mit particle tracking velocimetry (PTV) neu ausgewertet (Abb. 2 & 3).

PIV und PTV sind seit den 1980er Jahren entwickelte und heute standardmäßige algorithmische Verfahren zur Strömungsvisualisierung und Messung der Richtung und Geschwindigkeiten einzelner Partikel in Fluiden. Nebst der numerischen Auswertung werden durch PIV und PTV-Auswertung der Filme die auf dem Filmmaterial ausmachbaren Geschwindigkeitsänderungen farblich hervorgehoben. Die computergestützte Auswertung des analogen Materials liefert ein Palimpsest aus farblich visualisierten quantitativen Daten auf dem schwarz-weißen Grund des analogen Ausgangsfilm. Mit diesem Palimpsest gewannen Willert und Kompenhans mit ihrer Auswertung den Best Movie Award auf dem Flow Visualization Kongress in Daegu, Korea.

Eine instationäre, also zeitabhängige Strömung, wie sie im Wasserkanal analog simuliert und visualisiert wird, zeichnet sich durch ihre Einmaligkeit und Unvorhersehbarkeit aus. Die optische Aufzeichnung verspricht, der Einmaligkeit habhaft zu werden und sie repetierbar zu machen. Die algorithmische Auswertung wiederum verspricht, die Mathematisierung und Berechenbarkeit der Einmaligkeit und damit die Möglichkeit einer computergestützten Simulation des Flows. Die Auswertung von C1 durch Willert und Kompenhans propagiert letztlich die Archivierung historischer Wissenschaftsbilder in der Hoffnung, in Zukunft Daten aus ihnen zu gewinnen. Meine Zeit am MECS möchte ich dazu nutzen, anhand dieser Episode der Neuauswertung eines analogen Wissenschaftsfilm beispielhaft nach den Zusammenhängen von computergestützter Auswertung, Datenvisualisierung und analoger wie digitaler Simulation zu fragen.

Am MECS organisiere ich außerdem zusammen mit Hannah Zindel und Sarine Waltenspül den Workshop „Windkanäle. Wissen, Politik und Ästhetik bewegter Luft“, der am 7. und 8.11. stattfinden wird, und kuratiere die den Workshop begleitende Ausstellung „Filme des Windes“ im Kunstraum der Leuphana.