Dippel - The Medium is the Messenger

Research Project

The Medium ist the Messenger

Anne Dippel in Collaboration with Martin Warnke

The medium is the messenger brings together media theoretical argumentations with approaches of anthropology and science and technology studies. Based on ethnographic field work amongst a research group at the Jülich Supercomputing Center, it investigates the propinquity of computer simulations to the construction of theories. The first part of this essay introduces into the framework of our media anthropological research on event-based computer simulation of that famous quantum-mechanical, double-split experiment which was always at the center of epistemological considerations. Its description in this case takes a turn that has much to do with computers and particularly messengers and whereby we can observe that media are anything but neutral entities. Taking the example of an event-based simulation of quantum mechanics, our qualitative field work investigates how the simulation of a physical experiment in silico allows a parallel description, that challenges established theoretical explanations of the wave-particle dualism. Thus, we situate the ethnographic case study within the broader field of physics, exploring the scope of Ludwik Fleck’s concept of thought style once more to understand the contemporary field of physics with its diverse approaches to investigate and understand nature.

The ethnography enters into the observed research field via a thick description. We expound here the fundamental convulsion of physical “cosmology” which was first made manifest and present through the empirical findings of early nuclear physics and then was cast in the theoretical form of quantum mechanics. By cosmology we don’t mean that eponymous branch of physics concerned with origin, development and the end of the universe; we mean the anthropologically construed notions of individuals, communities and societies that strive to apprehend the fundamental constitution of the world in a symbolic and hierarchical fashion. In relation to this we explicate the special uses to which computer simulation are put in our research field and the conception of the medial qualities of the computer which prevails there.

Traditionally the laws of nature are expressed by differential equations describing temporal changes of physical quantities. But in Quantum Physics we have the puzzling co-existence of discontinuities like the quantum leap and of continuous phenomena like interference patterns. Theory deals with this by again using  differential equations, yielding a continuous wave function and then interpreting it as the probability for discreet events like the click of a detector. The annoying consequence of this approach is that the event of the measurement does not appear in theory but only the unobservable smooth behaviour in between. This yields severe epistemic problems, like wave particle dualism or cats being  dead and alive simultaneously.

Our research shows how a new methodology appears that puts physics upside down: simulating single events in a computer does not suffer from those epistemic problems but gives a realistic description with an even higher accuracy. It removes the problematic division of the world into a classical and a quantum version along with many other aporetic consequences of "old" quantum theory. But: it infringes traditional theoretical foundations like the so called First Principles as the Schrödinger Equation from which all phenomena could be derived. This differential equation is replaced by rules implemented as simulation software. A universal consistent hierarchical system of idealisations is replaced by a data driven algorithmic procedure.

Computer simulations appear in the end as new types of theory, adding to the existing  approaches. We reflect this phenomenon as crisis and enrichment of current physics culture in special and current epistemologies in general, producing an excess of rationality and explanatory powers on the one hand, on the other hand unveiling how fashion and fringe are made on the basis of technological factualities.

In the end we ask, whether and how we can reassure ourselves about truthfulness without the measurement as confirmation of universal First Principles? Is a certain kind of “Turing Test” the solution? If humans are unable to distinguish computer output from laboratory data, does the trust into the truth telling capacities of simulations grow? Or do they appear as dangerous (un)black-boxes, willing tools of their commanders?