Events
Forthcoming events:
Analogue Experimentation (International Workshop)
16-17 July 2018
Speakers:
Programme:
Monday, 16th July
10.00 - 11.00 - Silke Weinfurtner (Nottingham) - TBA
11.00 - 12.00 - Ulrich Schneider (Cambridge) - Ultracold atoms in optical lattice – a quantum simulator for condensed matter physics
12:00 - 13:30 - Lunch
13.30 - 14.30 - Maxime Jacquet (Vienna) - The Hawking effect in dispersive media
14.30 - 15.30 - Dominik Hangleiter (FU Berlin) - Prospectus to a philosophy of analogue quantum simulation.
15.30 - 16.00 - Break
16.00 - 17.00 - Pete Evans (Queensland) - What exactly is quantum emulation? A survey and a proposal
17.00 - 18.00 - Radin Dardashti (Wuppertal) - Putting Analogue Experiments on the Methodological Map
19.00 - Conference Dinner No.4 Clifton Village
Tuesday, 17th July
10.00 - 11.00 - Jacques Carolan (MIT) - Beating classical computers with photons
11.00 - 12.00 - Anthony Laing (Bristol) - Simulating molecular quantum dynamics with photons
12:00 - 13:30 - Lunch
13.30 - 14.30 - Michael Cuffaro (Western Ontario) - Classical Simulations of Quantum Correlations
14.30 - 15.30 - Lena Zuchowski (Bristol) - Vertical, horizontal and diagonal modelling
15.30 - 16.00 - Break
16.00 - 17.00 - Eric Winsberg (USF) - What unifies simulation? Computer simulation and analog experiment
Abstracts:
*******
Jacques Carolan (MIT)
Beating classical computers with photons
Quantum computers promise to solve certain problems that are forever intractable to classical computers. The first of these devices are likely to tackle bespoke problems suited to their own particular physical capabilities. Photons in integrated circuits offer unique opportunities for quantum information processing. In this talk we give an overview of the field, and introduce a recently developed near-term route towards demonstrating quantum supremacy: boson sampling. We describe experimental efforts and discuss the question of verification.
*******
Lena Zuchowski (Bristol)
Vertical, horizontal and diagonal modelling
Abstract: I will argue that there exists more methodological pluralism in scientific modelling than currently recognised. In particular, I will explore two kinds of model construction and use that have so far not been given much attention in philosophical debates: 'horizontal' and 'diagonal' modelling. Horizontal and diagonal model construction is not tied to a particular target system as closely as the traditional 'vertical' model construction. However, I maintain that horizontal and diagonal modelling is widespread in many scientific branches and that a comprehensive account of scientific modelling should provide a comparative analysis of all three kinds of modelling.
*******
Radin Dardashti (Wuppertal)
Putting Analogue Experiments on the Methodological Map
Analogue Experiments have received a lot of attention among physicists in recent years. Some have even argued for the advantages of an experimental facility of analogue experiments to follow the experiments at the LHC in CERN once they are completed. For that purpose, it is crucial to understand the similarities and differences of analogue experiments compared to experiments, simulations, computer simulations and even thought experiments. In this talk I aim to contrast analogue experiments with these other scientific methodologies building upon and complementing previous work by Hangleiter, Carolan and Thébault.
*******
Dominik Hangleiter (FU Berlin)
Prospectus to a philosophy of analogue quantum simulation.
Recent years have witnessed an immense increase in experimental control over individual quantum systems. Along the way, it has become a central goal of modern quantum physics to exploit this control as both a means to increase our understanding of the physics underlying interacting quantum systems, and to achieve computational tasks which lie beyond the reach of classical computers. However, given the plethora of uses of the term `analogue quantum simulation' it is neither clear what exactly a quantum simulation is, nor what its precise epistemic goal is in various contexts. In part, this is because analogue simulation bears similarities with various types of scientific activities, including, experimentation, modelling, and computation.
In this talk, we will address this question by first sketching the notion of analogue quantum simulators and then situating them on the methodological map of modern science. To this end we will distinguish between simulation and emulation. Using the recent philosophical literature on understanding we will then argue that while simulation is aimed at achieving computational tasks and how-possibly understanding of physical phenomena, emulations have the goal to obtain how-actually understanding of phenomena. By isolating and assessing the goals of analogue simulation and emulation we hope to establish a concise analytical framework that will serve as a prospectus to a philosophy of analogue quantum simulation. We expect that our framework will be useful both to working scientists and philosophers of science interested in cutting edge scientific practice.
*****
Eric Winsberg (USF)
What unifies simulation? Computer simulation and analog experiment.
Computer simulation is often thought of as simply a tool for overcoming the problem of analytically unsolvable theoretical equations. And yet, there is also a natural point of view (one that’s more or less incorporated into the name) according to which computer simulations are a special case of simulation more generally. This is the view according to which computer simulation and analog experiment (or what we often call analog simulation) together have more in common with each other than either one does with ordinary material experiments. This talk will explore the latter point of view and discuss some natural objections.
*****
Maxime Jacquet (Vienna)
The Hawking effect in dispersive media
The Hawking effect of spontaneous emission at the horizon of black holes is one of the outstanding predictions of quantum field theory. However, because of its ultra-low temperature, observing it in the astrophysical context is an inconceivable feat. Fortunately, it is possible to create event horizons for waves in media, which renders the observation of this quantum emission doable. If the initial proof of this analogy between wave motion on curved spacetimes and the kinematics of waves in media was derived without accounting for the effect of dispersion, it has since been realised that the influence of the latter is actually key to enabling the experimental creation of analogue horizons. Here, we show in which regimes of dispersion the Hawking effect may really be observed in an optical-analogue scheme. We consider the limits and epistemology of the analogy to the astrophysical system thus drawn. We propose an onthological shift, whereby the necessity for experiments to operate in dispersion regimes in which the analogy cannot be mathematically derived is acknowledged. This will aid bridging from the theoretical to the experimental realms.
*****
Anthony Laing (Bristol)
Simulating molecular quantum dynamics with photons
Advances in femtochemistry have demonstrated unprecedented levels of control over molecular dynamics with shaped laser pulses and machine learning algorithms. Quantum states of light for quantum control of molecular dynamics open up new possibilities for molecular engineering. However, modelling these quantum dynamics is intractable to classical computational techniques and remains one of the central challenges of quantum chemistry. In this talk I will discuss a new methodology for analogue simulation of molecular quantum with photonics in which molecular vibrational modes are mapped to the optical modes of photonic chips, and vibrational excitations are mapped to photons. I will discuss recent experimental demonstrations of simulating dynamics for a diverse range of molecules that explore energy transport, decoherence, dissipation, anharmonic potentials, and a quantum algorithm that searches for states that maximise a dissociation pathway of ammonia.
*****
Ulrich Schneider (Cambrdige)
Ultracold atoms in optical lattice – a quantum simulator for condensed matter physics
The physics of interacting many-body systems presents one of the most challenging problems in quantum physics due to the exponential scaling of Hilbert space with particle number. While this scaling forms the basis of the large potential of future quantum computers, its necessary flipside is that exact classical simulations of quantum systems are hitting a hard wall already at very small particle numbers of 20-40.
At the same time, many emergent properties of interacting many-body problems, such as magnetism or superconductivity, form central pillars of condensed matter physics and are of high technological relevance. Despite our thorough knowledge of the relevant microscopic laws, namely electromagnetism and quantum mechanics, it often remains unclear which effective interactions and emerging phenomena govern the collective behaviour of a given system. Our theoretical understanding can ultimately only advance by comparisons with experiments, since exact simulations are not possible. Unfortunately, however, real condensed matter systems are to varying degrees plagued by unwanted disorder and other imperfections.
Hence the idea of quantum simulators: use one quantum system to learn something about another. During the last fifteen years, ultracold atoms in optical lattices have emerged as a powerful model system to study the many-body physics of interacting particles in periodic potentials. They have been established as very versatile and powerful Quantum Simulators to study collective phenomena, as they provide a flexible and clean test bed in which various important model Hamiltonians from condensed matter physics can be faithfully implemented.
*****
Michael Cuffaro (Western Ontario)
Classical Simulations of Quantum Correlations
I argue that the kinds of locally causal models ruled out by a given quantum no-go theorem depend, in part, on the context of inquiry. For implicit in our judgements regarding which locally causal models should be ruled out by a no-go theorem in a given context is a set of additional `plausibility' constraints associated with that context. This has interesting implications. In particular, in the traditional foundational literature, the `all-or-nothing' GHZ equality is usually considered a more powerful refutation of local causality than Bell's, for while GHZ can be shown to be violated using a single quantum experiment, the violation of statistical inequalities like Bell's requires repeated experiments to demonstrate. But the situation changes once we leave the foundational context. In the computational context, where we are concerned, not with alternatives to quantum theory, but with what we are capable of building with the aim of classically reproducing quantum statistics, the GHZ theorem loses its force, while statistical theorems like Bell's do not. In the context of analogue simulation, I will argue that this circumstance is of interest in that it represents a case where the disanalogies between a source system and its target do not have to do primarily with the systems themselves---i.e. with their fundamental physical characteristics---but rather with us and our purposes in carrying out a given investigation.
*****
Pete Evans (Queensland)
What exactly is quantum emulation? A survey and a proposal
A quantum simulation consists of a controllable source quantum system designed to probe dynamical features of a target quantum system otherwise inaccessible by classical simulation. Quantum simulation is a rather broad umbrella term for a range of experimental practices; most notably quantum simulation comes in both digital and analogue manifestations. The term ‘quantum emulation’, however, is occasionally employed to describe some subset of simulation practices in a variety of different contexts, and it is clear that there is very little agreement on the correct use of this term, nor its explicit differentiation from ‘simulation’ (if there be any). In a recent unpublished manuscript, Hangleiter, Carolan and Thébault propose that the difference between simulation and emulation is that the former is employed for probing abstract target phenomena and the latter concrete target phenomena. In this talk I survey a range of proposals for distinguishing simulation from emulation (including their relation to the digital/analogue distinction) and I assess the proposal of Hangleiter et al. in light of this survey. I propose my own distinction that essentially aligns with the proposal of Hangleiter et al. but provides an arguably more palatable justification of the distinction: emulation strikes a balance between freedom and control.
Previous events:
Causal Horizons in Physics
11 January 2017
Dave Sloan (University of Oxford): Through the Big Bang
26 April 2017
Abstract: I will show how the intrinsic definition of observables in relativity through dynamical similiarity (known as Shape Dynamics) leads to the continuation of Einstein's equations classically through the big bang singularity in simple cosmological scenarios. By appealing to general principles I argue that this is a generic feature, and that the singularity can be viewed as an artifact of the redundant description imposed by absolute length scales.
I will then lay out some other welcome features of intrinsic relational systems, and discuss the broader questions raised by a theory of physics that is independent of physical dimensions such as mass and length.
Henrique Gomes (Perimeter Institute): New Vistas from the Many-Instant Landscape
11 October 2017
Abstract: Quantum gravity has many conceptual problems. Amongst the most well-known is the "Problem of Time": gravitational observables are global in time, while we would really like to obtain probabilities for processes taking us from an observable at one time to another, later one. Tackling these questions using relationalism will be the preferred strategy during this talk. The 'relationalist' approach leads us to shed much redundant information and enables us to identify a reduced configuration space as the arena on which physics unfolds, a goal still beyond our reach in general relativity. Moreover, basing our ontology on this space has far-reaching consequences. One is that it suggests a natural interpretation of quantum mechanics; it is a form of ‘Many-Worlds’ which I have called Many-Instant Bayesianism. Another is that the gravitational reduced configuration space has a rich, highly asymmetric structure which singles out preferred, non-singular and homogeneous initial conditions for a wave-function of the universe, which is yet to be explored.
Patricia Palacios (Munich/Salzburg): Stock Market Crashes as Critical Phenomena? Explanation, Idealization and Universality in Econophysics.
18 October 2017
We study the Johansen-Ledoit-Sornette (JLS) model of financial market crashes. On our view, the JLS model is a curious case from the perspective of the recent philosophy of science literature, as it is naturally construed as a “minimal model” in the sense of Batterman and Rice (2014) that nonetheless provides a causal explanation of market crashes, in the sense of Woodward’s interventionist account