Members of the Rudolf Peierls Centre for Theoretical Physics host a morning of Theoretical Physics roughly three times a year on a Saturday morning. The mornings consist of three talks pitched to explain an area of our research to an audience familiar with physics at about the second-year undergradu…
Holography explains why black hole horizons have thermodynamic and hydrodynamic properties and inspires researchers to re-visit foundations and explore limits of relativistic hydrodynamics Since the work of Bekenstein, Hawking and others in the early 1970s, it was known that the laws of black hole mechanics are closely related if not identical to the laws of thermodynamics. A natural question to ask, then, is whether this analogy or the correspondence extends beyond the equilibrium state. The affirmative answer, given by various authors during the 1980s and 90s, became known as the "black hole membrane paradigm". It was shown that black hole horizons can be viewed as being endowed with fluid-like properties such as viscosity, thermal conductivity and so on, whose values remained mysterious. The development of holography 15-20 years ago clarified many of these issues and has led to the quantitative correspondence between Navier-Stokes and Einstein equations. It became possible to study the long-standing problems such as thermalization and turbulence by re-casting them in the dual gravity language. We review those developments focusing, in particular, on the issue of the "unreasonable effectiveness" of hydrodynamic description in strongly interacting quantum systems. Final remarks, Prof Julia Yeomans FRS, Head of Rudolf Peierls Centre for Theoretical Physics
Can we apply hydrodynamics to systems with extensively many conservation laws Can we apply hydrodynamics to systems with extensively many conservation laws
What is hydrodynamics and why does it apply over 20 orders of magnitude in energy and length. Welcome, Prof Julia Yeomans FRS, Head of Rudolf Peierls Centre for Theoretical Physics Why Hydrodynamics? Prof Steve Simon
Will strings be the theory of everything?, presented by Prof Luis Fernando Alday.
Prof March-Russell explains our latest understanding of black holes, some of the most mysterious objects in the Universe.
A pressing question in our quest to understand the Universe is how to unify quantum mechanics and gravity, the very small and the very large.
In this talk, Dr Elliott Bentine shall discuss how recent experiments have exploited machine-learning techniques, both to optimize the operation of these devices and to interperet the data they produce. Modern table-top experiments can engineer physical systems that are deeply into the quantum mechanical regime. These cutting-edge instruments provide new insights into fundamental physics, and a pathway to future devices that will harness the power of quantum mechanics. They typically require complex operations to prepare and control the quantum state, involving time-dependent sequences of magnetic, electric and laser fields. This presents experimental physicists with an overwhelming number of tunable parameters, which may be subject to uncertainty or fluctuations.
Professor Andre Lukas will discuss how string theorists have started to use methods from data science - particularly machine learning - to analyse the vast landscape of string data.
Professor Ard Louis gives a basic introduction to deep learning for physicists and addresses a few questions such as: Is the hype around deep learning justified, or are we about to hit some fundamental limitations? In less than ten years, machine learning techniques based on deep neural networks have moved from relative obscurity to central stage in the AI industry. Large firms such as Google and Facebook are pouring billions into research and development of these new technologies. The use of deep learning in physics is also growing exponentially. Can physics help us understand why deep learning works so well? And conversely: How can deep learning provide new insight into the world around us?
Ian Shipsey give an update on the department and introduces the next three talk on 'AI in Physics'.
In this talk, Dr Elliott Bentine shall discuss how recent experiments have exploited machine-learning techniques, both to optimize the operation of these devices and to interperet the data they produce. Modern table-top experiments can engineer physical systems that are deeply into the quantum mechanical regime. These cutting-edge instruments provide new insights into fundamental physics, and a pathway to future devices that will harness the power of quantum mechanics. They typically require complex operations to prepare and control the quantum state, involving time-dependent sequences of magnetic, electric and laser fields. This presents experimental physicists with an overwhelming number of tunable parameters, which may be subject to uncertainty or fluctuations.
Professor Andre Lukas will discuss how string theorists have started to use methods from data science - particularly machine learning - to analyse the vast landscape of string data.
Professor Ard Louis gives a basic introduction to deep learning for physicists and addresses a few questions such as: Is the hype around deep learning justified, or are we about to hit some fundamental limitations? In less than ten years, machine learning techniques based on deep neural networks have moved from relative obscurity to central stage in the AI industry. Large firms such as Google and Facebook are pouring billions into research and development of these new technologies. The use of deep learning in physics is also growing exponentially. Can physics help us understand why deep learning works so well? And conversely: How can deep learning provide new insight into the world around us?
Ian Shipsey give an update on the department and introduces the next three talk on 'AI in Physics'.
In this talk Subir Sarkar will explain how deflagration supernovae have been used to infer that the Hubble expansion rate is accelerating, and critically assess whether the acceleration is real and due to `dark energy'.
In this talk, Philipp Podsiadlowski will explain how this energy (sometimes) creates a visible fireball, before going on to explain the role of supernovae in the production of the heaviest elements in the periodic table.
In this talk, James Binney will outline the physics that leads to prodigeous release of energy in core-collapse and deflagration supernovae.
In this talk Subir Sarkar will explain how deflagration supernovae have been used to infer that the Hubble expansion rate is accelerating, and critically assess whether the acceleration is real and due to `dark energy’.
In this talk, Philipp Podsiadlowski will explain how this energy (sometimes) creates a visible fireball, before going on to explain the role of supernovae in the production of the heaviest elements in the periodic table.
In this talk, James Binney will outline the physics that leads to prodigeous release of energy in core-collapse and deflagration supernovae.
To study the Higgs boson at the LHC we also need to understand how highly energetic quarks and gluons interact, among themselves and with the Higgs. These interactions are described by quantum field theory, a beautiful mathematical framework that combines quantum mechanics with Einstein's theory of special relativity. In recent years, our understanding of quantum field theory has progressed significantly, allowing us to develop a new generation of accurate theoretical predictions for key LHC reactions. In this talk, I will highlight some of the ideas behind this progress, and illustrate how they are being applied to investigate the Higgs sector at the LHC.
We learn about the Higgs Boson and its interactions at the LHC by examining the debris produced by colliding protons head-on at unprecedented high energies. However, we know from our theory of strong interactions - quantum chromodynamics (QCD) - that protons themselves are highly complex bound states of more fundamental 'quarks', held together by the force carriers of QCD, the 'gluons'. The question is then: how do we go from the collision of these complicated protons to a theoretical prediction that we can use to test the properties of the Higgs boson itself? In this talk, I will discuss what we know about the proton, and how we apply this to LHC collisions and our understanding of the Higgs sector.
Over the past two years, CERN's Large Hadron Collider (LHC) has started to directly probe a qualitatively new class of interactions, associated with the Higgs boson. These interactions, called Yukawa interactions, are unlike any other interaction that we have probed at the quantum level before. In particular, unlike the electromagnetic, weak and strong forces, they have an interaction strength that does not come in multiples of some underlying unit charge. Yukawa interactions are believed to be of fundamental importance to the world as we know it, hypothesised, for example, to be responsible for the stability of the proton, and so the universe and life as we know it.
To study the Higgs boson at the LHC we also need to understand how highly energetic quarks and gluons interact, among themselves and with the Higgs. These interactions are described by quantum field theory, a beautiful mathematical framework that combines quantum mechanics with Einstein’s theory of special relativity. In recent years, our understanding of quantum field theory has progressed significantly, allowing us to develop a new generation of accurate theoretical predictions for key LHC reactions. In this talk, I will highlight some of the ideas behind this progress, and illustrate how they are being applied to investigate the Higgs sector at the LHC.
We learn about the Higgs Boson and its interactions at the LHC by examining the debris produced by colliding protons head-on at unprecedented high energies. However, we know from our theory of strong interactions - quantum chromodynamics (QCD) - that protons themselves are highly complex bound states of more fundamental 'quarks', held together by the force carriers of QCD, the 'gluons'. The question is then: how do we go from the collision of these complicated protons to a theoretical prediction that we can use to test the properties of the Higgs boson itself? In this talk, I will discuss what we know about the proton, and how we apply this to LHC collisions and our understanding of the Higgs sector.
Over the past two years, CERN’s Large Hadron Collider (LHC) has started to directly probe a qualitatively new class of interactions, associated with the Higgs boson. These interactions, called Yukawa interactions, are unlike any other interaction that we have probed at the quantum level before. In particular, unlike the electromagnetic, weak and strong forces, they have an interaction strength that does not come in multiples of some underlying unit charge. Yukawa interactions are believed to be of fundamental importance to the world as we know it, hypothesised, for example, to be responsible for the stability of the proton, and so the universe and life as we know it.
The coding theorem from algorithmic information theory (AIT) - which should be much more widely taught in Physics! - suggests that many processes in nature may be highly biased towards simple outputs. Here simple means highly compressible, or more formally, outputs with relatively lower Kolmogorov complexity. I will explore applications to biological evolution, where the coding theorem implies an exponential bias towards outcomes with higher symmetry, and to deep learning neural networks, where the coding theorem predicts an Occam's razor like bias that may explain why these highly overparamterised systems work so well.
Active systems, from cells and bacteria to flocks of birds, harvest chemical energy which they use to move and to control the complex processes needed for life. A goal of biophysicists is to construct new physical theories to understand these living systems, which operate far from equilibrium. Topological defects are key to the behaviour of certain dense active systems and, surprisingly, there is increasing evidence that they may play a role in the biological functioning of bacterial and epithelial cells.
Ian Shipsey delivers the welcome speech for the Saturday Mornings of Theoretical Physics.
The coding theorem from algorithmic information theory (AIT) - which should be much more widely taught in Physics! - suggests that many processes in nature may be highly biased towards simple outputs. Here simple means highly compressible, or more formally, outputs with relatively lower Kolmogorov complexity. I will explore applications to biological evolution, where the coding theorem implies an exponential bias towards outcomes with higher symmetry, and to deep learning neural networks, where the coding theorem predicts an Occam's razor like bias that may explain why these highly overparamterised systems work so well.
Active systems, from cells and bacteria to flocks of birds, harvest chemical energy which they use to move and to control the complex processes needed for life. A goal of biophysicists is to construct new physical theories to understand these living systems, which operate far from equilibrium. Topological defects are key to the behaviour of certain dense active systems and, surprisingly, there is increasing evidence that they may play a role in the biological functioning of bacterial and epithelial cells.
Ian Shipsey delivers the welcome speech for the Saturday Mornings of Theoretical Physics.
Siddharth Parameswaran, Associate Professor, Physics Department. The usual picture of entropy in statistical mechanics is that it quantifies our degree of ignorance about a system. Recent advances in cooling and trapping atoms allow the preparation of quantum systems with many interacting particles isolated from any external environment. Textbook discussions of entropy — that invoke the presence of a “large” environment that brings the system to thermal equilibrium at a fixed temperature --- cannot apply to such systems. Sid Parameswaran will explain how “entropy” of subsystems of such isolated quantum systems arises from quantum entanglement between different parts of the system, and how their approach to thermal equilibrium is best described as the `scrambling’ of quantum information as it is transferred to non-local degrees of freedom.
John Chalker, Head of Theoretical Physics, gives a talk on entropy. Thermodynamics and statistical mechanics give us two alternative ways of thinking about entropy: in terms of heat flow, or in terms of the number of micro-states available to a system. John Chalker will describe a physical setting to illustrate each of these. By applying thermodynamics in a realm far beyond its origins, we can use the notion of an ideal heat engine to find the temperature of a black hole. And by applying combinatorial mathematics to hydrogen bonding, we can find the entropy of ice.
Alexander Schekochihin, Professor of Theoretical Physics, gives a talk on entropy. When dealing with physical systems that contain many degrees of freedom, a researcher's most consequential realisation is of the enormous amount of detailed information about them that she does not have, and has no hope of obtaining. It turns out that this vast ignorance is not a curse but a blessing: by admitting ignorance and constructing a systematic way of making fair predictions about the system that rely only on the information that one has and on nothing else, one can get surprisingly far in describing the natural world. In an approach anticipated by Boltzmann and Gibbs and given mathematical foundation by Shannon, entropy emerges as a mathematical measure of our uncertainty about large systems and, paradoxically, a way to describe their likely behaviour—and even, some argue, the ultimate fate of the Universe. Alex Schekochihin will admit ignorance and attempt to impart some knowledge.
This talk reviews the developments in quantum information processing.
This talk reviews testing and developing ideas in quantum computing using laser-manipulated trapped ions.
This talk explains how qubits are used to represent numbers in a way that permits 'quantum-mechanical parallel' computing. We show how this can used to achieve fast factorisation of large numbers, and hence the breaking of current codes. We end by explaining how entangled pairs of particles can be used to provide an alternative and entirely secure cryptographic system.
This talk reviews how to deal with quantum systems that are coupled to the outside world, as in reality all systems are. We first introduce density operators and explain how quantum states give rise to them. We then turn to measures of entanglement that can be computed from a density operator, and show that entanglement grows with time. Finally, we show how the interaction with the environment gives rise to the phenomenon of decoherence.
This talk reviews the modern formulation of the basic ideas of quantum mechanics. We start by explaining what quantum amplitudes are, how they lead to the idea of a quantum state and how these states evolve in time. We then discuss what happens when a measurement is made before describing correlated ('entangled') systems. Applying these ideas to two-state systems ('qubits') we point out that the complexity of computing the evolution of an N qubit system grows like exp(N)
Dr Ralph Schoenrich will talk about the chemical evolution side Spiral density waves patterns re-distribute stars throughout the entire system, making it impossible to know a star's origin from just its kinematics. However, stars are more than just points in phase space: every star is labelled with the elemental abundances of the gas cloud from which it was formed. Over the last few years a number of observational campaigns have started to measure these labels for millions of stars in our own Galaxy's disc. Ralph Schoenrich will describe how chemodynamical models are being used to piece together the evolution of our Galactic environment from presolar times to the present.
Dr John Magorrian will talk about the dynamics of galaxy discs In galaxy discs it is energetically favourable for angular momentum to move outwards and mass to move inwards. This transportation is effected by spiral arms, but what causes them? Simple linear response calculations demonstrate that even the smallest perturbation is amplified manyfold, while the differential rotation of the disc means that the response is stretched out into a spiral-like pattern. John Magorrian will introduce the notion of the disc as a resonant cavity, within which spiral density perturbations rattle back and forth.
Prof James Binney FRS will talk about stellar systems: a new state of matter The long range of gravity means that many concepts from undergraduate statistical mechanics do not apply: energy is not extensive; there is no microcanonical or canonical ensemble. Stars and dark matter particles have long mean free paths, which means that to a very good approximation their motion is determined by the mean-field gravitational potential. James Binney will identify a hierarchy of timescales, explaining how the Boltzmann equation for the full 6N-dimensional many-particle phase space distribution function can be reduced to an evolutionary equation of a function of a mere 3 variables that is governed by the resonances among the particles' orbital frequencies.
In this talk, Siddharth Parameswaran discusses how a topological approach to 2D systems reveal that they can indeed become superfluid, and lead to surprising and beautiful universal results whose implications continue to resonate today. Superfluids spontaneously break a continuous symmetry linked to the conservation of particle number in a many-body system. Standard lore holds that such symmetries must remain unbroken at any temperature above absolute zero in a two-dimensional material, such as a thin sheet or film, apparently precluding superfluidity in such systems.
Stephen Blundell reviews a theory of superconductivity that was developed in Oxford in the 1930’s by Fritz London. The idea is that under certain conditions quantum coherent effects can become manifest on a large scale. In an effect such as superconductivity, this idea can be put to use in such applications as magnetic resonance imaging, in which a living human patient is inserted inside a quantum coherent wave function. He will explain how coherent effects can be measured in real superconductors.
John Chalker discusses how the laws of quantum mechanics lead us from the microscopic world to macroscopic phenomena. The notion that atoms of a given isotope are indistinguishable has profound consequences in the quantum world. For liquids made of identical bosons, indistinguishability forces the particles into a quantum condensate at low temperature, where they all dance in perfect synchrony. Treated gently, such a condensate has no viscosity: once it is set in motion --say around a circular pipe -- flow will persist indefinitely (so long as the fluid is kept sufficiently cold!).
John March-Russell gives a talk about gravitational wave signals of stringy physics, a ‘soundscape’ connected to the landscape of string vacua.
Subir Sarkar reviews the detection of the ‘chirrup’ signal from a pair of merging massive black holes by the Laser Interferometer Gravitational-Wave Observatory, as well as subsequent experimental developments.
James Binney gives a talk about the mathematics that describe Gravitational waves.
Fasten Your Seat Belts: Turbulent Flows in Nature. Turbulence is ubiquitous in nature, and it often causes us headaches both literal and metaphorical. From unpredictable weather to the mixing of milk in our coffee, Michael Barnes will talk about how turbulence arises and our ongoing struggle to control it.
Ramin Golestanian will introduce you to Life at Low Reynolds number and ask how microorganisms can swim, navigate, and coordinate their activities. You will discover how the left-right symmetry is first broken in a developing embryo, and investigate the medically important question of how mucus is shifted in our lungs and what happens when things go wrong.