Archivi tag: Quantum mechanics

99 years of Black Holes: from Astronomy to Quantum Gravity

This meeting shall bring together the Black Hole community (from Quantum to supermassive) as well as the Gravitational Wave Community. The main topics of the conference will cluster around Black Holes, Gravitational Waves and the Future of General Relativity and Quantum Physics.
Continua a leggere 99 years of Black Holes: from Astronomy to Quantum Gravity

Validating the uncertainty principle

Nearly 90 years after Werner Heisenberg pioneered his uncertainty principle, a group of researchers from three countries has provided substantial new insight into this fundamental tenet of quantum physics with the first rigorous formulation supporting the uncertainty principle as Heisenberg envisioned it. CREDIT: P.Busch/York

Nearly 90 years after Werner Heisenberg pioneered his uncertainty principle, a group of researchers from three countries has provided substantial new insight into this fundamental tenet of quantum physics with the first rigorous formulation supporting the uncertainty principle as Heisenberg envisioned it.

More at AIP: Proving Uncertainty: New Insight into Old Problem

arXiv: Measurement uncertainty relations

The ‘liquid’ nature of spacetime in quantum gravity models

An illustration of the liquid spacetime concept. Credit: Jason Ralston/Flickr
Se lo spaziotempo fosse un liquido, avrebbe una viscosità bassissima, come i “superfluidi“. Un lavoro che ha visto collaborare la Scuola Internazionale Superiore di Studi Avanzati (SISSA) di Trieste con l’Università Ludwig Maximilian di Monaco ha mostrato come dovrebbero comportarsi gli “atomi” che compongono il fluido dello spaziotempo, secondo alcuni modelli di gravità quantistica. Le considerazioni proposte in questo lavoro impongono vincoli molto stretti al verificarsi di effetti legati a questa eventuale natura “fluida” dello spaziotempo, mostrando che è possibile discriminare tra i modelli di gravità quantistica finora sviluppati al fine di superare la Relatività Generale.

Continua a leggere The ‘liquid’ nature of spacetime in quantum gravity models

Testing different aspects of the twin paradox using a physical system

Immaginiamo due gemelli, identici, tranne per il fatto che uno dei due possieda una navicella spaziale. Quest’ultimo decide di fare un viaggio verso una stella distante, diciamo qualche decina di anni-luce, mentre l’altro gemello rimane a terra. Viaggiando con una velocità pari a circa il 75% -85% rispetto a quella della luce, il gemello raggiunge la stella e fa ritorno a terra, incontrando il proprio gemello decisamente più vecchio di lui. Questo fenomeno, noto come paradosso dei due gemelli, è dovuto alla dilatazione dei tempi come viene descritto nella teoria della relatività speciale. Di fatto, Albert Einstein predisse che orologi soggetti ad accelerazioni diverse misurano il tempo in maniera differente. Per quanto strano possa sembrare, l’effetto della dilatazione dei tempi è stato verificato più volte in laboratorio e viene continuamente utilizzato nei sistemi GPS.

The GPS is able to provide you with your position by timing very precisely the signals emitted by satellites, and to this end it needs to take into account the time dilation due to the different accelerations of the satellites. While GPS is one of the most precise systems we have, it can locate your smartphone with an error margin of a few metres. The precision could be improved by using the most precise clocks that we know on Earth, known as quantum clocks because they are ruled by the laws of quantum mechanics. There are plans funded by space agencies to launch these clocks into orbit. It is natural to think that a GPS consisting of quantum clocks would also need to take into account relativistic effects.

However, we do not fully understand how to combine quantum mechanics and relativity. The inability of unifying both theories remains as one of the biggest challenges of modern science.

Predictions in the 1970s said that there is a physical phenomenon that is both quantum and relativistic called the Dynamical Casimir Effect. But it wasn’t until 2011 that an experimental setup could be developed to test the prediction. Here is what theory predicted: if light is trapped between mirrors that move at velocities close to the speed of light, then they will generate more light than there is in the system. Even if initially there is no light between the mirrors, just vacuum, light shows up because the mirror turns the quantum vacuum into particles. This is supposed to happen because vacuum at the quantum level is like a sea of pairs of particles that are constantly emitting and absorbing light. They do this at incredible speeds, but if the mirror moves that fast too some of these particles are reflected by the mirror before disappearing and can be observed. But setting up such a system has proved difficult. In 2011, this difficulty was circumvented in the experiment conducted by Per Delsing at Chalmers University of Technology in Sweden. In this case the mirrors were different. They were magnetic fields inside a Superconducting Quantum Interferometric Device (SQUID), but they behaved exactly like mirrors, making light bounce back and forth. Unlike physical mirrors, these magnetic fields could be moved at incredible speeds. Einstein used to think of clocks as light going back and forth between mirrors. Time can be inferred from the distance between the mirrors divided by the speed of light, which remains constant no matter what. But he never thought about particles being created by motion, a prediction that was made many years after his death.

In a recent work, with colleagues at the University of Nottingham, Chalmers University and University of Warsaw, we have taken inspiration from the 2011 experiment.

We propose using a similar setup to test different aspects of the twin paradox using a physical system, which haven’t been tested so far.

Although it won’t involve human twins, the possibility of achieving enormous speeds and acceleration allows the observation of time dilation in a very short distance. Also, all previous experiments that have tested the theory have involved atomic clocks, which are “point-clocks”, that is, what measures time in these atomic clocks is confined to a tiny point in space. Our experiment would instead use something that has finite length. This is important because, along with time, Einstein’s theory predicts that length of the object changes too. We believe our experiment would test that aspect of the theory for the first time. We have found that particle creation by motion, which was observed in 2011, has an effect on the difference in time between the clock that is moving and the one that is static. Using this setup, while we can reconfirm that time dilation occurs, the more interesting application would be to help build better quantum clocks, by means of a better understanding of the interplay between quantum and relativistic effects.

The Conversation: How to test the twin paradox without using a spaceship

arXiv: The twin paradox with macroscopic clocks in superconducting circuits

Trying to fix the black hole information paradox

Black HoleI fisici hanno discusso le affermazioni che sono state avanzate di recente da Stephen Hawking in merito ai buchi neri (post). Ormai sono decenni che si sta cercando di svelare il mistero che avvolge questi affascinanti “mostri del cielo” la cui estrema forza di attrazione gravitazionale è così intensa che nemmeno la luce riesce a sfuggire. Oggi il professor Chris Adami, della Michigan State University, ha deciso di buttarsi, per così dire, nella mischia per capirne di più e tentare di risolvere l’enigma.

The debate about the behavior of black holes, which has been ongoing since 1975, was reignited when Hawking posted a blog on Jan. 22, 2014, stating that event horizons, the invisible boundaries of black holes, do not exist. Hawking, considered to be the foremost expert on black holes, has over the years revised his theory and continues to work on understanding these cosmic puzzles. One of the many perplexities is a decades-old debate about what happens to information, matter or energy and their characteristics at the atomic and subatomic level, in black holes. “In 1975, Hawking discovered that black holes aren’t all black. They actually radiate a featureless glow, now called Hawking radiation”, Adami said. “In his original theory, Hawking stated that the radiation slowly consumes the black hole and it eventually evaporates and disappears, concluding that information and anything that enters the black hole would be irretrievably lost”.

But this theory created a fundamental problem, dubbed the information paradox. Now Adami believes he’s solved it.

According to the laws of quantum physics, information can’t disappear”, Adami said. “A loss of information would imply that the Universe itself would suddenly become unpredictable every time the black hole swallows a particle. That is just inconceivable. No law of physics that we know allows this to happen”. So if the black hole sucks in information with its intense gravitational pull, then later disappears entirely, information and all, how can the laws of quantum physics be preserved?

The solution, Adami says, is that the information is contained in the stimulated emission of radiation, which must accompany the Hawking radiation, the glow that makes a black hole not so black. Stimulated emission makes the black hole glow in the information that it swallowed.

Stimulated emission is the physical process behind LASERS (Light Amplification by Stimulated Emission of Radiation). Basically, it works like a copy machine: you throw something into the machine, and two identical somethings come out. If you throw information at a black hole, just before it is swallowed, the black hole first makes a copy that is left outside. This copying mechanism was discovered by Albert Einstein in 1917, and without it, physics cannot be consistent”, Adami said. Do others agree with Adami’s theory that stimulated emission is the missing piece that solves the information paradox? According to Paul Davies, cosmologist, astrobiologist and theoretical physicist at Arizona State University, “In my view Chris Adami has correctly identified the solution to the so-called black hole information paradox. Ironically, it has been hiding in plain sight for years. Hawking’s famous black hole radiation is an example of so-called spontaneous emission of radiation, but it is only part of the story. There must also be the possibility of stimulated emission, the process that puts the S in LASER”. With so many researchers trying to fix Hawking’s theory, why did it take so long if it was hiding in plain sight? “While a few people did realize that the stimulated emission effect was missing in Hawking’s calculation, they could not resolve the paradox without a deep understanding of quantum communication theory”, Adami said. Quantum communication theory was designed to understand how information interacts with quantum systems, and Adami was one of the pioneers of quantum information theory back in the ’90s. Trying to solve this information paradox has kept Adami awake many nights as demonstrated by his thick notebooks filled with 10 years of mathematical calculations. So where does this leave us, according to Adami? “Stephen Hawking’s wonderful theory is now complete in my opinion. The hole in the black hole theory is plugged, and I can now sleep at night”, he said. Adami may now sleep well at night, but his theory is sure to keep other physicists up trying to confirm whether he has actually solved the mystery.

Michigan State University: Plugging the hole in Hawking’s black hole theory
arXiv: Classical information transmission capacity of quantum black holes

 

MSU Professor Chris Adami believes he has found the solution to a long-standing problem with Stephen Hawking’s black hole theory in a groundbreaking new study recently published in the journal Classical and Quantum Gravity. Shot by G.L. Kohuth 

Alan Guth commenta i risultati di BICEP2

E’ ancora vivo il fermento che ha generato in questi giorni l’annuncio dei ricercatori dell’Harvard CMB Group sull’esperimento BICEP2 in merito alla rivelazione di un segnale presente nella radiazione cosmica di fondo associato al passaggio di onde gravitazionali primordiali, una forte evidenza indiretta dell’inflazione cosmica (post). Il modello inflazionistico fu inizialmente proposto negli anni ’80 da Alan Guth, oggi Victor F. Weisskopf Professor of Physics presso il MIT, che commenta qui di seguito il significato scientifico dei dati ottenuti da BICEP2.

Q: Can you explain the theory of cosmic inflation that you first put forth in 1980?

A: I usually describe inflation as a theory of the “bang” of the Big Bang: It describes the propulsion mechanism that drove the universe into the period of tremendous expansion that we call the Big Bang. In its original form, the Big Bang theory never was a theory of the bang. It said nothing about what banged, why it banged, or what happened before it banged. The original Big Bang theory was really a theory of the aftermath of the bang. The universe was already hot and dense, and already expanding at a fantastic rate. The theory described how the universe was cooled by the expansion, and how the expansion was slowed by the attractive force of gravity. Inflation proposes that the expansion of the universe was driven by a repulsive form of gravity. According to Newton, gravity is a purely attractive force, but this changed with Einstein and the discovery of general relativity. General relativity describes gravity as a distortion of spacetime, and allows for the possibility of repulsive gravity. Modern particle theories strongly suggest that at very high energies, there should exist forms of matter that create repulsive gravity. Inflation, in turn, proposes that at least a very small patch of the early universe was filled with this repulsive-gravity material. The initial patch could have been incredibly small, perhaps as small as 10-24 centimeter, about 100 billion times smaller than a single proton. The small patch would then start to exponentially expand under the influence of the repulsive gravity, doubling in size approximately every 10-37 second. To successfully describe our visible universe, the region would need to undergo at least 80 doublings, increasing its size to about 1 centimeter. It could have undergone significantly more doublings, but at least this number is needed. During the period of exponential expansion, any ordinary material would thin out, with the density diminishing to almost nothing. The behavior in this case, however, is very different: The repulsive-gravity material actually maintains a constant density as it expands, no matter how much it expands! While this appears to be a blatant violation of the principle of the conservation of energy, it is actually perfectly consistent. This loophole hinges on a peculiar feature of gravity: The energy of a gravitational field is negative. As the patch expands at constant density, more and more energy, in the form of matter, is created. But at the same time, more and more negative energy appears in the form of the gravitational field that is filling the region. The total energy remains constant, as it must, and therefore remains very small. It is possible that the total energy of the entire universe is exactly zero, with the positive energy of matter completely canceled by the negative energy of gravity. I often say that the universe is the ultimate free lunch, since it actually requires no energy to produce a universe. At some point the inflation ends because the repulsive-gravity material becomes metastable. The repulsive-gravity material decays into ordinary particles, producing a very hot soup of particles that form the starting point of the conventional Big Bang. At this point the repulsive gravity turns off, but the region continues to expand in a coasting pattern for billions of years to come. Thus, inflation is a prequel to the era that cosmologists call the Big Bang, although it of course occurred after the origin of the universe, which is often also called the Big Bang.

Q: What is the new result announced this week, and how does it provide critical support for your theory?

A: The stretching effect caused by the fantastic expansion of inflation tends to smooth things out — which is great for cosmology, because an ordinary explosion would presumably have left the universe very splotchy and irregular. The early universe, as we can see from the afterglow of the cosmic microwave background (CMB) radiation, was incredibly uniform, with a mass density that was constant to about one part in 100,000. The tiny nonuniformities that did exist were then amplified by gravity: In places where the mass density was slightly higher than average, a stronger-than-average gravitational field was created, which pulled in still more matter, creating a yet stronger gravitational field. But to have structure form at all, there needed to be small nonuniformities at the end of inflation. In inflationary models, these nonuniformities — which later produce stars, galaxies, and all the structure of the universe — are attributed to quantum theory. Quantum field theory implies that, on very short distance scales, everything is in a state of constant agitation. If we observed empty space with a hypothetical, and powerful, magnifying glass, we would see the electric and magnetic fields undergoing wild oscillations, with even electrons and positrons popping out of the vacuum and then rapidly disappearing. The effect of inflation, with its fantastic expansion, is to stretch these quantum fluctuations to macroscopic proportions. The temperature nonuniformities in the cosmic microwave background were first measured in 1992 by the COBE satellite, and have since been measured with greater and greater precision by a long and spectacular series of ground-based, balloon-based, and satellite experiments. They have agreed very well with the predictions of inflation. These results, however, have not generally been seen as proof of inflation, in part because it is not clear that inflation is the only possible way that these fluctuations could have been produced. The stretching effect of inflation, however, also acts on the geometry of space itself, which according to general relativity is flexible. Space can be compressed, stretched, or even twisted. The geometry of space also fluctuates on small scales, due to the physics of quantum theory, and inflation also stretches these fluctuations, producing gravity waves in the early universe. The new result, by John Kovac and the BICEP2 collaboration, is a measurement of these gravity waves, at a very high level of confidence. They do not see the gravity waves directly, but instead they have constructed a very detailed map of the polarization of the CMB in a patch of the sky. They have observed a swirling pattern in the polarization (called “B modes”) that can be created only by gravity waves in the early universe, or by the gravitational lensing effect of matter in the late universe. But the primordial gravity waves can be separated, because they tend to be on larger angular scales, so the BICEP2 team has decisively isolated their contribution. This is the first time that even a hint of these primordial gravity waves has been detected, and it is also the first time that any quantum properties of gravity have been directly observed.

Q: How would you describe the significance of these new findings, and your reaction to them?

A: The significance of these new findings is enormous. First of all, they help tremendously in confirming the picture of inflation. As far as we know, there is nothing other than inflation that can produce these gravity waves. Second, it tells us a lot about the details of inflation that we did not already know. In particular, it determines the energy density of the universe at the time of inflation, which is something that previously had a wide range of possibilities. By determining the energy density of the universe at the time of inflation, the new result also tells us a lot about which detailed versions of inflation are still viable, and which are no longer viable. The current result is not by itself conclusive, but it points in the direction of the very simplest inflationary models that can be constructed. Finally, and perhaps most importantly, the new result is not the final story, but is more like the opening of a new window. Now that these B modes have been found, the BICEP2 collaboration and many other groups will continue to study them. They provide a new tool to study the behavior of the early universe, including the process of inflation. When I (and others) started working on the effect of quantum fluctuations in the early 1980s, I never thought that anybody would ever be able to measure these effects. To me it was really just a game, to see if my colleagues and I could agree on what the fluctuations would theoretically look like. So I am just astounded by the progress that astronomers have made in measuring these minute effects, and particularly by the new result of the BICEP2 team. Like all experimental results, we should wait for it to be confirmed by other groups before taking it as truth, but the group seems to have been very careful, and the result is very clean, so I think it is very likely that it will hold up.

Courtesy MIT: 3 Questions: Alan Guth on new insights into the ‘Big Bang’

The potential of string theory as an elegant unified description of physics

Nell’Agosto del 1984 due fisici arrivarono ad elaborare una formula che aprì una nuova finestra verso la comprensione della teoria delle stringhe. Lo scorso mese di Dicembre, Michael Green dell’Università di Cambridge e John Schwarz del California Institute of Technology sono stati insigniti del Fundamental Physics Prize 2014, uno dei premi della serie “Breakthrough Prizes” che riguarda le scienze fisiche e biologiche. La citazione del premio, che ammonta a 3 milioni di dollari, dice “per aver introdotto nuove prospettive sulla gravità quantistica e l’unificazione delle forze“.

Green and Schwarz are known for their pioneering work in string theory, postulated as a way of explaining the fundamental constituents of the Universe as tiny vibrating strings. Different types of elementary particles arise in this theory as different vibrational harmonics (or ‘notes’). The scope of string theory has broadened over the past few years and is currently being applied to a far wider field than that for which it was first devised, which has taken those who research into it in unexpected directions. Although the term ‘string theory’ was not coined till 1971, it had its genesis in a paper by the Italian physicist Gabriele Veneziano in 1968, published when Green was a research student in Cambridge. Green was rapidly impressed by its potential and began working seriously on it in the early 1970s. As he explains in the accompanying film, he stuck with string theory during a period when it was overshadowed by other developments in elementary particle physics. As a result of a chance meeting at the CERN accelerator laboratory in Switzerland in the summer of 1979, Green (then a researcher at Queen Mary, London) began to work on string theory with Schwarz. Green says that the relative absence of interest in string theory during the 1970s and early 1980s was actually helpful: it allowed him and a small number of colleagues to focus on their research well away from the limelight. “Initially we were not sure that the theory would be consistent, but as we understood it better we became more and more convinced that the theory had something valuable to say about the fundamental particles and their forces”, he says. In August 1984 the two researchers, while working at the Aspen Center for Physics in Colorado, famously understood how string theory avoids certain inconsistencies (known as ‘anomalies’) that plague more conventional theories in which the fundamental particles are points rather than strings. This convinced other researchers of the potential of string theory as an elegant unified description of fundamental physics. “Suddenly our world changed – and we were called on to give lectures and attend meetings and workshops”, remembers Green. String theory was back on track as a construct that offered a compelling explanation for the fundamental building blocks of the Universe: many researchers shifted the focus of their work into this newly-promising field and, as a result of this upturn in interest, developments in string theory began to take new and unexpected directions. Ideas formulated in the past few years, indicate that string theory has an overarching mathematical structure that may be useful for understanding a much wider variety of problems in theoretical physics that the theory was originally supposed to explain, this includes problems in condensed matter, superconductivity, plasma physics and the physics of fluids. Green is a passionate believer in the exchange of ideas and he values immensely his interaction with the latest generation of researchers to be tackling some of the knottiest problems in particle physics and associated fields. “The best ideas come from the young people entering the field and we need to make sure we continue to attract them into research. It is particularly evident that at present we fail to encourage sufficient numbers of young women to think about careers in physics”, he says. “Scientific research is by its nature competitive and there are, of course, professional jealousies – but there’s also a strong tradition of collaboration in theoretical physics and advances in the subject feel like a communal activity.” In 2009 Green was appointed Lucasian Professor of Mathematics at Cambridge. It comes with a legacy that Green describes as daunting: his immediate predecessor was Professor Stephen Hawking and in its 350-year history the chair has been held by a series of formidable names in the history of mathematical sciences.

The challenges of pushing forward the boundaries in a field that demands thinking in not three dimensions but as many as 11 are tremendous. The explanation of the basic building blocks of nature as different harmonics of a string is only a small part of string theory, and is the feature that is easiest to put across to the general public as it is relatively straightforward to visualise.

Far harder to articulate in words are concepts to do with explaining how time and space might emerge from the theory”, says Green. “Sometimes you hit a problem that you just can’t get out of your head and carry round with you wherever you are. It’s almost a cliché that it’s often when you’re relaxing that a solution will spontaneously present itself”. Like his colleagues Green is motivated by wonderment at the world and the excitement of being part of a close community grappling with fundamental questions. He is often asked to justify the cost of research that can seem so remote from everyday life, and that cannot be tested in any conventional sense. In response he gives the example of the way in which quantum mechanics has revolutionised the way in which many of us live. In terms of developments that may come from advances in string theory, he says: “We can’t predict what the eventual outcomes of our research will be. But, if we are successful, they will certainly be huge and in the meantime, string theory provides a constant stream of unexpected surprises.”

Michael Green will be giving a lecture, ‘The pointless Universe’, as part of Cambridge Science Festival on Thursday 13 March, 5pm-6pm, at Lady Mitchell Hall, Sidgwick Site, Cambridge. The event is free but requires pre-booking.

University of Cambridge: Strings that surprise: how a theory scaled up

Testing Bell’s theorem with distant quasars

Illustrazione artistica del concetto del disco di accrescimento attorno al buco nero nella galassia Mrk 231. Il flusso di radiazione è rappresentato in alto sopra il disco (in blu) ma non è ciò che si vede da Terra. Si nota inoltre un getto, molto stretto e localizzato che era noto prima delle osservazioni di Gemini.  Credit:Gemini Observatory/AURA, Lynette Cook

Alcuni ricercatori del MIT hanno pubblicato un articolo in cui viene proposto un esperimento che potrebbe risolvere un teorema vecchio di 50 anni, noto come teorema di Bell, che se violato potrebbe implicare che il nostro Universo non è strutturato secondo le leggi della fisica classica bensì secondo quelle meno tangibili ed estremamente probabilistiche della meccanica quantistica.

Such a quantum view would allow for seemingly counterintuitive phenomena such as entanglement, in which the measurement of one particle instantly affects another, even if those entangled particles are at opposite ends of the Universe. Among other things, entanglement, a quantum feature Albert Einstein skeptically referred to as “spooky action at a distance”, seems to suggest that entangled particles can affect each other instantly, faster than the speed of light. In 1964, physicist John Bell took on this seeming disparity between classical physics and quantum mechanics, stating that if the Universe is based on classical physics, the measurement of one entangled particle should not affect the measurement of the other, a theory, known as locality, in which there is a limit to how correlated two particles can be. Bell devised a mathematical formula for locality, and presented scenarios that violated this formula, instead following predictions of quantum mechanics. Since then, physicists have tested Bell’s theorem by measuring the properties of entangled quantum particles in the laboratory.

Essentially all of these experiments have shown that such particles are correlated more strongly than would be expected under the laws of classical physics, findings that support quantum mechanics.

However, scientists have also identified several major loopholes in Bell’s theorem. These suggest that while the outcomes of such experiments may appear to support the predictions of quantum mechanics, they may actually reflect unknown “hidden variables” that give the illusion of a quantum outcome, but can still be explained in classical terms. Though two major loopholes have since been closed, a third remains; physicists refer to it as “setting independence,” or more provocatively, “free will.” This loophole proposes that a particle detector’s settings may “conspire” with events in the shared causal past of the detectors themselves to determine which properties of the particle to measure, a scenario that, however far-fetched, implies that a physicist running the experiment does not have complete free will in choosing each detector’s setting. Such a scenario would result in biased measurements, suggesting that two particles are correlated more than they actually are, and giving more weight to quantum mechanics than classical physics. “It sounds creepy, but people realized that’s a logical possibility that hasn’t been closed yet”, says MIT’s David Kaiser, the Germeshausen Professor of the History of Science and senior lecturer in the Department of Physics. “Before we make the leap to say the equations of quantum theory tell us the world is inescapably crazy and bizarre, have we closed every conceivable logical loophole, even if they may not seem plausible in the world we know today?” Now Kaiser, along with MIT postdoc Andrew Friedman and Jason Gallicchio of the University of Chicago, have proposed an experiment to close this third loophole by determining a particle detector’s settings using some of the oldest light in the Universe: distant quasars, or galactic nuclei, which formed billions of years ago.

The idea, essentially, is that if two quasars on opposite sides of the sky are sufficiently distant from each other, they would have been out of causal contact since the Big Bang some 14 billion years ago, with no possible means of any third party communicating with both of them since the beginning of the Universe, an ideal scenario for determining each particle detector’s settings.

As Kaiser explains it, an experiment would go something like this: A laboratory setup would consist of a particle generator, such as a radioactive atom that spits out pairs of entangled particles. One detector measures a property of particle A, while another detector does the same for particle B. A split second after the particles are generated, but just before the detectors are set, scientists would use telescopic observations of distant quasars to determine which properties each detector will measure of a respective particle. In other words, quasar A determines the settings to detect particle A, and quasar B sets the detector for particle B. The researchers reason that since each detector’s setting is determined by sources that have had no communication or shared history since the beginning of the Universe, it would be virtually impossible for these detectors to “conspire” with anything in their shared past to give a biased measurement; the experimental setup could therefore close the “free will” loophole. If, after multiple measurements with this experimental setup, scientists found that the measurements of the particles were correlated more than predicted by the laws of classical physics, Kaiser says, then the Universe as we see it must be based instead on quantum mechanics. “I think it’s fair to say this [loophole] is the final frontier, logically speaking, that stands between this enormously impressive accumulated experimental evidence and the interpretation of that evidence saying the world is governed by quantum mechanics”, Kaiser says. Now that the researchers have put forth an experimental approach, they hope that others will perform actual experiments, using observations of distant quasars. Physicist Michael Hall says that while the idea of using light from distant sources like quasars is not a new one, the group’s paper illustrates the first detailed analysis of how such an experiment could be carried out in practice, using current technology. “It is therefore a big step to closing the loophole once and for all”, says Hall, a research fellow in the Centre for Quantum Dynamics at Griffith University in Australia. “I am sure there will be strong interest in conducting such an experiment, which combines cosmic distances with microscopic quantum effects, and most likely involving an unusual collaboration between quantum physicists and astronomers”. “At first, we didn’t know if our setup would require constellations of futuristic space satellites, or 1,000-meter telescopes on the dark side of the Moon”, Friedman says. “So we were naturally delighted when we discovered, much to our surprise, that our experiment was both feasible in the real world with present technology, and interesting enough to our experimentalist collaborators who actually want to make it happen in the next few years”. Adds Kaiser, “We’ve said, ‘Let’s go for broke, let’s use the history of the cosmos since the Big Bang, darn it.’ And it is very exciting that it’s actually feasible”.

MIT: Closing the ‘free will’ loophole

arXiv: Testing Bell's Inequality with Cosmic Photons: Closing the Setting-Independence Loophole

A Planck star instead of singularity inside black holes

E’ una idea proposta da due astrofisici, Carlo Rovelli e Francesca Vidotto, che in un articolo suggeriscono che un oggetto, noto come stella di Planck, possa esistere al centro dei buchi neri, una proposta che eliminerebbe perciò il concetto di singolarità facendo sì che l’informazione possa riemergere in qualche punto dello spazio nel nostro Universo.

The current thinking regarding  is that they have two very simple parts, an event horizon and a . Because a probe cannot be sent inside a black hole to see what is truly going on, researchers have to rely on theories. The singularity theory suffers from what has come to be known as the “information paradox“, black holes appear to destroy information, which would seem to violate the rules of general relativity, because they follow rules of quantum mechanics instead. This paradox has left deep thinking physicists such as Stephen Hawking uneasy, so much so that he and others have begun offering alternatives or amendments to existing theories. In this new effort, a pair of physicists suggest the idea of a Planck star. The idea of a Planck star has its origins with an argument to the Big Bang theory, this other idea holds that when the inevitable Big Crunch comes, instead of forming a singularity, something just a little more tangible will result, something on the Planck scale. And when that happens, a bounce will occur, causing the Universe to expand again, and then to collapse again and so on forever back and forth.

Rovelli and Vidotto wonder why this couldn’t be the case with black holes as well, instead of a singularity at its center, there could be a Planck structure, a star, which would allow for general relativity to come back into play.

If this were the case, then a black hole could slowly over time lose mass due to Hawking Radiation, as the black hole contracted, the Planck star inside would grow bigger as information was absorbed. Eventually, the star would meet the event horizon and the black hole would dematerialize in an instant as all the information it had ever sucked in was cast out into the Universe. This new idea by Rovelli and Vidotto will undoubtedly undergo close scrutiny in the astrophysicist community likely culminating in debate amongst those who find the idea of a Planck star an answer to the information paradox and those who find the entire idea implausible.

arXiv: Planck stars

Hawking’s grey holes are made of an ‘apparent’ horizon

In questi giorni, i media si sono scatenati riportando la proposta provocativa di Stephen Hawking secondo la quale i buchi neri “non esistono” (post). Da qui, sono emersi tutta una serie di commenti che poi si sono trasformati in “discussioni satiriche” allo scopo di puntare il dito contro le affermazioni che fanno spesso gli scienziati famosi. La Scienza è, come viene spesso suggerito, un pò diversa dalla religione dove il clero è sempre in attesa della “grande notizia”. Dunque, qual è il significato fisico di questa affermazione fatta da uno dei giganti della fisica moderna? Dobbiamo riscrivere i libri di testo? Per rispondere alla domanda dobbiamo fare un passo indietro e capire bene il concetto di buchi neri in modo da arrivare al problema iniziale che ha portato lo scienziato inglese a ricredersi sulla natura dell’orizzonte degli eventi che circonda i buchi neri.

A classical black hole

In 1915, Einstein derived the equations of general relativity, revolutionising our view of gravity. While Einstein struggled with his equations, the German physicist Karl Schwarzschild was able to use them to determine the gravitational field outside of a spherical distribution of mass. But the conclusions of Schwarzschild were rather frightening, predicting that objects could completely collapse, with mass crashing down to a central “singularity”, surrounded by a gravitational field from which even light cannot escape. For any black hole, the delineation between light escaping and being trapped is a well-defined surface, the event horizon, separating our Universe from the mysteries close to the black hole. With this, the notion of the “classical” black hole was born, governed purely by the equations of general relativity. But while we know general relativity governs the force of gravity, the early 20th century saw a revolution in the understanding of the other fundamental forces, describing them in exquisite detail in terms of quantum mechanics.

A quantum leap

But the problem is that general relativity and quantum mechanics just don’t play well together. Simply put, the equations of quantum mechanics can’t describe gravity, whereas general relativity can only handle gravity. To talk about them both in situations where gravity is strong and quantum mechanics cannot be ignored, the best we can do at the moment is sticky-tape the equations together. Until we have a unified theory of gravity and the other forces, this is the best we can do. Stephen Hawking undertook one of the most famous attempts at this in the early 1970s. He wondered about what was happening at the event horizon in terms of quantum mechanics, where empty space is a seething mass of particles popping in and out of existence. At the horizon, this process separates particles, with some sucked into the central singularity, while their partners escape into space. What Hawking showed is, through a jerry-rigged version of gravity and quantum mechanics, black holes leak radiation into space, slowly sucking energy from their gravitational core, and that, given enough time, black holes evaporate completely into radiation. When quantum mechanics is thrown into the mix, the notion of a “classical black hole” is dead.

Teapots and black holes

There is, however, a bigger problem in including quantum mechanics into the study of gravity, and that problem is information. Quantum mechanics cares intensely about information, and worries about the detailed make-up of an object like a teapot: how many protons are there, and electrons, and where are they; they care about the fact that a teapot is a teapot, a particular arrangement of electrons and protons, which is different to something else, like a light beam or a sofa. When the teapot is thrown into a black hole, it is complete destroyed, first smashed into a million pieces, then atomised, and then the atoms ripped into their constituent parts, before being absorbed into central singularity. But the radiation that Hawking predicted being emitted from black holes doesn’t contain any information of what fell in; no matter how well you examine the radiation, you can’t tell if it was a teapot, a fridge or a small iguana called Colin that met their demise.

Pushing boundaries

It must be remembered that we are now pushing the boundaries of modern physics and, as we do not have a single mathematical framework where gravity and quantum mechanics play nicely together, we have to worry a little about how we have glued the two pieces together. In 2012, the problem was revisited by US physicist Joseph Polchinski. He examined the production of Hawking radiation near the event horizon of a black hole, watching how pairs of particles born from the quantum vacuum separate, with one lost irretrievably into the hole, while the other flies off into free space. With a little mathematical trickery, Polchinski asked the question: “What if the information of the infalling particle is not lost into the hole, but is somehow imprinted on the escaping radiation?” Like the breaking of atomic bonds, this reassignment of information proves to be very energetic, surrounding a black hole with a “firewall”, through which infalling particles have to pass. As the name suggests, such a firewall will roast Colin the iguana to a crisp. But at least information is not lost. While presenting a possible solution, many are bothered by its consequences of the existence of a firewall and that Colin will notice a rapid increase in temperature, he will know he is at the event horizon. This goes against one of the key tenets of general relativity, namely that an infalling observer should happily sail through the event horizon without noticing that it is there.

Back to Hawking

This is where Hawking’s recent paper comes in, suggesting that when you further stir the quantum mechanics into general relativity, the seething mass of the vacuum prevents the formation of a crisp, well-defined event horizon, replacing with a more ephemeral “apparent horizon”. This apparent horizon does the job of an event horizon, trapping matter and radiation within the black hole, but this trapping is only temporary, and eventually the matter and radiation are released carrying their stored information with them. As black holes no longer need to leak information back into space, but can now release it in a final burst when they have fully evaporated, there is no need to have a firewall and an infalling observer will again have a roast-free ride into the black hole.

Are black holes no more?

To astronomers, the mess of fundamental physics at the event horizon has little to do with the immense gravitational fields produced by these mass sinks at the cores of galaxies, powering some of the most energetic processes in the Universe. Astrophysical black holes still happily exist.

What Hawking is saying is that, with quantum mechanics included, the notion of a black hole as governed purely by the equations of general relativity, the “classical black hole”, does not exist, and the event horizon, the boundary between escape and no-escape, is more complex than we previously thought.

But we’ve had inklings of this for more than 40 years since his original work on the issue. In reality, the headlines should not be “black holes don’t exist” but “black holes are more complicated than we thought, but we are not going to really know how complicated until gravity and quantum mechanics try to get along”.

After all, Hawking is just a man

But one last vexing question: is Hawking right? Science is often compared to religion, with practitioners awaiting pronouncements from on high, all falling into line with the latest dogma. But that’s not the way Science works, and it is important to remember that, while Hawking is clearly very smart, to quote the immortal Tammy Wynette in Stand By Your Man, “after all, he’s just a man”, and just because he says something does not make it so. Hawking’s proposed solution is clever, but the debate on the true nature of black holes will continue to rage. They will continuously change their spots, and their properties will become more and more head-scratchingly weird, but this is the way that science works, and that’s what makes it wonderful.

The Conversation: Grey is the new black hole: is Stephen Hawking right?