Tuesday, May 14, 2019

Quantum Mechanics brought up to date

kw: book reviews, nonfiction, quantum mechanics, overviews

I've just finished reading Beyond Weird: Why Everything You Thought You Knew About Quantum Physics is Different, by Philip Ball. Prior to reading it, every book I've read that "explained" quantum mechanics has used the knowledge and language of the 1940's. Philip Ball brings things up to date, and it is about time someone did so!! The field of quantum (we can leave off the various modifying nouns for the nonce) has been the hottest of hot topics in physics for more than a century, so incredible numbers of experiments have been performed, and astronomical numbers of hypotheses have been proposed and defended and attacked and re-defended in tons and tons of published articles. So folks have learned a thing or to.

To cut to the chase. Werner Heisenberg got things rolling with his Uncertainty Principle: you can't specify both the position and the momentum of any object with accuracy more precise than a tiny "delta" that is related to Planck's Constant (); it yielded a Nobel Prize for him. Albert Einstein showed that light is quantized in his publication on the photoelectric effect, for which he received his Nobel Prize. In general terms, Heisenberg and Einstein agreed that a moving particle had a genuine position and momentum, but that no tool for determining these quantities could operate without changing them.

We can summarize the way this is explained in the Feynman Lectures on Physics thus:
A beam of electrons directed through two holes will produce an interference pattern on a screen. The electrons are behaving like waves. You want to determine which electron goes through which hole. Electrons can be 'probed' by shining short wavelength light on them, and recording where the scattered light came from. When you turn on the light beam, the interference pattern disappears, and is replaced by a smooth diffraction curve, as though there were one hole, not two. Weakening the beam doesn't help. So you try using a longer wavelength, then longer and longer wavelengths. Finally, you find a wavelength that will scatter off the electrons, but allows the interference pattern to form. However, the wavelength of the light you are now using is greater than the spacing between the holes, and you cannot tell which hole any electron goes through.
Niels Bohr thought differently. I have not determined whether he focused on the ubiquitous presence of diffraction patterns in focused beams (of light, electrons, or whatever; I'll come back to this). Whatever, he concluded that the observation of an electron's position or its direction of motion didn't just reveal that quantity, but created it. His published material shows that he dwelt on the matter of measurement; that no quantum theory could be admitted that said anything more than what measurements could say: "There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that it is the task of physics to find out how nature is. Physics concerns what we can say about nature." (p.73) He went beyond this, stating that nothing really happens until it is observed.

This led Erwin Schrödinger to make fun of Bohr with his Cat story: Put a cat in a box, along with a device that has a 50% chance of killing the cat via poison gas in the next hour. Just before opening the box, can you say whether the cat is alive or dead? He asked if Bohr would say, "Both," until the box is opened. Most likely, Bohr would say that the wave function, which Schrödinger had proposed for calculating probabilities of quantum events, would "collapse" and you would only then observe either a dead cat or a living cat (which you would whisk out of the box and close it in case the mechanism were to go off just then and kill you both).

Bohr was a bully. He didn't just promote what became the first of many versions of his Copenhagen Interpretation, he evangelized it, in a totally obtrusive way. He used no violence (of the fisticuffs variety, anyway). He'd just talk you to death. It was like water torture, a torture to which many of that generation of physicists eventually submitted. Still, fewer than half the physicists of any generation, then or since, really believes it. Murray Gell-Mann, for one, thought he simply brainwashed people.

The thing is, there is a bit of support for this view, though it calls into question the definition of "observer". And I confess, when I first heard of the Cat story, I asked, "Isn't the cat an observer?" Anyway, anyone who has done technical photography, and especially astronomical photography, knows of Diffraction. The smaller the hole (the camera diaphragm, for instance) through which a beam of light passes, the more that beam spreads out. It isn't much, in most cases, because visible light has such short wavelengths, between 400 and 700 billionths of a meter (nanometers or nm). However, the pixels in a modern digital camera are really small and closely spaced. In my Nikon D3200 camera, for example, there are about 24 million cells in a sensor that measures 12mm x 18mm, spaced 0.003mm apart. That is 3,000 nm. For technical photography I'd prefer for the light to spread out no more than the distance from cell-to-cell, so that "every pixel counts". For a lens of focal length 100mm, we want a half-angle α to be less than arcsin(.003/200) = 0.00086°. For such calculations it is common to use a wavelength of 500 nm. We find that an aperture of 33.3mm, or larger, is needed to get full utility from the camera's sensor. That implies an f/stop of f/3. Make the aperture any smaller, and the picture gets successively fuzzier. If you have an SLR and can control the aperture, try taking a picture outdoors at f/22 or f/32 (many will allow that). It will be pretty fuzzy.

This is why astronomers like telescopes with a wide aperture. Not just because they are efficient "light buckets", but because the bigger the hole the light goes through, the less it spreads out, and the sharper an image you can obtain. Of course, a telescope with a focal ratio of f/3 or less is hard to build and expensive. But for a large instrument with a long focal length, those little pixels are very small "on the sky", allowing you to see finer detail in distant galaxies.

Now, if there is no sensor at the focus of the telescope, does the diffraction still occur? Here I attempt a détente between Bohr and Heisenberg. Heisenberg would say that the aperture, no matter its size, is "disturbing" the light beam from some distant object, spreading it out. The effect is never zero, no matter how big the aperture. This implies that the whole universe affects what happens to every bit of light, every photon, as it makes its way from source to wherever it is "going". But, whether we are watching what happens or not, it must still be happening. Bohr would have to admit that the "observer" is effectively the aperture, and by extension, that the universe itself is an observer, or is constituted of observers. Effectively, everything is an "observer" of everything!

On page 84, the author writes, "…the idea that the quantum measurement problem is a matter of 'disturbing' what is measured is exactly what the Copenhagen Interpretation denies." (Author's emphasis) I think this example shows that such a position is untenable. For my part, I think the Copenhagen Interpretation, as stated by Bohr and most of his followers, is simply silly. Photons go where they go and do what they do. So does anything else in motion. It is amazing that the environment, all of it, has an effect on where they go. But diffraction experiments show that it is so: Everything disturbs everything. However, for most of the universe out there, the disturbance doesn't seem to amount to much.

One quibble I have with the book is that it lacks a table of contents. The 19 chapters just have a 2-page heading with a snappy title. In the chapter titled "The everyday world is what quantum becomes at human scales" (the 11th), the environment is brought in, and the matter of "decoherence". Prior chapters have discussed all the things we find "weird", such as "entanglement" (for example, two particles that carry equal but opposite values of some characteristic such as polarization, even to the ends of the universe if they don't bump into anything). They get us ready for this chapter, the key chapter of the book.

Entanglement, just mentioned above, is one kind of Coherence between quantum entities. A laser beam and a Bose-Einstein condensate are, or express, coherent states among numerous entities. Coherence is thought to be fragile. It is actually quite robust, and even infectious. Particles that interact with their environment spread their quantum states around. The problem is, any instrument we might use to measure such quantum states is part of the environment, and so partakes of that state, becoming unable to detect it. That is what is meant by Decoherence. It expresses our inability to keep a pair of quanta, for example, in a given entangled state because they "want" to spread it around. The longer we want them to stay in coherence, the more it will cost. However, it is this phenomenon of decoherence that leads directly to the human-scale, everyday behavior of objects. The author concludes that the entire universe "observes" everything that goes on unless we take great pains to isolate something from the environment, so we can measure it. It is the error of Bohr and others in not recognizing the influence of the multitude of non-conscious "observers" known as the universe, that led to the silliness of the Copenhagen Interpretation.

Or, perhaps I ought to be more charitable to Niels Bohr. Maybe he was right, that things only happen when they are observed by a conscious entity. But diffraction shows that every photon, of whatever wavelength, that passes through the universe; every electron, neutrino, etc., etc., etc., is an observer, and produces the universe that we see, in which most quantum phenomena require costly apparatus to observe and maintain (except diffraction!). If the universe required conscious observers only, or things could not happen, that would imply that God made the universe only after there were observers to keep it functioning! And that's funny. It may even be true! The Bible, in the book of Job (38:4-7), mentions multitudes of angels that observed the "foundation of the Earth". The Copenhagen Interpretation agrees with Job.

A late chapter discusses quantum computing, that it is being over-hyped (so what else is new?). Near the end of the discussion, I read that all the ways of making a quantum computer, so far discovered, are special-purpose. One can search, one can encode or decode, and so forth. It appears that, at the moment at least, no general-purpose quantum computer can be produced, that would be analogous in its breadth of function to our general-purpose digital computers. So don't sell your stock in Intel or Apple just yet!

I was much refreshed by this book. The author's point of view is still somewhat "Copenhagenish", but that's OK with me. If decoherence is what he says it is, then it really is true that what we see at our scale of a meter or two is just the consequence of quanta doing what they do, in all their multitudes, and spreading their characteristics about quite promiscuously, so that the universe just keeps keeping on, as it has since the Big Bang, at the very least.

No comments:

Post a Comment