When I was about ten, I was disappointed in a picture I'd taken. I had been too far from the person I was "shooting", so he looked like no more than a couple of dots. Having recently learned about enlargements, I suggested getting the middle of the picture enlarged. My father remarked that the photo shop charges a lot for enlargements. Then I suggested putting it under my microscope and taking another picture, then getting that printed—I'd already been setting up a clumsy rig with a tripod holding Dad's camera at the eyepiece and making photos of the cells in thin-sliced carrots and leaves. He said I could try, but it would be very blurry, then explained about the grain in the print and in the negative. I looked, and sure enough, even at 25X the film grain made the picture look like it was printed on sand.
The next year he and I made a small telescope (I still use it), and I learned about diffraction and the magnification limit of an optical system. I realized, even if the film and print grain were a hundred times smaller, and even if the optics of the camera were flawless, diffraction would limit how much I could enlarge the final image.
This is an illustration of the Rayleigh criterion for resolving star images in a telescope. I downloaded it from the Angular Resolution article in Wikipedia. The upper section shows that the Airy Disks of the two stars are fully separated. The Airy Disk is everything inside the first dark ring (first null). The lowest section shows serious overlap, and the middle section shows the Rayleigh criterion, at which point the first null of one Airy Disk passes through the center of the other. This is the accepted resolution limit of a telescope system, or indeed, any optical system, including the eye.
What causes this pattern? It results from the interaction of light from a distant point source (or multiple sources) passing through a circular aperture. Just by the way, if you should get the notion to make a telescope with a rectangular aperture, under high magnification you'll get a diffraction pattern more like this:
Such diffraction patterns, I realized one day, are a visible manifestation of quantum-mechanical effects. If you could solve the Schrödinger Wave Equation for this system, the square of its solution would look like this image. In the SWE, the solution is in complex space, and represents probabilities, while the square of the complex probability at any point is the intensity of, for example, a beam of light or electrons, as it is spread through space by diffraction. One characteristic of the SWE is that, while there will frequently be numerous nulls, or zeroes, in the solution, there is no greatest angle or maximum distance beyond which its solution is always zero. This is why even huge telescopes such as the 10m diameter Keck telescopes in Hawaii still have a diffraction pattern once all other aberrations are accounted for (the atmosphere is a much bigger light scatterer "down here", though).
So, think of it. The yellow-green light that our eyes are most sensitive to has a wavelength of 0.55µ, or 550 nm. That's pretty small, about 1/1800 mm. And, even if we are comfortable with photons, the minimal packets of light, we think of them as having a similar "size". But diffraction patterns show us that a photon can somehow "sense" the entire aperture as it "chooses" by how much to change its direction of travel. A certain experiment that has been done with both photons and electrons proves it:
- Set up a very, very light-tight box with a dimmable light source at one end, a sheet with a hole in it about midway, and either a sheet of film or an array of sensitive detectors (e.g. a digital camera sensor) at the opposite end.
- Let's assume the light source is accompanied by a lens system that makes a uniform beam larger in diameter than the hole in the sheet.
- Set the "brightness" of the light source such that there will very seldom be more than one photon inside the box at any one time. That's pretty dim!
- A 550 nm photon has an energy of 2.254 eV.
- A 1 mw yellow-green laser set to that wavelength (you can do that with dye lasers) emits 2.77 quadrillion photons per second.
- Light traverses a 1-meter box in about 3 ns.
- The 1 mw laser thus emits 8.3 million photons in those 3 ns.
- Thus you must dim the beam by a factor of more than 8 million. That is 23 f/stops, or an ND of 6.9. Two pieces of #9 welding glass is about right.
- Close the box, turn on the light, and wait about 3 hours.
- Develop or download the resulting image. It will have the same diffraction pattern as if you'd left off the filters and shot a picture in 1/1000 sec.
The experiment has been done many times, usually using a two-slit setup. Either way, it shows that both a photon and an electron somehow "self-interfere" as they are influenced by everything along the way from emitter to "final resting place."
All the above serves to get my mind in gear to write about The Quantum Moment: How Planck, Bohr, Einstein, and Heisenberg Taught Us to Love Uncertainty By Robert P. Crease and Alfred Scharff Goldhaber. The authors, professors at Stony Brook University, aim to demonstrate that "quantum stuff" keeps things from either collapsing or flying apart. That we owe our lives to it. Dr. Goldhaber, in particular, draws upon classroom experience, for he teaches a course that uses optics to introduce quantum mechanics.
The book is filled with mini-histories and mini-biographies of the "physics greats" of a century ago who wrestled with the findings of phenomena that revealed that Newtonian mechanics are not up to the task of explaining all the little stuff that underlies our everyday experience. Optical diffraction is just one such phenomenon. If there were no diffraction, you could put a really powerful eyepiece on an ordinary pair of binoculars and see to the end of the universe...if your eyes were sensitive to really, really dim light (telescopes are big mainly to collect more light; high resolution is also good, but is secondary in many cases).
Einstein imagined riding a beam of light from emitter to absorber. Nowhere have I read an explanation that, from the photon's point of view, nothing happens at all. The special theory of relativity, with length compression by Lorentz contraction, and time dilation, only applies to non-photons, and in particular, particles with mass. If you take Lorentz contraction and time dilation to their limits at v=c, the photon travels no distance at all, and does so in zero time. So there is nothing to experience! From a photon's point of view, the entire universe has zero size and time has no meaning; the big bang may as well never have happened!
What if we step back a tiny bit, and imagine the neutrinos that arrived in 1987, heralding the core collapse of an immense star in the Large Magellanic Cloud, Supernova 1987a (SN1987a). I haven't read any analysis of their apparent velocity, but it must have been only the tiniest whisker slower than c. Neutrinos do have some mass, perhaps a few billionths of the mass of an electron, so they tend to have near-c velocities. It is likely that the "clock" of those neutrinos registered only a few minutes during their journey of 187,000 light years, and the distance seemed at most a few hundreds or thousands of kilometers. Now, that is relativistic.
What did Einstein and Planck and Heisenberg do that got everyone (among physicists) all in a dither for the first half of the Twentieth Century? First, Planck applied a minimum limit to the "packets" of energy radiating from a heated object, in order to combine two competing, and incompatible, mathematical models of "black body radiation" into a single formula. Einstein later showed a simpler derivation of that formula. But at first, physicists just thought of it all as a mathematical trick. In between, Einstein had described a good theory of the photoelectric effect, which seemed to require that light be in finite packets, that we now call photons.
Photons are usually small in terms of the energy they convey. As mentioned above, the yellow-green color seen at 550 nm wavelength is carried by photons with an energy of 2.254 eV (electron-Volts). An eV is a 6 billionth-billionths of a joule, and a 1-watt current is defined as one joule per second. But molecules are also small, and the energies that underlie their structure are similarly small. UVb radiation from the sun, just half the wavelength, and thus twice the energy, of "550 nm yellow-green", breaks chemical bonds in your skin, causing damage that can lead to cancer. So use sunscreen! (The middle of the UVb band is close to 275 nm, with a photon energy near 4.5 eV; more than enough to knock a carbon-carbon bond for a loop.)
Book after book is filled with the stories of the founders and discoverers of quantum physics. This book puts it all into a context that the authors call the Quantum Moment. They use the word "moment" the way a historian uses "era". From 1687 until 1927, the Newtonian Moment dominated about 240 years of physics discovery. Once a critical mass of physicists had to accept that quantum phenomena were real, not just mathematical tricks, the Quantum Moment arrived. The stories of the epic battle between Bohr, who formulated the Copenhagen Interpretation, and Einstein, whose work stimulated Bohr and others, but from which Einstein then recoiled, is told here with more feeling and clarity than any other I've read.
Scientists have an emotional bond with their science. For many of them, it is their church, which they defend as keenly as any ardent fundamental Christian defends his church's theology. In the Newtonian Moment, phenomena whose initial state could be perfectly described were thought to be perfectly predictable. The math might be gnarly, but it could, in principle, be done. Quantum theory, and then quantum mechanics, blow by blow cracked open this notion and showed it to be a fantasy.
This is not just the problem of imperfect knowledge, rounding errors, or the need to simplify your equations to make them solvable. Heisenberg's Uncertainty Principle is not just a description of the way a measurement apparatus "kicks" a particle when you are measuring its location or velocity. What is Uncertain is not your measurement, but the actual location and velocity of the particle itself, at least according to Bohr. One implication of this with more recent application is the "no-quantum-cloning" principle, which makes certain applications of quantum computing impossible. However, they also make it very possible to create unbreakable cryptographic codes, which has the governments of the world (or their equivalents of our NSA and CIA) all-aquiver.
Then there's the cat. The authors give us the luscious details of Schrödinger's Cat satire, which he proposed as a slap against the notion of an "observer". Bohr and others needed some instruction from optics: every quantum particle is sensitive to, very literally, everything in the universe. All at once, and with no apparent limitation set by c. Heck, half the time, the cat is the only observer that matters. The other half, the cat is dead, and it ceases to matter to him. But, the authors point out, the air in the box is an "observer": the exchange of oxygen, water and carbon dioxide around a breathing cat are quite different from those near a dead one. So all we can say from outside the box with the cat in it, is that we can't decide the status of the cat without looking inside. We just need to remember that the term "observer" is very squishy.
I recall reading that even a pitched baseball has a "wavelength", according to the deBroglie formula. It is really tiny, only a few thousand times larger than the Planck limit of 10-35 cm, in fact. That means the deBroglie wavelength of a jet aircraft is much, much smaller than the Planck limit, which is why "real world" phenomena are easily treated as continuous for practical matters.
But the Cat, and the Uncertainly limit, show that the boundary between quantum and "classical" worlds is hard to pin down. Since that is the core of the Copenhagen Interpretation, it is seen to be weak at best, and in the eyes of some physicists, simply wrong. But there is no well-attested competing theory.
We must remember that the theories and mathematics of quantum "stuff" describe lots of "what" and a little bit of "how". They tell us nothing about "why". We don't know why there is a Pauli Exclusion Principle, that two electrons, and two only, can coexist in an atomic "s" shell, but only if they have opposite spins (and that "spin" is oddly different from the way a top spins). But we do know, that if it were not so, atoms would collapse in a blast of brightness, almost immediately, and the universe would collapse back into a reverse of the big bang, all at once and everywhere.
One scientist's work is not mentioned in this book, probably because he wasn't directly involved in the quantum revolution. But his work is pertinent in another way. Kurt Gödel formulated his Incompleteness Theorems in 1931, early in the Quantum Moment. Together, they show that no mathematical system can "solve" every problem that can be stated using its postulates, and that no mathematical system can be used to describe its own limitations. For example, there are rather simple polynomials that can be formulated using Algebra, but can only be solved using Complex Analysis. Even weirder if you know only Algebra, the simple formula X²=1 has two answers (1 and -1), but we tend to think that Xⁿ=-1 has only the answer -1 when n is odd, and is "imaginary" when n is even. But in Complex analysis, when n=3, for example, there are three answers, two of them involving an "imaginary" part.
At present, then, science has three boundaries to infinite exploration:
- Heisenberg Uncertainty. You can't know everything to infinite precision.
- Schrödinger Undecidability: You can't predict quantum phenomena on a particle-by-particle basis. Even if you could escape the Uncertainty Principle, you couldn't do anything of great use with the results (which would fill all the computers in the known universe, just describing a helium atom to sufficient precision).
- Gödel Incompleteness: You can't solve most of the questions being asked in the framework of quantum mechanics, not now, not ever, using the methods of quantum mechanics. QM appears to be the most Gödelian of mathematical systems, in that it asks so few questions that can be answered!
For scientists who grew up in the Newtonian Moment, it is like finding out that your church has no roof, and the rain and raccoons are getting in and taking over the place. No wonder Einstein was upset! We are in the Quantum Moment, nearly 90 years into it, and it may be another century or two before a new Moment supersedes it. Get used to it.
No comments:
Post a Comment