Friday, September 30, 2011

It doesn't take much of a molecule

kw: drugs, laws

The concept usually called Therapeutic Window is relevant to recent news. First, there is the issue of a number of designer drugs collectively called "bath salts". They are hallucinogens with this characteristic: the amount that you need to get "high" is rather close to an overdose. They have a rather narrow Window. The range between overdosing, where psychotic symptoms can become permanent, and fatal overdose is, fortunately, a little wider.

This is one of the "bath salts" molecules. Strange that such a small molecule can have such profound effects. Of course, one of the smallest molecules, HCN, or cyanide, is fatal in very small amounts.

One fortunate circumstance: little Delaware is likely to outlaw "bath salts", perhaps as early as today. The world would be a better place if marijuana were legalized and the sale of "bath salts" hallucinogens were made a felony. Pot has a very wide therapeutic window. Whether used for pain management or recreationally, it is hard to kill yourself with it.

Then there is this stuff. Propofol, the molecule that killed Michael Jackson. In yesterday's court debate, the defense claimed that the doctor was weaning the singer off the drug. Strangely, he had just ordered another three gallons of it!

There is no way it ought to be used as a sleep aid. It is used for a quick knockout when a patient is being prepped for surgery. Its effects only last for ten minutes, by which time safer long-term anesthetics have been administered.

Basically, if you can't sleep without an injection of something, you've got huge problems. I suspect the King of Pop was using stimulants to sustain his frenetic performances, to the extent that he couldn't sleep without something equally powerful to counteract them.

Finally, heads up on book reports: I am a quarter of the way through a 990-page novel, so it'll be a while before I get it done and review it.

Wednesday, September 28, 2011

Global warming might save us

kw: observations, global warming, sunspots, ice ages

This graph, from Wikimedia Commons, is one of the most striking compilations of long-term observations to be had. The blue line charts more than 260 years of sunspot observations gathered since daily observation of the sun began in 1749. The red symbols add less regular observations that stretch back to the time of Galileo.
The most prominent feature is the regular 11-year solar cycle, followed by the significant variations in the height of the solar maximum with each cycle. The Maunder Minimum, a 40-year period in which three solar cycles passed with nearly no sunspots, coincides with the first, and coldest, phase of the Little Ice Age from 1650-1880.

There is not exact correspondence between sunspot activity and climatic heating or cooling. Solar Cycle 4, the beginning of the Dalton Minimum, lags just a little the significant cooling episode that caused the Continental Army under George Washington such suffering during the Revolutionary War. Many of our weather proverbs relate to late LIA conditions in New England. In particular, the old saw about the groundhog's shadow on Feb 2, which originated in Massachusetts in the mid-1800's, is quite a bit out of date. Clear conditions in midwinter no longer foretell a coming cold spell nearly so accurately. We are now in the midst of, or perhaps near the end of, the Modern Maximum which began with Cycle 18 about the time I was born.

The next image, also from Wikimedia Commons, shows recent cycles in more detail, plus predictions for cycle 24, which began in 2010 rather than 2007 as predicted, and 25, which is sheer speculation at this point.
The prediction for Cycle 24 is lower now. Cycle 23 peaked at around 135, while the prediction in the image below, from nwra.com, shows a prediction of about 90. That matches the range of sunspots during the early 1900's, and it was based upon those that predictions of a cooling episode were made in the 1960s, before the MMax became clear. It is not stated where the very low prediction for Cycle 25 arises, but if 24 is 90 or lower, and 25 is similar, we'll have significant cooling by the year 2030.

One feature of this chart deserves discussion. The curves go below zero! This is because sunspot number was correlated with f0F2 several decades ago, and now "official" sunspot number is taken indirectly. The f0F2 parameter refers to the critical frequency of reflection for the F2 layer in the ionosphere, measured by HF radar. Ham operators occasionally hear loud clicks in various HF bands (3-30 MHz) around sunrise, midday and sunset, caused by the radar measurements, which take only a few seconds. Since the correlation is not perfect, some measurements of critical frequency "figure out" to a sunspot number below zero. I have not located actual sunspot counts for recent years with which to compare this graph.

A lot is riding on the actual magnitude of Cycle 24. It could fool us; there is a lot we still don't know about solar dynamics. We have hardly a clue as to the cause of the Maunder Minimum. What we do know is that, although sunspots are cooler than the rest of the solar disk, the rest of the disk runs a little hotter on average, so that a strong cycle like Cycle 19 might have 1-2% greater sunlight hitting Earth than occurs during a solar minimum such as 2008 or 1995. Averaged out over twenty years or so, the reduced sunlight from a couple of weak cycles can make for stronger ENSO episodes (El Niño-Southern Oscillation) and deeper winters. A single year does not show such trends clearly, but decadal averages can. We'll have to wait and see.

In the meantime, let us consider the Medieval Climate Optimum, the period of 500-600 years before the LIA, during which global temperature was probably at least 2-3°C warmer than the decade of 2001-2010. Crops flourished and it was a time of prosperity. Global warming could drive us not just to that point, but beyond it. However, if we have a couple of weak solar cycles, the warming will likely be largely offset. Whether this is seen as coincidence or divine providence, it appears at first blush that we're going to have a postponement of the full effect of global warming, giving us time to increase the efficiency with which we use energy, perhaps even moving our transportation habits back to cars with Model A performance (they'd go 50-60 mph or 80-100 kph, but with 20HP, took a while to get there). That's what it will take to approach 100 mpg/160 kpg.

The other side of the coin is that, without global warming, we could have another series of winters such as those of the 1780's. Imagine postponing your Easter Egg hunt until the snow melts, or wearing your flannel underwear from September to April or May (add six months to these if you're in South America or Australasia). I like cold weather, but it has its limits. I have lived in South Dakota, and having SD conditions in PA would take some getting used to, such as putting our water main deeper underground where it couldn't freeze. A little global warming might just be our best friend!

Tuesday, September 27, 2011

Skin deep and little else

kw: book reviews, nonfiction, taxidermy, biographies

Quick quiz to start things off: You have a choice of artifacts to purchase and take home, perhaps to display in your den. One is amounted pair of fox kits in an attitude of play, the other is a jackalope. Assuming prices are similar (which is unlikely), which would you rather have? Based on sales figures just at Wall Drug in South Dakota, which sells 1,200 mounted jackalopes each year, the odds are about a thousand to one in favor of that horned hare! The last time I was at Wall Drug, I was told the most popular version of the jackalope has antelope antlers rather than the young deer antlers shown here, plus pheasant wings.

That's a lot of jackrabbits that had to die to satisfy a whimsy. But tell me, is the mounted pair of fox kits any less whimsical? It is more "authentic", because it depicts something real. But this is not a hunting trophy, like that stag's head on the wall of the neighborhood sports hunter. I sure hope the kits weren't killed on purpose just to make this display. But museum hunters still kill many animals just for the purpose of mounting them for display in dioramas or other museum exhibits. Even the tiny local Natural History Museum has a couple dozen rather large taxidermized animals (lion, wildebeest and similar critters) mounted and on display.

The Authentic Animal: Inside the Odd and Obsessive World of Taxidermy by Dave Madden is part biography and part natural history, the natural history of that interesting human variant, taxidermists. The biographical part concerns Carl Akeley, who has been called the father of modern taxidermy, even though nearly every technique he promoted had been invented by others, sometimes a century prior. The author devotes about half the book's text to biographical material and the other half to his own rather obsessive (as he admits) pursuit of things taxidermic. He interviewed a number of them, and attended several competitive exhibits, where judges examine every hair or feather, every skin fold, of a mount for perfection of technique, as well as the overall composition and arrangement.

At one such show, Madden was admiring a lovely wood duck on a pedestal, when suddenly it moved! A man standing nearby said, "He's a good duck." Quiet and unflappable, this living duck happened to be well practiced in stillness, a necessary skill for all prey animals. At a taxidermy show, this pet duck was remarkable, being the only living animal present.

You know the old joke: If you hear "clippity clop" hoofbeats outside your window, you don't automatically think "Zebra!" Well, unless you are on safari in the Okavango Delta. There, it is the horse that is unusual.

Carl Akeley became something of a legend in his own time, in part by killing a leopard bare-handed once he ran out of ammunition (he missed it at least five times). It attacked him, going for the throat, and he got his right arm up and into its jaws, then punched it to death, breaking ribs and finally puncturing the lungs. As you can see, this was a young animal, a juvenile. A stronger adult animal would have been able to bite through the arm bones and finish Akeley off rather quickly.

This was on Akeley's first visit to Africa, once he became an established taxidermist at the American Museum of Natural History in about 1896. He visited Africa five times, dying there of dysentery in 1926.

As the story goes, when he was a boy, the pet bird of a family friend died, and young Clarence (who preferred Carl in later life), learning from books, skinned it and mounted the skin in a pretty lifelike pose. It is not clear how much the friend appreciated the gesture.

As a young man, Carl worked at Ward's, a commercial taxidermy house that catered to rich clients. It was a bit of a sweat shop, and Carl's preference for getting the details right made him slower than the others. It led to his firing after a few years. Yet that same eye for detail, and a few innovations in preparing lighter-weight manikins, led to museum jobs and growing fame.

But I find most interesting the author's travels in the world of taxidermists. He never tried to mount an animal skin himself. He studied those who do, learning what makes them tick. Most are hunters, but as time passes, most taxidermists turn their skills to mounting others' kills, either as favors or as a business, or to win awards. At one time, taxidermists made their own manikins, carved from soft wood or foam, or molded in hollow plaster or fiberglass. Now you can buy a dozen sizes of prepared manikin for the animal of your choice, in most cases. You can get a deer manikin of your chosen size with four neck positions, from erect to full sneak (almost feeding position). With luck, the skin can be mounted forthwith. One may need to do a bit of carving here and padding there. The artistry that remains is all in the plumping of lips and arrangement of eyelids about glass eyes, and a few other details here and there.

It kinda reminds me of the way ham radio has changed since I was licensed in the 1980s. You could at that time still buy parts or kits and make a radio, or even scrounge at a surplus store and really go "homebrew". A very few hams still build, but most are "appliance operators". When you can buy a transmitter for $250 that is better than anything you are likely to build, why go to the effort? Most hobbies are the same, and both hobby and commercial taxidermy are almost wholly dependent on the vendors. Basically, you start with a dead animal, skin it, then use one of several commercial flensing machines to strip inner flesh from the pelt, leaving, for a squirrel-to-deer-sized animal, no more than a millimeter or two thickness. After that, you use various softening compounds and adhesives, and commercial arsenical or insecticidal soap, a commercial manikin, glass eyes chosen from quite an impressive selection, and perhaps even a photo-mural diorama out of the catalog. It's getting harder to find areas in which to display one's artistry.

There is a long chapter on why we kill animals, anyway, besides the need to eat some of them. Bragging rights is one big motivator (and I find the impulse understandable but nauseating). I suppose I must, grudgingly, admit the museum's need to educate the public with properly-mounted specimens of animals most people would never see. But programs like PBS's Nature, and series like Wild Kingdom elsewhere, let people see the animals in motion, albeit in heavily-edited footage that doesn't show much of the boring stuff (it takes a lion or leopard a couple days to sleep off a good meal).

There is also a chapter about human taxidermy and similar efforts. There was only one real taxidermist of humans, Edward Gein, a person who was clearly deranged; most of his makings were fetish objects. In a house filled with skulls he had lampshades and chairs of tanned human skin, and lots and lots of "unmentionables". At least Gein didn't make fresh kills. He was a grave robber. Kind of like an archetypal Ygor only a dozen points crazier. But in the same chapter the author discusses a few exhibits of preserved humans, other than mummies, in which the skin is missing. The most celebrated is Body Worlds, which I have been to see when it showed in Philadelphia a few years ago. You can't call it taxidermy, as "derm" means "skin." Maybe the ones that just show veins and arteries outlining an entire body are "taxiveny". Many of the rest of the plastinated bodies show musculature. I found it eerily fascinating. Once was enough. It seems we have an inborn, powerful aversion to making human taxidermy.

What of the impulse to preserve an animal's figure in this way? Given that fewer than a third of us have the impulse to collect (making collecting, as Madden writes, deviant behavior), why collect mounted skins? Is it to honor the animal, or to connect with our own animal nature? As with other endeavors, the reasons are as diverse as height or the depth of one's voice. I like natural history museums, though I spend most of my time there with the rocks, some with the shells, some with bones, and almost none with the mounted "specimens". I am not comfortable in a home "decorated" with mounts. A mounted fish or moose head in a restaurant doesn't put me off my feed, but doesn't interest me, either. I suppose there are those whose favorite dining room is filled with mounts. For this subject, more than most, "to each his own" is full of meaning.

Monday, September 26, 2011

Physics surprise - or is it?

kw: science, physics

In the CNET article Physics Shocker - Neutrinos clocked faster than light, work is reported that measured the speed of neutrinos that traveled 732 kilometers through solid rock, from CERN in Switzerland to Italy, and arrived 61 nanoseconds "too soon". That is, they arrived sooner than the time it would take light to cover the same distance in a vacuum, which is supposed to be the shortest travel time possible. By the way, the caption to the article's illustration mentions "neutrons", which is an error.

This is no surprise to me, because of a well-known property of x-rays: they have a refractive index less than one in most media. Refractive index is the ratio of the velocity of a phenomenon through a medium to its velocity in vacuum. Thus, one variety of crown glass has a refractive index of 1.51 for a certain wavelength of visible light; this implies its velocity through the glass is 1/1.51 times c, or 0.662c. The refractive index of an x-ray of moderate energy (30 KeV, wavelength 0.0413 nm) in water is 0.99999974, which implies that the x-ray's phase speed is 1.00000026c. I say "phase speed" rather than velocity because it is one of two ways wave speed is measured for a wave packet (The two are equal in a standing wave); refractive indexes are formally defined as the ratio of phase speeds.

Try this sometime on a windless day: throw a small stone into a pond, then watch the ripples carefully. The expanding ring of ripples, typically about ten wavelets, will show a kind of rolling effect; new wavelets appear at the trailing edge, travel through the thickness of the ring, and die out at its forward edge. These wavelets are moving at the phase speed, while the slower speed of the ring is called the group speed. Standard physics analysis indicates that x-ray photons travel at the group speed, but have an internal structure that exhibits phase speed "rolling", and this is what interacts with the medium to refract it at an angle we measure to determine the refractive index.

This leads to an interesting property of x-rays: total external reflection. Anyone who has taken an elementary physics course will know about total internal reflection: at angles shallower than the critical angle all the light moving from the medium to vacuum (or air) is reflected. Thus, when you are under water and look toward the surface, there is a circle in which you can see what is above the surface, and outside that circle, the water's surface appears as a mirror reflecting things below the surface. With x-rays it is just the opposite. At very shallow angles all the x-rays that hit a surface, such as of polished aluminum, are reflected outward with no losses (Aluminum's refractive index for x-rays is in the range of 0.99999). This behavior was exploited to make the focusing mirrors of the Spitzer X-Ray Telescope, which gathers x-rays from an area roughly a meter in diameter to a spot a micron wide, producing images with resolution similar to a telescope with an 8-inch (20 cm) aperture.

An x-ray lens was invented that exploits a different result of this sub-unit refractive index. A rod of aluminum or boron can be drilled with small holes that cross the central axis at various angles. Each hole does a little focusing in one plane, and by having various angles (three is common), a diverging beam of x-rays can be focused into a parallel or even converging beam. We are talking very small angles here. The effect makes it possible to focus x-rays, but has limited utility because even in Al or B, x-rays are gradually absorbed, so the thickness of material needed for elaborate focusing would use up most of the beam's energy.

The experiment in Italy is surprising because it was set up to measure the group speed of the neutrinos. If further investigations support this result, it will be necessary to repeat them with the neutrinos moving through vacuum over similar distances, such as between the CERN reactor and the ISS. Very careful timing (and careful synchronization of clocks to account for relativistic effects with satellite velocity in the 5km/s range) will be needed to ascertain whether this is a refractive index situation or a property of the neutrinos that is independent of the medium. Maybe they'll turn out to be the fabled tachyons! (I kinda doubt it, but fun to dream…)

Saturday, September 24, 2011

My favorite paradox

kw: musings, logic

One of he famous stories of mathematics is about the "Hardy-Ramanujan Number" 1729. G.H. Hardy related:
I remember once going to see him when he was ill at Putney. I had ridden in taxi cab number 1729 and remarked that the number seemed to me rather a dull one, and that I hoped it was not an unfavorable omen. "No," he replied, "it is a very interesting number; it is the smallest number expressible as the sum of two cubes in two different ways."
Of course, when Srinivasa Ramanujan said "number" of course he meant a natural number, not an integer, or he'd have said "integer". This is because, as his notebooks show, 91 is the sum of two different cubes two different ways, but one of the numbers being cubed is -5. Careful with his words, he was.

Suppose you wanted to make a list of interesting numbers (positive only, of course). You could start out with 1, the first number; 2, the first even number and the only even prime; 3, the smallest odd prime; 4, the smallest perfect square other than 1; 5, the number of digits on a primate hand, and the number that, when multiplied by any odd number, replicates itself as the last digit of the result; and so forth.

Sooner or later you might come to a number about which you know nothing "interesting", and you can't find anything about it. Have you found the first "uninteresting number"? How Interesting! Does it go on the list, or not?

By the way, these days you'd have to go far to find a number about which nothing has been written. Wikipedia includes thousands of pages about number after number. That first "unidentified flying number" probably has at least five digits.

Friday, September 23, 2011

Nonlinearity and Heterodyning and the Sea

kw: analysis, nonlinearity, signal mixing

A fundamental concept related to yesterday's book review is nonlinearity, which leads to unexpected effects, and not just in giant oceanic waves. I first need to define what a mathematician means by linearity and nonlinearity. A linear system is one that can be described by a system of differential equations in which the terms that are being combined are independent, the variables can be separated, and there are no feedback effects. In a nonlinear system, there are frequently feedback effects, at least one term includes an expression that is recursive (i.e. it depends on the solution variable), and the terms are not all independent.

The simplest example is the frequency mixing that makes an AM radio work. The illustration below, from Amplitude Modulation in theFreeDictionary.com, illustrates how the radio signal is prepared.


There are a number of ways to combine the low-frequency signal (typically voice or music) with the high-frequency carrier (the radio wave), but the easiest to understand is a voltage-controlled amplifier (VCA). The audio signal is offset by a fixed voltage selected so that the combination will not reverse polarity (go below zero), and this biased audio signal used to control the VCA which is amplifying the radio carrier. The result is a modulated carrier.

The upper part of the image shows a pure radio signal, which has a fixed frequency. What changes once it is modified by modulation? If the modulating frequency is also a fixed frequency, the result is that there are four frequencies in the output of the VCA, the two input frequencies and two more which are their sum and their difference. Thus, if the carrier is at a frequency of 1 MHz (1000 on the AM radio dial), and the modulating signal is at a frequency of 200 Hz (near A-flat below middle C), the VCA's output will be four frequencies: 200 Hz, 999,800 Hz, 1,000,000 Hz (1 MHz), and 1,000,200 Hz. The rest of the radio transmitter contains a high-pass filter to eliminate the 200 Hz and a power amplifier to boost the other three frequencies for broadcast. These three frequencies together produce the modulated waveform shown at the bottom.

Here is a key point. Modulation is a nonlinear effect. It has created frequencies that did not exist before the signal and carrier were "mixed", as a radio engineer would call it. Now, if the power amplifier in the radio transmitter is perfectly linear in its action, both mathematically and in the common sense of producing a perfect multiplication of the input signal, the transmitted signal will just have those exact three frequencies. But if there is any distortion in the amplified signal, extra frequencies are created. Distortion is a further nonlinear effect.

To produce the next image, I created a carrier with a frequency of 256 and a modulating signal with a frequency of 8, and mixed them. The red dashed line in the image below shows the result: A strong peak at 256 and peaks one-quarter of its size at 248 and 264. This image is a Fourier Analysis of the modulated signal. The blue line has something extra; it shows the effect of significant distortion in the power amplifier. You'll probably need to click on the image to see all the detail. (Better yet, right-click it and choose "Open in new window", so you can move the larger image to one side and refer to it easily.)


The frequencies to either side of the carrier are called sidebands. They carry the added "stuff" needed to accomplish modulation of the carrier. However, in the distorted signal, there are more frequencies shown, because the distortion makes the whole system more complex. And this is not all. Pulling back to see a wider range of frequencies, we see that there are significant output levels at 3x the carrier frequency, near 768:


There are actually small peaks every 8 all along, but most are so small they don't appear at this scale. (You can now close the extra window and click on this image or open it in another window to see more detail.) Radio engineers go to a lot of trouble to make the power amplifier as perfect as possible, but they also include a low-pass filter to remove clusters of spikes at 3x, 5x, and so forth that any remaining distortion might introduce.

This example shows that nonlinear effects can produce energy at frequencies far removed from the carrier. Strong distortion can put significant energy into these higher frequencies. Purposeful distortion, such as using a diode to rectify the carrier, produces a frequency comb, a long series of higher and higher frequencies, evenly spaced but reducing in power.

It is a long way, conceptually, from an AM radio to waves in the ocean, and we have to invert our thinking a little bit. Higher frequency for electromagnetic radiation means higher energy in a quantum sense: higher energy per photon. Light is at a frequency millions of times as high as AM radio, and photons of light can knock electrons loose from a metallic surface with energies of a few volts, in the photoelectric effect. Go to a frequency a million times higher yet and you have Gamma radiation, which is deadly. Each gamma photon is a multi-million-volt jolt.

In the ocean, surface waves work just the opposite. The shortest waves, those with the highest frequencies, have very little energy. The more energetic a wave is, the lower its frequency and the larger its wavelength in the open ocean. Furthermore, the winds that produce most waves are not nearly as steady as the sources of radio signals. At the shore, watch the waves, and time the interval between them. You'll usually find that the time from wave to wave varies by 10-20% of the average time between waves, and that some waves are larger than others. Have you even been taken by surprise when a large wave washes much higher up the shore than you were expecting, and you got wet before you were ready?

Waves produced by storms at sea travel in sets, and each set has a characteristic wavelength and range of wavelengths. Far from a storm, any particular set seems to be quite uniform, but in the middle of the storm, there are several sets coming from different directions because the storm is usually a few miles across, and the wind blows from varying directions. Because of nonlinear effects at the water-air boundary, the range of wavelength, and thus wave energy, gets much greater when waves mix. It is analogous to radio modulation.

The next image shows a rather complex picture. The blue bars show a range of energies on an arbitrary scale for a single set of wave trains. I used a normal (Gaussian) random generator with a mean of six and a standard deviation of one (for the mathematician out there: this was not a sum of 12 RAN functions, but a genuine random variable generator).

The red bars show the intersection of two independent wave trains of the same mean and S.D. Notice that the distribution of red bars is a bit wider. The green bars show three random trains mixed together. The highest values reached by green are more than twice the highest value for the blue.


The other feature to notice is that, as you go from blue to red to green, the peak of the distribution gets slightly smaller. This illustrates how the mixing of wave trains can produce waves larger than the largest waves of a single wave train. And this is with only a small amount of nonlinearity introduced. The variation of salinity and temperature with depth adds more nonlinearity to the system, and is likely to act a little like the shore does to swells approaching land: focus and raise the waves even far out at sea, making the rogue waves even more roguish! In fact, it may act as a source of distortion, and possibly produce giant waves 3x or 5x, or more, of the median wave height.

Well, that's a lot more thinking than I usually indulge in this early in the morning. I hope it is illustrative and illuminating.

Thursday, September 22, 2011

Those aren't waves, they are mountains on the move

kw: book reviews, nonfiction, oceanography, waves

You've probably seen this picture before, of a surfer on a 50-foot wave known as Jaws. What isn't shown is the jet ski that towed him into position at high speed, and is waiting to rescue him when (not if) he wipes out. Somewhere in the 40-foot range, waves arrive too fast to be caught by swimming or paddling, even on a stand-up board using a paddle. Jet ski towing, first promoted by Laird Hamilton and others, is the only way to catch the really big ones.

I don't pay a great deal of attention to surfing. I'm a mediocre body-surfer, nothing over twelve feet. So I took the chance to catch up on my education by reading The Wave: In Pursuit of the Rogues, Freaks, and Giants of the Ocean by Susan Casey. It is a rather eclectic book, because she investigated all the world's largest waves, not just surfing waves, but also mid-ocean rogues and tsunamis.

Nonetheless, more than half the chapters in the book relate to big surfing waves and the men (and one woman) who obsessively pursue them. One chapter reports on a mix, where a salvage operator flew some surfers and their jet skis into the Agulhas Current off South Africa to catch some 100-foot rogues waves. Trouble is, big oceanic rogues are not nearly as steep-faced as breakers coming up on the beach. The guys got some rides, but it was rather anticlimactic.

Of course, when you are on the bridge of a big ship, facing a wave even higher than your viewpoint, they are mighty intimidating. This wave is probably in the 80-foot range. A few ships have encountered waves over 100 feet and managed to return to port, or have been salvaged. A large container ship or two is lost every week, and the size of the waves that do them in is never recorded; they typically vanish without a trace. A visit by the author to Lloyd's of London verified such figures.

For decades, oceanographers contended that oceanic waves of 100 feet and larger were impossible. That was before photographic evidence was gathered, showing that waves three and four times as large as the seas around them could pop up. It was also before radar-bearing satellites began routinely measuring wave heights of 100 feet and more in the heart of storms.

The secret to this is nonlinear dynamics, a common natural phenomenon which is behind the "mixing" that allows an AM or FM radio to work. Scientists use "linear" mathematics to model natural systems whenever they can, because linear systems can be solved mathematically. Nonlinear systems are typically impossible solve, or even to calculate accurately without lots of costly supercomputer time. But nature is full of nonlinear effects.

The air-water interface introduces nonlinear effects. I have seen a simple experiment using a wave tank. When waves with two different wavelengths, coming from different angles, pass through one another, you get wave trains with four wavelengths. One of the extra trains has a frequency that is the sum of frequencies going in, and the other one's frequency is their difference. Out on the open ocean, a storm produces a variety of wave trains, called swells. These can mix in unexpected ways. As Ms Casey writes, sometimes one plus one equals seventeen. That is a bit exaggerated. In reality, energy from several wave trains can "mix" nonlinearly to produce a sudden surge, or train of surges, that can rise to several times the height of the swells around it. That is a rogue wave. Researchers discussed in the book have been able to create small versions of nonlinear rogues in wave tanks.

Surf breakers and tsunamis behave differently from oceanic rogue waves, because they interact with the sea bottom. When a swell approaches shallowing water, it begins to "feel the bottom" when the water depth is comparable to the wavelength, and typically begin to grow in height as they slow down once the depth is half the wavelength. By the time a wave is in water shallower than its height, it rises fast and then either tumbles down or breaks over, depending on how steep the shore is. Of course, the shore is seldom a smooth plane. Curvature of all kinds can either dissipate or focus wave energy. Famous wave locales such as Jaws have curved bottoms that focus waves that are already large because they arise from powerful oceanic storms that created large, powerful long-wavelength swells.

The longer a swell's wavelength, the more energetic it is, and the faster it moves. The longest swells result from sub-ocean-floor earthquakes and ocean floor landslides. These can have wavelengths of a number of miles, and travel at speeds exceeding 600 mph (960 kph). On a steeply dipping shore with some focusing, they rear up into tsunamis that can rise 100 feet or more and travel far inland.

The biggest verified wave, 1,740 feet (2,800 m), occurred in Lituya Bay, Alaska on July 9, 1958. This was a splash wave caused by an enormous land-and-ice fall from one of the extra-steep mountainsides that flank the landward end of the bay. Lituya bay is shaped like a double funnel, with Cenotaph island smack in the center. It is tailor-made to amplify any wave that arises within it. Aerial photos show that no trees grow within about 2,000 feet (3,200 m), vertically, of the shoreline. The 1,740-foot monster is not the biggest that has ever occurred, just the biggest one witnessed and, somehow, lived through. If a surfer managed to catch up with a wave this size, he would soon find himself smacked into the mountainside at a speed in excess of 100 mph. There really are limits to what can be surfed!

But the author, and the book, return again and again to Maui, the home of Jaws, Egypt and Hookipa, which among them just about corner the market on large waves that can be surfed with some chance of living through it. Big-wave surfers share one characteristic with professional hockey players: an oft-repaired body and long familiarity with emergency rooms. The author's hero, Laird Hamilton, stated that he quit counting at 1,000 stitches. Their obsession sends one to an early grave every year or two. In a late chapter, the author relates how Laird Hamilton asked her to ride along on his jet ski while he pulled a fellow surfer into position at Jaws. Instead of swinging away right after, he then surfed the wave on the jet ski with her aboard. Scared out of her mind, she was immediately ready to repeat the experience: "Again? How about ten more times!" Adrenaline is quite addicting.

Wednesday, September 21, 2011

Barnum was more right than he knew

kw: astronomy, photographs, comets, pseudoscience

Here is how the popular cult comet Elenin looked more than three weeks ago, on the evening of August 27, as photographed in Australia by Michael Mattiazzo. The image accompanied this Sky and Telescope article.

Why should this interest me? I usually only care for naked-eye comets, and there has not been one since Hale-Bopp about 15 years ago.

There is a subtle feature of this image: the nucleus of the comet appears elongated, where it was pointlike just a day or two earlier. The comet is breaking up. Yet just this morning (here is why I am interested), I was up extra-early, listening to Coast-to-Coast AM with George Noory, and he and a guest were talking about this comet as some kind of celestial messenger. I wonder if they even knew the comet is already in "rest in pieces" condition.

On those occasions that I am up before 5:30 AM I listen to Coast-to-Coast AM purely for its entertainment value. It is great fun to listen to. I have yet to hear anything that I'd call scientifically verifiable. I simply add what I hear to my store of the bottomless fund of human credulity.

Tuesday, September 20, 2011

Two mice and counting

kw: local events, animals, pests

A week ago I caught a mouse in a glue trap, in the family room. It is a common place for them to hide once they get in through the garage. In the past, at most one mouse gets in each year. Sometimes, instead, one will stay in the garage and build a nest there, which we find quite a bit later.

This evening, there was a ruckus in the kitchen, and we saw our cat had trapped another mouse. It got loose long enough to run down the basement stairs, where it hid among some shoes and sandals piled on the lower landing. I went down and encouraged the cat, but it was clear she didn't know quite what to do. A cat that gets raised by a good mouser for a mother learns to dispatch them with a bite through the neck.

After watching the hide-and-seek for a little while, I noted where the mouse was, under the toe of a sandal. I punched down on the sandal, stunning the mouse, which I popped into a plastic bag and then into the freezer. Come trash day, mouse, bag and all, will be moved to the trash and disposed of.

It is a bit of a shame to kill a one-ounce mouse, but I've tried letting them alone in the past, many years ago, and they gradually take over. They are like Tribbles, only slower. They really do need to be exterminated once they get inside. A pity.

Monday, September 19, 2011

Our most intimate inmates

kw: book reviews, nonfiction, viruses, evolution

Either physician and researcher Frank Ryan and the colleagues he interviewed while writing Virolution are on the brilliant forefront of evolutionary discoveries, or they are stark, staring mad. I tend to believe the former is true. But consider what the book claims:
  • Viruses are primarily beneficial, even necessary for our existence, and that of all life.
  • Viruses that reside in our genome mediate the development of our bodily organs.
  • While "vertebrate DNA" makes up 1.5% of the total human genome, ready-to-activate viruses make up 6-7%, and partial viruses make up another 35-40% (more are being determined all the time).
  • These partial viruses are experts at moving about, and such "jumping genes" are huge agents of genetic change, greatly increasing the variation that natural selection depends upon to produce new species.
  • Viral symbiosis and mutualism are responsible for our continued health and longevity.
  • Virus infections and diseases are an unfortunate side effect of the ages-long interplay of viral and animal (and plant) DNA.
I don't know about you, but I find some of these ideas rather unsettling. I remember predicting, nearly twenty years ago, that within a few generations, ever person still living on Earth would be either immune to or tolerant of HIV infection. It seems Dr. Ryan would agree, and moreover, he would further predict that HIV-1 and -2 would become integrated into our genome, as at least 98,000 other retroviruses have done in past ages, going back to the first cells. And that figure is actually a tip-of-the-iceberg amount; each virus has infected numerous times. The number of complete virus genomes and genome fragments in the DNA of every creature is several million.

Well, I hope I got all that correct. The book is fascinating reading. It points to HIV and to a virus that is currently decimating koalas as examples of early stages in the integration of a new virus into the genome. This ruthless culling made me recall something else I read in an article many years ago.

I don't recall author or title, but the premise was this: Viruses descend from a toolbox of small, initially non-living DNA "machines", created to facilitate regulation of DNA in multi-cellular creatures. They became the original "Frankenstein monsters", having attained great powers to modify DNA and create copies of themselves. Achieving a kind of quasi-life, they did what life does, and began to reproduce selfishly. They have become the prototype of the "gray goo" that some researchers fear will result if we produce self-reproducing nanomachines. The fact that we have not become a world of gray goo, AKA virus fodder, is that, in self-defense, early control mechanisms evolved just quickly enough into a more robust and active immune system. This virus-versus-immune system arms race has now gone on for about two billion years.

Whichever way viruses arose, Frank Ryan's claim is that they are primarily symbiotic with us and with all plants, animals and fungi. In a late chapter in the book, he outlines epigenetics, the subject of a book I reviewed two weeks ago. I saw no obvious connection between epigenetics and virology, but if I understood right, the various mechanisms of DNA control that we lump under epigenetics also activate and deactivate retroviral genomes that are so intimately involved in our development from a fertilized ovum to a grown adult, and throughout our lives.

Dr. Ryan's interest is not only academic. He is a physician, with a doctor's practicality. Pathological cases and other problems help researchers figure out the difference between things working right and working wrong, or not working at all. Diseases highlight areas that need to be understood. Pathologies that were once thought to be this or that "bad gene" are now often shown to be problems of development, or or epigenetic mistakes, or of a DNA-virus interaction gone wrong. Once they are better understood, therapies that attack the proper cause can be developed. Because epigenetics is so variable from person to person, because of our differing experiences, this will inevitably lead to very personalized medicine, almost the way my eyeglasses will only work with my eyes, and you need your own pair with different parameters, if you need any at all. If future DNA-HERV-epigenetic medicine can be done at acceptable cost, the possibilities are breathtaking.

The book is written at a bit higher level than many popularizations, but I didn't find the reading itself to be difficult. The concepts, however, are so mind-blowing that I'll have to set the book aside a while and re-read it later to be sure my impressions are in any way accurate.

Friday, September 16, 2011

Viruses R Us

kw: medicine, viruses, embryology, symbiosis

I'll just get this out of the way before I even finish the book. I find the idea behind this image rather unsettling. This shows a portion of the syncytium (pronounced sin-sigh-tee-um), the multinuclear membrane, effectively composed of a single cell, that is the boundary between fetal blood and maternal blood in the placenta of all mammals. It is the reddish layer surrounding the purplish blobs, which are folds of placental tissue.

What you don't see in this light microscope image are the viruses that induce the syncytium to form. Animal tissue doesn't "know" how to form a syncytium, or any multinuclear cell. Its formation is mediated by viruses called HERVs, for Human Endogenous RetroViruses. The retrovirus most of us have heard about is HIV, the cause of AIDS. It is related to HERVs. Our DNA is host to many, many related retrovirus genomes, and certain ones are expressed and work together with "our" DNA at many stages of our life, including setting up the placenta that makes most mammalian pregnancies work.

I'm reading a book on evolutionary virology, which I'll review more fully in a few days. Meantime, I could not get this image out of my mind once I saw it at the author's website. Many of our tissues and organs develop with the help of symbiotic viruses. I never knew viruses could be symbiotic! Not only that, they may be the dominant partner!!

This is one more demotion of our vaunted humanity. First, we were at the center of the Universe. Copernicus and Galileo moved Earth to "third rock from the Sun". Then, we were the peak of creation. Darwin, Wallace and others showed we're smart apes, but apes all the same. In recent decades it has become clear that 9/10 of the living cells in our bodies are bacteria, although each of "our" cells weighs hundreds of times what a bacterium does. Now I read that, while "vertebrate DNA" makes up only 1.5% of our total genome, various total and partial virus sequences make up 45%, or 30 times as much. I am starting to think that if you took away everything that is not "human" from us, we would be nearly weightless shells, ready to collapse under our own negligible mass.

I'm tempted to write more about the book now, but I suspect the author has more surprises waiting in the last few chapters, so I must simply say, "Stay tuned."

Thursday, September 15, 2011

A fun reference for mnemophiles

kw: book reviews, nonfiction, words, mnemonics, references

While I have been known to read the dictionary or an encyclopedia for enjoyment, I did not read all of Judy Parkinson's book yet. It is a reference book, titled i before e (except after c): old-school ways to remember stuff, and yes, the title is entirely uncapitalized.

Every language has mnemonics used to recall sundry facts. My wife knows a number of haiku and other short Japanese poems she learned, including one for remembering the years of the Asian Zodiac (2011 is the year of the rabbit), and there are two ways they remember their 72-character phonetic alphabet. One is rhythmic and starts "aa ii uu ee oo ka ki ku ke ko". The other, titled "Iro Wa", and actually spelled i ro ha, is a Buddhist poem about the transitoriness of life, which uses each character exactly once. Of course, they have about 7,500 other characters derived from Chinese that are not phonetic, which explains why 10-12 years of Japanese education are required before one can read a newspaper with ease.

In English, we have, for month duration, "30 days hath September...", the "I before E" rule for non-Germanic words (which means there are hundreds of exceptions, such as "weird" and "seismic"), and for astronomy buffs, "O Be A Fine Girl, Kiss Me" to remember star classes in order: OBAFGKM. The rcent addition of classes R, N and S has lengthened the phrase with "...right now, sweetheart!"

The book's sixteen chapters cover all kinds of things, so rather than be comprehensive here, I'll keep it around and offer tidbits from time to time. Today's is this: Chapter 2 is on spelling words that are most commonly misspelled (or misspelt if you are really old-school). Most of the chapter consists of 75 rules that are almost as hard to remember as the spelling of the words. For example: "A Rat In The House May Eat The Ice Cream" is for ARITHMETIC. Slightly easier is, for distinguishing "capital" from "capitol", "PAris is the capitAl of FrAnce; there is a CapitOl building in WashingtOn." The silliness of memorizing the phrase fixes the two in your mind. My own bottom line is that we just need to remember the hard ones, look them up if needed, and when typing on the computer, trust your spell checker, at least a little: some might be homonyms, but that is a tidbit for another day.

Tuesday, September 13, 2011

Watching Watson win

kw: artificial intelligence, supercomputers, popular culture

I missed the original showing of the Jeopardy shows featuring the Watson supercomputer. Now it is rerun season, so last evening I got to see the first round in which Watson fought to a draw with one of the trivia champs. I'll have to miss tonight's finale, but I know the outcome.

I hope the two gentlemen who lost to Watson don't feel too bad. They were beaten not just by a fast computer, but by a huge team of programmers and experts who had pre-loaded Watson's data banks with things like whole encyclopedias, all of Bartlett's Quotations, and the like. Were Watson's lookup and correlation routines even faster, the men would have been outgunned in every instance except the two for which Watson answered incorrectly.

I'm glad they started the show with a small tour of the hardware behind the avatar. Watson is physically just as large as the mainframe computers I was using in the 1960s, and is accompanied by enough air conditioning equipment to keep a whole block of houses cool. Now let's speculate a little. In the 1960s, the current "supercomputer" was a CDC 6600; ten years later (1976), the first Cray-1 ran the first 100-MFlop calculation. A desktop computer my son and I built last year coasts along in the 100-300 MFlop range. So, thirty-four or -five years in the future, assuming the databases and algorithms are available economically, can we expect a desktop Watson? Possibly.

I have two questions: (1) When will a machine be able to appreciate a subtle joke? and (2) When will a machine no larger than a grapefruit be able to navigate the world with the ease of a typical house cat? A partial answer to both requires us to imagine that a true artificial intelligence needs a body with a density of sensors equivalent to the endowment of a natural animal. A machine that can not just see and hear but smell, taste, feel the breeze in its hair, and that has senses of balance, proprioception and at least the 20+ other senses Aristotle didn't include among the Big Five.

Monday, September 12, 2011

Listening too closely

kw: book reviews, nonfiction, linguistics, psychology

There is a proverb, that good liars give lots of details, but the best liars don't. But there is no proverb that tells us liars very seldom use the words I, me or my. Yet it is true. Lying is hard work. It doesn't pay to be introspective when your every effort must be directed to confabulation.

More interestingly, in America's "classless" society, we still estimate social ranking. It takes some work, but listen to two people talking together. One will use "I" words (I, me or my) much more than the other, who will instead use many more "us" words (we, us or our). Guess which one is dominant (stay tuned)?

Psychology Professor James W. Pennebaker, who likes to be called "Jamie", will be the first to tell you that catching such cues from active conversation is very difficult. He calls such words "stealth words" and "function words". In his book The Secret Life of Pronouns: What Our Words Say About Us, he notes that we are attuned to listen for content words, words that reveal the subject or object of what we are hearing. Pronouns, articles (a, an and the), and prepositions, for example, just slip right by us. When we read, they slip by just as readily. Have you noticed, for example, that prior to this sentence I have not written any "I" words except as examples? That was hard: I usually write this blog in a self-reflective mood.

To make more accurate measures of the use of stealth words, and to avoid wear and tear on his graduate students, Jamie and his colleagues have developed a number of computer methods for analyzing text by counting the percent use of as many as 80 families of words. The most revealing of these are pronouns and other small words, the little words that glue our sentences together.

We often joke that people who have been married a long time start to resemble one another. In Dr. Pennebaker's research, he and his colleagues have found that they are even more likely to sound alike. In fact, we all tend to pick up the speaking style of those we spend time with, and the more we like someone, the more we will speak like them. We are also likely to pick up the speech patterns and accent of anyone we consider dominant (except for that I-we thing). Until recently, my supervisor was an Englishman, and people could always tell when I'd spent my monthly one-on-one review with him. It would take me half the day to shed the British accent. This was true of everyone except one fellow from India, whose accent was anglified already.

The web site SecretLifeOfPronouns.com contains several interesting exercises. One of them compares two pieces of text for similar patterns in the use of stealth words, and no fair using two pieces of your own material. I entered two 100-word extracts from an e-mail exchange with one of my colleagues, a young woman who has a grade-school boy. The comparison revealed a correspondence of 87%, which is just above the average of 84% for people who are "friendly acquaintances".

Another exercise has you spend five minutes typing about a picture of a water bottle. The subject is so boring, I found it hard to keep going after about two minutes! But I persevered, producing 178 words (I can't resist calculating that this comes to 35.6 wpm). The analysis was as follows:
  • Visual Dimension . . . . . . . . . . . You . Average
  • Words on Label-Verbal Thinking . . . . 1.12 . . 1.74
  • Colors and Text-Visual Sensitivity . . 1.12 . . 3.74
  • Bottle Contents-Functional Thinking. . 0.00 . . 1.67
  • The Bottle itself-Tactile Sensitivity. 0.56 . . 2.91
  • Light and Shadow-Contextual Thinking . 0.00 . . 0.79
That does not please me much. The results make it seem I wasn't thinking much at all! Since these are percents, I suppose had I used the word "water" somewhere (I didn't), I'd have had a 0.56 score for Functional Thinking. Of the 178 words that I typed, the filtering program was only "interested" in five of them. Perhaps this is a computer's revenge. I have often called my profession of Information Science "the art of lying to a computer and making it believe me."

If you work for a corporation, do you think of it as your family? When you speak of the company (if you ever do), do you call your workgroup "us" or "them"? Try writing an essay about your work. Then count the instances of "us" words and "them" words. If the latter predominates, perhaps you need to update your résumé. And by the way, when you talk to your boss, you are most likely to use lots of "I" words. The political uses of "we" are found in the speech of dominant people.

I find it a bit unsettling that there are so many things that a computer can winkle out of my patterns of speech. Perhaps this is the next direction that Toastmasters type clubs can go: diction training, teaching us how to write a better college entrance essay (use more big words and lots of articles, and reduce "I" word use), how to get along with your spouse better (or at least sound like you do!), and even how to craft more convincing "little white lies" (leave the bigger ones to the experts like Bill Clinton). Who would have guessed that such a fun book of ten chapters could be written about the way we use the smallest words?

Sunday, September 11, 2011

911: The first ten years

kw: memorials, history

This is how the Ground Zero site looked just three months ago. Crews have been working around the clock to get ready for today's dedication of the 911 Memorial, consisting of the waterfall pools that occupy the footprints of the two fallen towers, the names of those who died on brass plaques around the pools, the mini-forest that was planted all around the site, and a museum containing historical exhibits.


Then, let us not forget the new building going up to the north, which is already 82 stories high, intending to top out at over 100 stories and more than 1,700 feet. That is half a kilometer, folks! According to a recent PBS special about it, the building is being built to withstand an attack by a whole fleet of jumbo jets. In this image, we can see only its top.

Here is how the finished memorial is intended to look, according to a model by Squared Design Lab. There are also renderings of the museum being built and other features.


What have we learned after ten years? Not a great deal. Within a couple of months after the 2001 attacks, U.S. forces invaded Afghanistan, where it was thought the Al Qaeda perpetrators were being harbored. Just incidentally, they ousted the Taliban, though it is still not clear whether they will stay ousted. The grindingly evil advances of "political correctness" over the past 30-40 years have made the U.S., as a nation, too squeamish to look an enemy in the eye and put a bullet there. The debacle in Iraq is even more pitiful. Yeah, some of our heroes finally killed Osama Bin Laden this May, but had to sneak around to do it. Anybody who was dumb enough to apologize to Pakistan after that is no friend of mine. They ought to consider themselves lucky we didn't commit a third invasion.

Now, that said, there is one huge step the U.S. could take to ramp down the world's resentment and reduce some of the motivation for terrorism (we'll never remove it all). We have troops and bases and posts in more than half the nations of the world, for various reasons. I have not been able to find out how many troops that comes to, but it is somewhere in the range of 100,000 to 300,000 (probably). Bring them home! Give each of them $100,000 seed money to start a small business (a sum in the $20 billion range), because there aren't jobs for them here. Enough of them will succeed and begin to hire people that it will make a big dent in our unemployment level. In fact, it would be more effective, at 1/20th the cost, than the so-called "American Jobs Act", which I understand is not yet on paper, meaning it does not yet exist.

To those who think the above paragraph contradicts the prior one: I believe in being ready to utterly defeat a foe, but I do not wish to live in his closet awaiting the chance.

More on the "we haven't learned much" front. Some folks are making this argument about who "really" perpetrated the 911 attacks, "Either our government knew about it and did nothing to stop it, or they knew and were involved, or they were the biggest buffoons ever." I fall firmly in the "they were huge buffoons" camp. But the statement poses only three of a number of options; it is a false choice. It reveals a lack of imagination on the part of the conspiracy buffs. As usual, life is not so stark, not so black and white. Reality is far more banal. Only one of twenty hijackers was prevented from getting on his flight, because only one security guard had guts enough to buck the PC mavens and detain an Arab who, he said, had death in his eyes. The twenty Arab hijackers were able to get flight training because we have multiple intelligence agencies that treat each other worse than they treat the nation's enemies; each had a piece of the puzzle but no way were they going to share with each other. And things haven't gotten any better on that front! The buffoons are still in charge, and our buffoon-in-chief thinks our biggest priority is to spend half a trillion dollars we do not have, as wastefully as the "stimulus" money of two years ago. Most of that went down a rathole with no discernible effect.

OK, I'd better stop there. This is what I want you to know. The buffoons are still in charge.

Saturday, September 10, 2011

Better than the bistro

kw: recipes, food

From time to time we buy a frozen pizza. Tonight, the store didn't have the "Supreme" by DiGiorno, which we like best, so we bought the "Three Meat" one and decided to improvise. We sliced and slivered half an onion and two small bell peppers from the garden, and added them with tomato slices from cutting up a single Roma tomato. Then we added more Mozzarella cheese. The cooking time was the same, though the liquid from the tomato slices made it a little soft in the middle. The taste? Scrumptious.

Thursday, September 08, 2011

Will the Jobs Act see the light of day?

kw: politics, finance

Just about two hours ago U.S. President Obama finished his speech to a joint session of Congress promoting his American Jobs Act. In contrast to most prior speeches, in which he had the measured demeanor of a professor, he was passionately in the pulpit, preaching to a very mixed audience. There were actually some points he made that were received by standing applause from all present. Much more often, the Democrats applauded while the Republicans sat, stony-faced.

Basically, the President wants nearly $450 Billion to stimulate the economy in several ways: construction and re-construction of infrastructure, refurbishing about 35,000 public school buildings, and funding large tax cuts to small businesses that hire new employees. He claims this will be paid for with further deficit reduction, then at one juncture stated he will charge the deficit reduction panel to add the above amount to the amount they are supposed to identify.

I am half tempted to say, let him give it a try. Problem is, the deficit reduction efforts are inadequate; current legislation has "reduced" a $9 Trillion deficit over the next ten years to $8 Trillion (based on some shaky assumptions), and the aforesaid panel is tasked with finding another $1.5 Trillion, or now $2 Trillion. That is quite a reach, and there will still be an unprecedented level of deficit spending. My analysis is, the Act sounds like it has a number of good ideas but there really won't be the money to make it work.

I can't hazard a guess whether the Act will be passed by Congress, but I am pretty sure it will be amended beyond recognition by both houses, then face an acrimonious conference committee if both houses pass some version of it.

I took a look at history. Franklin Roosevelt didn't quite invent federal deficits, but he was the first to oversee a push of total debt past 40% of GDP; it quickly reached 100% of GDP. The post-war boom also produced a post-war surplus that dropped the debt into more manageable territory, but it is approaching 100% again. Anybody remember Ross Perot? During his presidential campaign he claimed that if we had a genuine budget freeze, the Federal budget would remain locked at $1 Trillion. Instead, it grew by a factor of nearly four, and now the deficit is $2 Trillion. Even if you adjust for inflation, President Obama has overseen a huge increase in Federal spending in just two years.

If there is a genuine shifting of the deficit to "cover" this Act, maybe it will fly. Otherwise, we simply can't afford it. It is like trying to remodel your house by borrowing more money when your loan is already upside down.

Wednesday, September 07, 2011

A writer's writer

kw: observations, opinion, writing, writers

I wish to comment further on Globish by Robert McCrum, which I reviewed yesterday. Reading the book was an experience not far removed from reading a good novel. I was not just learning, but was transported elsewhere.

Though the book is filled with evidence of great erudition, it is not a dry, scholarly tome. I have had to read plenty of learned works, the kind that induce slumber. Most share some common defects, all of which McCrum has adroitly avoided. He used endnotes, but done in a style that does not leave the text riddled with superscripted 1,2,3; the notes themselves are succinct and seldom exceed two lines; the references to published work are smoothly done, avoiding author (date) references in favor of more personal and personable narrative; he avoided footnotes, because if something is worth saying it is worth saying as you go and be done with it, and this goes for long endnotes also.

I was reminded of the best of nineteenth century writing (a century which also produced some of the least readable books every printed). Reading ought to be a joy, and give evidence that the author enjoyed the writing as much as you enjoy the reading. A well-written book makes me feel as if I were snuggled with the author in an overstuffed chair as we read together. I do hope Mr. McCrum would not find such a sentiment creepy.

Tuesday, September 06, 2011

Unwinding Babel

kw: book reviews, nonfiction, language, english language, globalism

About half the members of the church I attend are Chinese, both "mainlanders" and Taiwanese. Though they all speak Mandarin, it is not everyone's first language. At a church lunch I might hear four or five Chinese dialects in use. The rest of the congregation consists of those, like me, whose first language is English. Church meetings and business sessions are conducted in English. However, when speaking to the group, I unconsciously speak a simpler English, adjusted to more closely match the vocabulary of them all.

At work, some of my colleagues are Indian, American citizens born in India. Several other colleagues with whom I sometimes interact are in India, at the other end of a video conference link or Skype call. I don't need to adjust my language much with any of them, but I have noticed signs of some of them digging out English terms to express concepts that, among themselves, they have a local word for. At least four Indian language groups are represented among them, so even though they also all speak Hindi as a second or third language, their English facility is better, and that is how business is carried out, in Indian-accented English.

Nearly none of my wife's family can speak English, but some of her childhood friends do. With them, we can speak ordinary American English; they learned from American servicemen and their spouses. But their children and grandchildren all speak "Japanglish" (more recently termed Janglish) among themselves, and can effortlessly switch to a lightly-accented English if they need to talk to me (not that they much care to).

Robert McCrum would tell me these are varieties of Globish, the new world language. The title of his new book (just a year out) is Globish: How the English Language Became the World's Language. His thesis is simple, that a few accidents of history, and the recent immense spread of the World Wide Web starting as an English language phenomenon, catapulted this word-gobbling, culture-flattening German/French hybrid tongue into global hegemony. Though twice as many earthlings speak Mandarin as a first language, more than half of all of us, four billion, speak English as a second or first language.

Simple though the thesis may be, the process was sinuous as a burnt python. Starting with the confluence of invader languages and native Celtic in the 5th Century, Anglo-Saxon, or Old English, quickly developed as the primary tongue of Britain. The breakup of the Roman Empire had resulted in the rapid eclipse of Latin as anything but a scholar's language. A few generations of Norman rule starting in 1066 AD nearly doubled the vocabulary as the British subjects of Norman overlords were required to learn new words for everything, but kept their old language for converse among themselves. A similar phenomenon continues wherever there is a servant class; just go to the Jamaican section of Baltimore, where you'll hear little English but plenty of Patois being spoken, though they speak English clearly enough when at work. In Norman England, the Anglo-Saxon of the servant class was sufficiently robust to survive and become Middle English, even though a nearly total overlay of "upper class" French words had been added to the mix.

King Henry V was the first to use (Middle) English instead of Latin or Norman in all his documents. Not for nothing did Shakespeare put a rousing English rallying cry in his mouth as he prepared his troops for battle on St. Crispin's Day (what King Harry actually said is, of course, not known, except that it was in English). In this, Henry was, perhaps consciously, recapitulating Alfred the Great, who used the vernacular hundreds of years earlier to rally his people, including his commissioning of the Anglo Saxon Chronicle. As Alfred's influence catapulted Anglo-Saxon back into a national tongue, so Henry did for Middle English at a crucial time.

We still retain memorials of the transition of Middle English into Early Modern English—Chaucer, Shakespeare and the Geneva Bible—and then Modern English, with the Authorized Version (King James Bible) on the cusp of the transition. Though spelling conventions have changed, making KJV English hard to read in a first or second edition, the spoken word has changed less (barring 400 years of neologisms), so that you'd be able to understand William Shakespeare well enough, were he to pay us a visit to discuss his life and work.

What is it that makes English such a robust language? In a word, flexibility. English is the great borrower. It exemplifies the entrepreneur's ethic: Why create what you can freely appropriate? While the French are struggling to find "honest" French expressions for the one-quarter of modern Franglish that was borrowed from English and other languages, English has grown to have a vocabulary of more than a million words, as collected by the Oxford English Dictionary. It requires only 10,000 words or so to become a fluent speaker of most modern languages. Any English-speaking fourth grader has a vocabulary exceeding 30,000 words, and the college-educated population knows 50,000 to 80,000 words. About a fifth of these make up the Anglo-Saxon/Norman core that fills our daily conversation, and many of the rest are borrowings of various vintages.

A most fortunate circumstance for the spread of English worldwide was England's loss of the American colonies in the 1776-1814 period. England turned to worldwide Empire building, and within a century, Great Britain and her empire girdled the globe and acculturated more than a billion people to her language and customs. Then two world wars came along, in which the Anglophone powers, the U.S. and England, became the closest of collaborators. One result was the breakup of the British Empire into the British Commonwealth plus a number of English-speaking former colonies. Another was the Marshall Plan, which has morphed into half the world's nations hosting clusters of American troops, whose English language spreads from every American base outward throughout those nations (this last sentence is my addition to McCrum's otherwise comprehensive historical review).

Just at such a juncture, technology intervened to produce a class of perfect electronic servants. When I first obtained a desktop computer in 1981, I knew that 90% of all keystrokes on PC's were word processing. PC's replaced typesetting equipment almost overnight; they replaced typewriters shortly thereafter, once printers got cheap (though my IBM ProPrinter, at $600, doesn't seem cheap now, it was quite a bit less than $7,000 laser printers of the time). The early 1990s saw the World Wide Web sweep over the Internet/DARPAnet and turn it into a global library. Now it is becoming "the Cloud", and if you don't read or write English, you can get free translation to and from the language you prefer at Babelfish or Google Translate or several other venues. They are getting less clumsy all the time (BTW, when I correspond in French, I use GT to check my word choices and grammar).

The two largest nations on Earth are promoting English even more than the U.S. and England. The Indians of course are becoming the go-to-folks for technical service no matter where you happen to be. And their accents are getting better. India has a greater accent-reduction industry than Hollywood. The Chinese are teaching English almost universally, so even though Mandarin is spoken by a billion and a half people, you can now add half of them to those who can use fluent English, or actually Globish, for commerce.

The English language escaped England two hundred years ago, and now it is escaping America also. That is a good thing. The flow of language is two ways, and Globish is now the language with the greatest ability to gain new words for new concepts. As a result, English/Globish is the only language that really needs a Thesaurus. It will never be a concordant language. Today's situation was summarized by McCrum's colleague Alan Rusbridger at The Guardian thus:
  1. There is no such thing as Abroad.
  2. Most of our readers are 'foreign'.
  3. They expect us to inform them about their own countries.
  4. Their decisions will affect us.
  5. No economy is an island.
  6. 'They' will want to come here.
  7. It matters in London what they teach in Lahore.
  8. The environment is global.
  9. Technology is global.
  10. Their own media won't do this: but we will!
(Note on p 290). This is a journalist's view, but with minor tweaking it represents the environment in which all commerce finds itself. For the coming decade or two, native English speakers will have it a bit easier than most, but the spread of Globish continues to flatten the playing field. After all, it matters most if you have something to say, and the world is full of people with plenty to say, and they are getting better at saying it to everyone, everywhere.

Monday, September 05, 2011

A holiday walk and hurricane aftermath

kw: observations, zoos, hurricanes, photographs

Today we went visiting in Delaware. We saw the Brandywine Zoo for the first time since we first moved to the south Philadelphia area in 1995. They have a new Siberian Tiger to replace a pair of tigers they had, which have since died. The tiger is young yet, and rather slender. I could not get my camera to focus beyond the screen wire, so I had to wait until the tiger came up to the screen to get a clear photo.

As you can see, it is a rather depressing zoo compared to ones with larger enclosures and less wire mesh. It is very small. We all saw everything in less than an hour, and we were taking our time. However, the price is right, $4 for seniors.

We spent about another hour walking along Brandywine Creek to the west. It was evident that there had been flooding from the hurricane a week ago. The debris band in the middle of the image below was about nine feet above the present water line. The boys in the picture were just a couple of youngsters who were fishing. There were about seven or eight people fishing along the stretch of river we walked. I hope nobody was taking the fish they'd caught home to eat. The B'wine is too polluted for that.

All together, it made for a nice break in midafternoon.

Saturday, September 03, 2011

Lamarck wasn't entirely wrong

kw: book reviews, nonfiction, genetics, epigenetics

Many years ago I read an article about the results of some genetic programming experiments. Genetic programming is a means of designing using a computer method that is based on natural selection. The article was about the design of a low-pass electronic filter using electrical components such as resistors, capacitors and inductors (coils). Starting with a basic filter design having a few components, a program would make large numbers of small changes (like changing the value of a resistance), a few medium changes (such as move a wire connection), and rare larger changes (say, add a capacitor between two points), to generate a few hundred or a few thousand modified designs. Then a test program would calculate the performance of each and rate them. One or a few of the modified designs would perform the best, and the process would be repeated. At the end of several trials, the experimenters built some of the filters to see how they performed.

The process resulted in a number of surprises, but the one of interest here was a design with only a few components, and three of them comprised a resonant circuit "off to the side" of the main filter elements. When the filter was built it performed spectacularly well, but they could not figure out what the "off to the side" circuit was doing. They tried eliminating it, and the performance was very poor. There was nothing in electronics theory that could help them understand that odd circuit, but it was clearly essential to the total function of the filter. Somehow it regulated the overall operation.

With simple examples such as this to go by, it is no surprise that natural selection, operating for billions of years, has resulted in a genome for any given living thing, replete with surprises. Even bacteria, seemingly simple as they are, are the products of almost four billion years of evolution. Far from being a simple tank full of chemicals that interact in random ways, a bacterium is a sophisticated, surprisingly complex chemical machine. But we don't see them in everyday life. We do see larger things: plants, animals, and fungi in particular. These creatures, made of more complex eukaryotic cells, have biochemistry that is much more involved. The membranes within each cell, that segregate the nucleus and a number of "organelles", set up an environment that more resembles a colony of thousands of bacteria than a single cell. At the center of all this, we find the nucleus with its chromosomes, seemingly the seat of the genetically controlled "executive office" of the cell. Not so fast.

In his book Epigenetics: The Ultimate Mystery of Inheritance Richard C. Francis opens to us a different view, one in which this "executive" is a subject of cellular process, rather than a dictator. In his words, "…the executive function resides at the cellular level and the genes function more like cellular resources" (p xiii). The whole cell self-regulates, with the genes acting as a storehouse and consultant wrapped together.

The old view of "one gene, one protein" is on one hand 90% correct, and on the other wholly outmoded. Some genes code only for components of the ribosomes, which are composed of RNA. Some code for "microRNAs" that turn back and glomp onto certain proteins to regulate their rate of operation or even destroy some of them. Other sections, not specifically called genes, regulate the expression of other genes. But there are cellular processes, including methylation principally, that pervasively interfere with gene expression, and are largely responsible for the silencing of nearly all the genes in every cell, so that a liver cell only does liver cell stuff and a muscle cell busies itself with making things move, not making hormones or trying to digest food.

Such processes are called epigenetic, meaning "beside genetic". I think of them like the odd circuit that helped the filter work properly, but didn't seem to be a direct part of the filter. Epigenetics in particular, then, refers to the collection of processes that are not directly under the control of genes and their proteins, but instead operate upon them, creating a feedback mechanism. Feedback isn't so mysterious; it is what regulates all our technological devices. My dishwasher runs itself because of a feedback mechanism. Feedback controls the rate your heart beats. Certain stimuli cause the controlled rate to increase or decrease; if it were free-running it would be very unstable and prone to stopping without warning.

Where my title comes in, is that some changes in cell function that occur due to epigenetics are sufficiently permanent that they are inherited and affect the next generation. The simplest example is a child born to a mother who was subject to high levels of chronic stress during pregnancy. The high levels of stress hormones in her blood affect the development of the baby, who may be born earlier than usual, or weigh less, and is likely to experience certain maladies in later life. Even more, in what is called the Grandmother Effect, if this child is a girl, her children will similarly be more prone to the same maladies; the epigenetic changes she experienced while developing in her mother's womb affect her own egg cells and are passed on to her children. So be nice to your pregnant sister!

I don't pretend to understand methylation, but it takes place all the time, causing CH3- (methyl) groups to be attached all over the DNA. Usually, methyl attachments slow down gene expression, but in some cases they speed a gene up. A sufficient number of them will silence a gene altogether. The amount of methylation and its targets are at least partly controlled by the environment, and sometimes also by the choices an animal makes, or its emotional responses to the results of its choices. In this regard, Lamarck's contention that acquired characteristics are inherited was at least a tiny bit correct: Some of what happens during our life can be passed on to our offspring.

Now, most epigenetic changes are wiped clean when germ cells are produced, a kind of "epigenetic erasing". But not all. In my reading outside the book, I found that about a hundred things, such as certain forms of color blindness and a number of disease syndromes, are now linked to epigenetics.

The author states that this book is only a small window into the subject. It is vast, and requires much more study than just the "genome/proteome" studies we thought would result from DNA sequencing. For example, a particular "gene" (we need a new word now) doesn't create a protein, but a proto-protein. Even, the early stage of mRNA generation has an intermediate step or two of editing before the proto-protein is produced. Then nearby microRNAs set to work on the proto-protein, or other enzymes remove bits of it or rearrange it, and some methylation of the protein may occur also, until it is ready as the final, working protein, for whatever task it was prepared. Thus, the gene is not the unit of protein generation, but a template for some working parts from which a protein is produced. There is a lot of other cellular machinery involved.

Just like the nuts, bolts, wires and so forth in a hardware store can be assembled into many different devices, so I am coming to look on the products of some genes as standardized parts that can be cobbled together into quite a variety of products. That is how millions of proteins can be produced, when we have only about 22,000 genes in our DNA. It is the entire mechanism, as complex and unwieldy as it is, that has been the subject, and product, of gigayears of evolution by natural selection. The genes are an important part of it, but to say that only the genes evolve is like saying all houses are the same, except for the wiring. The genes have indeed evolved; the cell and its effects on how the genes work has also evolved. That is my take-away message from this fascinating book.