Sunday, September 27, 2015

Why books can be read in comfort

kw: book reviews, nonfiction, proofreading, copy editing, memoirs

As much as I occasionally lampoon an egregious typographical error, or a book that seems filled with them, I truly appreciate the careful copy editing that goes into the production of nearly everything we see in print, and books in particular. Copy editing is more than proofreading, more than the ferreting-out of errors by the author, the typesetter, or another editor. It embodies the skills needed to ensure that errors that detract are omitted or corrected, but that usages the author intended, for any reasons whatever, are faithfully retained, even if some might thin them erroneous.

By this I mean much more than the unenviable job of Mark Twain's copy editor, making sure the dialect-ridden text of Huckleberry Finn was as Twain intended it. Not only novels employ "variations" on English usage for effect. Essayists, for example, whose texts require clarity, might employ word order or punctuation in ways that do not exactly fit a journal's preferred rules of style. I've had a couple of battles with copy editors, particularly those in England: one peeve I have is that they want to move every adverb to a standard location in a verb phrase. I might write, "…they were desperately seeking to find…" and have the proof come back, "…they were seeking desperately to find…". Such usage is a hangover from Norman French. It has largely been abandoned in the American language, but is clung to by many copy editors of journals published in England. Then there is the serial comma. Do you prefer to write, "In grammar school I learned reading, writing, and arithmetic", or "…I learned reading, writing and arithmetic"? The former example uses the serial comma, and the latter example leaves it out. There are strong proponents of both usages, just as there are several opinions about the way I placed the question mark in the prior sentence.

Mary Norris has been a copy editor—and worn a few other hats—at The New Yorker since 1978. Her book Between You & Me: Confessions of a Comma Queen drags the somewhat secretive vocation of the copy editor into the daylight for us all to enjoy. She broke into the field when she pointed out an error in something James Thurber had written on his office wall. He was delighted.

Writing and punctuation styles change over time. I learned to use many commas in my sentences, having been taught to "Write for someone reading aloud; show where to breathe." A copy editor set me straight about more modern usage in about 1974, and I've gradually learned to use about one-third as many commas as before. Peruse a few pages of an issue of The New Yorker from about 15 years ago, and you'll see more commas than you might in your daily newspaper. Ms Norris writes of the colorful persons found among the warrens of The New Yorker's offices, none more distinctive than Lu Burke, whose "comma shaker" was famous. It reminded her colleagues to make stylishness subservient to clarity, and not to dogmatically expunge every comma for some doctrinal reason. (Image found at a CMOS review).

Punctuation marks and the foibles of their usage seem to fill about half the book. Chapters treat of hyphens, the other three kinds of dashes (en, em, and long: – — ―), apostrophes, and semicolons as compared to colons and other designators of an author's thought changing direction or focus. In the chapter about dashes, she tells us of Emily Dickinson, who used dashes for nearly everything. A careful student of her handwritten papers could probably find six or seven lengths of dash, and it is quite likely that Dickinson had something quite definite in mind when producing any of them. And there is the question of using spaces around a dash, or not, or whether it is proper to follow a dash with a comma or other bit of punctuation (nearly never in Norris's view). –I just went back and deleted a comma after the word "never"; I still have certain instincts from the 1960's.

And what of the other half of the book? The title illustrates a pet peeve of hers, that people who might usually say, "between you and me," which is proper, tend to say, "between you and I," which is not, if they think they are speaking with someone who has a better education. Somehow, the proper usage takes on a common tinge in their mind, and is therefore suspect, as though "common usage" might be frowned upon by a person of excessive education. Just in case you were wondering, it isn't. Some common usages are certainly incorrect, but most are quite correct. And language changes over time. Today's common usages that are thought to be errors will become standard over a generation or two.

If you can find an edition of Shakespeare that retains his original orthography, you'll find it hard to read. Go back another 400-500 years, and "Old English" is really quite incomprehensible:
Fæder ure þu þe eart on heofonum;
Si þin nama gehalgod.
The letter þ is the Thorn, and is pronounced as an unvoiced "th". Its companion, the Eth (ð), is the source of the "y" used in faux-colonial signs such as "Ye old Curiosity Shoppe", where "Ye" is to be pronounced "the", with the "th" voiced. Have you figured out the two lines above? Here they are circa 1729:
Our Father, which art in heaven;
Hallowed be Thy name.
That ought to be more familiar. The punctuation of the Old English version is according to the 1729 editing of the King James text of 1611. If the Anglo-Saxons of the 12th Century punctuated the prayer at all, it is likely they used a dash or a comma. If you are familiar with the King James Bible in print today, it is the fourth edition, revised in 1729, not the 1611 version, which is almost as unreadable as Anglo-Saxon to most modern readers. Even the orthography of 1729 is looked upon by today's younger set as a nearly foreign language.

Proofreading and copy editing are a conservative enterprise. Readers are most comfortable with the kind of writing they grew up with, if not in content, at least in form. So most authors write in a style not far removed from that of their formative years, and are quite OK with a copy editor who ensures that the same style is adhered to. But some authors experiment with new forms and have new ideas and want them expressed just so. Emily Dickinson without her dashes would seem enervated; they give a breathless rush to her verse. Ms Norris uses an example handwritten by Jackie Kennedy, complete with dashes among its run-on sentences. You simply get a more intimate feel from it as compared to something shoehorned into the straitjacket of "correct usage".

So words, though their treatment takes up but half the book, are the meat, the nourishment of the mind, and the punctuation marks the bones and joints. There is even a chapter on "curse words", particularly the "f-bomb", and on a competition among certain writers at The New Yorker to see how many they could fit on a page (and say something halfway useful in the process). I was reminded of a Mythbusters episode from a few years back, in which they tested the emotional impact on the speaker of cursing loudly to alleviate pain, compared to shouting more innocuous strings of words such as "kittens, raspberries, elephants!" and so forth. Cussing worked better. There really is some utility to it!

Without saying it directly, Ms Norris confesses to a certain level of OCD. She devotes half a chapter to her love of soft #1 pencils, and her inability to achieve comfort with anything harder, such as the ubiquitous #2. She often can't enjoy something when her eye/mind keep tripping over errors. Other times she'll find it entertaining to see a large printed sign that reads "Hunters's Rest", and wonder whether the sign maker was working with a family named "Hunters", or simply covering all the bases of possessive usage.

As you might expect, the writing style is excellent, easing a reader's enjoyment of her insight, wit, and humor. It is quite enjoyable to peek behind the scenes to see that, at least at The New Yorker, a substantial series of editors and readers awaits an author's prose, to ensure that what the magazine prints is, firstly, exactly what the author intended, and secondly, as error-free as is humanly possible.The author's website for the book: www.commaqueen.net.

Wednesday, September 23, 2015

Speculation Unbound!

kw: book reviews, nonfiction, scientific miscellany

What would happen to the Earth if the Sun suddenly switched off? Randall Monroe answers that question beginning on page 248 of What If? : Serious Scientific Answers to Absurd Hypothetical Questions. Randall Monroe created the webcomic xkcd.com, which includes a What If? section, in which he answers questions of all kinds sent in by readers of the web site or, more recently, the book.

I can't believe I didn't stumble across this sooner. It is a step beyond the "Fermi Questions", so beloved of the young victims of Science Olympiad. Answering the really absurd questions requires a skill akin to Fermi's, who was famous for taking on a query with no more than a pencil and the back of an envelope. He is also remembered for his method of measuring the yield of the original Trinity atomic bomb. While others did whatever they were doing in their trench a mile or so from Ground Zero, he was seen busily tearing a sheet of notebook paper to small bits. A second or two after the blast was triggered, just before the shock wave hit, he tossed the handful of confetti as high as he could. After the shock hit, and it was deemed safe to exit the trench, he walked around, mapping the outline of the scattering of paper bits, did a calculation or two, and announced how many kilotons the yield had been.

So what would happen to us if the Sun switched off? Randall's take on it is mainly positive. He catalogs nine consequences, including "no need to force your children to wear sunscreen" and "better astronomy" with a quieter (and soon, nonexistent) atmosphere. Of course, his tenth consequence? "We would all freeze and die."

Interestingly, there are two ways to look at the Sun switching off. He chose to work with an immediate cessation of all energy flow from the Sun. One could also consider a sudden cessation of the fusion reactions powering the Sun. That leads to a more drawn-out scenario, because it would take a long time, hundreds of thousands of years, for the outer layers of the Sun to dim appreciably. It would take several tens of millions of years for the Sun to cool to invisibility. Perhaps that would give us the motivation to really ramp up the space program!

There was an interesting short story I read a couple decades ago, in which a young man (or so he seemed) walked into a reporter's office and informed him that the reason so few neutrinos were coming from the Sun was that Jehovah had left the place in a huff a couple thousand years ago, and being a thrifty sort, had turned off the fusion furnace. He said he was the newly-assigned deity and asked the reporter to run a provocative, cagey story that "perhaps" scientists would find a more "normal" level of neutrino activity from the Sun, starting in a few days, and to give no reason other than "informed by someone in the know". He intended to re-start the Sun. Sure enough, a week later the neutrino level rose to what the scientists had calculated it "ought to be". Of course, this was during the period that "neutrino oscillation" was being theorized, and is now the accepted reason that solar neutrino activity is observed to be 30% what was originally expected.

So, what kinds of questions get asked? Things like, "How many laser pointers do you have to point at the Moon so that we could see it?" or, "How much force power did Yoda produce (when lifting the X-wing from the swamp)?" There are also several short sections in which questions are listed but not explicitly answered; they are in the "Weird (and Worrying)" category: "What is the total nutritional value of a human body?" or, "Is there sound in space (There isn't right?)?"

Actually, that last question has an answer (so does the first: the same as a pig of the same weight). Yes, there is sound in space. Sound requires a medium in which to travel. Although the gas density in "outer space" is very low, it is never zero, anywhere. But the frequency of sound that is transmitted with little loss needs to be low enough that the wavelength is longer than the mean free path of the gas molecules as they bounce off one another. So the sounds in space, that go any useful distance, have very low frequencies. For example, in "interplanetary space", the average gas molecule travels a few meters before encountering another. The speed of sound is different at low pressure, but not by a great amount, so we can still use 300 m/s for rough calculations, and we find that a wavelength of 10m occurs at about 30 Hz. The trouble is, the sonic volume would be low, because so little gas is carrying the sound, but a sensitive microphone could detect low hum-type sounds "out there". In interstellar space, the pressure is lower, perhaps a thousand times lower, meaning that the mean free path is a thousand times as long, and frequencies higher than 0.03 Hz would not travel far. So the sounds in interstellar space would be at very low frequencies indeed. But they are there.

Rather than go on about things like using a Gatling Gun to propel a car (watch out, anyone behind!), I suggest you read the book, and check out the web site. Randall Munroe is an entertaining writer and, with a background in robotics, a deft hand at off-the-cuff mathematics (and a stable of helpful scientists' phone numbers in his Rolodex, no doubt). You'll love it.

Monday, September 21, 2015

Producing a depauperate Earth

kw: book reviews, nonfiction, extinction

I collected butterflies and other insects as a child. For a couple of years, when we lived in Utah, I mainly collected locusts, the ones with colorful wings. There were many different wing color patterns. Now, fifty years later, I find that both butterflies and colorful locusts (when I visit Utah) are quite a bit scarcer. Where I live now, in the suburbs southwest of Philadelphia, I have seen more butterflies than I did for a long, long time. But nothing matches those young years in Utah and Ohio. I also recall, during high school years in Sandusky, Ohio, recording morning bird song. I wish I still had the tapes! The "morning chorus" that began half an hour before sunrise in the Spring was a rich symphony. I could recognize the calls of 6 or 8 kinds of birds, and heard several calls I didn't know, every time. There is a pretty good morning chorus here, these days, but again, it pales by comparison with what I recall. Two to three kinds of bird calls are the usual fare.

This is not just me, remembering some "golden age" that never existed. Things are dying out, lots of them. I have been hearing about a "sixth extinction" for some years now. This is the title of a sobering, well-researched book by Elizabeth Kolbert, The Sixth Extinction: An Unnatural History.

What is the "normal" rate of species extinction? To jump in with the conclusion, it is probably close to one species yearly. It is closely allied to the normal rate of species turnover. That is, "extinction" can mean one of two things. Firstly, one species changes to another under the pressure of environmental change. When the two species are things like shellfish that leave good fossils, what a geologist would notice is that in rocks of a certain age, only shells of "type 1" are found, and in the next layer, only those of "type 2". An ecologist might notice that certain "recent fossils" are not found among living shellfish. I saw an example of this in Bear Lake, Idaho 60 years ago, snail shells in abundance, but a ranger told us they no longer lived in the lake; there was a different species now.

In geologic terms, the time span across a couple of millimeters of sedimentary rock might be a million years, so the speciation even could be quite gradual as seen from a human perspective. Observations of animals under selection pressure indicate that one may be replaced by another in much less than a million years: 50 to 100 years is sometimes sufficient. Many, many animal species live their life out within one year, so this represents 50-100 generations. Longer-lived creatures are a different matter. Horses, for example, can reproduce as early as two years, but have a fertile lifetime of about ten years, sometimes more. So a "horse generation" is probably about 6-8 years. A human generation is commonly thought of as 25 years, though in very early times it was probably closer to 20. Anyway, species transformation (I dislike the popular conception of "mutation") can occur in hundreds or thousands of years, to perhaps tens of thousands of years. This is synchronous extinction.

The second kind of extinction is that a species dies out when the environment changes to rapidly for it to adapt, and it is no longer suited to it. It may or may not be replaced, in ecological terms, by an unrelated (or more distantly related) species, which may have evolved about that time, or maybe not. This is asynchronous extinction. Depending on the kind of animal, a species that makes fossils is seen to last between one and ten million years, though some that we call "living fossils" are found to have lasted for tens or hundreds of millions of years. Though I wonder if a coelacanth living today could actually be bred with one somehow brought to the present from 300 million years ago; perhaps there have been a hundred synchronous extinctions along the line, as the animal changed in profound ways that did not materially affect what its fossil form would be.

"Mass Extinction" refers to the sudden disappearance of many species over a shorter period of time. A mass extinction is thought to happen because of a great and widespread change in environmental conditions. These are, of necessity, asynchronous extinctions. One thing that can utterly transform the environment worldwide, at least for a time, is the fall of an asteroid a few miles wide. An asteroid impact eliminated the dinosaurs (those that hadn't become birds already), in what is called the end-Cretaceous extinction event. There have been five major mass extinction events in the last 500 million years, and a few dozen lesser ones. Each of the "Big 5" drove at least half of all species out of existence, pretty much overnight. They mark the boundaries between geologic ages. The lesser ones were of less significance only in comparative terms, and also mark the boundaries of geologic ages or significant geologic periods.

The major ages of geologic time are called:
Cambrian
Ordovician
Silurian
Devonian
Mississippian + Pennsylvanian in America, Carboniferous elsewhere
Permian
Mesozoic, including Triassic, Jurassic, and Cretaceous
Tertiary (or Paleogene)
Quaternary (or Neogene)
The Big 5 ended the Ordovician, Devonian, Permian, Triassic and Cretaceous. Other named ages and periods also ended with lesser mass extinctions.

The trouble with geologically sudden events is that, on a human scale, they may not appear sudden at all. While the dinosaur-killing asteroid changed all of Earth's environments in at most a few days, the other mass extinctions seem to have taken more time, in the range of years to centuries, and perhaps tens of millennia. On a scale that considers a thousand-year transformation as "sudden", something that takes only a century is lightning-fast.

That is what we see happening. Synchronous species turnover, and most cases of asynchronous extinction, make up the background rate. Roughly speaking, if species last on average a couple million years, and there are about a million species, then some species or other will go extinct every year or every second year. That's a ballpark estimate of background extinction. One per year or a half that. If the greatest of the Big 5, the Permian catastrophe, took 1,000 years to occur, and nearly a million species were wiped out, that is a rate 1,000 to 2,000 times greater than the background.

The chapters of the The Sixth Extinction each focus on one species, as an example of a group of related species, that are greatly reduced or already extinct. Most are known or strongly suspected to be due to human influence. The first example is a Panamanian tree toad, a "poison dart frog", that is probably already extinct. It represents amphibians in general, that are vanishing at a stunning rate. Of 6,200 species of amphibian (frogs, toads, newts, salamanders, and a couple of similar odd critters), about 1,800, or nearly 30%, are reducing in number rapidly, and at least 440, or 7%, are likely to become extinct within very few years. Just among amphibians, the extinction rate is about 100 times the background rate for all species!

One example cannot possibly be due to human influence (unless you are a strict, young-Earth creationist), the Ammonites. These spiral-shaped critters actually survived the biggest mass extinction, the one at the end of the Permian, 251 million years ago, but were wiped out later on by the end-Cretaceous event, the one that famously ended the "age of reptiles", but let some few mammals and birds (small, feathered dinosaurs) sneak through and repopulate Earth. The chapter focuses on the consequences of an Asteroid Winter, and compares it with other possible causes of mass extinctions. It sets the stage for discussing the massive environmental changes we humans are bringing about. If we are indeed the major actor in the environmental shift called Global Warming, and I think we probably are, that is but one large-scale change in worldwide habitats that we have produced, and the one most likely to kill us along with so many other species.

One chapter dwells on rain forests ("jungles"), and the species-area relationship as determined by counting the number of species to be found in various sized plots of forest. In a portion of a large forest, there is a variation of species quantity with area, which is partly statistical. But in a dissected forest, with forested plots of various sizes surrounded by barren land or farms, there is a similar relationship, though it is steeper. The species of focus for the chapter, a small tree in the genus Alzatea, is not found at all in an isolated plot if its area is below a specific number of acres. When a large forest is broken up into isolated plots, at first, the S/A relationship follows that of the original forest. But over time, species are lost, most rapidly from the smallest plots, until a steeper S/A relationship is developed. Sometimes, keeping plots from total isolation by having forested "highways" between them will preserve some species, but this is not true for all. It is like some species die out if their members cannot get far enough from the forest boundary.

Twelve chapters, twelve species plus a thirteenth, about Homo sapiens, about us. We are very likely to be the ultimate victims of the great mass extinction that we are carrying out. It is not known just how many species go extinct every year. I once read of an experiment with "tree fogging", in which researchers used insecticide fog throughout the canopy of an entire tree, and collected all the insects, particularly beetles, that fell onto sheets spread under the tree. Dozens of new species were found and described. Excitedly, they fogged a second tree. Many more new species were found. But they were sobered that most of the new species from tree #1 were not found at all on tree #2, even though they were within a hundred meters of each other. They concluded that many of those, perhaps 100 species, were endemic not just to that forest, but to that specific tree, and found nowhere else. They had made 100 species extinct in an afternoon! They canceled future experiments of that type.

It is hard to find out everything that exists without going and finding them. But when doing so destroys what you are trying to find, what good is that? I don't how to find out how fast species are going extinct, but I think it very, very likely that this human-induced mass extinction is proceeding at a rate that exceeds that of the Permian event, the biggest of the Big 5, by a large margin. I do believe this needs to be more widely known.

Wednesday, September 16, 2015

Faster than the wind, and perhaps he saved your life

kw: book reviews, nonfiction, biographies, scientists, safety, rocket sled experiments

There is a name you need to know: John Paul Stapp. If you have been in a car accident, it is likely that you owe your life and health to him. That is, if you were wearing a seat belt.

Step back about 70 years. World War II had just ended, and a young physician was wondering why so many military pilots were dying, when they didn't have to. During that war, getting shot down was a death sentence in one of two ways: you died when the plane crashed, or you died trying to exit the plane. After the war, ejection seats were found to be, far too frequently, tickets to oblivion. Their design was based on, at best, random guesses about the amount of stress the human body could survive, and the forces the aircraft frame could handle.

Dr. Stapp set out to gather accurate and usable data. What he did and how he did it are detailed in the first half of Sonic Wind: The Story of John Paul Stapp and How a Renegade Doctor Became the Fastest Man on Earth, by Craig Ryan. The second half shows what he, and the country, did as a result.

Before the 1940s, a smattering of centrifuge experiments had established that, with training and with minimal support from a flight suit, a fighter pilot could avoid blacking out at accelerations of about 6 G's. The G is a one-gravity acceleration force. If you weigh 150 lbs (68 kg), that is the force a mattress must apply to hold you up. If you and the mattress are put in a centrifuge and spun so as to apply a 6 G acceleration, the centripetal force the mattress (and the frame holding it) must now apply to hold you is 900 lbs (408 kg). When your body weight is spread out by a mattress, if the area of your body against the mattress is about 5.4 sq ft (0.5 m²), you'll feel a pressure of about 28 lb/ft² or 136 kg/m². That comes to about 0.19 psi. Now, multiply that by six, and you'd feel almost 1.2 psi. If your normal blood pressure is 120/75 (what doctors currently recommend, but maybe yours is higher), that 120 mm translates into 2.3 psi, and the 75 mm into 1.5 psi. So you can see that sustained acceleration of 6 G's tends to draw the blood in your body towards the mattress. If you are sitting rather than lying down, it doesn't take long for an acceleration of 6 G's to pull the blood from your brain, and you black out.

At this point it is all about sustained G forces. It makes sense that you could survive larger forces if they occurred briefly and were rapidly abated. Somehow, a factor of three became dogma, so that a brief acceleration of 18 G was considered the threshold of death. Yet, common observations of people surviving falls calls this into question. One of my brothers fell 20 feet out of a tree, landed on his back on the lawn, and had the breath knocked out of him. But he got up after a minute or so and was OK. Now, a grassy lawn is softer than landing on concrete, but it doesn't have much give. The main thing keeping this from being an "instant stop" (physically impossible) was the flexibility of the body, which squishes out briefly. I calculate that my brother's body touched the ground going about 24 mph (39 kph) and stopped in a distance of about 4 inches. That works out to a stopping force of 60 G's. If instead we allow him a little more flexibility to squishing, perhaps the stopping distance was 6 inches, and he experienced 40 G's. Either number is a far cry from 18 G's.

Over about a decade, Dr. Stapp used himself as the primary experimental subject (not the only one; he also used chimpanzees and on rare occasions, another volunteer) in rocket sled experiments. The rockets would get the sled going to some high velocity, and a braking system would then stop it over a prescribed distance. Here are parameters that might describe a typical experiment:

  • Rocket acceleration: 4 G's
  • Burn time: 4.6 s
  • Burn distance: 410 m (1,340 ft)
  • Peak speed: 644 kph (400 mph)
  • Stop distance: 20.5 m (67 ft)
  • Stopping time: 0.23 sec
  • Average stop G's: 20
  • Peak stop G's: 30 (measured by camera)

Early experiments were conducted with the seat on the sled facing backward, so the subject was pressed into the seat by the stopping forces. Experiments were also conducted with the seat in various orientations, including "butt forwards", to determine the forces of an ejection seat's kick-off blast.

Later experiments were conducted with the seat facing forward, and the subject exposed first to the wind blast, and then to deceleration against the webbing holding him into the seat. Dr. Stapp used chimps to determine the edge of lethality, though it turned out that they are much, much tougher than humans, so getting the calibration right for human experiments was tricky. With humans (mostly himself), he gradually raised the G forces and observed his own feelings and had doctors note what injuries he sustained. Thus, as time went along, the design of the seat was improved to avoid points that exerted extra forces and were causing injury. Over time these design changes were implemented in pilot seats.

The final, most definitive experiment was conducted with a chase plane flying above the rocket sled, to observe and film it from above. The pilot was astounded when the sled outraced the plane, reaching a top speed of 639 mph (1,028 kph), or Mach 0.9. This earned Stapp the title of "fastest man on earth" in a ground-bound vehicle. The title stood for about 30 years. During the deceleration, though, he sat forward-facing, getting the full wind blast, and being jammed against seat restraints with a crushing 45 G's, peak, during a stop that lasted less than 1.5 seconds. He was a mess when he was helped out of the seat. His eyes looked like pools of blood; he was lucky they had stayed in his head. It took weeks for all his sight to return. He had several broken bones. Though he had the ambition to go 1,000 mph, or at least Mach 1 (about 715 mph; authorities vary), it was not to be. He had advanced to Captain, Major, and was now a Colonel, and was moved by the Air Force command to a more administrative role. His sled, named the "Sonic Wind", was retired.

What he did next is the subject of the second part of the book. Dr. Stapp had performed his experiments, often against opposition, on a shoestring. He had to scrounge and cadge for equipment and apply verbal tricks to get some semblance of permission. Such skills were even more necessary after about 1956. He had long lobbied and clamored to Air Force brass about the safety, and its lack, in fighter aircraft and also transports. One result of his nagging was that many transports in war zones had the seats for the troops facing backwards. Then they were much more likely to walk away from a crash. But even during his earlier experiments he was also lobbying for the use of seat belts in automobiles.

By 1956, about 36,000 Americans were dying every year in automobile crashes. The population was about half what it is today, so in proportion, there could now be 72,000 auto deaths yearly, but instead, there are about 33,000. It took Colonel Stapp and his allies another 14 years to bring about the changes, primarily in laws, that have, since about 1970, saved at least 800,000 lives. Over the last 17 years, some of the difference is also due to airbags, something Stapp heartily approved of; he died in 1999, the year after airbags were mandated.

During his "lobbying years", he fought resistance in both government and industry against mandatory seat belt installation and use. The auto manufacturers were a lot like the tobacco lobby of the same era, denying that their products' quality had anything to do with the deaths that were occurring. Fortunately, there were at least aftermarket seat belts available, and many members of the public didn't wait for Washington or anyone else. Over a decade's time, sufficient statistics were compiled that a growing number of lawmakers became convinced of the belts' value, and in 1968, factory-installed seat belts were required by law. I remember an ambulance EMT who said he'd never unbuckled a dead body.

I bought my first car in 1967, a 1964 VW beetle. A couple of years later I bought a set of aftermarket 3-point lap/shoulder belts and installed them. Fortunately, Europe had been ahead of the curve, and though the car didn't have belts already installed, it did have threaded mounting holes, so the installation was easy. I have used seat/shoulder belts ever since. But growing up, we did many road trips, hundreds of miles yearly, in a big station wagon with no belts, and a mattress in the "back-back" for us boys to nap on. We were lucky.

Since 1984, one after another of the U.S. states has passed laws requiring seat belt use. Compliance varies, but averages 85%. Nearly all of those 33,000 highway fatalities in recent years, has come from the 15% who don't wear seat belts. In spite of the air bag in most vehicles, they either crash around inside during a collision, or are ejected. Driving in California with my brother several years ago, we saw an SUV hit the median barrier on the freeway, and the driver burst through the side window and landed on the highway almost in front of us, on his head. One of us (I don't recall who) said, "We just saw someone die."

Two things to remember about Colonel Dr. John Paul Stapp: He risked his life, incidentally becoming the fastest man on earth, to gather safety data; then he used those data and traffic statistics to practically crowbar the United States into becoming quite a bit safer as a place to drive or fly. Craig Ryan's exciting biography brings us the man and the stories, a portrait of someone to whom you just might owe your life.

Monday, September 07, 2015

Creators need armored underwear

kw: book reviews, nonfiction, creativity, sociology

Build a better mouse trap, and the world will beat a path to your door – with tar and feathers. More simply put, "No good deed goes unpunished."

In How to Fly a Horse: The Secret History of Creation, Invention, and Discovery, Kevin Ashton fully exposes and expounds the greatest hypocrisy we face: Nearly everyone loudly touts their allegiance to "innovation" and "creativity", but in the face of any truly new thing, they will hide, protest, or punish the "perpetrator". The book is filled with stories of what really happened to many inventors and innovators. Very few received anything like acceptance, at least during the first few decades after making their discoveries known. Dr. Joseph Lister showed that requiring doctors to wash their hands after every visit with a patient nearly eliminated hospital-borne infections. Even today, there's only a 40% chance that your doctor washes up before examining you, unless you vociferously insist. Dr. Lister's predecessor, Ignaz Semmelweis, was driven to suicide by those opposed to his discoveries.

Why do we create? Why this continual drive for making something new? It is built into us. To be human is to create, or at least, to desire, yearn and long to create. We are defined by our tools. Humans are not the only tool-using animals, but we are the only animals that keep modifying and refining out tools, making more progress in tool development during one lifetime than chimpanzees or ravens have made in tens of thousands of years. In the opening chapter of the book, we read that, prior to about 50,000 years ago (70,000 according to other accounts I have read), innovations in toolmaking occurred over spans of thousands of years. Something changed in the human brain, and there followed an explosion of improvement and modification of human technology that continues to accelerate. Ashton's core thesis is that we are all creators, creation is common, and that the notion that only geniuses can be creators is so false that he denies there is any true meaning to the word "genius". We must all create, or die. In a late chapter he points to stifled creativity as a feature in many kinds of addiction and criminality.

Yet, as Jesus said (Luke 5:39), "No one after drinking old wine wants the new, for he says, 'The old is good enough.'" No matter how much people may say they value creativity and innovation, their instinctive reaction to anything new is, "It was good enough for grandpa and it's good enough for me." (Apologies to the composer of "Gimme That Old Time Religion"). You might say, "Oh, how about Einstein? He did Relativity and all that stuff, and got a Nobel Prize, and everyone loves him." He had a couple of lucky breaks, and got four articles published in 1905, giving him the stature to generalize his discoveries about Relativity in 1915, but his work is roundly misunderstood by most people who have not done the work to understand it, and every year many "private researchers" try to get articles published that challenge some aspect of Relativity or Quantum Mechanics.

Not all innovators are "punished". The book opens with the story of a 12-year-old slave on a French island who discovered how to pollinate the Vanilla plant artificially. "Edmond's Gesture" is still used, by workers with small enough hands. Only in its native area can the plant be pollinated naturally by a small green bee that lives nowhere else. But there was no assurance that Edmond would get credit for his discovery. Only persistent protection and advocacy by his owner assured that the "Gesture" would be associated with Edmond and nobody else. Later in the book we read of the team that developed America's first fighter jet, in about five months! Only thanks to a particularly clear-headed leader and his "Show Me" principle was it even possible.

Before going on, I want to take partial issue with one portion from the second chapter, that purports to show there is no value to the notion of "incubation." Various studies are remarked upon, in which it was shown that giving people various creative tasks, and having them take a break of 15 or 30 minutes in the middle, did not improve the results, and usually hindered them. I am not sure that my experience falls under the title "incubation", but here is a practice upon which I built a career of nearly 40 years as a software developer:
When faced with a conundrum, I'd work at it for a day, and if I didn't see how to make it function properly, I would step back and think through every aspect of the problem, building a kind of flowchart in my mind. Then I would go home, do whatever needed doing there, and sleep on it. Some time between 3 and 6 AM I would awake with a critical idea, and that would directly solve, or lead to the solution of, the key problem with that part of the software. I called it, "throwing it over the wall to the right brain".
I think the studies of "incubation" did not give nearly enough time for incubation to work, nor were the problems to be solved sufficiently difficult. Half a day (including the overnight sleep) seems to be required, at least the way my own mind works. By the above practice I produced a great deal of software that nobody else had been able to write. Oh, and just by the way, I was also a very unusual computer programmer in this regard: I always documented my work in internal "comment lines" that typically numbered about 1/3 of the total bulk of the code. This saved a great deal of time when I needed to re-use some code later and needed to remember how to hook it up. I once had a boss who gave me a file of subroutines (modules) that he had written related to tracing contour lines on maps. I began reading through the FORTRAN code, and some things were very obscure, so I was having a hard time even figuring out how to pass data into and out of the modules. There was not a single comment line in many, many pages of FORTRAN code! I asked him why, and he responded, "Can't you read FORTRAN?" I said, "If I were your supervisor, you'd have just been fired." Of course I can read FORTRAN, but every programmer uses clever constructions that make sense during the writing, but are very hard even for the original programmer to decode later. I decoded his code and was able to make good use of it, but it could have taken much less time had he been a thoughtful programmer. 'Nuff about that.

Right along with Ashton's thesis that we all create, he dwells much on the incremental nature of innovation. Remember Einstein? One of his huge innovations was Special Relativity, based on the speed of light as an absolute limit to motion. It includes things like time dilation (a speeding traveler ages more slowly), relativistic mass increase (things get harder to push as velocity increases), and length compression (faster yardsticks are shorter). All this from his thought experiments about riding a light beam or driving a very fast carriage with a lamp and clock aboard. But the points just listed are based on others' work, particularly that by Lorentz, who was building on work by others. Einstein had some key insights, it is true, but the power of his system was bringing all the parts together, and showing how they worked together. In a small way, I've had a similar experience. I have published a few articles in journals. The one of which I am most proud uses methods from astronomy and civil engineering, plus a 200-year-old technique first published by Leonard Euler. How many recent scientific articles have you seen that cite literature from the 1700's?

The chapter "Chains of Consequence" trace out a few key innovations, including the chain of innovation leading to the aluminum soda can, which can be produced for a few cents (of the 25 cents you spend for a cheap brand of soda, or the dollar you spend for a Coca-Cola). He shows the origin of all the known ingredients in the syrup that is added to carbonated water to make that Coke. Later on, Ashton even digs into Isaac Newton's statement about standing on the shoulders of giants. Similar statements were made for generations before Newton. We find that there are no giants. Instead, we are standing atop a pyramid of people like us, who made one innovation after another after another.

This reminds me of a joke about engineers and mathematicians. An engineer is brought to a room in which he sees a saucepan containing water to his right and a stove, already lit, across the room in front of him. A piece of paper next to the pan says, "Heat the water". He picks up the pan and places it on the stove burner. Then he is taken to another room with the same setup, but the pan with the water is to his left. He picks it up and places it on the stove burner. Now a mathematician is taken to the first room, re-set to the original condition. He does exactly what the engineer did. Then, when taken to the second room, he picks up the pan, but sets it to his right, saying to the person who brought him, "I already solved that problem." Of course, this sounds silly and we snicker at mathematicians, but the mathematician's principle is how we actually innovate. Based on all the problems already solved, we find an unsolved problem, and add just a little to a known method that had almost solved it. You can get a PhD for doing that.

Politics rears its ugly head in a chapter on credit. According to this Wikipedia article, as of 2014, Nobel Prizes have been awarded to 817 men, 47 women, and 25 organizations. Only 15 women have won in Science, and two of them were Marie (twice) and Irène (once) Curie. Quite a number of male laureates won the prize for work not only performed, but invented, by women. This opens up quite a can of worms, so I'll leave it there, reported but not editorialized.

We do not see everything, but a most egregious example of not seeing because of not expecting is found in the saga of Helicobacter pylori, the organism that causes ulcers. Doctors looked at samples of ulcerated tissue from stomach and duodenal ulcers for more than 100 years, and didn't note the bacteria there. Some early "authority" had decreed the stomach too acidic to allow bacterial survival, let alone growth, so every single time bacteria were seen, they were ignored as "contamination". Not so the doctor who finally believed his eyes, and a colleague who was willing to "let the data talk." Ashton doesn't report this, but as I recall, Barry Marshall had to drink water laced with the bacterium, get an ulcer, then cure it with antibiotics, to convince even a few of his colleagues that H. pylori is the true cause of ulcers, not "stress". And it is now known that hundreds of species of bacteria inhabit our stomachs.

What we expect can determine what we see, and what we don't see. Many years ago I went with friends to a road cut in eastern Nevada, where they said we could collect trilobites. We stopped, got out of the car, and I looked at the weathered rock in the road cut. It looked like grainy limestone with a kind of salt-and-pepper texture. I said, "How do we find the trilobites?" One of the guys put his finger next to a black blob half an inch long and said, "Look closely". Suddenly I saw them. Thousands of them. Nearly every black blob on that hillside was a small trilobite! Having seen one, I had "eyes for them". Similarly, on the first fossil collecting trip that my wife and I brought our young son along—he was 5 years old—the leader started the day by showing us several specimens of shellfish that had been found there (it was an abandoned quarry). He pointed to one, saying, "This one is rare and hard to find." Our boy looked for a moment, then trotted off. He came back in 15 minutes or so with three of them. We joked that, of course it helped that his eyes were only 3 feet from the ground! But he was the only person to find any of that species that day.

Innovation is built into us. The hard part is getting over resistance to innovation. We need the new, but we fear it. My father calls it the Moses Principle: in his words, "It takes 40 years in the wilderness for all the old-timers to die off, before the next generation will embrace something new."

Tuesday, September 01, 2015

Who is really at sea, the character or the reader?

kw: book reviews, fiction, novels, animals, shipwrecks, mysticism

A friend gave me a paperback copy of Life of Pi by Yann Martel. This may be the most difficult book to review that I've encountered. It is described by the author in his introduction as a story to "make you believe in God." I would say, for anyone who thinks belief in God is a possible thing, it will make that possibility inevitable, and for anyone who thinks belief in God is either impossible or wrong, it will confirm that impossibility.

Near the end of the Revelation given to John, the last book in the New Testament, the angel's final words to John are:
Do not seal up the words of the prophecy of this scroll, because the time is near. Let the one who does wrong continue to do wrong; let the vile person continue to be vile; let the one who does right continue to do right; and let the holy person continue to be holy. (Rev 22:10b-11, NIV)
Earlier in the vision another angel had said, "There will be no more delay" (10:6), which the King James Version renders, "There will be time no longer." Together these passages show that once the end times truly arrive, it is too late to repent. Of course, the vile and the wrongdoers believe in neither end times nor in repentance.

Young Pi, the nickname for Piscine (French for swimming pool), leaves nothing to chance. Having an open heart and a keen desire to know God, he has accepted and diligently practices Christianity, Islam, and Hinduism. During the major section of the book, which tells the story of his nine months at sea in a lifeboat in the company of a tiger, Pi credits his triple faith for his survival.

The core idea of the story is ambiguity, even irrationality. But "irrational" has two meanings. As a schoolboy, Piscine Molitor Patel, tired of hearing his name mispronounced as "pissing", begins a new school year by insisting that he is to be called Pi, writing on the blackboard of every classroom, "Pi ≈ 3.14". Pi (shown in formulas as π) is the most irrational of the so-called irrational numbers; they are those quantities that cannot be expressed as a ratio of integers. In fact, π is the leading member of a special class of irrational numbers called "transcendental numbers", because they are not the result of any simple operations but themselves form the basis for operations: they are sources, not productions. They transcend algebra.

The first number proven to be irrational is the square root of 2, shown as √2. It is the first member of the other class of irrational numbers, the algebraic numbers. Algebraic and Transcendental: these are the two kinds of irrational numbers. And the other meaning of "irrational"? A sort of synonym, even an evil twin, of "illogical".

The transcendental numbers intrigue me. Though few are known, they seem to outnumber all other kinds of numbers. And so I recall Isaiah 45:15, "You are a God who has been hiding Himself." As metaphors, they are even more fascinating, because "transcendental" is a kind of good-angel twin of "divine". If in reading Life of Pi we believe his story throughout (at least, prior to the second half of Chapter 99), he becomes a kind of god to us. Or, at least, an avatar leading us to God. Mr. Pi Patel becomes transcendental. If, instead, we believe the second half of that chapter, seemingly fabricated on the spot to satisfy overly-rational shipping-disaster investigators, we fall from heaven to earth.

Perhaps this story cannot make absolutely everyone believe in God. But it makes a reader clear that the choice of heaven or hell is ours to make, and we will surely attain our choice.