Tuesday, September 01, 2015

Who is really at sea, the character or the reader?

kw: book reviews, fiction, novels, animals, shipwrecks, mysticism

A friend gave me a paperback copy of Life of Pi by Yann Martel. This may be the most difficult book to review that I've encountered. It is described by the author in his introduction as a story to "make you believe in God." I would say, for anyone who thinks belief in God is a possible thing, it will make that possibility inevitable, and for anyone who thinks belief in God is either impossible or wrong, it will confirm that impossibility.

Near the end of the Revelation given to John, the last book in the New Testament, the angel's final words to John are:
Do not seal up the words of the prophecy of this scroll, because the time is near. Let the one who does wrong continue to do wrong; let the vile person continue to be vile; let the one who does right continue to do right; and let the holy person continue to be holy. (Rev 22:10b-11, NIV)
Earlier in the vision another angel had said, "There will be no more delay" (10:6), which the King James Version renders, "There will be time no longer." Together these passages show that once the end times truly arrive, it is too late to repent. Of course, the vile and the wrongdoers believe in neither end times nor in repentance.

Young Pi, the nickname for Piscine (French for swimming pool), leaves nothing to chance. Having an open heart and a keen desire to know God, he has accepted and diligently practices Christianity, Islam, and Hinduism. During the major section of the book, which tells the story of his nine months at sea in a lifeboat in the company of a tiger, Pi credits his triple faith for his survival.

The core idea of the story is ambiguity, even irrationality. But "irrational" has two meanings. As a schoolboy, Piscine Molitor Patel, tired of hearing his name mispronounced as "pissing", begins a new school year by insisting that he is to be called Pi, writing on the blackboard of every classroom, "Pi ≈ 3.14". Pi (shown in formulas as π) is the most irrational of the so-called irrational numbers; they are those quantities that cannot be expressed as a ratio of integers. In fact, π is the leading member of a special class of irrational numbers called "transcendental numbers", because they are not the result of any simple operations but themselves form the basis for operations: they are sources, not productions. They transcend algebra.

The first number proven to be irrational is the square root of 2, shown as √2. It is the first member of the other class of irrational numbers, the algebraic numbers. Algebraic and Transcendental: these are the two kinds of irrational numbers. And the other meaning of "irrational"? A sort of synonym, even an evil twin, of "illogical".

The transcendental numbers intrigue me. Though few are known, they seem to outnumber all other kinds of numbers. And so I recall Isaiah 45:15, "You are a God who has been hiding Himself." As metaphors, they are even more fascinating, because "transcendental" is a kind of good-angel twin of "divine". If in reading Life of Pi we believe his story throughout (at least, prior to the second half of Chapter 99), he becomes a kind of god to us. Or, at least, an avatar leading us to God. Mr. Pi Patel becomes transcendental. If, instead, we believe the second half of that chapter, seemingly fabricated on the spot to satisfy overly-rational shipping-disaster investigators, we fall from heaven to earth.

Perhaps this story cannot make absolutely everyone believe in God. But it makes a reader clear that the choice of heaven or hell is ours to make, and we will surely attain our choice.

Friday, August 28, 2015

His cure is much worse than the disease

kw: book reviews, purported nonfiction, medicine, thumbs down

From the book's title, it was clear that it was anti-doctor. I just wasn't prepared for just how much it was anti-doctor, anti-medicine, anti-just about anything except, of course, the products of his business. The Great American Health Hoax: The Surprising Truth About How Modern Medicine Keeps You Sick is by Raymond Francis, and as it turns out, he is the president of Beyond Health International. The book is a moderately subtle advertisement for the company's products. By the way, I compared his online catalog with other highly reputable brands. His cost four times as much as anyone else's that I could find. Caveat emptor!

There is a proverb I heard a few times when I was young, "The devil will tell you the truth seven times to get you to believe one lie." That seems to be the strategy here. I suppose I ought give Mr. Francis the benefit of the doubt in most cases; perhaps he really believes what he has written. But I'd like to examine three items where I find such a supposition to be rather incredible. All are found in the book's 11th Chapter, "Death by Medicine".

First, concerning Ebola, and infectious disease in general. Ebola is a miserable illness, caused by a virus, that causes bleeding from many bodily tissues. If the body's immune response does not subdue the virus soon enough, the range of "leaky" tissues increases until the patient bleeds out from just about everywhere and dies of blood loss. With intensive interventions, developed by doctors using a combination of intuition and trial-and-error, some patients' lives can be saved. With no treatment, depending on the strain of virus, 50% to 90% of those who get it die from it.

Beginning the chapter's section on infectious disease, (based on earlier attacks against antibiotics and vaccines), we read, "If antibiotics and vaccinations are inappropriate, how then do we deal with infectious disease? The answer is simple: keep your immunity strong. The "bug" is not the problem." Throughout the book, the author has made dozens of recommendations for supporting health and a healthy immune system. Most of them are pretty standard fare, some are more wacky, but all sooner or later circle back to this: we need appropriate nutritional supplementation. He scarcely mentions the products; he is more clever than that. But he so frequently writes of "pure ingredients" and suchlike that it is clear, nobody but he has truly "pure" products. Then he has this to say of Ebola in particular on page 263:
The existing literature indicates that vitamin C, in sufficient quantities, has never failed to cure any virus infection…What appears to be happening to Ebola patients is that the body mounts an excessive immune response to the infection, producing a flood of inflammatory chemicals that massively damage every tissue in the body, causing the blood vessels to leak. These inflammatory chemicals use up vitamin C and cause the need for vitamin C to go sky high…this deficiency causes acute scurvy…"
Ebola may seem like an exaggerated case of scurvy, but it is not. The "existing literature" statement is totally false. See this post in ScienceBlogs for a full debunking of the very small number of published pieces that make such claims. Most were by a doctor who claimed to cure polio with vitamin C. The post also debunks the notion that vitamin C has any measurable effect on Ebola. Every point of Mr. Francis's discussion is false. Period.

Second, let's back up to what he says about antibiotics. He calls antibiotics "one of the greatest medical blunders of history." He doesn't even bother to try to explain away the millions upon millions of lives saved and diseases cured by antibiotics. He just ignores them. He makes much of the well known facts that antibiotics are being overused and misused, which has resulted in great problems with antibiotic resistance and with the destruction of gut bacteria. The sad history of antibiotic misuse provides plenty of fodder for the "seven truths". The string of lies that follow include, "…there is no need for antibiotics at all", "Immune-enhancing nutrients [a list follows] will take care of most infections", and, "The right amount of vitamin C will stop almost any infection." Boy, I wish the things he claims were true! Sure would be nice!! But they are false. He closes with a recommendation for the Rife machine. Do you remember Dr. Andrew Weil? He is a great proponent of alternative treatments…but not this one. See his discussion of the machine here; it was quackery in 1932 and it is quackery today.

Here is my own take on antibiotics, because I lived through much of their history. The two main problems are resistance developed by bacteria and the collateral damage, which is the slaughter of many of the bacteria in your gut. Both are a problem mainly with oral antibiotics. Antibiotic pills were developed to make administration easier, not needing a hypodermic injection. When I was a child, if we had an infection we got a penicillin shot. My brothers and I all hated the needle. But there was no other way to administer penicillin, although penicillin powder was pretty good for sprinkling on an open wound (but painful!). Oral antibiotics became a focus because parents didn't like screaming children or children who couldn't be dragged to the doctor's office. Sure, that's over-simplified, but it explains a lot. When an antibiotic is injected, and intravenous injection is the most effective (and most painful) way, little of the drug is excreted so it doesn't get into the sewer and induce resistance on whatever bacteria encounter the sewage. Actually, no matter how you take an antibiotic, you can reduce resistance problems by storing and incinerating all the wastes your body produces during the time you take it, and for 3-4 days thereafter. Smelly, though. But IV antibiotics will not kill your gut flora. This isn't a total solution, but points the way to some useful steps in the right direction.

OK, now to #3. Vaccines. The first thing I did after reading the book's introduction—which is full of red flags to any medically literate person—was check the index for "Vaccine". An extended quote from page 256 is warranted here:
Another of conventional medicine's historic fiascos [sic]is vaccination. Vaccines are ineffective and dangerous. Unlike other drugs, which undergo basic testing prior to approval and recommendation, vaccines do not have to be proven safe or effective before hitting the market. While there is no scientific evidence that immunizations prevent disease, there is plenty of evidence that they are not safe. No vaccine has ever been scientifically proven in double-blind, placebo-controlled studies to be effective; the existing evidence indicates that they are only marginally effective or not effective at all.
That paragraph contains five sentences. Each is false. In order:

  1. A fiasco? A very few vaccines were later found to be ineffective or to cause problems. Most are clearly effective, and they are the most effective public health tool we have to reduce the number of cases of the most deadly infectious diseases.
  2. Ineffective and dangerous? See prior statement. Also, let's see, how many children do you know who have had measles? Even one? I had it, as did my brothers, and every one of my schoolmates. Luckily, we were all of European extraction, and had immune systems that could deal with measles before we died an Ebola-like death! When Europeans first went to Hawaii, there soon followed a measles epidemic that killed most who caught it. The vaccine was introduced in 1963. In 1960, 380 Americans died of measles, among more than 400,000 who caught the disease. In 2007, there were only 43 cases of measles, and no deaths. Cases have been rising more recently, due to the efforts of anti-Vaxxers (including Mr. Francis, no doubt).
  3. Untested before marketing? Not true. This isn't just a lie, it is a huge lie. Maybe he's talking about some other country, in which the government doesn't mandate three-phase testing of all medications. It is dietary supplements that are untested and don't have to prove their effectiveness. Like his company's products.
  4. He repeats the "ineffective and unsafe" charges more vehemently. He is even more wrong. Some vaccines that were found over time to be unsafe to small numbers of patients have been withdrawn. But they had earlier passed safety and efficacy tests mandated by the US government. Some medicines need extra time in their tests, it is true, but predicting which ones isn't yet feasible.
  5. He pulls out all the stops with scientific language intended to muddle your mind. Here is a true statement: Every vaccine on the market has been subject to double-blind, placebo-controlled studies and has proven its effectiveness. The yearly "flu" shots have a measured effectiveness, so they can tell you that, for example, the vaccine being produced right now, for use during the fall of 2015 and winter of 2016, is about 65% effective at preventing infection by the strains of virus that are even now moving out of the tropics into North America; and for those who still get influenza, the case will nearly always be milder in a vaccinated person than one who is not vaccinated.

I went to a private school for grades 1-3. We lived in Salt Lake City, Utah at the time. During exactly those years, Dr. Jonas Salk conducted a double-blind, placebo-controlled study of his Polio vaccines (there are 3 of them). I was one of his 50,000 experimental subjects. My name was in the newspaper in 1955 (the names filled a few pages), in the article announcing the spectacular success of the vaccine. If this vaccine were ineffective, there'd be hundreds of thousands of "iron lung" machines in use today in America.

The anti-vaccine drumbeat really rankles me. Do you really think the disease is safer than the medicine? People scream about some of the vaccines using a mercury chemical as a preservative. The amount of mercury in a vaccine shot is tiny. Do you know the earliest somewhat effective remedy for syphilis? Metallic mercury, about 2cc, injected right into your hip through a large needle (it won't go through a small one). But fewer and fewer vaccines continue to use mercury compounds; better preservatives have been developed. That's how science works. You develop something that works most of the time, but causes a few people some problems. So you keep on developing, to reduce both the numbers who don't get help, and the numbers who have problems. Neither of these numbers will ever reach zero. But it sure beats having the disease!

The diseases I had as a kid: measles, mumps, chicken pox.

The diseases my mother had as a kid: those plus scarlet fever and Rubella (it was called German Measles at the time).

Diseases I avoided because of vaccination: Whooping cough (pertussis), diphtheria, tetanus (which killed the brother of Henry David Thoreau among many others of that era). I could add Polio, but I'd actually had that in my first year, when it is rather mild; the older you are when you get polio, the more severe it is. This wasn't known in 1952 when I became a "polio lab rat". More recently, I have had the shingles vaccine, which is about 2/3 effective I am told. My mother had shingles. I'd risk a great many side effects to avoid that! I've also had the Pneumovax injection; the viruses it combats used to be called "the old man's friend", because immobile patients in nursing homes were its typical victims. And, since age 67, I've begun getting a yearly flu shot. It's a whole lot better than even ten years ago.

In this case I say, "Don't buy this book." I'm glad I read a library copy. Don't read it. I did already so you don't have to. The guy has products to sell, that are very similar to other products but cost much, much more. He is either a cynical, manipulating charlatan, or is bamboozled himself.

Monday, August 24, 2015

Cataloging cryptosleuths

kw: book reviews, nonfiction, monsters, investigations

We love our monsters (most of us). Whatever the nearby mystery, it has a large following, and a few more charismatic legends have a worldwide following, such as Nessie and UFO's. Writer Tea Krulos set out to learn about the whole range of monsters/demons/ghosts/UFO's and the people who hunt them. Doing so, he followed along with a group known as PIM (Paranormal Investogators of Milwaukee) on a number of such hunts, and as he relates in the last chapter, got a little more than he'd bargained for in one instance. He also tagged along with or interviewed members of other groups and researched the field among both scientist, skeptics and believers. The result is his book Monster Hunters: On the Trail with Ghost Hunters, Bigfooters, Ufologists, and Other Paranormal Investigators.

Other than "the boogieman", the only popular monsters I recall from childhood are the Loch Ness Monster ("Nessie") and the Abominable Snowman (Yeti). I find that each represents a category. I don't know if the book's span is all-inclusive, but it seems so:

  • Cryptozoology, or the study of "cryptids", elusive creatures that include
    • Bigfoot, Sasquatch, Yeti, the Skunk Ape and other man-apes.
    • Two kinds of Chupacabra (sometimes written Chupacabras, following a Spanish idiom), whose name means "goat sucker". One variety is thought to be responsible for animals found drained of blood. The other looks like an extremely mange-afflicted dog.
    • Lake monsters such as Nessie, "Champ" in Lake Champlain, and other large snake- or dinosaur-like swimming animals.
    • Werewolves, but seldom of the shape-changing kind (no Vampires are mentioned, though, and Vampire Bats are all too real, if rather small).
  • Demonology, including ghost hunting and activities of professional exorcists.
  • Chimeras, such as MothMan. Nobody seems to hunt Griffins or the Basilisk these days.
  • UFO's, and not all ufologists are convinced they are ET's.

The middle point above is a tricky one. The Roman Catholic Church employs an official Exorcist in each of the 50 U.S. states, and others stationed in provinces around the world. They take demon possession very seriously. Krulos interviewed one of them for his chapter on demon possession, and was given a reasoned account; also a scathing commentary on the flashy exorcist he describes in detail, a certain "reverend", whose activities I disdain at least as much as the Catholic exorcist does. A certain Bible verse about "making merchandise of the word of God" comes to mind.

A point to ponder: If the demons of the Bible actually exist, they are quite capable of impersonating the "ghosts" of the dear departed, which is the view the Bible takes. Their abilities and activities would explain all manner of occult manifestations, although the great majority of "spiritualist" activities are easily shown to be the work of charlatans. Some well respected theologians consider that demons are also behind the UFO activities that are not hoaxes or misunderstandings of natural phenomena.

Cryptozoology in particular dwells on the line between fact and fantasy. From time to time a new, largish species is discovered that nobody knew was there. A famous one is the Coelacanth, a rare fish that was thought to be extinct until some were caught near Indonesia in 1938. I remember as a boy the reports of the first Okapi, a kind of smallish zebra/giraffe. Finding a new mammals is rare. Though new species of many kinds are found every year, they are nearly all small or even tiny, and reclusive. Pretty much everything that isn't reclusive has already been discovered and described in journals. But some cryptozoologists do try to take a very scientific approach, and may study science so as to do it better. Maybe someday a living Yeti will show up in some indisputable way.

Tea Krulos became a kind of meta-hunter, hunting the hunters and bringing us glimpses of their lives. Not many are mild-mannered sorts, though a very few seem to be. It generally takes an outsize personality to go from being "interested" in various monsters or ghosts, or even fascinated with them, to being an active hunter, in an organization like PIM or on one's own. But considering the reputed powers of their prey, it seems safest to hunt in groups.

Monday, August 17, 2015

Even more digital caution

kw: book reviews, nonfiction, sociology, computers, computer revolution

A Canadian Indian chief was talking with a visiting geologist, who was describing his work and the chances of mining in that area. At one point the chief said, "The first time white men came to Canada, they shot all the big game and hauled away the meat. The second time white men came to Canada, they trapped all the small game and hauled away the furs. The third time white men came to Canada, they cut down all the big trees and hauled them away to make lumber. The fourth time white men came to Canada, they cut down all the small trees and hauled them away to make paper. Now they are coming for the rocks!"

Observing the sweep of human history, I get a similar feeling, or perhaps it is the kind of building dread embodied in a Vaudeville routine that began, "Slowly I turned. Step by Step…"

  1. At various times mainly between 15,000 and 5,000 years ago, the Agricultural Revolution made possible a great increase in the human population and the size and density of settlements-towns-cities. In some ways life was better, but there also arose supervisors and nobles and kings, epidemics of cholera and TB, and harder and longer work for nearly all. Those few foraging cultures that remain seem to have an easier time of it, at the cost of not being able to accumulate more goods than they can carry on their person or drag with a travois (foragers don't build roads, so the wheel is no use to them).
  2. Beginning about 200 years ago in England, and spreading to about a third of humanity so far, the Industrial Revolution made possible a great increase in productivity of goods manufacture and travel and literature/literacy/education. There also arose sweatshops (still almost universal outside Europe and the USA) and all the associated ills detailed in Sinclair Lewis's "muckraking" books.
  3. Though Charles Babbage and Ada Lovelace and others made a false start at developing computing machinery in the "real steam" era that is adulated in "steampunk" fiction, true general-purpose computers truly began a bit more than 60 years ago with electronic computation, first using vacuum tubes, then transistors, then small-scale integrated circuits, and now "chips" about the size of postage stamps, that can do a few billion operations per second.
  4. The first Blackberry, the 850, appeared just 16 years ago; it was the precursor of the "smart phone", which really took off once touch screens became economical to produce. The level of computing power needed to run these devices also powers, on a larger scale, all the "big data" processes that have addicted so many of us to a pocket device that pretty much runs our lives, and informs advertisers and government staffers alike about everything happening to nearly everybody.
This environment of ubiquitous computing is what the word "digital" means in the title of Andrew V. Edwards's new book Digital is Destroying Everything: What the Tech Giants Won't Tell You about How Robots, Big Data, and Algorithms are Radically Remaking Your Future. That's not quite the longest title I've ever seen, but it is the longest this year.

Digital processes are not all that new. They date to the invention of counting numbers thousands of years ago. The world has always been divided into things you can count and stuff you can't count. You can't say "I'll have one water with lunch and two waters with dinner", except if you are talking about bottles of filtered water. But water itself is treated as a continuous substance. Apples or sheep, on the other hand, are unitary. It is quite legitimate to have one apple with lunch and two apples with dinner. Or even two "fruits" if you intend to have one apple and one pear. Though most "unitary" items are divisible, and cutting an apple in half to share with a friend is OK, half a table or half a chair or half an automobile is not so useful. So if we want to do the work involved, it is possible to count exactly how many automobiles (that run) are in the city of Houston, or how many oranges are on a particular tree in Florida. But water? or even apple juice? You can't "count" it unless you containerize it, and then you can count quarts or gallons or whatever.

That's a long way to say that the human race has actually become comfortable working with some stuff that is analog and thus can be measured but not counted, and other stuff that comes in natural packets and can thus be counted, and is digital. Though there are super-microscopes that can see atoms, we still don't worry much about how many atoms of helium are in a particular balloon, or how many molecules of water are in a particular drinking glass. On the human scale, atoms (from "a tomos" meaning "can't be divided") just don't matter. For many, many purposes, analog is king. Most of us only care about exactitude when getting change from the cashier or balancing our checkbook. Or counting that there are indeed 12 eggs in that carton of a dozen.

All that is changing. In the four points above, I didn't pay much attention the radical changes of occupation that accompanied each revolution. Midway through the Industrial Revolution, autos rapidly replaced carriages, and the proverbial "buggy whip makers" nearly all went out of business. Only one in ten thousand still remains, making the whips for funky carriages used for giving rides to tourists in historic Philadelphia or Williamsburg. My wife was once a Telex operator. That has long been superseded by at least three technologies in sequence, until they all fell to e-mail and now texting. CEO's text or e-mail almost everything except contracts that need signing, and even then, the signable PDF is taking over for paper contracts.

So what does Mr. Edwards wish us to beware? I could be cute and say, "All of it!", but that would do him poor justice. Digital with a capital D provides abundant conveniences. We just have to achieve some kind of balance, because convenience is not all there is to life. Before there was Digital there were already "couch potatoes", for whom convenience really was, if not everything in their life, at least it made up as much of it as they could manage. Before there was TV, there were already over-avid spectators, going back even before Roman times, when the Emperor's formula for a contented population was "bread and circuses". In between it was "beer and football (either kind)".

The book has 17 chapters that cover everything from the music industry (rapidly dying away), screen addiction (people who text the person sitting across the table at the eatery), the job market (or lack thereof), retail (Amazon and eBay and their Mafia-esque ways to grow into monopolies), and the tension between using the Twitterverse to overcome authoritarian rule and its use by the authorities to track us all in real time (I knew there was a reason I've eschewed getting a Twitter account!).

The saddest chapter has the title, "Obsessive Compulsive: Digital is Destroying Our Will to Create Anything Not Digital." Having spent 40 years writing software, because I was better with code than I was in the lab (I majored in Chemistry, Physics and Geology), I well know the allure of, "Computer programs can do anything!". But early on I recognized that they can't. So I've given equal effort to analog pursuits: music performance (voice and several instruments), art (mobiles, the kind with wires and hanging things), and essay writing (generally speaking, half an essay is of no more use than half a chain saw). I wonder what I did right? So many people I know have no hobbies that don't fit on a 4-inch screen.

Now, some things we're better off without. Music aficionados who have a really good ear can tell the great improvement CD's are over vinyl records. Of course, if they really, really like the third harmonic emphasis created by older equipment, they use the CD player to drive a vacuum tube amplifier. So there is still a very tiny market for manufacturers of vacuum tubes! But most of us are happier without them (though I still own a ham transmitter with driver and transmitter tubes). Digital controls in aircraft and autos have steadily reduced traffic and air fatalities. Digital libraries, though they are putting pressure on brick-and-mortar libraries (still my favorite places: FYI, I don't own an e-reader), afford instant access to an increasing fraction of all human knowledge, and the only way to index all of that.

I suspect if advertisers could not track us via the click rate on their banner ads, there would still be a larger market for paper "newspapers" and magazines. But those markets continue to shrink. Digital is also destroying education "as we know it", but I favor that to some extent. Different people learn different ways: I learned FORTRAN II better in two days using a "programmed instruction" book, than if I'd sat for 12 weeks in some "Comp Sci" classroom trying to learn it from an instructor (Oh, yeah, Comp Sci didn't exist in 1968; I was among those who invented it). The Khan Academy caters to people like me…usually! But for some subjects, I do better with a talented instructor. I needed a really good one to learn Differential Equations while getting an Engineering degree. Two attempts with less talented teachers led me to drop those classes, so the "third time" really was the charm. But if Digital takes over the classroom for many subjects, I, for one, will not cry the loss. Only the most talented teachers will remain as teachers. That is a good thing. The most talented today are moving toward massive online courses, which spread their expertise to a great many more students than could be taught just a decade or two ago.

For some things, we prefer less human interaction. I'm a typical "hunter" type when buying something. Once I know what I want, it is like, "go to forest (store), find prey (the shirt I've decided to buy), kill (buy) it, and go home." My best "shopping" trips last ten minutes. The last person I want to interact with is a store clerk. Of course, that's if I really know what I want. If I don't, and online research hasn't proven helpful, I really do want a knowledgeable store clerk's help. Then I shop differently: "Go to forest, find hunting mentor to lead me to where the best game is. Then kill the prey and take it home." I'm OK if such an event takes half an hour instead of ten minutes. But if the "mentor" is a dullard with little interest in being of genuine help, he/she'd better duck! Actually, no violence, I just find the supervisor, who's more likely to be a good "mentor".

The last couple of chapters get into "What do we do about it?", and I'll avoid stealing the author's thunder, except to say, the Hippies were right, that we ought to get out and smell the flowers more often. Do you ever take a walk and turn off your phone until you return home? Try it.

Friday, August 07, 2015

Our life in bits and bytes

kw: book reviews, nonfiction, algorithms, prediction, sociology

What would life be like if the atoms that make us up were just big enough to see, if we could witness directly how they slide, merge and separate? How complex could our life be if the sum total of our lives could be described by, say, 1,000 characteristics, or perhaps 100? How about 10?

Yet how quick we are to pigeonhole people according to one or two, or at most five, distinguishing items! What do most of us now about, for example, Yo Yo Ma? Male, Chinese, famous musician (maybe you know he is a cellist), … anything else? How about that he is French born, a Harvard graduate, and has earned 19 Grammys? That's six items, more than most people probably now about him.

To what extent do you think you could predict his tastes and buying habits from these six items? If another person shares these six characteristics, to what extent will he also share certain tastes in clothing or food or books to read? Some people wish us to think, "to a great extent". In The Formula: How Algorithms Solve All Our Problems and Create More by Luke Dormehl, some of the people he interviewed claim to do just that. (Maybe you've made a profile on a dating site that starts matching you up when you've entered no more than four or five items. And how fully have you completed your FaceBook profile?). But some go to quite an extreme in another direction, using "big data" to pry inside our skulls.

What kind of big data? All your searches on Google, Yahoo, Alta Vista, Bing, or whatever; every click, Twitter text, FaceBook, LinkedIn, blog post, or online chat. We create tons of data about our day-to-day, even moment-by-moment activities. There was recently an item on the noon radio news about a company that aggregates such data and sells "packages" to companies, who pay $1 to $2 million dollars on some periodic basis for it (That's all I remember, I was listening with half an ear while folding laundry). Why is all that data so valuable? Because businesses believe they can better predict which products will sell to what kind of people if they crunch it.

A few months ago a handle on a drawer broke. Naturally, the cabinet is decades old and nothing even remotely similar in style could be found at Home Depot or a decorator's salon. So of course I looked online for something with the right spacing of mounting holes, with an appearance that would be compatible with the cabinet, in a set of four, so the handles would all match. It took a few days. I bought a set I liked, online, and installed them. For the next several months, however, ads about cabinet door handles appeared everywhere I went online: Google, FaceBook, Amazon, eBay. They all knew I'd been looking for door hardware. None of them knew I was done looking! (Google, are you listening? Do, please, close the loop and collect purchase data also.)

What is The Formula? Luke Dormehl calls it an Algorithm. What is an algorithm? To anyone but a mathematician it is a Recipe or a Procedure. I used to have a book, which I used into unusability: How to Keep Your Volkswagen Alive: A Manual of Step-by-Step Procedures for the Compleat Idiot by John Muir and Richard Sealey. With its help I kept my 1966 Bug alive into Moon Unit territory. The "procedures" were recipes, or algorithms, for things like setting valve clearances, changing a wheel bearing, or overhauling an engine. In computer science, an algorithm is the detailed instructions to a computer to direct it what you want it to do, very, very exactly.

Here is the kicker. A traditional algorithm is carried out in a procedural manner (don't pay attention to claims of non-procedural, object-oriented computer language gurus. At the root, a computer CPU carries out a series of procedural instructions), according to a "computer code" or "program", written in one or more formal languages. Some time ago I looked at the internal release notes for the Android OS used in many cell phones. That version, at least, released in 2009, had modules written in 40 computer languages. No matter how complex the program or program system, the instructions are written by a person, or perhaps by many persons, and no matter how many, their knowledge is finite. There are also time constraints, so that the final product will be biased, firstly by the limitations of the programmer(s), secondly by tactical decisions of what to leave out for the sake of time or efficiency, and thirdly by the simplifications or shortcuts this or that programmer might have made so that some operation was easier to write the code for. They may also be biased by inner prejudices of the programmer(s).

Another kicker: A kind of start-stop-start process had been going on around Neural Networks. They try to mimic the way our brains are wired. There are two kinds, hardware and software. Hardware neural nets are difficult to construct and more difficult to change, but they have much greater speed, yielding almost immediate results. Because people who can wire up such hardware are quite rare compared to people who can write computer software, hardware nets are also rare, and nearly all the research being done with them is being done using software simulations. "Machine learning" by neural nets can be carried out by either hard- or software nets, but I'll defer remarks on one significant difference for the moment.

A neural network created for a specific task—letter recognition in handwritten text, for example—is trained by providing two kinds of inputs. One is a series of target images to "view", perhaps in the form of GIF files, or with appropriate wiring, a camera directly attached. The other is the "meaning" that each target image is to have. A training set may have five exemplars of the lower-case "a", along with five indicators meaning "that is an a", five of "b" and their indicators, and so forth. The innards of the net somehow extract and store various characteristics of the training data set. Then it is "shown" an image to identify, and it will produce some kind of output, perhaps the ASCII code for the letter.

The inner workings of neural nets are pretty opaque, and perhaps unknowable without extremely diligent enumeration of all the things happening at every connection inside. But at the root, in a software neural network there is a traditional algorithm that describes the ways that the network connections will interact, which ones will be for taking input or making output, which ones will store things worth "remembering", and so forth. This is one reason that software nets are rather slow, even on pretty fast hardware. The simulation program cannot produce the wholly parallel processing that a hardware net uses (brains use wholly parallel processing, and are hard-put at linear processing, the opposite of computer CPU's). If the net is small, with only a few dozen or a few hundred nodes, the node-by-node computations can be accomplished rapidly, but a net that can recognize faces, for example, has to be a lot bigger than that. It will be hundreds of times slower.

Now for the other significant difference. The computer running the simulation is digital, while a hardware network is analog. I remember the first time I used a computer, that I was quite impressed to see calculations with 7-8 digits of significance, and if I used double precision, 15 digits. That sounds very precise, and for many uses, it is. Fifteen digit precision means one can specify the size of something about the size of a continent to the nearest nanometer. That is about the size of five or 10 atoms. However, a long series of calculations will not maintain such a level of precision. For many practical uses, calculations of much lower precision are sufficient. Before computers came along, buildings and bridges were built, and journeys planned; a slide rule was accurate enough to do the calculations. My best precision using a slide rule was 3-4 digits. But "real life"systems are typically nonlinear, and the sums tend to partly cancel one another out. You might start with very accurate measurements (but it's quite unlikely they are more accurate than 4-6 digits). Run a simulation based upon those figures a few dozen steps, and somewhere along the line there might have been a calculation similar to this:

324.871 659 836 648 - 324.860 521 422 697 → 0.011 138 413 951 016 4

If you've been counting digits, you might notice that the digits 0164 (which I colored red) are superfluous...where did they come from? That is the rounding error, both that which arose from representing the two numbers above in binary format, and that from the conversion of the result back into decimal form for display. But the bigger problem is that, counting only the black digits, only 11 are useful. Four have been lost. Further, if you were to start with decimal numbers that can be represented exactly in binary form, such as 75/64 = 1.171 875 and 43/128 = 0.335 937 5, multiplying them results in 3,225/8,182 = 0.393 676 757 812 5, which has 13 digits of precision, whereas the original numbers had seven each. Thus it typically takes twice as many digits to represent the result of a multiplication, as were needed to represent the two multiplicands.

I could go on longer, but an interested person can find ways to determine error propagation in all kinds of digital systems, many of which have long been studied already. By contrast, an analog system is not limited by rounding errors. Rather, real wires and real electronic components have thermal noise, which can trouble systems that run at temperatures we might find comfortable. Further, Extracting the outputs in numerical form takes delicate equipment, and the more accurately you want those output numbers to be, the more delicate and expensive the equipment gets. However, until readout, the simulation runs with no errors due to subtraction or multiplication, other than gradual amplification of thermal noise.

Suffice it to say, both direct procedural algorithms and neural network machine-learning systems are in use everywhere, trying to predict what the public is going to do, be it buying, voting, dating, relocating, or whatever. That is the main reason for science, after all: predicting the future. Medical science in the form of a doctor (or more than one) looks at a sick person and first tries to find a diagnosis, an evaluation of what the problem is. The next step is a prognosis, a prognostication or prediction; it is the doctors' expectation of the progress of the disease or syndrome, either under one treatment or another, or under none. A chemist trying to determine how to make a new polymer will use knowledge of chemical bonding to predict what a certain mixture of certain chemicals will produce. Then the experiment is carried out to either confirm the expectation (the prediction), or if it does not, to learn what might have gone against expectation and why. The experiments that led to the invention of Nylon took ten years. But based upon them, many other kinds of polymers later proved easier and quicker to develop. It is even so in biological science. Insect or seashell collecting can be a fun hobby, but a scientist will visit a research museum (or several) to learn all the places a certain animal lives, and when various specimens were collected, and then determine if there is a trend such as growing or shrinking population. Is the animal going extinct? Or is it flourishing and increasing its range worldwide?

In the author's view, The Formula represents the algorithms used in the business world, broadly construed, to predict what you might like, and thus present you with advertising to trigger your desire for that thing. My experience with cabinet handles shows that they often get their timing wrong. Many cool and interesting ads showed up, but it was too late. However, that isn't the author's point. The predictive methods find what ads to show us for products, or prospective dating partners on eHarmony or OK Cupid, or those that manage a politician's image, all tend to narrow our choices. A case in point from the analog world: one of the best jobs I had before going into Engineering came about because an Employment Agent, leafing through job sheets, muttered, "You wouldn't be interested in that," but I quickly said, "Try me!"

Try making some Google searches while logged in to Google, and then (perhaps using a different browser, and if you're really into due diligence, on a different computer network such as a library), making the same searches while not logged in. The "hits" in the main column will be similar, or possibly the same. But the ads on the right are tailored to your own search history and other indicators that Google has gathered.

Is all this a bad thing? Maybe. You can game the system a little, but as time goes on, your history will more and more outweigh things you do differently today. Sure, I got a sudden influx of ads about cabinet handles after searching for same, but if I had a history as a very skilled handyman (I don't!), the exact ads I saw might have been quite different. And I might have also seen ads about certain power tools intended to make the mounting of new cabinet handles even easier.

The author has four concerns and spends a chapter on each.

  1. Are algorithms objective? They cannot be. Programmers are not objective, and machine learning is dependent on the training set, which depends on the persons who create it, and they are not objective.
  2. Can an algorithm really predict human relationships? We have proverbs that give us pause, such as, "Opposites attract", and "If you're not near the one you love, you'll love the one you're near".
  3. Can algorithms make the law more fair? I was once asked by a supervisor if I thought he was fair. I replied, "Too much concern for fairness can result in harshness. We (his 'direct reports') wish to be treated not just fairly but well. We'd like a little mercy with our justice." Mr. Dormehl cites the case of an experiment with an inflexible computer program, given the speed records from a car on a long-distance trip. It issued about 500 virtual tickets. A different program, that averaged speed over intervals just a little longer, issued one ticket.
  4. Can an algorithm create art? Since all the programs created to date operate by studying what makes existing artworks more or less popular, they can only copy the past. True creation means doing what has not been done. Picasso and others who developed Cubism did so against great opposition. Now their works sell for millions. It was art even before it was popular, but the "populace" didn't see it that way for a couple decades.

The book closes with a thoughtful section titled "How to Stay Human in the World of the Formula." While he has some suggestions, I think the best way is to avoid being totally predictable. In many ways, that is hard for me, because I am a man of regular habits. I'm quite happy eating the same meat-and-cheese sandwich for lunch day after day, taking the same route to a work place (or these days, a place I volunteer), eating at a certain kind of restaurant and eschewing most "fine dining" places, wearing a certain kind of garb depending on the season, playing (on acoustic instruments, not electronic devices) certain kinds of music to the exclusion of others, and so forth. But I am also the kind of guy, when I make a mobile, it will be quite different from any other I have ever made: different materials, different color schemes, and different numbers of hanging objects clustered—or not—in various ways. I made one out of feathers once; not my most successful mobile. When I write a formal document or a letter for sending via snail mail, though I type it because handwriting is so slow, I usually pick a new typeface in which to print it; I have a collection of nearly 2,000 font files, carefully selected either for readability or as specialized drop caps (I love drop caps, though I am careful in their use). I haven't bothered to try alternate typefaces for this blog, because there are only 7 available anyway, and the default is as good as any.

The author proposes that we "learn more about the world of The Formula". Sure. But as long as Google's Edge Rank (formerly Page Rank) is a black box, and as long as everyone out there from FaceBook and LinkedIn to Amazon and NetFlix keep tweaking their own black box "recommendation engines", it will be a kind of arms race between the cleverest consumers and the marketers. But, hasn't that always been true?

Saturday, August 01, 2015

And you thought sharks were dangerous

kw: book reviews, nonfiction, animals, animal rights, performing animals, orcas

What is black and white, smarter than a chimpanzee, but can't walk? It is the apex predator of the oceans, the orca or "killer whale". Until about 90 years ago, an orca was called a "grampus" by many whalers; that's the word used in Moby Dick. Others called them "blackfish".

I once recalculated the Encephalization Quotient of various whales and porpoises, based on subtracting out the blubber so they'd be more comparable to land mammals. Human EQ ranges from 7 to nearly 8, for fit persons (Take a bright individual with an EQ of 7.5, but a sweet tooth. If he fattens up and his weight doubles, his EQ will drop below 7). Porpoises and other "small" toothed whales have EQ's mostly above 4. But even the fittest bottlenose is 40% blubber, so divide 4 by 0.6, and you get – surprise! – something above 6.6! Raw EQ for orcas ranges from nearly 3 to nearly 4, but their percentage of blubber is a bit less, so I'd put their "real" EQ into the 4.5-6.5 range.

I've been to SeaWorld to see orcas perform, including "rocket hop" stunts that throw a human trainer 30+ feet above the water. This was before 2010 when a widely-publicized killing of a trainer led to a ban on human trainers (who are also performers) entering the water with orcas throughout the US.

How is it even possible for humans to work with these creatures? We need some comparisons, and John Hargrove, an orca trainer for 14 years, provides them in his new book Beneath the Surface: Killer Whales, SeaWorld, and the Truth Beyond Blackfish, written with Howard Chua-Eoan. John was one of several orca trainers interviewed in the film Blackfish, which probed the real story of the death of Dawn Brancheau.

After a performance in February 2010, tensions were high between several orcas, for reasons not clear to the trainers, and Dawn was not in the water, but on a platform next to the pool right at water level, talking to Tilicum, when he moved forward, bit her arm and pulled her into the pool. He didn't try to eat her, but rammed and bit her body repeatedly. She didn't live long, and was partly dismembered. She was the third trainer that Tilicum had killed, but the first to be made known publicly, because this took place in public view.

Do orcas consider us prey? Not really. We don't resemble any of their usual pray animals. Seas around the world are inhabited by a dozen or more loosely related populations of orcas. Some prefer seals, some attack primarily baleen whales, and others eat mainly fish. Being quite a bit smarter than sharks, orcas can tell we aren't seals when they encounter swimming humans, and typically leave them alone. That is in their natural environment. A theme park is hardly a natural environment!

The salt water tanks at SeaWorld parks are big, really big, from our perspective. Roughly the size of a football field, and 40 feet deep, they hold 10-20 million gallons of water, each. But a mature male orca is 30 feet (9m) long and he can weigh 10 tons, though the heaviest male in captivity weighs 6 tons. Take that 6-ton animal, compared to a 200# (90kg) man: the ratio is 60:1. The length ratio is about 5:1. A 15 million-gallon tank sounds like a lot; that is about 1.8 million cubic feet, but:

  • Comparison 1: 1.8 million cu.ft / 60 = 30,000 cu.ft. (850 cu.m.)
  • Comparison 2: 300 ft (football field length) / 5 = 60 ft (18m)
The average house in my neighborhood is a 3-bedroom, 1- or 2-bath bungalow, colonial style, with a full basement but no garage. Indoor volume is about 15,000 cu.ft. The two largest houses on this block are 4-bed and include an attached 2-car garage. Their volume is just under 25,000 cu.ft. That's similar to Comparison 1. On a length basis (Comparison 2), the usual house around here is 30-35' long (10m or less), and the two large houses are 40 ft long (12m). So a captive orca in a SeaWorld park has a pretty big "house" to live in, one might think. Almost a McMansion, perhaps. But he can never go out into the yard. It is perpetual house arrest. A free orca often swims 50-100 km daily, and may dive to 1000 ft (~300m) after prey. You, in house arrest, might take advantage of a treadmill or StairMaster. They don't make those for orcas.

The book has another apt analogy. For about 70 years, some people who fear "flying saucers" or ET's have expressed anxiety over "alien abductions" and "experiments", including breeding experiments, being done by aliens on human captives. Now imagine, you are suddenly subject to imprisonment in your own home, by captors the size of Guinea Pigs…except they are smarter than you are, they have weapons you can't understand, and they control your food supply. Oh, and they teach you silly "behaviors" that you must perform 4, 5, 6, even 7 times daily. They take away your children, subdue you to perform artificial insemination, and take away the children that result, once they are weaned. They also bring in, from time to time, other children and adults with whom you are supposed to share your living space and "all get along." Except these others don't speak your language and have cultural habits you can't fathom. We're not talking moving a Frenchman into an American home. More like tossing Mr. Joe Sixpack in with a warrior from the New Guinea uplands, a Congolese tribeswoman who has never heard the English language, and Quechua-speaking twin toddlers from Peru.

I analyzed orca intelligence above in terms of EQ. The batch of disparate humans described in the prior paragraph would soon develop some kind of patois so they could communicate, and the young twins would probably learn all three other languages available. Orcas in "forced communities" in SeaWorld parks never seem to learn one anothers' languages. Those captured wild were just a year or two old, so they were never raised or socialized by parents. Those born in the parks were, if they were very lucky, somewhat "raised" by their mothers, at least for a couple of years, but were typically moved to another park by year 5, only partway through whatever socialization their mother could teach them. Captive orca groups are thus socially abnormal, extremely so. They cannot communicate vocally to defuse emotional tensions, so they are more violent among themselves than wild orcas.

It is a testament to their intelligence, curiosity, and general good nature, that captive orcas tolerate humans swimming with them at all. If you were in the house described above, and a Guinea-Pig-sized ET came within arm's reach, do you think it would live long? The incidence of "aggressive incidents" and trainer deaths caused by orcas is actually stunningly small.

John tells us there are at present 30 orcas in captivity at SeaWorld parks. Contrary to park press releases, the typical captive orca has a life span of 15-20 years, if they survive beyond a year or two, which half don't. Wild orcas live 30 (male) to 50 (female) years. But a few captive orcas are in the range of 30+ years old. Even if orca "performances" become outlawed, the captive orcas cannot be put into the ocean. They don't know how to behave around wild orcas and would be either shunned, and thus die of loneliness, or killed outright. So we have a 20-to-40-year commitment to these animals, to provide some kind of living for their lifetimes. They are roughly half as intelligent as we are, perhaps more. But that intelligence is different. We cannot ethically keep them as performers, and especially, we cannot keep breeding them. The day must come when no whales are captives in tiny tanks ("tiny" meaning smaller than the Gulf of Bothnia).

John Hargrove loves the whales. He details his life, learning what he had to learn to qualify as an apprentice trainer, getting opportunities that rapidly moved him up the ladder until he was Senior Trainer. A couple of years in France, training orcas who had not yet learned how to work with humans in "their" water…and training the French trainers. He does not stint from telling of the occasional problems with an aggressive or angry orca. I am again impressed that these animals have levels of self-control, and an ability to cool down quickly, that beats nearly any human I've known. John tells of the injuries and traumas, physical and mental, that led him to take a medical leave and then to resign from training, as badly as that hurt him emotionally.

He is deeply conflicted, wishing he could be with the whales, yet knowing that, were SeaWorld management truly ethical, there could be no further human-orca contact beyond caretaking, and many people he loves would lose their jobs. He has become an advocate for the whales. They are the real victims here. No human has yet learned to communicate in depth with an orca or any other sea mammal, in spite of the fact that orcas not only have language among themselves, but several languages around the world. So we have little hope of learning the language of space aliens if any ever show up. Meanwhile, in this "alien encounters of the third (or even fourth!) kind"scenario, we are the aliens, and our captives are totally dependent on us. We have them in a space they cannot escape, and have changed them so they cannot be introduced to "normal" society, anywhere on this globe.

Every one of us needs to spend a few minutes daily pondering that fact.

Wednesday, July 29, 2015

Numeracy instruction

kw: book reviews, nonfiction, mathematics, mathematical thinking, instruction, learning methods

When I saw A Mind for Numbers: How to Excel at Math and Science (Even if You Flunked Algebra) by Barbara Oakley, PhD, I was intrigued. The main title was in the form of a cutesy equation, which I supposed was the editor's conceit. Having read the book, I am not so sure.

Dr. Oakley knows what she is talking about, because she began as a math-phobic, but learned to love it. Throughout the book she has 1-page items by people with a similar background, who now are comfortable with mathematics, at one level or another.

The book is a breezily-written compendium of learning techniques and tips gathered into 18 chapters. It is not intended to be taken in wholesale. Different people learn in different ways, and in many cases, a single chapter, or even one point in a chapter, can unlock the math potential for someone. But I wonder…

At a certain level, to be human is to be mathematically adept at some level. Very young children, asked to choose one of two piles of coins, will pick the one that is spread out rather than a neat stack of the same number of coins. They equate spatial area with quantity, and don't realize that the two piles are equivalent. But, I suspect they have not yet learned to count, and it takes this further level of sophistication before they have the mental equipment to fairly evaluate the two piles.

I think that is analogous to an experience I had at about age 12. Someone had showed me a few Algebra equations. I saw something like 10x=5 and wondered, "How can that be?" I thought the "x" was supposed to represent a digit, like the stuff on the left would be a number from 100 to 109. So I thought something else had to be going on. This caused quite a delay in my getting the point when I began Algebra class later that year. But I think, a month or so into the school year, when it all began to "click", that my brain had simply grown up enough to have the right tools for doing algebra.

We all do a certain amount of calculation. Most of us can quickly evaluate the change we're given at the store (if we used cash). People who bowl soon learn to keep score without writing down their calculations. When we drive (without a GPS), and we see "Chicago, 95 miles", we check the odometer, note what it shows, and can then glance at it later and know how many miles we still have to go. Many times, we can even be told a "problem" like this:
John and Mary ride bicycles toward each other at 5 mph, from 1 mile apart. Their pet bird, which flies 20 mph, flies back and forth from one to the other until they meet. How far does the bird fly?
Most of us, by age 10 or so, can figure that John and Mary each ride half a mile, because they are going the same speed. The bird flies four times as fast as either of them, so its total flight is two miles. This kind of "figuring" is actually algebra, without the equations. In fact, it would take me longer to write down the equations for solving this using "traditional" algebra, than it did to write the two sentences above. What is funny is when someone tries to tackle the problem as a series of flights of decreasing length by the bird. The equations to get that to work are gnarly!

So every one of us has some amount of math built in. Standard equipment. But that "some amount" varies a great deal from person to person. Not everyone can learn algebra, no matter how it is taught nor how hard they try. But most can, and by "most" I mean "more than half but not a great deal more". Some kids who had no trouble with algebra never, ever get the point of Trigonometry. My senior year of high school, we got done with the ordinary curriculum for the year a few days early, so the teacher did an experiment. He got out a few copies of a basic text in Calculus and taught it to us. In just a couple of weeks, I learned enough Calc so that I pretty much breezed through the Calc 101 course the next year in college, which used that same textbook!

I was a working mathematician, at a certain level, for decades. But there are branches of math that have never made sense to me, and others that I can puzzle out with desperate levels of effort. I had to take Differential Equations three times to pass it. I'm still not comfortable with it, but the explanation of how to use it takes only two pages in my old CRC Handbook of Chemistry & Physics, and actually contains most of what one needs to do nearly any Diff Eq problem!

And so it goes. Each human brain has a certain mathematical limit. With luck, we might grow mathematically to our full potential, but it is time consuming. Most of us never need all that stuff. But we also grow into certain abilities over time. Just as the brain doesn't finish emotional maturation until about age 25, it must be true that certain math circuits only get set up at certain ages. It may be that the ten years between my second and my third try at learning Diff Eq made more difference in my ability, than the exposure I'd had during the first two attempts.

With all that in mind, I find no sense in delving into what Dr. Oakley has to say. The book is a fantastic resource. Someone who needs encouragement and help in how to learn math and science will do well to read the book quickly, then return to read over certain sections with more care: those sections that seemed to make the most sense the first time through. The first two chapters will be helpful to everyone. As for the others, while the author attempts to make them generally applicable, each will actually be best suited to people with a certain kind of mind, one way or another.

Thursday, July 23, 2015

The view from the Piedmont

kw: book reviews, essays, philosophy, semiotics

A friend gave me an old paperback copy of Travels in Hyperreality by Umberto Eco. He is one of those authors about whom I have heard a little, but really knew nothing more than his name. That made the reading a bit of an adventure.

Chapters 1, 7, and 8 are long essays in the form of extended travelogues. The other five chapters are collections of shorter essays, republished from newspaper columns, dating from the late 1960s to 1980. All were translated by William Weaver.

There was a certain sameness about the second and later chapters, once I could stand back and view them all. Though they are diverse in subject, they are all the views of a skeptical intellectual who revels in digging into less-traveled corners of this or that concept. Chapter one, from which the book gets its title, stands alone as a travelogue in two senses: physical travel experienced as cultural and conceptual travel. It soon becomes clear that "hyperreality" refers to the United States of America, and most specifically to the breadth of cultural milieus that form a different kind of national map. None of them can be found "across the pond". This is probably as true today as it was 40-50 years ago, even though Europe is getting populated with McDonalds joints, Disneylands, and so forth.

There is just something about certain places that you can say, as certain advertisements in the US have it, "Often imitated, never duplicated." Such places need but one name. In Europe, they are usually great capitals: Rome, Paris, London, Oslo, Berlin, and so forth. In the U.S., some are great cities: Chicago, Las Vegas, New York, New Orleans; but some are more regional: SoCal, The Valley, 'Bama, The Rockies, or Down East. (To be fair, Europe also has regional memes: The Loire Valley, Tuscany, the Alps…)

Eco delved into certain of these, but his interest was conceptual landscapes, which frequently transcended the geographic. Thus, he begins by exploring "Fortresses of Solitude", modeled in his telling on Superman's lair, because of the American penchant for greatly expanding the meaning of "Museum" far beyond the way the word is used elsewhere. Starting with the Lyndon B Johnson Library with its hyper-eclectic gathering of artifacts, including a full-size, detailed replica of the Oval Office (but brighter and shinier), he passes through several similar establishments sporting full-size replicas of this or that building or collection thereof, and winds up at a replica Colonial farm, complete with livestock…or as close as the proprietors could come to a replica, what with changes in farm animals over the past 3-4 centuries. I found myself wondering what he'd have thought of the Winterthur Museum in Delaware, the 175-room mansion of Henry F. du Pont, which is primarily composed of entire salons—walls, ceilings, floors, windows, furniture, art and all—bought from other mansion-builders who'd fallen on hard times all across America.

Then he dwells upon wax museums that go beyond the "statuary" genre of Madame Tussaud's, and this country has a great many of them. Wax replicas of people are one thing. Some of the items are replicas of works of art, such as one of Michaelangelo's David, but colored as the statue might once have been, and perhaps as David was in life. Is that more than a replica, or less?

In another turn, he starts with Wm. R. Hearst's "Castle" in San Simeon, remarking at length upon its confused mix of artifact and artifice, the real and the fake, the old and the new and the new-but-looks-old. He touches on other such American castles, and again, I was wondering what he'd have thought of the "summer cottages" in Newport, Rhode Island, where the Vanderbilt's and others escaped the oppressive summers of their "real" homes in the Carolinas. After contrasting these various kinds of "museums" with the Forest Lawn Cemetery's exhibits, he gets to the real meat of his essay, the real homes of hyperreality, the amusement parks. Quite simply, American entrepreneurs have turned the notion of a "house of amusement" inside-out, first with Disneyland, in California, and on a scale 150 times larger in Orlando, Florida, home of Disney World (and Mr. Disney meant it), yet not only the "Disneys", but also Knott's Berry Farm, the various Universal Studios properties, and sundry other places, all fitting the moniker "theme park". Why, my former favorite Mojave Desert destination, Calico Ghost Town, is well on its way to becoming a theme park, though its current admission fee is only 1/10 what it costs to visit a Disney for a day (but then, you can experience all that Calico has to offer in a day. Disney World? Not even close).

Nothing is off limits to the American amusement machine, it seems. Eco samples the religious fare; one of the "church celebrities" he saw must have been Kathryn Kulhman, by the description: she was a walking, talking, one-woman circus if ever there was one. And his conclusion of it all? That Americans must love fakery better than the genuine, for we certainly consume enough of it. And that's what hyperreality means, after all, not just a fake but an enhanced fake, a fake "on steroids", a fake that is a great deal more enjoyable than the original.

After all that, the rest of the book is "merely" brilliant. I gradually realized that Eco is a European intellectual with a capital I. The titles of the chapters and essays are elliptical, on purpose. If I get started commenting in detail, the foregoing will be a tenth of what comes after. I think instead I'll give it a rest, and say that Eco peers here, there and everywhere, and always has something to say that is at least interesting and thought-provoking, and is often useful.

Friday, July 17, 2015

Steampunk and Gaslight in the 21st Century

kw: book reviews, science fiction, science fantasy, mysteries, cryogenics

From a distance, one simply sees a misshapen, though symmetric, skull. Once the book is in hand, the details resolve into a pair of men in tall hats flanking a pair of trees, a row of fenced tombstones, and a small, solitary female figure in the background. This cover art, coupled with the placement of Unseemly Science by Rod Duncan in the "Sci-Fi/Fantasy" shelving, promised a thorough mix of genres, and indeed, it proved a delightful mix.

Another detail in the cover art is more subtle. Upon a lengthy look, the scene is found to be snowy, with mountain shadows behind. The author's writing is similarly subtle. It took a good while for me to realize that the ice itself was the core around which the mystery resolved. Yet if I reveal more than that, it will be an "unseemly spoiler".

The milieu is of more immediate interest. The book is set in an early 21st Century England with a distinctly 19th Century flavor. Armistice in 1819 after a civil war has split the country along a line through the Midlands that divides Leicester into North and South halves. To the south is Kingdom, centered on London, and to the north is a Republic, centered on Carlisle. The former British Empire is now popularly called the Gas-Lit Empire. Most nations of the world have invested great power in the International Patent Office.

Unlike the familiar patent authorities of modern nations, which exist to facilitate technology, the Patent Office enforces the Great Accord, which primarily limits technology to innovations that can be shown to "protect and insure the wellbeing of the common man." One area considered practically exempt from their oversight is medical innovation, based on the risky notion that any medical advance must be beneficial. I'd guess they forgot Dr. Mengele.

The protagonist is Elizabeth Barnabus, a fugitive from the Kingdom living in the Republic. Her backstory is told in a former book by Duncan, The Bullet-Catcher's Daughter. Upon becoming pubescent and lovely, she'd been "acquired" by a certain nobleman as a plaything (mistress), but escaped and ran northward. She lives by her wits, a kind of female Sherlock Holmes, aided by her skills in disguise. Being tall and less shapely than one might expect, she is adept at taking on a male persona and doing business as her brother when a man's work is needed. Her "brother" has been asked to look into apparent theft of ice, which is produced in large amounts in the Welsh mountains by poor families of ice farmers, and transported southward where it is kept frozen by large, inefficient cooling machinery.

Chapters in this book are headed by quotes from two as-yet unwritten books, The Bullet-Catcher's Handbook and From Revolution. The latter is stated in a glossary as a mix of writings reaching back to the Federalist Papers. I am intrigued by the titling of carnival illusionists as bullet-catchers. Having seen on a Mythbusters episode that catching a bullet with one's teeth is quite impossible, no matter how much powder you remove from the shell, I understand that the carnival illusion is one of the most skillful.

Male writers cannot totally pull off writing in a female voice. Mr. Duncan does as well as any I've read, but the very familial sense I had reading it indicated that the character of Ms Barnabus is more male-like than a female writer would have made her. I suppose the author might protest that her frequent forays into a male world, disguised as a man, make her rather mannish in general. Perhaps. She is, nonetheless, a very engaging doubly-secret detective; doubly so in that she must do her work in secret, females being forbidden from doing business in the Republic.

Dramatic tension is amplified when the Republican government takes up a bill that would enact an extradition treaty with the Kingdom. Though the Republic has tolerated numerous fugitives from the Kingdom, "proper folk" (meaning mainly those with "jobs" few of us would call "gainful employment") look askance at such immigrants, and they intend to legislate them out of existence. Most will be forcibly returned to the Kingdom, finding themselves on a rapid course to the tight end of a noose. The author has done a delightful job of rendering the above ingredients into a gripping tale of multiple betrayals and surprising heroics.

I'm in the process of scaring up a copy of the earlier book. This one ends with sufficient closure that, while one knows the author plans another volume (or more), the book is a unit unto itself. One thing is clear. If she can, Ms Barnabus means to bring about the downfall of the Patent Office. My wager'd be on that being the subject of a successor volume.

Friday, July 10, 2015

Exploring the quantum boundary

kw: book reviews, nonfiction, quantum mechanics, quantum theory, popular treatments

When I was about ten, I was disappointed in a picture I'd taken. I had been too far from the person I was "shooting", so he looked like no more than a couple of dots. Having recently learned about enlargements, I suggested getting the middle of the picture enlarged. My father remarked that the photo shop charges a lot for enlargements. Then I suggested putting it under my microscope and taking another picture, then getting that printed—I'd already been setting up a clumsy rig with a tripod holding Dad's camera at the eyepiece and making photos of the cells in thin-sliced carrots and leaves. He said I could try, but it would be very blurry, then explained about the grain in the print and in the negative. I looked, and sure enough, even at 25X the film grain made the picture look like it was printed on sand.

The next year he and I made a small telescope (I still use it), and I learned about diffraction and the magnification limit of an optical system. I realized, even if the film and print grain were a hundred times smaller, and even if the optics of the camera were flawless, diffraction would limit how much I could enlarge the final image.

This is an illustration of the Rayleigh criterion for resolving star images in a telescope. I downloaded it from the Angular Resolution article in Wikipedia. The upper section shows that the Airy Disks of the two stars are fully separated. The Airy Disk is everything inside the first dark ring (first null). The lowest section shows serious overlap, and the middle section shows the Rayleigh criterion, at which point the first null of one Airy Disk passes through the center of the other. This is the accepted resolution limit of a telescope system, or indeed, any optical system, including the eye.

What causes this pattern? It results from the interaction of light from a distant point source (or multiple sources) passing through a circular aperture. Just by the way, if you should get the notion to make a telescope with a rectangular aperture, under high magnification you'll get a diffraction pattern more like this:

Such diffraction patterns, I realized one day, are a visible manifestation of quantum-mechanical effects. If you could solve the Schrödinger Wave Equation for this system, the square of its solution would look like this image. In the SWE, the solution is in complex space, and represents probabilities, while the square of the complex probability at any point is the intensity of, for example, a beam of light or electrons, as it is spread through space by diffraction. One characteristic of the SWE is that, while there will frequently be numerous nulls, or zeroes, in the solution, there is no greatest angle or maximum distance beyond which its solution is always zero. This is why even huge telescopes such as the 10m diameter Keck telescopes in Hawaii still have a diffraction pattern once all other aberrations are accounted for (the atmosphere is a much bigger light scatterer "down here", though).

So, think of it. The yellow-green light that our eyes are most sensitive to has a wavelength of 0.55µ, or 550 nm. That's pretty small, about 1/1800 mm. And, even if we are comfortable with photons, the minimal packets of light, we think of them as having a similar "size". But diffraction patterns show us that a photon can somehow "sense" the entire aperture as it "chooses" by how much to change its direction of travel. A certain experiment that has been done with both photons and electrons proves it:

  • Set up a very, very light-tight box with a dimmable light source at one end, a sheet with a hole in it about midway, and either a sheet of film or an array of sensitive detectors (e.g. a digital camera sensor) at the opposite end.
  • Let's assume the light source is accompanied by a lens system that makes a uniform beam larger in diameter than the hole in the sheet.
  • Set the "brightness" of the light source such that there will very seldom be more than one photon inside the box at any one time. That's pretty dim!
    • A 550 nm photon has an energy of 2.254 eV.
    • A 1 mw yellow-green laser set to that wavelength (you can do that with dye lasers) emits 2.77 quadrillion photons per second.
    • Light traverses a 1-meter box in about 3 ns.
    • The 1 mw laser thus emits 8.3 million photons in those 3 ns.
    • Thus you must dim the beam by a factor of more than 8 million. That is 23 f/stops, or an ND of 6.9. Two pieces of #9 welding glass is about right.
  • Close the box, turn on the light, and wait about 3 hours.
  • Develop or download the resulting image. It will have the same diffraction pattern as if you'd left off the filters and shot a picture in 1/1000 sec.

The experiment has been done many times, usually using a two-slit setup. Either way, it shows that both a photon and an electron somehow "self-interfere" as they are influenced by everything along the way from emitter to "final resting place."

All the above serves to get my mind in gear to write about The Quantum Moment: How Planck, Bohr, Einstein, and Heisenberg Taught Us to Love Uncertainty By Robert P. Crease and Alfred Scharff Goldhaber. The authors, professors at Stony Brook University, aim to demonstrate that "quantum stuff" keeps things from either collapsing or flying apart. That we owe our lives to it. Dr. Goldhaber, in particular, draws upon classroom experience, for he teaches a course that uses optics to introduce quantum mechanics.

The book is filled with mini-histories and mini-biographies of the "physics greats" of a century ago who wrestled with the findings of phenomena that revealed that Newtonian mechanics are not up to the task of explaining all the little stuff that underlies our everyday experience. Optical diffraction is just one such phenomenon. If there were no diffraction, you could put a really powerful eyepiece on an ordinary pair of binoculars and see to the end of the universe...if your eyes were sensitive to really, really dim light (telescopes are big mainly to collect more light; high resolution is also good, but is secondary in many cases).

Einstein imagined riding a beam of light from emitter to absorber. Nowhere have I read an explanation that, from the photon's point of view, nothing happens at all. The special theory of relativity, with length compression by Lorentz contraction, and time dilation, only applies to non-photons, and in particular, particles with mass. If you take Lorentz contraction and time dilation to their limits at v=c, the photon travels no distance at all, and does so in zero time. So there is nothing to experience! From a photon's point of view, the entire universe has zero size and time has no meaning; the big bang may as well never have happened!

What if we step back a tiny bit, and imagine the neutrinos that arrived in 1987, heralding the core collapse of an immense star in the Large Magellanic Cloud, Supernova 1987a (SN1987a). I haven't read any analysis of their apparent velocity, but it must have been only the tiniest whisker slower than c. Neutrinos do have some mass, perhaps a few billionths of the mass of an electron, so they tend to have near-c velocities. It is likely that the "clock" of those neutrinos registered only a few minutes during their journey of 187,000 light years, and the distance seemed at most a few hundreds or thousands of kilometers. Now, that is relativistic.

What did Einstein and Planck and Heisenberg do that got everyone (among physicists) all in a dither for the first half of the Twentieth Century? First, Planck applied a minimum limit to the "packets" of energy radiating from a heated object, in order to combine two competing, and incompatible, mathematical models of "black body radiation" into a single formula. Einstein later showed a simpler derivation of that formula. But at first, physicists just thought of it all as a mathematical trick. In between, Einstein had described a good theory of the photoelectric effect, which seemed to require that light be in finite packets, that we now call photons.

Photons are usually small in terms of the energy they convey. As mentioned above, the yellow-green color seen at 550 nm wavelength is carried by photons with an energy of 2.254 eV (electron-Volts). An eV is a 6 billionth-billionths of a joule, and a 1-watt current is defined as one joule per second. But molecules are also small, and the energies that underlie their structure are similarly small. UVb radiation from the sun, just half the wavelength, and thus twice the energy, of "550 nm yellow-green", breaks chemical bonds in your skin, causing damage that can lead to cancer. So use sunscreen! (The middle of the UVb band is close to 275 nm, with a photon energy near 4.5 eV; more than enough to knock a carbon-carbon bond for a loop.)

Book after book is filled with the stories of the founders and discoverers of quantum physics. This book puts it all into a context that the authors call the Quantum Moment. They use the word "moment" the way a historian uses "era". From 1687 until 1927, the Newtonian Moment dominated about 240 years of physics discovery. Once a critical mass of physicists had to accept that quantum phenomena were real, not just mathematical tricks, the Quantum Moment arrived. The stories of the epic battle between Bohr, who formulated the Copenhagen Interpretation, and Einstein, whose work stimulated Bohr and others, but from which Einstein then recoiled, is told here with more feeling and clarity than any other I've read.

Scientists have an emotional bond with their science. For many of them, it is their church, which they defend as keenly as any ardent fundamental Christian defends his church's theology. In the Newtonian Moment, phenomena whose initial state could be perfectly described were thought to be perfectly predictable. The math might be gnarly, but it could, in principle, be done. Quantum theory, and then quantum mechanics, blow by blow cracked open this notion and showed it to be a fantasy.

This is not just the problem of imperfect knowledge, rounding errors, or the need to simplify your equations to make them solvable. Heisenberg's Uncertainty Principle is not just a description of the way a measurement apparatus "kicks" a particle when you are measuring its location or velocity. What is Uncertain is not your measurement, but the actual location and velocity of the particle itself, at least according to Bohr. One implication of this with more recent application is the "no-quantum-cloning" principle, which makes certain applications of quantum computing impossible. However, they also make it very possible to create unbreakable cryptographic codes, which has the governments of the world (or their equivalents of our NSA and CIA) all-aquiver.

Then there's the cat. The authors give us the luscious details of Schrödinger's Cat satire, which he proposed as a slap against the notion of an "observer". Bohr and others needed some instruction from optics: every quantum particle is sensitive to, very literally, everything in the universe. All at once, and with no apparent limitation set by c. Heck, half the time, the cat is the only observer that matters. The other half, the cat is dead, and it ceases to matter to him. But, the authors point out, the air in the box is an "observer": the exchange of oxygen, water and carbon dioxide around a breathing cat are quite different from those near a dead one. So all we can say from outside the box with the cat in it, is that we can't decide the status of the cat without looking inside. We just need to remember that the term "observer" is very squishy.

I recall reading that even a pitched baseball has a "wavelength", according to the deBroglie formula. It is really tiny, only a few thousand times larger than the Planck limit of 10-35 cm, in fact. That means the deBroglie wavelength of a jet aircraft is much, much smaller than the Planck limit, which is why "real world" phenomena are easily treated as continuous for practical matters.

But the Cat, and the Uncertainly limit, show that the boundary between quantum and "classical" worlds is hard to pin down. Since that is the core of the Copenhagen Interpretation, it is seen to be weak at best, and in the eyes of some physicists, simply wrong. But there is no well-attested competing theory.

We must remember that the theories and mathematics of quantum "stuff" describe lots of "what" and a little bit of "how". They tell us nothing about "why". We don't know why there is a Pauli Exclusion Principle, that two electrons, and two only, can coexist in an atomic "s" shell, but only if they have opposite spins (and that "spin" is oddly different from the way a top spins). But we do know, that if it were not so, atoms would collapse in a blast of brightness, almost immediately, and the universe would collapse back into a reverse of the big bang, all at once and everywhere.

One scientist's work is not mentioned in this book, probably because he wasn't directly involved in the quantum revolution. But his work is pertinent in another way. Kurt Gödel formulated his Incompleteness Theorems in 1931, early in the Quantum Moment. Together, they show that no mathematical system can "solve" every problem that can be stated using its postulates, and that no mathematical system can be used to describe its own limitations. For example, there are rather simple polynomials that can be formulated using Algebra, but can only be solved using Complex Analysis. Even weirder if you know only Algebra, the simple formula X²=1 has two answers (1 and -1), but we tend to think that Xⁿ=-1 has only the answer -1 when n is odd, and is "imaginary" when n is even. But in Complex analysis, when n=3, for example, there are three answers, two of them involving an "imaginary" part.

At present, then, science has three boundaries to infinite exploration:

  • Heisenberg Uncertainty. You can't know everything to infinite precision.
  • Schrödinger Undecidability: You can't predict quantum phenomena on a particle-by-particle basis. Even if you could escape the Uncertainty Principle, you couldn't do anything of great use with the results (which would fill all the computers in the known universe, just describing a helium atom to sufficient precision).
  • Gödel Incompleteness: You can't solve most of the questions being asked in the framework of quantum mechanics, not now, not ever, using the methods of quantum mechanics. QM appears to be the most Gödelian of mathematical systems, in that it asks so few questions that can be answered!

For scientists who grew up in the Newtonian Moment, it is like finding out that your church has no roof, and the rain and raccoons are getting in and taking over the place. No wonder Einstein was upset! We are in the Quantum Moment, nearly 90 years into it, and it may be another century or two before a new Moment supersedes it. Get used to it.

Tuesday, June 30, 2015

Stuff our brain makes up

kw: book reviews, nonfiction, psychology, neuroscience, hallucination

To hallucinate is to be human…and, perhaps, to be any creature with a mind. As we read in Hallucinations by Oliver Sacks, a great many stresses and neurological disorders can lead to sensing (any of the "5 senses" may be involved) things that aren't there, but for many of us, so can a great many rather prosaic matters. For example, many people are like me: almost any time I can close my eyes and I will either see things—including persons—or hear voices that aren't there. Particularly when I am sleepy, these phantasms can be quite detailed: I'll either see entire scenes being enacted or hear entire conversations (though I can seldom understand the words), or music, and sometimes sight and sound go together. Also when I am sleepy or tired, I don't necessarily have to close my eyes to hallucinate. It is likely that these kinds of things happen at times for most of us. (I was once asked why I rarely listen to music. I replied that I have a sound track running almost all the time.)

Hallucinations could be considered both a travelogue and a catalog of hallucinatory perceptions. Dr. Sacks has migraine auras; he has experimented with sundry drugs; he has suffered griefs and stresses that led to several hallucinatory episodes. While many disease syndromes, from high fevers to Parkinsonism, lead to hallucinations, I was particularly interested in the more "normal" cases. It seems that the brain's pattern matching and recognition systems easily go into overdrive, as many of us experience when we look at clouds and see all kinds of fantasies. Static images get "over-recognized" rather easily. I have a painting of a seascape, with waves and rocks; one of the rocks one day looked just like a jaguar's head to me, and I can't see it any other way now. But we also experience things for which there is no apparent external trigger. Perhaps it is the lack of a trigger that triggers them, such as closing one's eyes.

By the way, the author mentions tinnitus, or "ringing in the ears" as a kind of hallucination caused by damage to the inner ear, and the brain hallucinates the sounds it is not receiving from the organ. This may be so in some cases, but certainly not all. I have low-level tinnitus, which gets louder if a pull my head back a certain distance. An audiologist used a tiny microphone in my ear to listen in, and said that pulling my head back changed the shape of the middle ear, which amplified the sound. The cause is the damaged hair cells vibrating in response to random noise (Brownian motion), not being damped as is normally the case. The inner ear may be a super-regenerative amplifier, which I'll discuss in a moment.

It may be that the only time most of us are free of hallucinations is when we are in a most ordinary state, not bored, not over-engaged, just "doing something" that fits well within our comfort zone, mentally and emotionally. I like the concept of the comfort zone, particularly in this context. Its boundary may be quite firm for some of us, and rather more nuanced for others. In my case, I think of the boundary as a wide zone of gradually increasing stress, and throughout most of this range any shift can release a mild hallucination of some sort. Thus the tendency to hallucinate in this "normal" way follows a sort of spectrum.

I think of a mechanical/electronic example. A kind of radio receiver, used in older CB radios, is "super-regenerative". It has three circuits in its detector portion. One is an extra-sensitive amplifier that will oscillate and almost blow itself out when any signal of the right frequency appears, including noise. It has extremely high positive feedback, but the key is that it "pops" faster the stronger the input signal. The second is a squelch circuit that allows the amplifier to "go crazy" for about 1/20,000th of a second, then very briefly cuts its power. The amount of squelch can be set by the operator. The third measures the maximum level achieved during each tiny time slice, and turns that series of measurements into an audio signal. So you can think of a hallucinating brain as a super-regenerative receiver with the squelch set too low.

A characteristic of most hallucinations is that you know it. A hallucination taken as real is a delusion. One question raised a few times in the book is whether the human tendency to religious faith is based entirely on hallucinations. Of course, to a total rationalist, all religion is delusional. But total rationalists are quite rare. According to Julian Jaynes (see The Origin of Consciousness in the Breakdown of the Bicameral Mind), half our brain informed the other half of its sensings and learnings via hallucinations that were thought to embody the voices or appearances of deities. Further evolution caused these two functions to become better integrated. Some say the tendency to generate divine apparitions and voices are a remainder of the bicameral mind, leading to every form of religious experience. I personally think that is an over-interpretation, and that there really is a God, but I'll forego theology in this review.

Hallucinations of all kinds are a class of experience that stands alongside dreams and imagination. They resemble dreams but can be much more detailed. Some dreams can be directed; this is called lucid dreaming. Hallucinations can't be directed, and usually play out as though the hallucinator is a spectator in someone else's theater. Imagination is nearly always directed but typically lacks the apparent veracity of a hallucination. We imagine something and may even speak of "seeing it in the mind's eye", but it doesn't appear to project into the world outside the way a hallucination does. Hallucination is also related to synesthesia, and perhaps this is its closest cousin. A synesthete might see colors attached to musical notes or printed numbers or letters; or to be able to taste the sound of certain words or songs.

But hallucination is more than mixed perception. It is perception without a perceived object, a result that is quite different from the stimulus that might produce it. For example, in a healthy person, grief can trigger the sight and/or sound of the lost loved one. This kind of hallucination is most directly related to a perceived object, or the memory of one. But the "sleepy-time" hallucinations I have aren't based on any proximal object, nor memory, except, I suppose, my general fund of memories about prior events. Thus, they might be waking dreams, though they differ from dreams during sleep, which are usually accompanied by a feeling of purpose. Hallucinations are typically purposeless.

I had a great time reading an earlier book by Oliver Sacks (reviewed in May). Hallucinations was a bit harder to read through. The writing is often more analytical, written at a higher level, and perhaps a bit more detailed at times than I had tolerance for. However, I don't want to commit the error of the king who told Mozart, "There are too many notes." Mozart rightly replied (so it is reported), "Majesty, which notes should have been left out?" This book can be read with profit by anyone, and will provide particular comfort to those who may be seeing or hearing "things", and fear they are crazy. No, you aren't crazy if you know your hallucination from what is really "out there". Or, if you are crazy, then so are we all.

Sunday, June 21, 2015

Walk on the wild side - on Main Street

kw: book reviews, nonfiction, wildlife, cities

Do children still sing "Skip to my Lou"? One verse repeats, "Pigs in the parlor/What'll I do?". Other verses mention flies in the buttermilk, a cat in the cream jar, and a couple of birds. If you want to get creative, verses could be added about coyotes or deer in the back yard, cottontails in the corncrib, and if you were in Cape Town, baboons in the kitchen.

Most people in the cities tend to think of the city as a pretty sterile place, inhabited only by humans and their pets, maybe with pigeons and sparrows around, and a few pests such as flies thrown in. Tristan Donovan is here to tell us there is more to cities than we might imagine, in Feral Cities: Adventures with Animals in the Urban Jungle.

Much of the book contains stories about animals, not just in suburban areas and city fringes, but right in the middle of our cities around the world: Boars in Berlin, Coyotes in Chicago, the resident Cougar in Griffith Park in Los Angeles, a flock of Parrots in Brooklyn, Baboons breaking into homes in Cape Town, and the finding by researchers in Raleigh that every home is host to at least 100 species of insects and spiders.

Why should there be animals in our cities? By making cities comfortable for humans, we have made them comfortable for a multitude of opportunistic animals. In the U.S., northern cities are warmer, sometimes as much as 10°F and even more. Further south, many spaces are air conditioned, so an overheated jaybird in Tucson might make its way into the local WalMart to cool off. Cities in dry places are wetter, and homes in wet places are drier, than their surroundings. Snakes have been found with half their bodies hanging into a hot tub on a cool night, occasionally diverting a human romantic encounter from its intended course. And there is food everywhere, everywhere! Raccoons raiding garbage cans. Crows and gulls picking at road kill. Rats in the storm sewers, eating our refuse and being hunted by snakes and coyotes and wildcats. To a bobcat a rat's intended purpose is turning our crap into his lunch.

Biologists have compared animals in cities with their rural counterparts, and have found that many species are more abundant, better fed and live longer in a city than in the countryside. Why wouldn't there be animals in our cities?

I really like the turn taken in the last couple of chapters. We ought to be making our cities more friendly to species we like. For most people, a little time spent watching rabbits or otters is calming. My wife was quite delighted one day to report seeing a deer "pronking" down our street outside our hedge. A few endangered species are actually doing better in cities than in their "native" habitat. The Peregrine Falcons nesting on window ledges in skyscrapers come to mind (Bookmark the DuPont FalconCam and take a look beginning next March; just now I see only feathers in the nest).

Many doctrinaire environmentalists might shudder at the thought of making our cities into better habitat for beneficial or endangered animals. To them, cities are Evil and part of the probem; there's no way they can be part of the solution. But face it, cities are here to stay. They presently encumber only 2% of the land area, but that is growing, and their impact is greater than you might think. A certain parrot species is found in greater numbers in certain southwestern U.S. cities than in its entire home range in Mexico. They go where the living is better!

And suppose we were to succeed in creating cities in which nothing could live except humans and a short list of "approved" human pets. Then what? Should inner city kids—and their parents—be deprived of the sight of a blue jay, cardinal or indigo bunting? Should they be doomed never to see a living rabbit or raccoon? Should the endangered parrots of the U.S. southwest be "repatriated" to a "native" habitat that is getting too degraded to support them?

I don't like flies in my home, so I welcome the spiders that live here. There are at least 10 species that I've found. Only when a spider gets too big and is found crawling on the bed do I evict her. Our yard hosts rabbits and squirrels, so I do have to put small-mesh fencing around the garden, and we hope for the occasional visit by a fox to keep their numbers in check (she comes through every couple of years). We see deer droppings under the apple tree in the fall. As long as I don't corner a deer and get clipped by those front hooves, I'm happy to have one bed down there occasionally. We're planting a greater variety of flowers to draw butterflies, but avoiding the "butterfly bush" which is too concentrated and becomes a praying mantis colony beneath which one finds piles of butterfly wings! When I find a robin nest in the hedge, that section goes an extra month without being clipped until the chicks fledge. We let wasps nest in the louvers of the attic vents, but not in areas where children might play. Wasps are great predators of the insects I don't want to encounter. We encourage dragonflies, which keep the mosquito population down. A local hawk "tends to" the various little mammals such as mice and voles.

I appreciate the biologists who agree with Mr. Donovan, and are working to make our cities better for human-animal coexistence. Of course we don't want rats everywhere, but the best exterminators are Maine Coon cats, not poison baits that kill so many other animals as a byproduct, and make rat bodies poisonous to house cats and wild cats. With proper education we can even learn to live with coyotes in our midst...and we aren't going to see those exterminated anytime soon, anyway! Nearby New Jersey residents need to learn to think like bears so they don't attract them where they don't want them, but do attract them where they do want them. We need to face it, humans are part of nature. Let's open up to seeing "who else" shares our cities.

A very education and refreshing book.