[For a brief comment on cell-phone risk, click here.]
One of the most depressing pieces of punditry I have read recently was David Brooks’ May 27 piece entitled “Drilling for Certainty.” In it, Brooks laments the “bloody crossroads where complex technical systems meet human psychology.”
Without ever quite saying so, Brooks implies that human fallibility renders the complex technological systems on which our lives now depend inherently unreliable. After reading his piece, you want to: (a) crawl under your bed and hide or (b) jettison modern technology entirely and return to a safe, if drafty, cave.
Brooks’ piece is troubling in two respects. First, it reflects a form of fatalism that is conservatives’ (and scoundrels’) last refuge when things go terribly wrong. As you can verify virtually any day by reading editorials and comments in the Wall Street Journal, conservatives’ response to the financial meltdown and the Great BP Oil Spill is: “We couldn’t have done any better. Markets require discipline. Bad things happen. That’s life!” In contrast, progressives say―as the President did recently about the Oil Spill―“We can do better, and we will, starting now. Here’s how.”
The second reason Brooks’ piece is troubling is that it is emblematic of a gaping hole in our ruling class. Nearly all of our policy makers in both government and business were trained as lawyers or business people. Like Brooks, the vast majority of them were liberal-arts majors. By aptitude and training they have barely the faintest grasp of higher mathematics, science, or engineering. These fields control our massive technological infrastructure. Yet our leaders see them through a glass darkly, with the aid of fuzzy and often misleading popularizers.
The number of people in Congress with significant (let alone current) training in science or engineering you can count on the fingers of one hand. A society built on science and engineering, yet with a dearth of scientists and engineers in its ruling class, is in trouble.
The “bloody crossroads” of human psychology and technology is itself an object of scientific study, with great promise. About a decade ago, Korean Air Lines had one of the worst safety records of any developed-country airline, with an unusual series of fatal accidents. Social and industrial psychologists began to study why. They found that Korean culture is extraordinarily paternalistic and authoritarian, especially within its military. (There is a reason why Kim Jong Il rules North Korea.)
Because of this culture, co-pilots and engineers refused to speak up in the cockpit when observing errors or sources of danger. They were afraid to question or contradict the pilot, who was often a product of the South Korean military’s dictatorial culture.
The discovery of this cause of trouble fostered a program of training in teamwork, which brought Korean Air Lines’ accident rate down to the international norm in just a few years. The answer was science and social engineering, not fatalism.
But this essay is not about the enormous progress that science and engineering have made in examining us fallible human beings as part of their subject matter. That topic would require a whole book. This essay focuses on another tool of good science and engineering: redundancy.
When undressing one night, take a close look at your body. You have two eyes, two ears, two arms, two legs, and two nostrils. Men have two testicles. Although you can’t see them from the outside, women have two ovaries, and all of us have two lungs and two kidneys. Part of this redundancy reflects our bilateral symmetry. But most is a result of evolution’s four-million-year engineering. Making critical systems redundant enhances survival and the chances for successful reproduction.
If Nature’s own engineering recognizes the value of redundancy in complex, critical systems, why don’t we?
Of course we do, but we forget from time to time. The Apollo Program got men to the Moon with doubly and sometimes triply redundant critical systems like flight computers and life support. Power plants (especially nuclear ones!) and financial institutions have redundant computers and backup systems. Virtually every serious computer center has redundant backup storage, usually in separate locations, in case of fire, flood, other natural disaster, sabotage, or terrorist attack.
Lack of redundancy was a cause of the Challenger shuttle disaster. The critical O-ring seals in the rocket module were redundant in a way; there were more than one. But they were all in much the same place and therefore subject to the same brittleness and fracturing when the weather turned cold. Redundancy is not always a simple thing: it’s a matter of context that requires careful thought and planning.
I consciously practice redundancy in my own life. Since buying my first personal computer in 1985, I have rarely had less than two. At first the two were my own and an office computer. Lately, as operating systems and software got more complex and less reliable, my family has never had less than three at home. Two years ago I had to switch to my wife’s computer for a time (while she was out of the country) after the hard disk on my primary computer crashed. More recently, I had to switch to my Linux netbook for certain on-line brokerage transactions when my desktop computer proved incapable of providing reasonable response time.
The latter problem is one of those frustrating and intractable interactions between my computer’s software, my unusual ISP, and a particular on-line brokerage. Months of effort and inquiry have produced lots of finger pointing but no tangible results. Yet my Linux netbook chugs along, keeping me connected.
Redundancy works just as well as a defense against social complexity. I have written several essays (for example, 1, 2 and 3) about how complex and unreliable our mostly private health-insurance system is. So I practice what I preach: redundancy.
For my last six years before Medicare, I had no less than three different health-insurance policies: a primary one through my employer, a secondary one through my wife, and a third (“excess major medical”) policy, with high limits and a high deductible, through a professional organization. I felt all three necessary because of the complexity and well-demonstrated unreliability of private health insurance. Good health care is a “critical system,” and you don’t get it without good insurance. So when private health-insurance policies don’t work well, you need three of them.
Does redundancy cost money? Of course it does. But our own evolution decided it was worth the expense. It spent a lot of energy (the biological equivalent of money) developing and feeding all those redundant or partially redundant arms, legs, eyes, ears, nostrils, lungs, kidneys and reproductive organs. And evolution is never wrong: it always trends toward greater survivability, which is why we are here. Who are we to argue with our own origins?
Exploiting redundancy properly requires a balance of humility, wisdom and confidence. We need humility to know that our best systems do fail, wisdom to see how and to back up what is critical, and confidence to proceed with redundancy, knowing that nothing we make is perfect but that we can continue to strive for perfection.
Fatalism is never a proper response to disaster. At least it hasn’t been among us Americans until recently. When you consider the complexity of air travel and the number of fallible human beings involved in it at every stage, including the design and construction of aircraft, you have to marvel that its safety record is as good as it is.
Our government established and largely controls the vast, global social engineering project that achieved and maintains that safety record, with its FAA, air traffic controllers, strict maintenance schedules, mandatory modifications of aircraft and engines, NTSB, mandatory incident reporting, thorough investigations of accidents, and black boxes to reconstruct them. That’s why English is the universal language of aviation. And we all have to keep pushing to make the system better, especially when the profit motive encourages cutting corners.
Air travel is an inherently dangerous endeavor. If we can make it so safe that millions of people who know nothing about how it works use it with justified confidence every day, we can do the same with nuclear power and even offshore drilling.
Redundancy in critical systems is one important way of achieving that result. The trick is not to nickel-and-dime good engineering, whether physical or social, but to willingly pay the price for safe and reliable systems.
Now we know that blow-out preventers are not as failsafe as we thought they were. And the Great BP Oil Spill teaches that they are indeed critical systems. So maybe henceforth all deep-sea wells will have two of them, perhaps of different designs, and government regulation will so require. That would be useful social evolution.
I don’t own a cell phone. My wife does, and I use it only rarely. One reason is that I don’t like an “on-call” 24/7/365 lifestyle. The other is my concern over possible biological effects.
Maureen Dowd brought the subject into focus yesterday in a column questioning whether cell phones will be the next cigarettes: everyday items fraught with serious medical risk. Before reading her column and comments to it, I was just concerned. Now I’m worried. Here’s why.
If you don’t mind a little high-school algebra, consider the following equation:
This equation expresses the law of physics by which the power of radio waves decreases with the square of distance from their point source. The lower-case and upper-case letters represent the power (p) and distance (r) from your cell phone and another source of radio energy, respectively.
Now plug in p = 2.5 watts (the power of a typical cell phone), r = 0.5 inch (the distance from its antenna to your brain), and P = 100,000 watts, for a typical high-power radio station. Then solve for R. You get 100 inches, or about 8.3 feet.
So the next time you walk or drive down the road with a cell phone clamped to your head, think of an 100,000 watt radio station with its huge antenna less than nine feet from your skull. Your brain receives the same radio power in both cases. Kinda puts things in perspective, doesn’t it?
This calculation and comparison should make one thing obvious. Since Marconi invented radio, we humans have never subjected our brains to such radio power, even momentarily. For reasons of effective long-distance propagation, radio towers are actually much taller then nine feet. So you’d have to climb one and stay there to get the same radio energy, even if you lived directly under it. Few people do that, but millions walk around with cell phones clamped to their heads.
Two common misconceptions about radio energy confuse the cell-phone-risk issue and many of the commenters to Dowd’s piece. The first is the notion that radio waves are ubiquitous (true), so that cell-phone emissions must be harmless (not necessarily true). The fallacy is failing to recognize that putting a cell phone with several watts of power right next your head is a brand new phenomenon in human history.
You can see why by using the same formula to calculate power ratios, even if you don’t know your equipment’s specifications. Many homes have a wireless network, which of course uses radio waves. You can estimate its power, relative to a cell phone’s, by looking at the distances each is supposed to serve. Home wireless networks typically have a 500 foot limit, while cell phones have to reach towers at least 2 miles away, or 10,560 feet. That’s a distance ratio of about 21. Take the square root to get the power ratio, which is about 4.6.
So your WiFi network has at least 4.6 times less power than your cell phone. On top of that, you don’t hold your laptop or wireless router right next to your head. It’s at worst at arm’s length, which is typically about 18 inches from your head. So the distance ratio is about 36, which when squared is 1296. Multiply that by the power ratio, and you get a number close to 6000. So your home WiFi network delivers about 6000 times less radio power to your brain. Not in the same league with cell phones!
Radio and TV stations deliver even less power because they are so far away. That’s why they require such sensitive and finicky electronics to pick up their signals.
Another common misconception is confusing radio waves, or long-wave electromagnetic radiation, with nuclear or “atomic” radiation. The two are similar in some abstruse physical respects, but they have very different energy regimes and physical effects. Cell-phone “radiation” is worlds away from the radiation that comes from atomic explosions, radioactive materials, X-rays or cosmic rays. Therefore physicists rightly point out that cell-phone emissions cannot cause the same kind of chemical ionization that “atomic” radiation does, and which is a known cancer risk.
But that doesn’t mean that radio waves from cell phones can’t cause cellular damage and cancer by other biological mechanisms not yet known. Strong radio waves can cause microscopic Ohmic heating or minute electrical discharges inside our brains. And we know that mutations at the cellular level increase with temperature.
Biology is more complex than basic physics. Rather than look for reasons to debunk single absurd mechanisms of cellular damage, such as ionizing “atomic” radiation, we should pay attention to the increasing number of credible studies that suggest a link between cell-phone use and brain tumors. We should also pay attention to the anecdotal stories of people like Ted Kennedy, who had no cancer in his family and yet died of a brain tumor after years of cell-phone use. Genetics are a strong predictor of cancer susceptibility, and the studies done so far suggest that something else is at work.
So where does that leave me? I used to be agnostic, but now I’m a skeptic. I do plan to get a cell phone soon. But I’ll keep my land lines and use my cell phone as infrequently as possible. And when I use it, I’ll keep the antenna as far away from my head and vital organs as possible. I’ll wear the phone on my belt while walking, put it on the seat or dashboard while in a car (not driving!), and use a wired headset. And I'll do so until a series of credible scientific studies, not funded by the tech industry, convinces me that cell phones are safe.
Distance really matters because it’s squared. If cell phones are dangerous, it’s not because they are so powerful, but because we use them so close to our most vital organ, our brain. And some of us keep them there for hours a day.
Put the antenna at arm’s length, and the radio power reaching your brain decreases by two or three orders of magnitude, i.e., by hundreds or thousands of times. Texting at arm’s length is probably OK, as long as you don’t hold the phone up to your eyeballs while sending.
It’s a sad fact in our society that every issue―even medicine and product safety―becomes politicized. So we have exonerating studies funded by the tech industry, which no one trusts. And we have vigorous arguments based on theory and so-called “common sense” (radio is everywhere), which make no sense when analyzed rigorously, as above.
Now that Maureen Dowd has given the problem some visibility, industry may help solve it by engineering instead of PR. Some enterprising cell-phone maker may offer phones with removable clip-on antennas, connected by retractable shielded wires (much like those used for vacuum-cleaner power cords, but tinier). Then you could put the phone to your ear but clip the antenna on your belt, on your umbrella (in the rain), or (in your car) on the rear-view mirror post or eyeshades. Doing so would improve reception in many cases, and the increased distance from your head would vastly decrease the radio power reaching your brain.
In the meantime, what we need are serious scientific studies, funded and run by impartial, neutral institutions, like the suggestive Swedish studies that Dowd cites in her piece. While waiting for Godot, it seems prudent to treat chronic cell-phone use (with phone clamped to head) like the hazardous activity it may turn out to be, and not just while driving.