Diatribes of Jay

This blog has essays on public policy. It shuns ideology and applies facts, logic and math to social problems. It has a subject-matter index, a list of recent posts, and permalinks at the ends of posts. Comments are moderated and may take time to appear.

16 March 2019

Software is Nonlinear: An Elegy for 346 Air Victims


[Note to readers: the first two following lists of links inadvertently omitted several recent posts. The omission has been fixed. For brief descriptions of and links to recent posts, click here. For an inverse-chronological list with links to all posts after January 23, 2017, click here. For a subject-matter index to posts before that date, click here.

In the recent no-survivors crash of Ethiopian Airlines Flight 302, 157 people were killed. In the similar crash of Lion Air Flight 610 last October 29, 189 people died. That’s a total of 346 people killed on 737 Max 8 aircraft within a span of five months.

To put this carnage in perspective, think of terrorism. Besides 9/11, no single terrorist attack in human history has ever killed this many people. Yet terrorism as a modern phenomenon has changed our routines of air travel and our attitudes toward international migration almost beyond recognition. Shouldn’t this large a death toll also change our attitude toward the software that may have caused or contributed to these deaths?

After virtually every other aviation authority in the world had done so, our own, dear FAA finally grounded the Boeing 737 Max 8 aircraft in which so many people had died. Now we surviving air travelers—the overwhelming majority worldwide—are relatively safe. At least we won’t have that risk to worry about, for the moment.

But are we survivors really safer? If our manic 24/7/365 news cycle moves on too quickly, we could lose this rare chance to evaluate a risk that soon may become commonplace. That’s the risk of flight controlled not by pilots, but by software.

Software-controlled flight fundamentally departs from the regime of mechanical, electrical, electronic and aerodynamic science that previously guided aircraft designers and pilots. Until experts and public officials governing air travel come to ken the many differences between the two regimes—digital and analog—these recent air disasters could be just the beginning of a new age of horrors.

I know, I know. Aircraft have used software as part of their control systems for decades. Software-controlled “autopilots” are now such a routine part of air travel that the phrase “on autopilot” has found its way into common parlance.

But autopilots differ from the software in the doomed 737 Max 8s in two vital respects. First, engineers design autopilots for the long, routine, stable, boring parts of flight, in which all the plane has to do is to keep a steady altitude and heading. Until recently, they did not design autopilots for takeoff—the most risky and dangerous part of flying. And they certainly didn’t design them to correct as tricky and variable a problem as nose-up stalling during takeoff.

Second and more important, pilots can shut down autopilots and retake control of their aircraft at the flick of a switch. Autopilots are not designed, as the 737 Max 8 software apparently was, to substitute for re-engineering an airframe or for failing to train pilots to deal with a new airframe’s peculiar flight characteristics, let alone by taking control away from pilots.

Autopilot software keeps aircraft on a level, straight flight path at a specified heading. It accounts for minor variations in wind, air temperature, density, currents and the like to keep the plane on course. For other types of flight, pilots shut off autopilots and take the helm whenever they choose to do so.

Flying on autopilot is like letting your untrained six-year old daughter take the tiller of a sailboat gliding on an easy beam reach in a steady wind with calm seas. But an aircraft taking off is nothing like that. Letting software take over that phase of flight is like letting the same six-year-old try a tough jibe or coming about in rapidly shifting winds. It’s a good recipe for capsizing.

Like the six-year-old, software can “understand” only the simple instructions you program into it. It has little flexibility or adaptability to unexpected conditions. It can’t “think outside the box.” Indeed, it can’t “think” at all.

More to the point of the Max 8 crashes, software programmed to operate only within a special range of specified external variables (here, angle of attack, altitude and air speed) can switch on and off abruptly as that range comes and goes. Apparently the mandatory or near-mandatory software on the 737 Max 8s was designed to take control, so as to bring the plane’s nose down, only when it sensed that the plane was going nose-up toward a stall. But the erratic vertical gyrations of both doomed planes suggest that the pilots were fighting the software for control, during its final minutes, over a whole range of nose angles, altitudes and speeds.

Whether due to external instrument failure or bad programming, the software apparently “thought” the plane was going to stall when it wasn’t. So it fought the pilots to bring the nose down while they were fighting to bring it up. The resultant vertical gyrations, observed by external radar, occurred repeatedly, suggesting that the software intervened on and off repeatedly, in ways unexpected by the pilots and probably seeming random to them.

At one point, records indicate, two pilots together applied 100 pounds of pressure to the yoke, trying to bring the nose up. But the software “won” and the planes went down, on a nearly vertical suicide path. The nose of each plane was apparently pointed straight down when it barreled, respectively, into the Java Sea and the soil of Ethiopia.

What horrible, helpless thoughts the pilots must have had in their last moments! What pangs of remorse they must have felt for failing so miserably in their responsibility to protect hundreds of innocent lives!

The pilots’ and passengers’ brief pre-oblivion terror—and their loved ones’ lifelong losses—will mean nothing unless these disasters teach us two vital lessons.

The first is an old but oft-repeated story. When engineers advise caution and lives are at stake, their expert judgment must prevail over marketing and promotion. When the reverse occurs, you get disasters like the Ford Pinto gasoline-tank explosions and the Challenger shuttle explosion, which something as simple as a rubber O-ring freezing in anomalously cold weather seems to have caused.

In Boeing’s case, a long chain of reported marketing/promotion decisions subdued engineering caution and judgment. Boeing wanted an aircraft that could go farther on less fuel and so added bigger engines to an airframe not designed for them. Rather than lengthen the landing-gear struts and modify the wings and control surfaces to accommodate the bigger engines—which would have taken time and money—Boeing just moved the engines further forward on the wing, changing the aircraft’s flight-control characteristics. Then, in order to market the plane as requiring no supplemental pilot training—an expensive proposition for airlines—it designed software to accomplish the equivalent of an airframe redesign, at least for the critical takeoff phase. In the beginning, it just “warned” the pilots of these changes, without giving them the hours of personal training in flight simulators that might have made their reactions to the new flight characteristics automatic. The result was 346 deaths.

In an economy that worships profit above all else, instances of marketing and promotion overwhelming engineering caution have a long and sorry history. Sordid as that history may be, it’s not the most salient cautionary tale here. What’s truly new in these Max 8 disasters is the role of software.

Software doesn’t act or react like the things that most aircraft engineers and pilots are used to handling. It’s not at all like any mechanical, electrical, or electronic device. Its performance is not linear. Neither its regime of operation nor its modes of failure are anything like proportional to the flaws it may have.

A mechanical gear that loses a cog may make a lot of noise, but it will still turn the shaft. An optical lens with a spot of grease on it will still “see” light; the grease may only decrease its sensitivity. A bent or even shot-up wing or control surface may still support an airplane and guide its pilot home, as many military pilots have learned in wartime. A wire corroded or bent may still conduct electricity.

But a computer program with so much as a comma or parenthesis missing or misplaced, out of millions of lines of code, can produce bizarre results or (more likely) simply shut down and not work at all. And if the programmer expected the program to operate only in a limited regime of three specified variables (angle of attack, altitude and airspeed, for example), and if one or more of those variables falls outside its expected regime in real life, the program may do something random, or simply not work at all.

Another facet of the practical chasm between real, physical systems and software is the difference between analog and digital systems. If sound sent through an analog amplifier exceeds the amplifier’s “dynamic range” (sound-volume capacity), the sound’s waveform gets “clipped.” Then the top and bottom of the analog waveform may be cut off, leaving the amplified analog result sounding “ragged” but recognizable.

In contrast, sound that exceeds the maximum amplitude of a digitizer, or the expected range of a digital multiplier, may seem entirely random and unrecognizable, depending upon the digitization frequency and how the digital “amplifier” (multiplier) handles numbers higher than its expected range. Something similar may have been precisely what happened in the 737 Max 8 disasters, for example, if the software encountered angles of attack higher or lower than those for which it was programmed.

There’s a lot still to learn about the twin 737 Max 8 disasters. Maybe the pilots or their lack of suitable training was partly at fault. Maybe some mechanical, electrical or electronic instrument failed. But, if so, it seems highly likely that the computer software’s inadequate response to those failures dramatically increased the risk of a no-survivors crash. No software or computer yet designed has anything like the flexibility and adaptability of the trained human mind.

The lesson here may inconvenience programmers but is pretty simple. In situations where software errors can kill, people must have control over the software in real time. Every bit of software whose faulty operation can be fatal must have a “kill switch” to shut it down, which its operators are trained to use in an instant. In appropriate cases, software should be designed to shut itself down upon sensing anomalous results, like a plane taking off at low altitude and assuming an angle of attack suitable only for landing.

More generally, business people, managers, their customers, regulators and pols have to shed their awe of software, the sooner the better.

Everyone who surfs the Web today has near-daily encounters with flaws and inadequacies in software. There’s the name field that won’t take a hyphenated last name, or one that ends with a suffix like “Jr.” or “III.” There’s the ZIP-code field that won’t take a full nine-digit ZIP code, or rejects one with a hyphen. There’s the telephone-number field that makes you type in all digits, like no telephone number you’ve ever seen before. There’s the hyperlink that leads to nowhere, or to the wrong page. There’s the series of links that leads back to where you started, making you think that no programmer ever tested the whole damn site.

If programmers leave these flaws in websites that are the public faces of their businesses, and that millions of people see and use every day, what flaws might lie hidden in the bowels of a digital box behind the instrument panel of an aircraft, where it’s needed, if at all, in perhaps one flight out of ten, or one of out of fifty?

It took some four million years for animals to evolve into apes and for them to evolve into creatures with anything like our human brains. Yet already we think we’re smart enough to duplicate the flexibility and adaptability of those brains, in little chips of silicon and metal, after less than eighty years from the first digital computer’s development to today. What hubris!

As I’ve argued in another essay, what we have today is nothing remotely like “artificial intelligence.” It’s “simulated” intelligence. It can beat any human being at a game of chess or go only because the rules of chess and go are limited and finite and capable of concise mathematical expression, and because the chips in electronic computers work least a thousand times faster than any animal’s neurons, including ours.

But try that parlor trick with life, or even with air travel! There, the variables that can arise in extreme events are far more numerous and wide-ranging than those for movement on a chess or go board. Unexpected events that impose whole ranges of strange new variables are common, for example, in high winds, deviant jetstreams, extreme weather or forest fires.

Against the infinite variety and vicissitudes of real life, no machine we have yet developed has anything like the real survival intelligence of a dog or a cat. Developing such a machine might take our species the better part of a century, if we can manage it at all. One should never underestimate human ingenuity, but the jury is still out on whether real artificial intelligence is possible using anything like current electronic technology.

So-called “self-driving” cars have already killed at least three people, including one innocent bystander (non-driver). And they’ve done so when they’ve only been deployed without human drivers in onesies and twosies, not by the millions. It now appears that software, designed to act wholly or mostly autonomously, has played a leading role in killing 346 people on airplanes.

If our AI hype and hubris continue to grow so far beyond our species’ feeble digital capability, these casualty rates will grow correspondingly. If we let AI control our instruments of war, as some propose, we may eventually extinguish our species, as acts of war escalate automatically, far faster than our primitive autonomous systems can sense and analyze them.

But another, more desirable, outcome is possible. If the untimely deaths of the 346 Max 8 victims cause us to regard software and AI with far more skepticism, caution, and common sense than we have up to now, those innocent people will not have died in vain.

Links to Popular Recent Posts

For my message to Southwest Airlines on grounding the 737 Maxes, click here.
For an example of even the New York Times spewing propaganda, click here.
For means by which high-school teachers could help save American democracy, click here.
For a modern team of rivals that might comprise a dream Cabinet in 2021, click here.
For an analysis of the global decline of rules-based civilization, click here.
For a brief note on avoiding health lobbying Armageddon, click here.
For analysis of how to save real news and America’s ability to see straight, click here.
For an update on how Zuckerberg scams advertisers, click here.
For analysis of how Facebook scams voters and society, click here.
For the consequences of Trump’s manufactured border emergency, click here.
For a brief note on Colin Kaepernick’s good work and settlement with the NFL, click here.
For an outline of universal health insurance without coercion, disruption of satisfactory private insurance, or a trace of “socialism,” click here.
For analysis of the Virginia blackface debacle, click here.
For an update on how Twitter subverts politics, click here.
For analysis of women’s chances to take the presidency in 2020, click here.
For brief comment on Trump’s State of the Union Speech and Stacey Abrams’ response for the Dems, click here.
For reasons why the Huawei affair requires diplomacy, not criminal prosecution, click here.
For how Speaker Pelosi has become a new sheriff in town, click here.
For how Trump’s misrule could kill your kids, click here.
For comment on MLK Day 2019 and the structural legacies of slavery, click here.
For reasons why the partial government shutdown helps Dems the longer it lasts, click here.
For a discussion of how our national openness hurts us and what we really need from China, click here.
For a brief explanation of how badly both Trump and his opposition are failing at “the art of the deal,” click here.
For a deep dive into how Apple tries to thwart Google’s capture of the web-browser market, click here.
For a review of Speaker Pelosi’s superb qualifications to lead the Democratic Party, click here.
For reasons why natural-gas and electric cars are essential to national security, click here.
For additional reasons, click here.
For the source of Facebook’s discontents and how to save democracy from it, click here.
For Democrats’ core values, click here.
The Last Adult is Leaving the White House. Who will Shut Off the Lights?
For how our two parties lost their souls, click here.
For the dire portent of Putin’s high-fiving the Saudi Crown Prince, click here.
For updated advice on how to drive on the Sun’s power alone, or without fossil fuels, click here.
For a 2018 Thanksgiving Message, click here.


Links to Posts since January 23, 2017

permalink to this post

0 Comments:

Post a Comment

<< Home