Introduction: great thinkers need more respect
Why I was wrong
Why collecting big data is a problem
Why storing big data is even more so
Conclusion: will we live in an Orwellian world?
Introduction: great thinkers need more respect
Sometimes great thinkers get no respect. People expect them to predict the future accurately, in all its detail and complexity. No one can do that.
Throughout human history, the great prophets have been able to foresee only broad outlines, general trends. They get the essence right but can’t see the trees for the forest. So detractors ignore their warnings and belittle their foresight.
Thus it has been with Malthus, and thus it may be with Orwell.
I’ve already written a whole essay on Thomas Malthus’ big idea
and will only summarize it here. Writing over two centuries ago, he got his timing wrong. Modern agriculture and twin revolutions in science and technology vastly increased our species’ ability to produce food and so staved off the starvation he predicted, at least so far.
Malthus was also wrong about the first commodity to go. It’s energy, not food. And he missed a huge unintended consequence: as we exhaust our planet’s limited supply of fossil fuels, we are not only running out of cheap, dumb energy; we are also changing our global climate and causing ourselves all sorts of serious headaches.
But for all the things he got wrong, Malthus got the most basic and important thing right
. He wisely saw that, having bested our few predators and having begun to conquer the most dread diseases, our exploding human population would eventually outrun the resources of our finite planet. As fossil fuels get scarcer and more expensive, and as our climate becomes our enemy, we are finally beginning to understand just how much our dawning twenty-first century will belong to Malthus.
George Orwell was “just” a novelist, not a “serious” thinker. When we Yanks got past his critical date—1984
—with a semblance of democracy still intact, we breathed a huge sigh of relief. We made it! Orwell was wrong!
So we don’t think about him much any more. But we should.
In his classic dystopian novel, published in 1949, Orwell made three basic predictions. First, he foresaw how the then-nascent sciences of biology, medicine and psychology could be used as tools of political control. Second, he predicted that large-scale conflicts between continent-spanning human cultures would provide convenient excuses for oppressing and manipulating the masses. Finally, he foresaw how collecting vast information on everyone would allow government and the elite to control individuals by less crude means than killing or jailing them.
Like Malthus, Orwell erred in detail. We don’t use modern biology to tailor-make individuals for different societal roles, at least not yet. And we certainly don’t do it in test tubes as he imagined. But the science of genetic engineering is just in its infancy.
There’s as yet no analogue to “soma”, the drug Orwell imagined rulers using to keep even skeptical citizens happy and docile. But don’t look too closely at today’s vast overprescription of ritalin for ADHD—an ailment whose recent “epidemic” seems to have occurred mainly among minorities and the poor. You might discern a similarity.
On continental conflict, Orwell’s prediction wasn’t bad at all. After a couple of decades attempting to cooperate, we Yanks, Russia and China seem to be resolving into continent-wide conflicting cultures much as Orwell predicted. The resulting conflicts “justify” all sorts of expenditures on spying and weaponry and all sorts of infringement of civil liberties. For all we know today, these depressing trends are just now hitting their stride.
Orwell didn’t foresee the rise of Islamic terrorism, which provides the most direct and powerful excuse for spying, secrecy and oppression, in all three conflicting realms. But, hey, no prophet is perfect.
Anyway, it’s in his last prediction that Orwell really
shines. He foretold that information about individuals would become an instrument of power and control. In so predicting, he was spot on.
Who would ever have imagined, back in 1949, that mundane information about all of our daily lives—our youthful experimentation and indiscretions, likes and dislikes, failed job performance, unsuccessful business ventures, extramarital affairs, and political communications and contributions—would become part of a permanent “cloud,” available forever, to persuade, blackmail, manipulate, intimidate and coerce us? Only Orwell, apparently.
Why I was wrong
About seven years ago, I published a post entitled “Search-and-Seizure Heresy
.” In it, I argued that our Fourth Amendment was being misconstrued in the electronic age. Its aim, I wrote, was not to protect information
, but to protect people
from the coercion, disruption and intimidation that arises from general or unauthorized searches of houses and offices by armed police or military forces.
That essay still has some useful insights and an interesting practical comparison of our age with Colonial times, when our Founders drafted our Fourth Amendment. But its most basic point was wrong, and I hereby recant it. Here’s why.
In that essay, I argued that the evil of general search and seizure—i.e., dragnets—was not acquisition of information or invasion of privacy, but the search or seizure itself. In our electronic age, when the government vacuums up vast reams of data about us, it does so secretly and silently. The targets don’t even know or suspect that they are being surveilled. So there’s no coercion, disruption, or intimidation, at least not immediately. The target of surveillance doesn’t even know what’s going on.
Under these circumstances, I reasoned, the evil of search and seizure is not the mere fact of surveillance, but the use
of the information once obtained. If it’s used to prevent a terrorist attack, no harm, no foul. If it’s used for illegitimate or improper purposes, such as political intimidation, blackmail, or prosecuting crimes without due process, then the courts can avoid it, or sort it out later. Aggrieved individuals can, among other things, later sue for violation of their rights.
But in reaching these conclusions, I did what I often use these pages to accuse others of doing. I didn’t follow the likely chain of causation and consequences far enough. So now I must eat humble pie.
What I failed to do was to follow my heresy to its logical and practical conclusions. Two are most important.
First, if authorities (or anyone else) can collect any information at will, they will. They will collect it all and store it all, without limit, for possible future use. Information is too valuable a commodity, and to cheap to gather, to let it go.
The technical means of vacuuming up big data today are extraordinarily cheap and simple. And they are getting cheaper, simpler and smaller by the year, in accordance with Moore’s Law. So once officials—or our businesses—incur the expense and trouble of building the collecting infrastructure, all their natural incentives are to collect everything they can and store it indefinitely, against possible future need.
They will collect big data on everyone and everything for the same reason that Hillary first climbed Everest: because it is there. The data vacuum-cleaner will become a voracious tornado that never forgets, because clever people can always think of new uses for old big data later.
Second, both government officials and business people who collect big data have every incentive to lie about what they are doing. If they want good information, they don’t want people to know they are collecting it. If their subjects know, then at least some will object. Some will hack. Some will encrypt. More than a few will whine and pout. Many will conceal, mislead and prevaricate. Some will provide disinformation, just for spite. Some will even sue.
But if you are quiet as a mouse and let no one know what you are doing, you can vacuum up good information as long and as much as you like, and you can store it and comb through it forever. There is absolutely no incentive for moderation or restraint, or for telling the truth about what you are doing.
Knowledge is power. Knowing things about people that they don’t know you know can provide near-absolute power. And we all know what Lord Acton said about that
Lest you think these consequences are abstract theory, watch the recent PBS Frontline
Feature, entitled “The United States of Secrets
,” both parts.
Not only our government, but also private business firms, big and small, have already slid down the slippery slope toward vacuuming up everything, simply because they could, and because doing so is cheap and simple in our modern digital age. More to the point, government officials at the highest levels, and the CEOs of “household name” US Internet firms, lied profusely and repeatedly about what they were doing. They didn’t want us subjects to know.
Perhaps some of the lying was justified. NSA understandably doesn’t want terrorists to know what it’s collecting and how. For similar reasons, Google and other “targeted ad” companies don’t want consumers to know how much they are watching us as we live, love, work and shop, because knowing might cause us to hide or change our behavior.
It’s all a bit like the Heisenberg uncertainty principle for subatomic particles. The very act of observing a particle, requiring interaction with a photon, can change the particle’s behavior. So generally speaking, those who run data vacuum cleaners don’t want the rugs to know they’re being vacuumed. Consequently, they lie and mislead, even to Congress and the American people. All this does not conduce to a transparent, open and honest society.
And when you think more deeply about it, even the “aiding terrorists” claim doesn’t seem to go very far. Don’t we want
would-be terrorists to think their every phone call and e-mail is monitored?
If they’re smart, they probably suspect as much anyway. And if they’re really
smart, they might find ways to encrypt their real
communications, and/or to feed us disinformation on more easily monitored lines.
But every encryption can be broken. And who has the better technology: Islamic extremists crouching in caves, or the nation that developed the Internet, routine encryption of financial transactions, and all the mathematical, electronic and engineering infrastructure to support them?
If terrorists are smart and well informed, their best option is not to try to best the Yankee technological juggernaut, but simply to avoid using cells phones, e-mail and the Internet generally, except for decoys and deception. In other words, they must resort to passing important messages by secure personal courier. Doesn’t that necessity put them at a substantial disadvantage in the digital age?
Why collecting big data is a problem
Despite all the charges and countercharges, there are no villains in this story, only misunderstood heroes. Edward Snowden gave up an easy life, let alone a free one, for a sordid existence of exile and unceasing risk and uncertainty, all in order to remind us of Orwell’s prophecy. The NSA folks are dedicated public servants with a single paramount job in their minds: averting the next 9/11.
Each has given us a vision of real and serious risks. The NSA’s warnings are easy to visualize: another 9/11-scale terrorist attack. Snowden’s warnings are less so: a society in which there are no secrets, in which all the sordid details of our private and public lives remain open visible and potentially accessible in the cloud forever, and gossip and propaganda replace wisdom, sound judgment and wise public policy.
With our twenty-four hour news cycle and our increasingly intrusive and twittering media, we are already well on our way to Snowden’s (and Orwell’s!) dark vision. To an astounding and rapidly increasing degree, we already have
a polity based in large measure on gossip, i.e., on titillating details about current events and prominent people’s lives and acts that have no substantial or logical relationship to rational decision making or sound public policy.
How much more can we build a culture and a public life based on random but titillating facts, without knowledge or wisdom, repeated endlessly by advocates with agendas, and expect to succeed in an extraordinarily competitive, globalized world? How can we hope to compete with societies like China and Germany which, for all their faults, keep their eyes on the ball and don’t confuse raw data and gossip with rational thinking?
This is just one of the hard questions that Edward Snowden has asked us. He has given up any semblance of a normal life, at least as an American, to put this question before us. His warning is perhaps harder to understand than the risk of a new terrorist attack, but it is none the less real and important.
Orwell would have understood. Can we?
The spooks’ desire to vacuum up everything has a sound practical justification. Connecting the dots of terrorist plots usually involves hindsight.
Remember the 9/11 hijackers taking flying lessons but not wanting to learn how to land? The significance of that dot didn’t appear until after the Twin Towers came down. But computers and a large enough data base might have revealed its significance beforehand.
That’s a perfectly plausible assumption and a worthy project for protecting all of us. Computers can apply hindsight a lot faster, and can handle a lot more data, than the best intelligence analysts. They can work so fast that their hindsight approaches foresight. And if properly programmed, computers don’t fail because of siloing data in inter-service rivalry, like that between the CIA and FBI.
So the rub is not the counter-terrorism operation. Every clear-thinking Yank wants it to continue unimpeded as long as a credible terrorist threat persists. The rub is the vast temptation that unlimited databases create for other, less noble or necessary uses.
Here Orwell provides a glimpse into a possible dark future.
Although it may not seem so when you look at Syria or Donetsk today, we humans became considerably less crude, savage and brutal during the last century. Stalin was one of the most brutal tyrants in human history, responsible for tens of millions of premature deaths. Mao was perhaps less brutal on a per-capita basis, but he, too, was responsible for a least millions of unnecessary, premature deaths. And his capricious economic policies in his dotage were responsible for millions more.
In contrast, today Vladimir Putin and Xi Jinping rule their respective nations pretty firmly, with but a tiny fraction of the killing and bloodshed that their predecessors used. How do they do it, and how do they do it so peacefully? By controlling information.
Both authoritarian societies—Russia and China—strictly control the flow of information to their people. Although both governments permit some freedom on the Internet and in the printed media, both control television almost absolutely, and both attempt to control the Internet. China does a far more effective job than Russia, as China has the edge in practical software technology and (according to some estimates) over 30,000 Internet “censors” monitoring the Web every day.
And lest you think we Yanks are far from those paragons of propaganda, consider Fox. That alleged “news” source is one of the most effective propaganda machines in human history. It succeeds, among other ways, by endless repetition to a captive audience (a tactic familiar from Hitler’s “big lies”), and by making propaganda seem like entertainment. The pill goes down so easily, as Fox’s jokers play on sarcasm, rant and rave.
But controlling the information people receive
is only half the story. Knowing real
information about them is the other. Without accurate information, you can’t make propaganda plausible and credible. With it, you can demolish the character, reputation and credibility of anyone you choose.
Take our Yankee 2008 presidential election, for example. An highly educated, moderate, restrained, soft-spoken, thoughtful man ran against an aging, intemperate, irascible pol who, in the midst of the worst economic crisis in eighty years, honestly confessed he knew nothing about economics. By all rights, Obama should have beaten McCain by a landslide. But he won by only a few percentage points in the popular vote.
So although the GOP ultimately lost, the closeness of its loss was an impressive propaganda coup. How did it achieve that coup? By using the raw material of fact to manufacture effective propaganda.
How did the birther movement grow such long legs? Because Obama was born in Hawaii, a racially mixed state that, to most Americans, is impossibly remote and exotic—not really a part of America. (Having lived there for eleven years, I know this from personal experience. Large numbers of Yankee tourists, while boarding planes in Honolulu to return home, say they are “going back to the States”—an expression that makes Hawaii’s Yankees wince.)
How did the “extremist” charge against the President gain traction? Because Obama’s preacher, the notable Reverend Wright, actually said some edgy things.
How did the “terrorist” charge gain credence? Because Obama actually did socialize briefly with a man who, decades before, had been part of the then-extreme “Weather Underground.” In the end, this dysfunctional group did little but kill and injure its own members, but it scared a whole lot of Americans.
Of course the GOP propaganda also relied in large part on the President’s race, if only indirectly. Despite all our progress since the Civil War, there are still a whole lot of Americans who are ready to believe the worst of anyone with black African genes. Having a few solid and accurate facts able to raise doubt was all clever propagandists needed to incite their suspicion, distrust and dislike.
From the GOP side, that’s what the 2008 presidential campaign was all about. If McCain had any clever and non-obvious plans and policies, they got lost in a blizzard of gossip, innuendo, and propaganda based on real but largely irrelevant factoids. Perhaps for failure to grasp the needs of a national campaign, McCain ran one of the worst in American history.
The twisted consultants and “political operatives” who gave us that horrible campaign lost, but they are far from defeated. Nor is their kind of dark sorcery limited to the 2008 presidential campaign. It’s endemic in our society, our politics and even in our courts.
Trial lawyers now make a practice of digging up real dirt on hostile witnesses. They destroy witnesses’ credibility before juries not by evoking contradictions in their testimony, or any rational doubt relevant to the case at hand, but by sheer character assassination. They have become demagogues and propagandists in what you might think are the last bastions of pure reason in our twittering society: our courts.
There are even CLE courses in which experienced lawyers teach young ones these dark practices. Judges shouldn’t, of course, permit these tactics in the courtroom. But they often do, if only because they fear being reversed on appeal for evidentiary rulings with little relevance to the heart of the case. So they let the character assassination proceed, hoping it will not distort the result of the trial. Of course the lawyers doing
the character assassination hope exactly the opposite, often with good reason.
And need I mention non
-presidential politics? All those Citizens United
ads, for which the 1% pay billions and which the rest of us pay as much as we can afford to counteract, are nothing more than propaganda based, as much as possible, on real dirt.
In our courtrooms, our political arenas, and our business competition, we have entire, well-paid industries whose job is to vacuum up real dirt from which to manufacture plausible and therefore effective propaganda. In a real sense, “character assassination ‘R us”—in our courts, in our states, in our national politics and in our businesses.
It is in this context that the social peril of vast, uncontrolled databases on everyone becomes apparent. Big
things in an important
person’s past are often public knowledge. As a freedom fighter, Nelson Mandela reportedly gave some orders that caused people to be killed. Some say he was a terrorist. But he later grew in wisdom. Then he united and brought racial peace to South Africa as no one else could.
Jose Mujica, the current president of Uruguay, once was a rebel guerrilla. He was shot several times and spent nearly two decades in prison. Much later, he formed a progressive government so marvelously effective that it led The Economist
to name Uruguay country of the year in 2013 for its “recipe for human happiness.” Our own President, during his high-school years, had been a member of an informal club that smoked marijuana (although after Bill Clinton’s not inhaling, no one seemed to care much in 2008).
The point is not that propaganda based on dirt is always effective, or that dirt always needs big data to extract it. The points are that digging up dirt is effective far too often, that data vacuum-cleaners bring it to a whole new level, and that digging for dirt distracts us from people’s real
character, and the real merits of products and services, let alone wise policy.
When you can’t tell the good guys from the bad guys, or the competent from the incompetent, through the blizzard of negative ads based on big-data dirt, you’re going to get bad government. At very least, you’re going to get government by the 1%, who have the cash to dig up the dirt, to borrow or steal it from government if necessary, and to hire the people who know how to use it in skillful propaganda. Government becomes an exercise in dirt-digging and propaganda, financed by people who have money to spare and who believe their dirt-based propaganda will get them more.
Everyone has something in his or her past that could be used for character assassination. The best of us have more than most. Mandela and Mujica, for example, each morphed from ruthless and perhaps brutal freedom fighters into wise and effective political leaders.
Having huge databases of everything about everyone, including dirt, will only give propagandists additional fodder, tempt us to waste energy, attention and societal resources on digging up dirt on each other, corrupt our politics and our judicial process, and put more power in the hands of the 1%, who have the money to waste on all of this and few scruples about doing so.
And lest you think the NSA’s and private-Internet-firm databases are immune from such sordid use, remember the NSA’s code name for one of its data vacuum-cleaning programs: “PRISM.” The name derives from a simple, non-electronic optical device familiar to everyone from high-school science classes: a triangular piece of glass called a “prism.”
In science classes, a prism splits the colors of white light, producing an attractive multi-colored spectrum. In the NSA’s use, prisms split fiber-optic beams in the Internet’s backbone into two parts. One goes on its normal way, making the Internet work. The other goes into NSA’s data-vacuuming equipment.
So simple, so cheap, so effective. A five-dollar optical device, which doesn’t even require electrical power, does it all. Do you begin to see why only effective legal
restraint and political oversight can impede Orwell’s dark vision?
Using a simple prism from a high-school physics class, crooked pols who once burglarized the Democratic Party’s headquarters in the Watergate Building can, much more easily and unobtrusively, siphon off a substantial portion of Internet traffic and bend it to nefarious ends. So can bent business, petty criminals everywhere on the Internet, and globally organized crime. Where there’s such potent temptation, devious minds will surely find a way.
Why storing big data is even more so
Of course the longevity of storage makes a difference, too. Once you conceive of storing data on individuals for a decade or more, you can vacuum up their whole lives.
It’ll all be there: youthful hijinks and indiscretions, embarrassing but ultimately harmless sexual adventures, cured venereal diseases and disabilities, early experiments and failures in employment and business, regrettable statements and conclusions in a person’s formative years, failed relationships, the minor crimes that everyone commits through negligence or oversight, the risky and unfortunate associations that more mature reflection might have shunned, and sometimes even evidence of major crimes.
Government and business will get distracted from their primary jobs of making our lives better. Instead, they will dig up dirt on people they don’t like, including political rivals and business competitors. We will have Richard Nixon’s “Enemies List” on steroids.
Do we really want to collect this stuff and have it available for stealing and illicit use for a person’s entire life, and even for their offspring’s? Should the sins of fathers and mothers be visited on their sons and daughters? Should the mistakes of youth dog a good person for his or her entire life?
Only if we want to become a culture based on character assassination, which is where our political campaigns and civil trials are now demonstrably headed. No wonder the good Spanish judge recently said “no” and ordered Google to let ordinary people, after a suitable interval, purge their dirt from the cloud. His decision, although no doubt costly for Google to implement, was little more than common sense and humanity.
Conclusion: will we live in an Orwellian world?
No matter how proper is big-data collection’s original or “primary” purpose—and counter-terrorism is a good one!—the more data we collect on more people, and the longer we store it, the more likely we will live in George Orwell’s world. The data’s mere existence will tempt the ambitious, weak and unscrupulous among us to improper use, whether formal or informal, legal or illegal, authorized or not. When splitting the fiber-optic beam requires only a five-dollar glass prism, we can’t count on keeping the genie in the bottle.
So there’s a real and palpable tension here. We can never, of course, have complete security in the face of ardent terrorism. We can approach it, perhaps, but only at the cost of bringing Orwell’s dark vision to life in our own time.
Perhaps we can reduce the risk of realizing that vision with some simple, practical means. First, all collected data, without exception, should have a shelf life, after which the storage equipment purges it automatically and completely. Maybe the shelf life should depend upon the reason, if any, to believe that the subject is involved with terrorist activity or might be a valuable witness to terrorism. The ordinary shelf life should be somewhere between two and five years—ample time to get a judicial warrant to keep it stored longer, if necessary.
Second, data on persons with no known or suspected connection to terrorism should be purged periodically and automatically, perhaps within a year. For example, if no qualified analyst has accessed the data within a year, the storage program could automatically purge it, keeping a record of the purging, the subject and the date, but not the content.
Third, we should think hard about doing with big data what the Fair Credit Reporting Act did with financial databases: allowing data targets to review and correct false information about them. Of course the problem with anti-terrorism data is much more difficult: we don’t want terrorists or their sympathizers rummaging around in the secret data that might expose and thwart their plots. But we all know that mistakes are made, and sometimes innocent people suffer terribly, as in the case of mistaken identity casting people irrevocably onto the “no fly” list. We ought to provide some way for people to purge their inaccurate
dirt, even if we have to set up special, secret courts like FISA to do so.
In curbing unnecessary big-data vacuuming and abuse of the information it gathers, we would do well to recall a phrase attributed to Larry Lessig: “the code is the law.” Collecting terabytes of data on 307 million Americans, and on a substantial fraction of over 6 billion foreigners, is not something anyone can monitor and control effectively in real time. If nothing else, the recent history of government lies and misinformation about what data it gathers and how it uses it is instructive.
Perhaps the law can manage, with appropriate judicial orders, to reveal, correct and unwind a particularly egregious misuse of the data after the fact. But doing so will take years of effort on the part of investigators, lawyers, judges and jurors. Only the most egregious and important violations of privacy and misuse of dirt will ever rise to that level.
So if we are to have safeguards against an Orwellian world, we will have to design and build them into the big-data-vacuuming system itself. They will have to be part of the programming that runs the system. They will have to be part of the code that, in large-scale computer systems, is the only practical law. And we will have to appoint a savvy Big Data Coding Inspector General to make sure that the coded protections are appropriate, adequate, in place, and operational. This is not a case in which individual legislators, or even legislative committees or staff, can have any real and direct oversight over vital detail.
So whatever his faults, Edward Snowden has raised a real and important issue. Are we going to remain a free country, where individuals are free to experiment and make mistakes without knowing or believing that that what they do will haunt them and their progeny forever? Or are we going to become a society of spooks and character assassins, ruled by gossip and clever propaganda, in which nothing is private and our individual mistakes and pasts are open to misuse and abuse by government, business, self-interested propagandists and criminal minds, which have every incentive to vacuum up as much data about us as possible, to hoard it forever against possible future need, and to deny vociferously that they are doing so?
The nature of our society and the future quality of our lives depend upon thoughtful, nuanced answers to these questions. Ultimately, they depend on practical measures to avoid excessive data collection and its improper use or release.
So far, all we have gotten for answers is caricatures. NSA has argued that everything is fair game for collection and indefinite storage because the goal (avoiding terrorist attacks) is just. That’s a self-evident non-sequitur. So is the view of some privacy advocates that nothing
is fair game unless approved in advance by a judge. The whole rationale for massive data collection is that you can’t connect the dots until you have them, and you don’t know what dots are important until you have lined them up for a while. So you can’t have anything persuasive to present to a judge until you’ve started up the vacuum cleaner and have begun to connect the dots.
Surely well-meaning pols and public servants came come up with a better way to reconcile safety and privacy than these two extreme and simplistic positions. The other alternatives are to abandon a useful tool to combat terrorism or to take a giant step down the dark road toward Orwell’s vision of an intrusive, controlling, inhumane and dysfunctional society.