Diatribes of Jay

This blog has essays on public policy. It shuns ideology and applies facts, logic and math to social problems. It has a subject-matter index, a list of recent posts, and permalinks at the ends of posts. Comments are moderated and may take time to appear.

14 January 2021

Social Platforms’ Roles in Treason and Harmful Disinformation


For brief descriptions of and links to recent posts, click here. For an inverse-chronological list with links to all posts after January 23, 2017, click here. For a subject-matter index to posts before that date, click here.

Do Facebook, Twitter, YouTube, Instagram and other Internet platforms have some responsibility for the January 6 Capitol Insurrection? Were they complicit in it? Did they help cause it? Were they accessories to treason? Did they give aid and comfort to real “enemies of the people”? Did they aid Vladimir Putin’s disinformation campaign to weaken the US, or spread disinformation about the pandemic?

At least one inside observer thinks so. Roger McNamee is a venture capitalist, an early investor in Facebook and a long-time observer of Internet platforms. He believes the answers to some of these questions may be “yes.” He thinks that the platforms recently kicked Trump off not out of selfless patriotism, but for fear of legal liability.

McNamee never went to law school and is not a lawyer. So his reasoning depends on cause and effect. For Facebook, he explained it in a January 13 interview by Hari Sreenivasan on Christiane Amanpour’s show on PBS. He also makes similar points in an article in Wired Magazine entitled “Platforms Must Pay for their Roles in the Insurrection.”

McNamee’s analysis proceeds in two steps. First, the platforms’ algorithms prioritize extremism, outrage and conflict in order to attract traffic. In McNamee’s words, “Hate speech and disinformation theories are core to the business.”

As an example, McNamee cited the Qanon conspiracy fantasy. He said that, by Facebook’s own admission, it had three million viewers of various pages related to Qanon. He cited independent reports of Facebook’s own internal study, which concluded that 64% of Facebook users who opened Qanon’s pages did so because Facebook had recommended them. He also claimed that “Covid misinformation flourished on Facebook.”

The second step is causation. Facebook, McNamee claims, was instrumental in inciting and organizing the Insurrection. Otherwise unrelated people from all over the nation took part in it. Without Facebook and similar platforms, they could neither have whipped themselves into a frenzy nor organized the Insurrection.

In legal terms, the ways Facebook programmed its algorithms to “monetize” extremism, outrage and conflict look like “but for” causes of the Insurrection. Although McNamee didn’t say so specifically, the Covid misinformation that “flourished” on Facebook is undoubtedly responsible for many instances of Covid sickness and death, including “superspreading” events.

I would add another “count” to McNamee’s analysis. As US intelligence has made clear, Vladimir Putin’s US disinformation campaign did not have electing Trump as its primary goal. Instead, it had the goal of fomenting and fostering discord and division within the United States, thereby weakening it. Because Trump was and is a master of division and discord, electing him and increasing his support (the more radical the better) were just means to that end. Facebook and the other platforms, with algorithms designed to exploit extremism, outrage and conflict, played right into Putin’s grand strategy.

No sane person believes that Mark Zuckerberg or Jack Dorsey deliberately incited the Insurrection, aided Putin’s disinformation campaign, or aggravated the pandemic. All they wanted to do was make money and, as Zuckerberg famously put it, “move fast and break things” for that purpose. Their facilitation of treason, Putin’s disinformation campaign and Covid disinformation was neither deliberate nor purposeful.

But all these things did happen in significant part because of the ways in which the Internet platforms did and do business and the steps they deliberately take to increase traffic. Facebook’s own employees, in internal and public protests, gave its bosses ample notice that what they were doing was not only immoral and dangerous, but also may have broken the law.

In legal terms, the platforms’ parts in the treason, disinformation and widespread pandemic unpreparedness seems, at very least, negligent or reckless. The question before us is: does that matter? In a sound, self-protective society, should it?

One of history’s great advances in the rule of law was the recognition of non-intentional torts, or personal wrongs. Long before the industrial age, the law recognized only intentional torts, such as assault and battery. But as the industrial age came and matured, personal injuries and deaths caused inadvertently became more common. People suffered and died as unintended consequences of the careless or reckless uses of dangerous machinery and chemicals.

Today every driver is familiar with the notion of “negligent” driving— driving a vehicle without ordinary and reasonable care. There is also a heightened guilty state of mind—“recklessness”—which means little or no care and implies greater liability. If driving negligently or recklessly injures or kills someone, the driver can be legally liable for civil damages and, in cases of death, criminally liable for manslaughter. Similar law applies to non-intentional industrial accidents and environmental damage from such things as oil spills.

Does the same analysis apply to unintentional treason, sedition or insurrection, or to disinformation that aids a foreign power or exacerbates a pandemic? Should it?

In the UK, the answers might well be “yes.” Unlike the United States, it has an unwritten Constitution and has not reduced almost all of its law to written statutes. Instead, the UK’s law more often operates on the old English system of “common law.”

When a new legal question arises that never came up before, Britain’s courts and its “Law Lords” from the House of Lords (the equivalent of our Supreme Court) rely on wise judges to determine what makes sense and where justice and the public interest lie. Likely the UK’s common-law judges would not allow a profit motive to excuse acts that put the entire state at risk—whether from treason, foreign disinformation or misleading claims about a pandemic. After all, such acts may ultimately harm the very people making the judgment.

Whether US courts, including our Supreme Court, would make such judgments without statutory authority is an open question. But enacting a statute to that effect is something Congress should consider.

The notion that private firms can help incite and enable violent insurrection, aid a foreign enemy, or aggravate a pandemic, and yet escape all liability, is unlikely to promote the survival of democracy. And just as the industrial age increased the risk of inadvertent injury and death in accidents, the information age is self-evidently increasing the risk of unintended social and political consequences, including widespread social destabilization.

One thing is almost certain. Even notorious Section 230(c)(1) of the Communications Decency Act, which wiped out the law of defamation for Internet platforms, does not give Internet platforms a free pass for all the harm they cause. Specifically, it does not excuse treason, sedition, insurrection or complicity in or aiding them. Nor does it excuse inadvertently aiding foreign disinformation campaigns or the spread of a pandemic. Its one-sentence midnight amendment only precludes a platform from being “considered the publisher or speaker of any information provided by another information content provider.”

This language comes directly from the law of defamation. In enacting it, Congress never imagined anything like the Capitol Insurrection, Putin’s disinformation campaign, the Covid-19 pandemic, or how Facebook and other Internet platforms might help cause or exacerbate them. Therefore § 230(c)(1), as it stands now, cannot preclude liability for a platform that arranges itself so as to foment extremism, outrage and conflict as means to profit and thereby helps enable these and similar disasters.

A case of negligent or reckless enabling would proceed much like a case involving an automobile or industrial accident or oil spill. Experts would testify on what level of “reasonable care” a platform should have taken to have reduced the risk of the insurrection, sedition or treason that ensued, or of aggravating the effect of foreign disinformation campaigns or the pandemic. Employees of the platform, who saw in person what had transpired on it, could be called to testify.

If employees’ reported pressure and protests to remove lies, hate and conflict from Facebook are accurate, Facebook might be hard pressed to avoid liability. Thus patriotic employees on the front lines could serve as a check on the unintended social and political consequences of corporate greed.

As McNamee suspects, the prospect of this actually happening may have been a significant factor in Facebook deplatforming Trump after the Capitol Insurrection. Similar analysis applies to other unintended consequences of letting platforms foment discord for profit, including riots, rumbles and hate crimes.

So if the Capitol Insurrection chilled your heart, you needn’t consider Zuckerberg, Dorsey, and their ilk, or their respective firms, entirely innocent. As a matter of common-sense cause and effect, the way they ran their businesses helped incite the insurrectionists and helped them organize their treason. Similar logic applies to the spread of disinformation in Putin’s campaign and about the pandemic.

Imposing legal liability on corporations—and personal liability on their controlling persons—is a tried and true way of discouraging non-deliberate wrongdoing, including accidental environmental pollution. If we want to discourage Internet platforms from polluting our society with hate, lies and disinformation, imposing legal liability on negligent and reckless acts is something to consider.

The very fact of Facebook deplatforming Trump after the Insurrection shows that imposing legal liability could work. Like traffic and industrial accidents and environmental disasters, the terrible unintended consequences of Internet platforms’ business could wane, if only we hold them legally accountable for meeting a reasonable standard of care. Negligent and reckless automobile and industrial accidents, including pollution “spills,” don’t discourage themselves: the laws of corporate and personal liability do.

Permalink to this post

0 Comments:

Post a Comment

<< Home