Opinion
When federal officials persistently pressured social media platforms to delete or downgrade posts those officials did not like, a government lawyer told the Supreme Court on Monday, they were merely offering “information” and “advice” to their “partners” in fighting “misinformation.”
If the justices accept that characterization, they will be blessing clandestine government censorship of online speech.
The case, Murthy v. Missouri, pits two states and five social media users against federal officials who strongly, repeatedly, and angrily demanded that Facebook et al. crack down on speech the government viewed as dangerous to public health, democracy, or national security. Some of this “exhortation,” as U.S. Deputy Solicitor General Brian Fletcher described it, happened in public, as when President Joe Biden accused the platforms of “killing people” by allowing users to say things he believed would discourage Americans from being vaccinated against COVID-19.
Surgeon General Vivek Murthy, who echoed that charge in more polite terms, urged a “whole-of-society” effort to combat the “urgent threat to public health” posed by “health misinformation,” which he said might include “legal and regulatory measures.” Other federal officials said holding social media platforms “accountable” could entail antitrust action, new regulations, or expansion of their civil liability for user-posted content.
Those public threats were coupled with private communications that came to light only thanks to discovery in this case.
As Louisiana Solicitor General J. Benjamin Aguinaga noted on Monday, officials such as Deputy Assistant to the President Rob Flaherty “badger[ed] the platforms 24/7,” demanding that they broaden their content restrictions and enforce them more aggressively.
Those emails alluded to presidential displeasure and warned that White House officials were “considering our options on what to do” if the platforms failed to fall in line. The platforms responded by changing their policies and practices.
Facebook executive Nick Clegg was eager to appease the president.
In emails to Murthy, he noted that Facebook had “adjust[ed] policies on what we’re removing”; had deleted pages, groups, and accounts that offended the White House; and would “shortly be expanding our COVID policies to further reduce the spread of potentially harmful content.”
Facebook took those steps, Clegg said in another internal email that Aguinaga quoted, “because we were under pressure by the administration.” Clegg expressed regret about caving to that pressure, saying, “We shouldn’t have done it.”
According to Fletcher, none of this implicated the First Amendment, because “no threats happened.” He meant that federal officials never explicitly threatened platforms with “adverse government action” while urging suppression of constitutionally protected speech.
That position is hard to reconcile with the Supreme Court’s 1963 decision in Bantam Books v. Sullivan. In that case, the Court held that Rhode Island’s Commission to Encourage Morality in Youth had violated the First Amendment by pressuring book distributors to drop titles it deemed objectionable.
Notably, the commission itself had no enforcement authority, and at least some of the books it flagged did not meet the Supreme Court’s test for obscenity, meaning the distributors were not violating any law by selling them. The Court nevertheless concluded that the commission’s communications, which ostensibly sought voluntary “cooperation” but were “phrased virtually as orders,” were unconstitutional because they aimed to suppress disfavored speech and had that predictable result.
The Biden administration’s social media meddling bears a strong resemblance to that situation. But Fletcher argued that federal officials were simply using “the bully pulpit” to persuade platforms that they had a “responsibility” to curtail dangerous speech.
“Pressuring platforms in back rooms shielded from public view is not using the bully pulpit at all,” Aguinaga noted. “That’s just being a bully.”
Free Press, an inaptly named organization that aims to promote “positive social change, racial justice and meaningful engagement in public life,” warns that a ruling against the government “could allow social-media platforms to leave up misinformation.” In other words, a ruling for the government would empower it to define “misinformation” and require its removal — something the First Amendment plainly forbids.
About Jacob Sullum
Jacob Sullum is a senior editor at Reason magazine. Follow him on Twitter: @JacobSullum. During two decades in journalism, he has relentlessly skewered authoritarians of the left and the right, making the case for shrinking the realm of politics and expanding the realm of individual choice. Jacobs’ work appears here at AmmoLand News through a license with Creators Syndicate.
from https://ift.tt/7NfXkPH
via IFTTT
No comments:
Post a Comment