By Steven Rosenfeld/Independent Media Institute
Will Facebook’s new political advertising rules tame extremists in American campaigns and elections, accomplishing something that neither Congress, federal regulators nor the judiciary have done?
Facebook’s new rules may be intended to take itself out of the crossfire sparked by Russian interference in the 2016 election via propaganda on its platform and a GOP-connected political firm stealing 87 million user profiles.
But as more details emerge about its transparency rules, a striking realization is emerging that turns widely held notions of American democracy upside down: that a corporation, not the government, may be only force in America that is both powerful enough and willing to restrain the darker elements of human nature from deploying online propaganda in campaigns and elections.
That a private corporation, not the government, may be the best hope from saving American democracy from itself sounds like a libertarian fantasy—government cannot act, so corporations must. But top legal minds have warned that this day is coming.
“The [Supreme] Court’s reluctance to allow the government to regulate false speech in the political arena could limit laws aimed at requiring social media sites to curb false political advertising,” Rick Hasen, an election scholar at University of California, Irvine School of Law, wrote last summer for a law journal.
“Non-governmental actors, rather than the courts and government, are in the best position to ameliorate some of the darker effects of cheap speech.”
Hasen, who curates the nation’s highest-profile election law blog, was predicting that downward spiral of American political discourse might only be curbed by private referees of the proverbial public square—because the current judiciary won’t let Congress do it.
“Yes, you are reading me correctly: the First Amendment, as interpreted by the Supreme Court, may block the kind of legislation which would be effective to staunch foreign influence over our elections,” he said in an email. “And so ironically we may have to rely on the beneficence of large private corporations like Facebook to save American democracy.”
What is Facebook trying to do? It wants to bar anonymous people or groups from posting political ads on its platform, which became the most-used online platform for 2016’s political messaging. That rapid development is at the heart of numerous political upheavals, from Russian interference to a British political consulting firm allied with the GOP that stole personal profiling data of 87 million Americans.
Facebook’s response, unveiled in announcements this winter and spring, is based on a simple observation and remedy. It now knows, via internal and external research, that the most extreme messaging came from people, groups, campaigns and foreigners who did not reveal or verify their true identities. Thus, its solution is to do what Republicans in Congress and libertarians on the Supreme Court have been averse to: force disclosure of who is behind political messaging, whether it is mild, muddy, torrid, vile or worse.
Facebook’s latest statements say it will verify and reveal who is paying for political ads—whether candidate- or issue-oriented—or it will not display them. It will not accept political ads from foreigners. It will show the sums spent on the ads, describe intended audiences, report how often ads were seen, and display and archive all of the ads that a person, group pr page has posted on its platform. The timetable to implement these steps, some of which have been pioneered in Canadian and European elections, is by summer.
“To get authorized by Facebook, advertisers will need to confirm their identity and location,” announced Rob Goldman, VP, Ads, and Alex Himel, VP, Local & Pages. “Advertisers will be prohibited from running political ads — electoral or issue-based — until they are authorized. In addition, these ads will be clearly labeled in the top left corner as ‘Political Ad.’ Next to it we will show ‘paid for by’ information. We started testing the authorization process this week, and people will begin seeing the label and additional information in the US later this spring.”
“We’re also investing in artificial intelligence and adding more people to help find advertisers that should have gone through the authorization process but did not,” the executives continued. “We realize we won’t catch every ad that should be labeled, and we encourage anyone who sees an unlabeled political ad to report it. People can do this by tapping the three dots at the top right corner of the ad and selecting ‘Report Ad.’”
These steps, which political reformers have long called for, are a paradigm shift for the world of campaigns and election law.
Facebook is requiring people to identify themselves and stand by their word and messages, which is not a universal requirement under current law. They also are saying that the political police will be Facebook users, not government regulators, crowd-sourcing enforcement. They are creating political speech standards for the Internet, which the last major federal campaign reform—2002’s McCain-Feingold law—intentionally did not choose to regulate. Facebook will regulate issue ads at any time, whereas the McCain-Feingold law regulates those ads within 60 days of an election.
Facebooks’ steps may temper some of the underlying trends that have made politics more extreme in recent years, as campaigns and the public have increasingly used social media platforms—whose nature accelerates and magnifies rash communication. However, academics and voting rights advocates have many questions and concerns. Indeed, anyone who has spent time in the campaign reform sphere knows well-intended efforts are routinely evaded.
“The first big loophole is if we just allow Facebook to do this as a self-regulating process, we’re leaving wide open all other social media platforms, because even though Google and others are thinking about it, they’re not coming out with anything specific,” said Craig Holman, Public Citizen’s government affairs lobbyist. “It is just Facebook that’s proposed some sort of specific regulatory actions. We need to have a legislative framework or regulatory framework so we capture all the social media platforms.”
Holman said Facebook was the dominant player in today’s online political advertising. But that might change over time. He said recent congressional testimony by CEO Mark Zuckerberg and the new ad disclosure rules has prompted Congress to sideline legislation that would regulate online political advertising. The Honest Ads Act is a Senate bill that has not yet emerged from committees.
“Even though Facebook has endorsed the Honest Ads Act, they have this huge lobby presence—out of the blue,’ Holman said. “They used to be non-existent a few years ago. And now they are spending a fortune lobbying Congress. And I know full well that they are going behind closed doors and saying, ‘Hey, we don’t need the Honest Ads Act, because we’ll do this ourselves.’ So I really believe it [self-regulation] is designed to undercut legislation.”
Holman said the content of Facebook’s proposals was “fairly good.” Facebook didn’t respond to a request to comment on whether they were lobbying against the Senate bill. But Holman’s concerns about self-regulation were not unique.
“Self regulation is not the answer,” said Young Mie Kim, a University of Wisconsin-Madison professor and scholar-in-residence at the Washington-based Campaign Legal Center. Her research team, Project DATA, just released an analysis of Facebook’s paid ads in the final six weeks of the 2016 campaign.
Kim’s team tracked all paid advertising on nearly 10,000 volunteers devices in the six weeks before the 2016 election. It found a third of the paid advertising was political. Of the most divisive ads—content referring to abortion, LGBT, racial conflict, the alt-right, nationalism, terrorism, guns, etc.—that 80 percent were by “suspicious” groups, meaning they did not clearly or honestly identify themselves.
“The suspicious groups are groups that are by definition untrackable beyond a certain point,” Kim said. “Some groups use very generic and benign names, and are very similar to existing non-profits and known groups. There’s America First. The America First the political action committee, which is not the [previous] group we identified. There’s groups with no information, no public footprint.”
Facebook said that it will use artificial intelligence—computers that are programmed to mimic the brain—and hire thousands of people to review political ads. But Kim is skeptical that they know what to look for and will not be duped by sophisticated imposters, such as groups that she has seen steal logos from legitimate nonprofits and falsify identities in other ways. (Facebook’s response is they know they will make mistakes, hence they are asking users to report ads for internal review and assessment.)
“Facebook is making a significant step, but still lacks a precise understanding of how these unknown, unidentifiable actors took advantage of their platform in the first place,” Kim said. “For example, Facebook announced their plans to verify ‘top pagers.’ First of all, we don’t know what that means. But more importantly, my research shows these anonymous groups, with few exceptions, are not popular pagers. In other words, there were a number of suspicious groups or niche groups around a similar campaign. That means if Facebook verifies top pagers only, we might not be able to effectively combat foreign interference, for example.”
Irvine School of Law’s Hasen also was skeptical that Facebook could solve a serious problem facing American political life—the way partisans deliberately hide their identities when throwing mud.
“I remain skeptical that Facebook can actually design effective regulation of political ads which will let us know who is ultimately behind them,” he said in an email. “I’m even more skeptical that without government regulation, it will be possible for Facebook to self-police foreign interference in our elections via the site, or even if Facebook is serious about limiting such activity.”
While Hasen was referring to the fine-print details of subversive campaign tactics and disclosure requirements, Kim most striking comments concerned the big picture question of can democracy reforms be privatized? Put another way, if government cannot or won’t restore checks and balances, should private corporations controlling widely used media platforms step in?
“If we think about where we are heading toward, broadcast media viewership is declining. We see more and more new platforms,” she said. “The way these ads work is not just applied to Facebook. It’s just how the Internet works, and how algorithms work, and how digital ads work. The dark posts are not just a Facebook problem. Google search only serves ads to those who search the terms the advertisers bought… What if we have new platforms that have more sophisticated algorithms, and those algorithms will be determined by big money donors or Russians or foreign entities?”
“Self regulation is not the solution,” Kim said. “Do we want to hand over our political system? Information is currency. And which information I will get, which information I will be I informed about, will it be determined by a tech platform? For example, Facebook is going to determine what political ads mean—because we don’t have a clear definition of a political ad. Academics have debated what constitutes a political ad for decades.”
Facebook said it would be partnering with third parties to determine what issues would fall under its political ad umbrella. While it did not respond to requests for comment, those parties appear to be media organizations that Facebook already work with to verify the accuracy and trustworthiness of news content.
But back to the big picture: has American democracy and the arms of constitutional government become so disabled that only actions by private corporations can reverse today’s outbreak of extremist online propaganda? The answer appears to be yes.
Facebook’s political ad transparency rules may elevate political discourse because the platform is indispensable for candidates, campaigns and voters. Whether other technology platforms and telecommunications giants follow remains to be seen.
But Facebook’s actions are sending an unmistakable larger message. Until today’s political culture and judiciary change, it looks like privatized custodians of democracy are best positioned to save it—or try to prevent human nature’s darker forces from rising again.
Steven Rosenfeld is a senior correspondent of the Independent Media Institute. He is the author of five books, including profiles of campaigns, voter suppression, voting rights guides and a WWII survival story currently being made into a film. His latest book is Democracy Betrayed: How Superdelegates, Redistricting, Party Insiders, and the Electoral College Rigged the 2016 Election (Hot Books, March 2018).