Debates over how best to regulate social media have raged for years without resolution, but recent events could accelerate discussions of improving governance, according to a consultant who has worked for some of the top leaders in Silicon Valley and on the world diplomatic stage.
"Looking at more democratic forms of decision-making and consultation, as well as simply thinking about what future governance models might look like for the internet generally, that's absolutely something that decision-makers in the industry and government have been talking about for a long time," Dex Torricke-Barton said on a panel discussion of tech ethics at Tech Conference 24 at Harvard Business School on Sunday.
"Those discussions are naturally heating up now just because we realize there are crises and challenges that are moving very, very quickly. They are going to have huge social impacts," added Torricke-Barton, a director at Brunswick Group, an international consultancy based in London.
Indeed, Sunday's session on tech ethics and regulating social media took place as national attention was focused on the social media footprints of Cesar Sayoc, who has been accused of mailing more than a dozen bombs to critics of President Donald Trump, and Robert Bowers, who has been accused of murdering 11 congregants of the Tree of Life synagogue in Pittsburgh. The recent collision between social media and terrorism has accelerated the debate on how to police social media.
Torricke-Barton is deeply familiar with political debates on tech regulation, having worked as a speechwriter for former United Nations Secretary-General Ban Ki-moon, former Google Executive Chairman Eric Schmidt and Facebook CEO Mark Zuckerberg, as well as running communications for Elon Musk at SpaceX.
Rather than favoring an independent adjudicatory body, along the lines of what Zuckerberg floated in an interview last spring (see sidebar), Torricke-Barton said he would prefer to see a group empowered to actually write the rules for social media.
"I thought the institution that makes more sense for Facebook and for social media platforms isn't a court. It's a parliament," Torricke-Barton said.
What does a governing body to regulate social media look like?
David Ryan Polgar, an ethicist and founder of All Tech is Human, based in West Hartford, Conn., said he anticipates a forthcoming change in governance and the overall regulation of social media.
"The future is probably going to entail some type of quasi-governmental system that we can't imagine right now that maybe gets influenced a little bit by what's going on in Europe," Polgar said, adding that the present state is untenable.
Social media companies do not want to take on the type of responsibilities that publishers bear, and they would be happy to relinquish some control over content on their platforms, according to Polgar.
"Tech companies have all this power to be this arbiter of truth, but they don't even want this power," Polgar said.
Under the current model, social media's incentives seem aligned to favor the most outrageous content, which detracts from a healthy political discussion, according to Jake Shapiro, co-founder of RadioPublic, who was also on the three-person panel.
"The ad model and the algorithm seems to be skewed toward a certain kind of psychology that will conjure forth the things that are most attention-getting, which seem to be at the polar opposites of a healthy dialogue around political discourse," Shapiro said.
Governance versus innovation
Still, regulating social media and other forms of digital communications poses a serious threat to the business model of many of the world's biggest companies -- and the world's biggest industry.
The data generated by our digital communications has informed targeted advertising on behalf of everything, from political campaigns to diaper sales, and the relationship between ads and data collection undergirds the online economy. The global advertising industry is a juggernaut, with unparalleled ability to finance innovations like social media, according to Torricke-Barton.
"Advertising is basically the largest industry in the world. The global advertising industry is over $500 billion every year. It is the one source of funding that is able to afford the rise of global platforms that have connected billions of people around the world," Torricke-Barton said. "There is absolutely no other source of funding that is able to build out these things to that extent."
David Ryan Polgarfounder, All Tech is Human
That point resonated for Nancy Wang, a student at Harvard Business School, who worries that if more of the public begins prioritizing privacy and the regulation of social media over access to ad-supported platforms, it could reduce the capital fueling innovation.
"The reason why we have seen such exponential growth in innovation is because so much of it is driven by ad dollars," Wang said after the panel discussion.
Latoya Peterson, founder of the blog Racialicious, an advisory board member of the Data & Society research institute and a keynote speaker at Sunday's event, said the United States Congress has shown itself to be severely lacking in its collective understanding of the principles that underpin technological innovation, like Facebook's free-to-users, ad-supported platform.
While she agreed the European Union has done a better job than Congress of protecting data privacy, she argued that tasking a quasi-governmental body with regulation would present other quandaries.
"The problem with the quasi-governmental stuff is, who's watching the watchmen?" Peterson said after her talk, pointing to the Unicode Consortium, which regulates emoji, as an example of a quasi-governmental group that is "still full of drama and problems."
Tech ethics in the spotlight
Discussions of a potential outside authority for regulating social media platforms have spilled into the public in recent months.
In an interview with Vox published in April, Zuckerberg considered the idea of an independent decision-making body to regulate speech on his platform.
"You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don't work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world," Zuckerberg said.
At HUBweek in Boston in October, Microsoft's president and chief legal officer, Brad Smith, recommended "a global standard for ethical principles and for the protection of things like privacy so that the price for global admission is adherence to a global standard."
Expanding the debate
No matter the form of the regulatory body, Torricke-Barton argued that adding more voices to the debate on regulating social media and other technologies would help leaven discussions that are now often dominated by the most ardent advocates on either side.
"I do think democratizing some element of the community management is really important, because a lot of the debates we have are still being driven either by elites or by a very hardcore group of users who have very distinct views on the how the community should be managed," Torricke-Barton said.
"We have no idea if that's actually what the views of most of the community are," he added. "You often hear from people who are either very, very outraged, or very, very defensive of the status quo. I want to hear from the folks in the middle."