Digital Rights Groups Also Call for More Social Media Transparency

May 9th, 2018 3:08 PM

Several digital rights organizations have banded together to urge social media sites to be more transparent. The Electronic Freedom Frontier (EFF), the ACLU Foundation of Northern California, the Center for Democracy and Technology, and other groups produced detailed guidelines for companies like Facebook and Twitter.

The guidelines are similar to the ones released by the Media Research Center last week, which also called for greater transparency from social media platforms, such as Facebook (whose founder, Mark Zuckerberg, is pictured here). More than 60 conservative leaders, including Media Research president L. Brent Bozell III and Rep. Lamar Smith (R-Texas), signed on to support the statement.

The Santa Clara Principles on Transparency and Accountability in Content Moderation, which came out of the Content Moderation at Scale conferences in Santa Clara, Calif. and Washington D.C., set up “minimum levels of transparency and accountability” in order to “serve as the basis for a more in-depth dialogue” with social media giants.

According to an outline provided by the Center for Democracy and Technology, the Santa Clara Principles focus on providing numbers for removals, giving notice of content removals, and allowing for appeals.

“Companies should publish the number of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines,” “Companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension,” and “Companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension,” the principles claim.

Each section has its own list of minimal details that social media companies should provide to users who have had their content either removed or had their accounts suspended. For instance, for the notice section, the Santa Clara Principles call for social media companies to provide the “specific case of the guidelines” that content was found to violate, the way the content was found (whether if it was “flagged by other users, governments, trusted flaggers”), and information about the appeals process.

In a statement on the principles, EFF senior staff attorney Nate Cardozo said, “Our goal is to ensure that enforcement of content guidelines is fair, transparent, proportional, and respectful of users’ rights.”

EFF director for international freedom of expression Jillian C. York said, “Users deserve more transparency and greater accountability from platforms that play an outsized role — in Myanmar, Australia, Europe, and China, as well as in marginalized communities in the U.S. and elsewhere — in deciding what can be said on the Internet.”

“Users need to know why some language is allowed and the same language in a different post isn’t,” York continued. “They also deserve to know how their posts were flagged — did a government flag it, was it flagged by the company itself? And we all deserve a chance to appeal decisions to block speech.”