WOL Featured Video
CLOSE
Elon Musk Buys Social Network Twitter

Source: Chesnot / Getty

Before the big news broke about billionaire Elon Musk buying Twitter, civil-society groups called for big tech to address ongoing issues in platform accountability by fixing the feed ahead of the upcoming elections. A coalition of 60 groups, Change the Terms, wants big tech to “fix the feed” and stop amplifying hate and lies. 

Three simple steps comprise the “fix the feed” demand: 

– Fix the Algorithm: Stop promoting the most incendiary, hateful content

– Protect People Equally: Staff up to protect democracy for all, across all languages

– Show Us the Receipts: Disclose your business models and moderation practices

Given his personal use of Twitter and objection to content moderation, it’s possible Musk could usher in changes that undo the basic measures for addressing disinformation and hate online. And with an election season ramping up with a majority of state primaries beginning to occur, the impact of unchecked amplification of hate speech and disinformation can have profound implications for yet another election.  

Like many in the ‘say whatever you want without consequences camp,’ Musk thinks it’s his protected right to free speech, but that’s not how free speech works. They treat content moderation and attempts to curtail hate speech and disinformation as infringing on first amendment rights. But content moderation in a non-governmental space isn’t an issue of the first amendment.  

“As an initial matter, the First Amendment does not apply to the policies of a private company; it only applies to actions taken by a U.S., state, or local government,” explains Change the Terms. “We carefully wrote the definition of hateful activity to cover types of speech that courts have said are not protected as free speech: incitement, violence, intimidation, harassment, threats, and defamation.” 

The Change the Terms coalition and others like them aren’t trying to take away people’s ability to express themselves. Claiming you can’t express your ideas when you are promoting genocide, violence targeted at particular groups, or

The Change the Terms coalition includes groups like the Southern Poverty Law Center, Free Press, Color of Change, Center for American Progress, National Urban League, Common Cause and Kairos. Understanding the power and importance of digital spaces like Twitter in public information consumption and news communication, having terms that reinforce social good is important. 

Being a private entity doesn’t mean that a corporate space can do anything it pleases, particularly as shown by large social media sites that have a direct impact on elections and Democracy itself. Yosef Getachew, director of the Media and Democracy Program at Common Cause, explained the problem of “prioritizing profits over the public good.”

“Social-media platforms have been complicit in the spread of disinformation and other harmful content that has suppressed votes and sparked real-world violence,” Getachew said in a statement. “Their actions have allowed high-profile disinformation spreaders and other bad actors to continue using social media to spread content designed to undermine trust in our elections. With midterm elections fast approaching, platforms must adopt these safety protocols, including the robust and consistent enforcement of their civic-integrity policies 365 days a year.” 

In an interview with NewsOne, Jelani Drew-Davi, director of campaigns for Kairos, mentioned there was reason to be concerned about the future of Twitter and content moderation.

“Elon Musk, from two weeks ago, and from even before that, has been very clear about the ways in which he sees content moderation and wants Twitter specifically to be a different place,” Drew-Davi said. “It’s really clear that he is intending to rollback content moderation policies, under the veil and guise of free speech, which ultimately will just lead to more disinformation and more hateful content on the platform that does affect black and brown people, LGBT folks and women more often and deeper than it does anybody else.”

Request for social media giants to change terms of service and community standards is not a new thing. While some people may say social media, particularly Twitter, isn’t the real world. These sites can have a real-world impact as words and plans move from online to offline action. 

“When things start to pop off, it’s not just online. It has a real impact on people offline as well,” Drew-Davi said. “There have also been calls to deplatform white supremacists off of Twitter for spreading hateful content. But also this information is truly making the experience for users, especially Black and Brown users, one that is not safe both in words and in character count but also when it comes to offline action.”

Drew-Davi said the best example of online hate moving into action led to a violent episode on the ground in Charlottesville and the murder of Heather Heyer in 2017. As previously reported by Recode, several platforms updated their terms of service and community standards to prohibit the use of the spaces for known white nationalist groups 

“I think about the content moderation changes in demand that groups have been asking of Twitter over the years,” they began. “It’s mostly been around making sure that hateful content is suppressed and disinformation, like completely wrong, false information is also suppressed on the platform.”

Social media companies are essentially free to do as they please, not falling under the express purview of government regulators. But the combination of disinformation and hate speech has moved people to take actions that directly attack the Democratic institutions people claim to respect.

As the nation witnessed on Jan. 6, 2021, allowing unchecked disinformation and calls for violence can lead to dire in-person results. Drew-Davi says federal regulation could be one way to help curtail the use of platforms in a negative way.

“Federal regulation of platforms is another bigger kind of piece of conversation where there should be comprehensive legislation that addresses things like online harassment, disinformation and biases of an algorithm,” Drew-Davi said. “And it does not currently exist. That is beyond Elon Musk via Twitter. It is also about a user experience, and a real-life people experience on online platforms.”

SEE ALSO:  

Elon Musk Buying Twitter: 5 Reasons Why Black People Should Be Wary

Group Demands Congressional Action In Regulating Racialized Disinformation On Online Platforms 

‘Lets Reel In The Mess’: Facebook Outage Renews Calls For Accountability As BlackPlanet Courts New Users 

Groups Want Big Tech To ‘Fix The Feed’  was originally published on newsone.com