Lo0go techturning.com

Meta’s ‘friendly’ Threads collides with unfriendly internet

Meta’s ‘friendly’ Threads collides with unfriendly internet

On July 7, Mark Zuckerberg introduced Meta’s new app, Threads, positioning it as a welcoming platform for online public discourse in contrast to Twitter, which is owned by Elon Musk. Despite this idealistic vision, maintaining a friendly atmosphere on Threads may prove challenging. Meta Platforms, the parent company of Facebook and Instagram, plans to enforce the same rules on Threads as its other platforms. However, Threads’ integration with social media services like Mastodon and its appeal to news enthusiasts and political figures may present new obstacles for Meta.

Unlike its existing apps, Meta will not extend its fact-checking program to Threads, according to spokesperson Christine Pai. Posts on Facebook or Instagram labeled as false by fact-checking partners will retain those labels when shared on Threads. When asked about this divergence, Meta declined to comment. Adam Mosseri, head of Instagram, acknowledged that Threads was more conducive to public discourse but intended to focus on lighter topics such as sports, music, fashion, and design.

Nevertheless, Meta’s efforts to distance itself from controversy were immediately tested. Shortly after launch, Threads accounts shared content related to the Illuminati and “billionaire satanists,” and users engaged in heated debates on subjects ranging from gender identity to violence in the West Bank. Conservative figures, including Donald Trump’s son, accused Meta of censorship due to warning labels appearing on their accounts. A Meta spokesperson claimed that these labels were an error.

Further challenges in content moderation will arise when Meta connects Threads to the fediverse, enabling communication between Threads users and users on non-Meta servers. Meta’s spokesperson stated that Instagram’s rules would apply to these users as well. However, experts in online media stressed the importance of Meta’s approach to these interactions. Alex Stamos, director of the Stanford Internet Observatory, warned that without access to back-end user data, Meta would face difficulties enforcing content moderation, particularly against spammers, troll farms, and abusers.

Solomon Messing of the Center for Social Media and Politics at New York University highlighted the complexities surrounding illegal content, such as child exploitation, nonconsensual sexual imagery, and arms sales. The responsibility of Meta in handling such material when indexing content from other servers remains unclear. Nonetheless, Meta is expected to limit the visibility of fediverse servers with abusive accounts and impose stricter penalties for those sharing illegal content, as stated by Stamos in his posts.

author

Related Articles