Meta said its decision to shut down CrowdTangle was part of a broader effort to build more internal monitoring tools. The company has been under pressure to do more to curb the spread of misinformation on its platforms, particularly after the 2016 U.S. presidential election, when Russian operatives used Facebook to spread false information and sow discord. Meta, like other social media companies, has also been criticized for the role its algorithms play in amplifying harmful content, such as vaccine misinformation. The company said in a statement that it remains committed to providing transparency tools and is working to improve the Meta Content Library based on feedback from researchers.
Many researchers, however, argue that the new tool falls short of what CrowdTangle offered. They say it is not as user-friendly and does not provide the same level of insight into how information spreads on Facebook and Instagram. News organizations have also raised concerns that they will no longer have access to data that was crucial for monitoring trends and identifying potential misinformation. Some worry that Meta’s decision to shut down CrowdTangle is a step backward for transparency and accountability on social media platforms.
In response to the shutdown, researchers, watchdog organizations, and journalists have called on Meta to reconsider its decision. They argue that cutting off access to CrowdTangle hampers their ability to monitor harmful content, such as hate speech, disinformation, and voter suppression. Without tools like CrowdTangle, they say, it becomes harder to hold social media platforms accountable for the content that appears on their sites. The decision to shut down CrowdTangle has sparked backlash from many in the research and journalism communities who relied on the tool for monitoring and analyzing social media trends.
Meta’s decision to shut down CrowdTangle comes at a time of growing scrutiny of the company’s role in shaping public discourse. The company has faced criticism for its handling of misinformation, political ads, and hate speech on its platforms. Critics say that Meta needs to do more to prevent the spread of harmful content and to provide greater transparency about its content moderation practices. The decision to shut down CrowdTangle has raised concerns about Meta’s commitment to accountability and transparency, particularly as the company faces increasing pressure to address misinformation and other harmful content on its platforms.
As researchers, watchdog organizations, and journalists continue to push back against Meta’s decision to shut down CrowdTangle, the company is facing mounting pressure to provide alternative tools that meet the needs of those who relied on the platform for monitoring social media activity. Meta has said that it is working to improve the Meta Content Library based on feedback from researchers, but many remain skeptical about whether the new tool will be able to fill the void left by CrowdTangle. The shutdown of CrowdTangle has highlighted the challenges that researchers and journalists face in monitoring social media platforms and holding them accountable for the content that appears on their sites.