Skip to main content

According to recent research conducted by Global Witness, social media platforms TikTok and X have exhibited significant far-right political bias in Germany, ahead of the country’s federal election on Sunday. The study analyzed the content displayed to new users through algorithmically sorted “For You” feeds, revealing that both platforms skewed heavily towards amplifying content in favor of the far-right AfD party.

The non-governmental organization (NGO) created test accounts on the platforms, following the accounts of major political parties and their leaders, to investigate the algorithmic bias. The results showed that 78% of the political content recommended to TikTok test accounts was supportive of the AfD party, exceeding the party’s current polling numbers. On X, 64% of the recommended content favored the AfD.

Global Witness’ findings also indicated that non-partisan users in Germany were exposed to right-leaning content more than twice as much as left-leaning content in the lead-up to the election. TikTok displayed the greatest right-wing bias, with 74% of its content leaning right, followed closely by X at 72%. Meta’s Instagram was also found to lean right, although to a lesser extent, with 59% of its content being right-wing.

Testing “For You” for Political Bias

To investigate the algorithmic bias, Global Witness set up test accounts on TikTok, X, and Instagram, following the accounts of major political parties and their leaders. The researchers engaged with the content, watching videos and scrolling through threads, to mimic non-partisan user behavior. The results showed a substantial right-wing skew in the content recommended by the platforms.

Ellen Judson, a senior campaigner at Global Witness, expressed concern about the lack of transparency in social media algorithms, stating, “One of our main concerns is that we don’t really know why we were suggested the particular content that we were.” Sheattributed the bias to the algorithms’ design, which prioritizes engagement over democratic objectives.

Global Witness’ findings are consistent with previous research on social media bias in the US, Ireland, and Romania. The organization has shared its results with EU officials, who are responsible for enforcing algorithmic accountability rules on large platforms.

The European Union’s Digital Services Act (DSA) aims to improve transparency and accountability in social media algorithms. However, some elements of the regulation, such as Article 40, which enables vetted researchers to access non-public platform data, have yet to be fully implemented.

Toward Algorithmic Transparency?

Judson emphasized the importance of transparency in social media algorithms, stating, “I think the transparency point is really important.” She hopes that the EU will investigate the alleged bias and take action to address it. The DSA empowers enforcers to levy penalties of up to 6% of global annual turnover for infringements and temporarily block access to violating platforms.

The EU has opened investigations into the three social media firms implicated in the Global Witness research. While the regulation has yet to deliver quick results, the EU is keen to avoid being accused of crimping freedom of expression. Global Witness and other civil society organizations are monitoring the situation closely, awaiting the implementation of the DSA’s provisions and the EU’s response to the alleged bias.

The outcome of this investigation and the EU’s enforcement of the DSA will have significant implications for social media platforms and their role in democratic processes. As Judson noted, “We’re asking the Commission to investigate whether there is political bias. [The platforms] say that there isn’t. We found evidence that there may be. So we’re hoping that the Commission would use its increased information-gathering powers to establish whether that’s the case, and … address that if it is.”


Source Link