In Germany, recommendation algorithms run by social media behemoths TikTok and X have revealed significant far-right political bias ahead of Sunday’s federal election, according to new research conducted by Global Witness.
In research on social media content presented to new users through algorithmically sorted “For You” feeds, the non-governmental organization (NGO) discovered that both platforms were significantly biased toward amplifying content that supports the far-right AfD party in algorithmically programmed feeds.
Tests conducted by Global Witness revealed the most egregious bias on TikTok, with 78% of the political content that was suggested to test accounts by the algorithm and originated from accounts that the test users did not follow endorsing the AfD party. (It points out that this number is significantly higher than the party’s current polling support, which is about 20% of German voters.)
According to Global Witness, 64% of the suggested political content on X was pro-AfD.
Its results, which were tested for general left- or right-leaning political bias in the platforms’ algorithmic recommendations, indicate that, in the run-up to Germany’s federal elections, non-partisan social media users are exposed to more than twice as much right-leaning content as left-leaning content.
TikTok once again showed the most right-wing tilt, displaying material that leans to the right 74% of the time. Although, X was not far behind — on 72%.
The NGO conducted three tests, and Meta’s Instagram was found to lean directly over them. Nonetheless, it showed less political bias in the testing, with 59% of the political content being right-wing.