
YouTube’s recommendation algorithm is more likely to suggest conservative videos, regardless of users’ political beliefs. Writes about it The Register.
Researchers at the New York University Center for Social Media and Policy asked 1,063 American adults to install a browser extension that tracks their video viewing experience.
As a starting point, the team identified 25 videos with political and non-political content. Users were asked to choose one of them and follow further YouTube guidelines.
Each time after viewing the viewer had to choose one of the five sentences. For each participant, the team randomly fixed a permanent position.
The study was conducted from October to December 2022. Each participant viewed the suggested videos on YouTube 20 times daily.
The extension recorded which videos the service recommended at each stage. The team assessed the ideological point of view of each video to measure the impact echo chambers and any hidden biases in the system.
“We found that the YouTube algorithm does not lead the vast majority of users down extremist rabbit holes, although it pushes users into increasingly narrow ideological content ranges,” the study says.
They found that, on average, the recommender system takes viewers slightly to the right of the political spectrum, regardless of their ideological beliefs.
“We believe this is a new discovery,” the researchers said.
The team also found that the system encourages users to watch more right-wing or left-wing media, depending on the starting point. As the recommendations progressed, ideological strength increased, the researchers noted.
For example, when viewing moderately liberal materials, recommendations will shift to the left over time, but very slightly and gradually.
Previously, a similar effect was observed on Twitter. The social network’s recommendation algorithm tends to promote posts from right-wing politicians and news outlets more than those from the left.
Political scientists at New York University have suggested that this may be due to the provocative nature of conservative content, which will lead to more engagement.
Recall that in September, Mozilla experts discovered that user actions from YouTube videos do not greatly affect the behavior of recommender algorithms.
Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!
Found a mistake in the text? Select it and press CTRL+ENTER

YouTube’s recommendation algorithm is more likely to suggest conservative videos, regardless of users’ political beliefs. Writes about it The Register.
Researchers at the New York University Center for Social Media and Policy asked 1,063 American adults to install a browser extension that tracks their video viewing experience.
As a starting point, the team identified 25 videos with political and non-political content. Users were asked to choose one of them and follow further YouTube guidelines.
Each time after viewing the viewer had to choose one of the five sentences. For each participant, the team randomly fixed a permanent position.
The study was conducted from October to December 2022. Each participant viewed the suggested videos on YouTube 20 times daily.
The extension recorded which videos the service recommended at each stage. The team assessed the ideological point of view of each video to measure the impact echo chambers and any hidden biases in the system.
“We found that the YouTube algorithm does not lead the vast majority of users down extremist rabbit holes, although it pushes users into increasingly narrow ideological content ranges,” the study says.
They found that, on average, the recommender system takes viewers slightly to the right of the political spectrum, regardless of their ideological beliefs.
“We believe this is a new discovery,” the researchers said.
The team also found that the system encourages users to watch more right-wing or left-wing media, depending on the starting point. As the recommendations progressed, ideological strength increased, the researchers noted.
For example, when viewing moderately liberal materials, recommendations will shift to the left over time, but very slightly and gradually.
Previously, a similar effect was observed on Twitter. The social network’s recommendation algorithm tends to promote posts from right-wing politicians and news outlets more than those from the left.
Political scientists at New York University have suggested that this may be due to the provocative nature of conservative content, which will lead to more engagement.
Recall that in September, Mozilla experts discovered that user actions from YouTube videos do not greatly affect the behavior of recommender algorithms.
Subscribe to Cryplogger news in Telegram: Cryplogger AI – all the news from the world of AI!
Found a mistake in the text? Select it and press CTRL+ENTER