Filter Bubbles and the Unfeeling: How AI for Social Media Can Foster Extremism and Polarization DOI Creative Commons
Ermelinda Rodilosso

Philosophy & Technology, Journal Year: 2024, Volume and Issue: 37(2)

Published: June 1, 2024

Abstract Social media have undoubtedly changed our ways of living. Their presence concerns an increasing number users (over 4,74 billion) and pervasively expands in the most diverse areas human life. Marketing, education, news, data, sociality are just a few many which social play now central role. Recently, some attention toward link between political participation has emerged. Works field artificial intelligence already pointed out that there is close use machine learning algorithms possible epistemic isolation, could lead to radicalization. The idea supporting this paper for can actively put users’ deliberative capacity at risk foster extremism. To prove these claims, I proceed along two lines inquiry. First, focus on filter bubbles, namely result selections made by recommend contents meet expectations opinions. analyze phenomenon, refer Deweyan model experience. Second, connect bubbles problem participatory democracy Nussbaum’s concept compassion. purpose provide philosophical foundation both (1) effectively serve as method analyzing their potential problems relation extremism, (2) be adopted standard counter danger extremism associated with

Language: Английский

Filter Bubbles and the Unfeeling: How AI for Social Media Can Foster Extremism and Polarization DOI Creative Commons
Ermelinda Rodilosso

Philosophy & Technology, Journal Year: 2024, Volume and Issue: 37(2)

Published: June 1, 2024

Abstract Social media have undoubtedly changed our ways of living. Their presence concerns an increasing number users (over 4,74 billion) and pervasively expands in the most diverse areas human life. Marketing, education, news, data, sociality are just a few many which social play now central role. Recently, some attention toward link between political participation has emerged. Works field artificial intelligence already pointed out that there is close use machine learning algorithms possible epistemic isolation, could lead to radicalization. The idea supporting this paper for can actively put users’ deliberative capacity at risk foster extremism. To prove these claims, I proceed along two lines inquiry. First, focus on filter bubbles, namely result selections made by recommend contents meet expectations opinions. analyze phenomenon, refer Deweyan model experience. Second, connect bubbles problem participatory democracy Nussbaum’s concept compassion. purpose provide philosophical foundation both (1) effectively serve as method analyzing their potential problems relation extremism, (2) be adopted standard counter danger extremism associated with

Language: Английский

Citations

9