Freedom of Expression and Alternatives for Internet Governance: Prospects and Pitfalls DOI Creative Commons
Emma Ricknell

Media and Communication, Год журнала: 2020, Номер 8(4), С. 110 - 120

Опубликована: Окт. 15, 2020

This article dives into the ongoing debate on how to address concerns of personal safety and respect online, as well consequences for exposure polarizing in various ways harmful information, while at same time safeguarding democratic essentials freedom expression participation. It does so by examining issue from a less common angle, namely who governs Internet platforms where much toxic material appears. By applying model free speech regulation conceptualized legal scholar Jack Balkin (2018a, 2018b), explores different theoretical future scenarios governance involving three main players, governments, private companies, speakers. The analysis finds that depending which player is forefront, outcomes standpoint participation may be drastically different. While there potential transformation can enable more ownership, transparency, agency citizens news media, some paths will place ever-increasing control over interests users.

Язык: Английский

What is platform governance? DOI
Robert Gorwa

Information Communication & Society, Год журнала: 2019, Номер 22(6), С. 854 - 871

Опубликована: Фев. 11, 2019

Following a host of high-profile scandals, the political influence platform companies (the global corporations that operate online 'platforms' such as Facebook, WhatsApp, YouTube, and many other services) is slowly being re-evaluated. Amidst growing calls to regulate these make them more democratically accountable, policy interventions are actively pursued in Europe beyond, better understanding how practices, policies, affordances (in effect, platforms govern) interact with external forces trying shape those practices policies needed. Building on digital media communication scholarship well governance literature from science international relations, aim this article map an interdisciplinary research agenda for governance, concept intended capture layers relationships structuring interactions between key parties today's society, including companies, users, advertisers, governments, actors.

Язык: Английский

Процитировано

447

Reframing platform power DOI Creative Commons
José van Dijck, David B. Nieborg, Thomas Poell

и другие.

Internet Policy Review, Год журнала: 2019, Номер 8(2)

Опубликована: Июнь 30, 2019

This article addresses the problem of platform power by probing current regulatory frameworks' basic assumptions about how tech firms operate in digital ecosystems.Platform is generally assessed terms economic markets which individual corporate actors harness technological innovations to compete fairly, thereby maximising consumer welfare.We propose three paradigmatic shifts conceptualisation power.First, we suggest expand notion welfare citizen wellbeing, hence addressing a broader scope services' beneficiaries.Second, recommend considering companies as part an integrated ecosystem, acknowledging its interrelational, dynamic structure.And third, shift attention from level playing fields towards societal infrastructures where hierarchies and dependencies are built into their architecture.Reframing may be necessary condition for updating integrating regimes policy proposals.

Язык: Английский

Процитировано

210

Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas DOI Open Access
Oliver L. Haimson, Daniel Delmonaco, Peipei Nie

и другие.

Proceedings of the ACM on Human-Computer Interaction, Год журнала: 2021, Номер 5(CSCW2), С. 1 - 35

Опубликована: Окт. 13, 2021

Social media sites use content moderation to attempt cultivate safe spaces with accurate information for their users. However, decisions may not be applied equally all types of users, and lead disproportionate censorship related people's genders, races, or political orientations. We conducted a mixed methods study involving qualitative quantitative analysis survey data understand which social users have accounts removed more frequently than others, what are removed, how differ between groups. found that three groups in our dataset experienced account removals often others: conservatives, transgender people, Black people. the from each group varied substantially. Conservative participants' included was offensive allegedly so, misinformation, Covid-related, adult, hate speech. Transgender as adult despite following site guidelines, critical dominant (e.g., men, white people), specifically queer issues. racial justice racism. More broadly, conservative involved harmful according guidelines create information, while expressing marginalized identities policies fell into gray areas. discuss potential ways forward make equitable such embracing designing

Язык: Английский

Процитировано

170

Algorithmic content moderation: Technical and political challenges in the automation of platform governance DOI
Robert Gorwa, Reuben Binns, Christian Katzenbach

и другие.

Опубликована: Март 11, 2020

As government pressure on major technology companies builds, both firms and legislators are searching for technical solutions to difficult platform governance puzzles such as hate speech misinformation. Automated hash-matching predictive machine learning tools – what we define here algorithmic moderation systems increasingly being deployed conduct content at scale by platforms user-generated Facebook, YouTube Twitter. This article provides an accessible primer how works; examines some of the existing automated used handle copyright infringement, terrorism toxic speech; identifies key political ethical issues these reliance them grows. Recent events suggest that has become necessary manage growing public expectations increased responsibility, safety security global stage; however, demonstrate, remain opaque, unaccountable poorly understood. Despite potential promise algorithms or ‘AI’, show even ‘well optimized’ could exacerbate, rather than relieve, many problems with policy enacted three main reasons: threatens (a) further increase opacity, making a famously non-transparent set practices more understand audit, (b) complicate outstand- ing fairness justice in large-scale sociotechnical (c) re-obscure fundamentally nature decisions executed scale.

Язык: Английский

Процитировано

111

Taking fundamental rights seriously in the Digital Services Act's platform liability regime DOI Creative Commons
Giancarlo Frosio,

Christophe Geiger

European Law Journal, Год журнала: 2023, Номер 29(1-2), С. 31 - 77

Опубликована: Янв. 1, 2023

Abstract This article highlights how the EU fundamental rights framework should inform liability regime of platforms foreseen in secondary law, particular with regard to reform E‐commerce directive by Digital Services Act. In order identify all possible tensions between on one hand, and other contribute a well‐balanced proportionate European legal instrument, this addresses these potential conflicts from standpoint users (those who share content those access it), platforms, regulators stakeholders involved. Section 2 delves into intricate landscape online intermediary liability, interrogating E‐Commerce Directive emerging Act grapple delicate equilibrium shielding intermediaries upholding competing stakeholders. The then navigates 3 fraught terrain as articulated Court Human Rights (ECtHR) Justice Union (CJEU) under aegis Convention Charter. section poses an urgent inquiry: can DSA's foundational principles reconcile frameworks manner that fuels democracy rather than stifles it through inadvertent censorship? 4 relationship DSA reform. conducts comprehensive analysis key provisions DSA, emphasising they underscore importance rights. addition mapping out strengths also identifies existing limitations within suggests pathways for further refinement improvement. concludes outlining avenues achieving balanced rights‐compliant regulatory platform EU.

Язык: Английский

Процитировано

26

The Politics of Platform Regulation DOI Creative Commons
Robert Gorwa

Oxford University Press eBooks, Год журнала: 2024, Номер unknown

Опубликована: Май 23, 2024

Abstract As digital platforms have become more integral to not just how we live, but also do politics, the rules governing online expression, behavior, and interaction created by large multinational technology firms—popularly termed ‘content moderation,’ ‘platform governance,’ or ‘trust safety’—have increasingly target of government regulatory efforts. This book provides a conceptual empirical analysis important emerging tech policy terrain regulation.’ How, why, where exactly is it happening? Why now? And best understand vast array strategies being deployed across jurisdictions tackle this issue? The outlines three commonly pursued actors seeking combat issues relating proliferation hate speech, disinformation, child abuse imagery, other forms harmful content on user-generated platforms: convincing, collaborating, contesting. It then theoretical model for explaining adoption these different in political contexts episodes. explored through detailed case study chapters—driven combination stakeholder interviews new policymaking documents obtained via freedom information requests—looking at development Germany, Australia New Zealand, United States.

Язык: Английский

Процитировано

11

The Rule of Law on Instagram: An Evaluation of the Moderation of Images Depicting Women’s Bodies DOI
Alice Witt, Nicolas Suzor, Anna Huggins

и другие.

University of New South Wales Law Journal, Год журнала: 2019, Номер 42(2)

Опубликована: Июнь 1, 2019

This article uses innovative digital research methods to evaluate the moderation of images that depict women’s bodies on Instagram against Western legal ideal rule law. Specifically, this focuses contested law values formal equality, certainty, reason giving, transparency, participation and accountability. Female forms are focal point for our investigation due widespread concerns platform is arbitrarily removing some and, possibly, privileging certain body types. After examining whether 4944 like depicting (a) Underweight, (b) Mid-Range (c) Overweight were moderated alike, we identify an overall trend inconsistent moderation. Our results show up 22 per cent potentially false positives – do not appear violate Instagram’s content policies removed. The platform’s opaque processes, however, make it impossible removed by or user. concludes apparent lack in largely unfettered power moderate content, significant normative which pose ongoing risk arbitrariness women users more broadly. We propose ways platforms can improve advocate continued development empirical, analysis governance. These improvements crucial help where exists allay suspicions fears does not.

Язык: Английский

Процитировано

55

Content Moderation As a Political Issue: The Twitter Discourse Around Trump's Ban DOI Creative Commons
Meysam Alizadeh, Fabrizio Gilardi, Emma Hoes

и другие.

Journal of Quantitative Description Digital Media, Год журнала: 2022, Номер 2

Опубликована: Окт. 4, 2022

Content moderation — the regulation of material that users create and disseminate online is an important activity for all social media platforms. While routine, this practice raises significant questions linked to democratic accountability civil liberties. Following decision many platforms ban Donald J. Trump in aftermath attack on U.S. Capitol January 2021, content has increasingly become a politically contested issue. This paper studies process with focus public discourse Twitter. The analysis includes over 9 million tweets retweets posted by 3 unique between 2020 April 2021. First, salience was driven left-leaning users, "Section 230" most topic across ideological spectrum. Second, stance towards Section 230 relatively volatile polarized. These findings highlight relevant elements ongoing political contestation surrounding issue, provide descriptive foundation understand politics moderation.

Язык: Английский

Процитировано

37

What’s Wrong with Automated Influence DOI Creative Commons
Claire Benn, Seth Lazar

Canadian Journal of Philosophy, Год журнала: 2021, Номер 52(1), С. 125 - 148

Опубликована: Сен. 24, 2021

Abstract Automated Influence is the use of Artificial Intelligence (AI) to collect, integrate, and analyse people’s data in order deliver targeted interventions that shape their behaviour. We consider three central objections against Influence, focusing on privacy, exploitation, manipulation, showing each case how a structural version objection has more purchase than its interactional counterpart. By rejecting focus “AI Ethics” favour structural, political philosophy AI, we show real problem with crisis legitimacy it precipitates.

Язык: Английский

Процитировано

36

Content Moderation Remedies DOI Open Access
Eric Goldman

Michigan Technology Law Review, Год журнала: 2021, Номер 28.1, С. 1 - 1

Опубликована: Янв. 1, 2021

This Article addresses a critical but underexplored aspect of content moderation: if user’s online or actions violate an Internet service’s rules, what should happen next? The longstanding expectation is that services remove violative accounts from their as quickly possible, and many laws mandate result. However, have wide range other options—what I call “remedies”—they can use to redress the applicable rules. describes dozens remedies actually imposed. It then provides normative framework help regulators navigate these remedial options address difficult tradeoffs involved in moderation. By moving past binary remove-or-not remedy dominates current discourse about moderation, this helps improve efficacy promote free expression, competition among services, services’ community-building functions.

Язык: Английский

Процитировано

35