Advertisement

THE CONTENT MODERATION REPORT: Social platforms are facing a massive content crisis — here's why we think regulation is coming and what it will look like

The Content Moderation Report_4X3 copy 4

Content moderation has become a top priority for social platforms — including Facebook, YouTube, Twitter, Pinterest, and LinkedIn — amid rising public and political scrutiny and escalating content-related crises. content moderation 2019 stop using a social platform

Brands, lawmakers, and even social media execs are coming to grips with the reality that platforms are not up to the task of moderating content on their own, primarily due to their massive scale. The scale of the problem — and the difficulty of fixing it — roughly correlate with the scale of platforms themselves.

Even if 99% of harmful content is removed from a large platform, an enormous amount can still remain. Even leadership at social platforms believe their platforms should have oversight from external stakeholders, with some either publicly stating that belief or privately meeting with regulators. Tech execs including Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey have each come out in support of new regulation around platform content. 

Social platforms are committed to getting content moderation right, but a paradigm shift in how online content is governed on platforms is likely coming. Additional stakeholders — like brands, lawmakers, or even users themselves — will increasingly be called in to aid the effort.

Most significantly, regulators around the world are likely to intensify pushes to influence or control online content moderation — and first attempts are likely to be led by the EU, which was also the first mover on creating data privacy regulation. In the near term, we expect regulation to take shape around two areas: Governments may start requiring companies to make more detailed disclosures of their respective moderation efforts; and governments will create rules aimed specifically at moderating political content.

In The Content Moderation Report, Business Insider Intelligence analyzes one of the most pressing issues currently facing social platforms — content moderation — and lays out how we expect the debate to evolve and why we think regulation is soon to emerge.

This report also uses proprietary data collected from our 2019 Digital Trust Survey, an online survey of 1,974 respondents from Business Insider's proprietary panel fielded from April 23 to May 16, 2019. Respondents to our Digital Trust Survey are social media users and tend to be younger, male, affluent, North American, and early adopters of tech.

As a result, respondents in our sample are more likely to be aware of changes on tech platforms, and they also have money to spend by clicking into ads they see on platforms. So, while our sample isn't representative of the general population, it offers a window into sentiment among a high-value segment. 

The companies mentioned in this report are: AT&T, Epic Games, Facebook, Google, Johnson & Johnson, LinkedIn, PepsiCo, Pinterest, Snapchat, Twitter, Verizon, Walmart, YouTube. 

Here are a few key takeaways from the report:

  • Platforms with inadequate or ineffective content moderation measures face stiff consequences, such as brand boycotts, reduced trust among users, and even platform abandonment. Frustration over ineffective content moderation could lead users to abandon platforms altogether, at least among respondents to our 2019 Digital Trust Survey: More than a quarter (27%) of respondents said they'd stop using a social platform if it continued to allow harmful content, for example. 
  • A broader group of stakeholders — like brands, lawmakers, or even users — will increasingly aim to influence or control how online content is governed. "Multistakeholderism" could appeal to at least some platform users: 70% of respondents to our 2019 Digital Trust Survey believe that stakeholders other than social companies should have final say in determining what content is permitted on platforms.
  • Regulators around the world could intensify their pushes to control online content moderation. Over the past year, several governments have proposed new rules — most notably, the EU's "Digital Services Act" — that would influence how tech platforms moderate content. 
  • As confidence in platforms' ability to self-regulate wanes, nonregulatory solutions are being proposed and pursued. Two forms of such nonregulatory solution have emerged in the past year, including the Global Alliance for Responsible Media (GARM) in June and Facebook's Oversight Board in September. 
  • As scrutiny mounts, smaller platforms might have greater success luring ad spend by positioning themselves as trusted environments. Respondents to our 2019 Digital Trust Survey consider Linkedin, Pinterest, or Snapchat to be the platforms least likely to show them deceptive content out of all platforms surveyed.

In full, the report:

  • Analyzes how the content moderation debate has evolved, in particular in the EU and US. 
  • Identifies how social platforms currently moderate user-generated content and how those efforts have expanded.
  • Discusses why and how regulation and other emergent forms of oversight are likely to take shape around how online content is governed. 
  • Contains 39 pages and 17 figures.

Interested in getting the full report? Here's how to get access:

  1. Purchase & download the full report from our research store. >>  Purchase & Download Now
  2. Join thousands of top companies worldwide who trust Business Insider Intelligence for their competitive research needs. >> Inquire About Our Enterprise Memberships
  3. Current subscribers can read the report here.

Join the conversation about this story »



https://ift.tt/2qQmLO2

Post a Comment

Previous Post Next Post