Authors: Maham

Problem

Misinformation is harder to spot on TikTok because of the speed at which it goes viral.

Example: Trial of Johnny Depp and Amber Heard

  • Mass of anti-Amber Heard content led to the hounding of a real person, and likely influenced the decision of a jury.
  • Jury members admitted to watching content about the trial online.
  • At the height of the trial, Amber Heard became one of the most abused people on the planet, despite domestic violence charities worldwide expressing concern about the public’s response to the trial.

  • The nature of TikTok’s recommendation algorithm means that sensational content, including falsified or exaggerated facts, conspiracy theories, and disinformation spread on a scale quicker than other social media for high profile or much-talked about cases.
  • For example the trial of Johnny Depp and Amber Heard and major geopolitical events like the war in Ukraine or elections.

Future direction

  • Limitation of the reach of harmful misinformation or inflammatory content without infringing on freedom of expression principles or ruining content creators’ ability to use comedy.
  • Perhaps an app that can log into TikTok or an extension that uses AI to flag content on a user’s feed that is likely false information - similar to Twitter’s community notes feature but might not be as diligent.