8
Users whose notices have been rejected by online platforms, should have a right to a second
assessment through an internal complaint mechanism to be able to challenge wrongful
platform decisions, as highlighted in the finding of this experiment.
Furthermore, wrongful content decisions are often made due to insufficient staffing of human
content moderators, lack of moderator training, and/or lack of moderators who are proficient
in the variety of languages used. It is important to ensure that platforms provide details of the
human resources they have in place for content moderation in a public annual report.
II. Don’t grant a free pass to online platforms to leave unlawful abuse
online
In reality, what motivates the platform to delete the notified unlawful piece of content through
official reporting mechanisms, be it racist hate speech or incitement to violence, is a fear of
being held accountable. However, law-makers risk giving a free pass to online platforms to
leave unlawful content online with no accountability. Policymakers should ensure that all
notices are thoroughly assessed by the online platforms, without lowering the standard for
assessment. Otherwise, they risk enabling a free flow of unlawful hate speech and lowering
the bar for already under-resourced content moderation systems and practices, that in the
case of Facebook, have already been criticised by international organisations and civil society
groups for contributing to real-life violence against ethnic and religious groups in Myanmar
and India. The latter is the biggest market in the world where Facebook operates.
III. Provide users with an effective help-line from authorities and
online platforms
Users are often left alone when dealing with online violence on social media. Victims
describe a sense of helplessness and isolation. The current Russian invasion in Ukraine has
shown the platforms’ ability to react, mobilise and assign resources when under political
pressure. We need a regulation that would mandate the necessary support on a day-to-day
basis:
● Enable authorities to help users whose rights are violated by requesting platforms to
remove or suspend access to the illegal content in question;
● Online platforms should establish contact points for consumers that should not only
rely on automated means of communication, and be available in one of the official
languages of each Member State.
● In order to ensure effective communication and enforcement of rules towards
platforms there should be a point of contact in every Member State, accessible for
users and authorities. This point of contact should be able to receive notifications as
well as documents including those initiating proceedings against the platform in a
legally binding way. This would lower the threshold for victims of online violence to
defend themselves in front of a court.