Content moderation is the act of reviewing user-created content shared on social platform against platform terms of use. This overview pops the hood on this activity, giving a peek into what it really entails.
2. First, let’s define
user-generated content (UGC)
Text, reviews, pictures, videos, etc. created and
posted on the internet by individuals rather
than by a company, platform, or brand.
Definition
2
6. They work with Product and
Engineering to build safety features
User Reports Onboarding Flows User Validation Content Interventions
Users can flag content they feel
violates guidelines
Where a new user is educated on what
behaviors are expected
as they sign up
Users are asked to prove
who they are
Content is evaluated in real time and
platform sends warning “are you
sure you want to post that?”
6
7. When issues arise,
they investigate and respond
Evaluating
user reports
Discussing
internally
Issuing
consequences
They manage a queue of reports every day
7
9. A Trusty & Safety leaders job is never done
They are always on.
Over and Over
Guidelines are
established
Content is
created
Violations
are investigated
Guidelines are
updated
9
10. A Trusty & Safety leaders job matters
A great deal
Benefits to the user Benefits to the company
They are kept safe and
enjoy the platform
A. Establishes the value of the platform
with the user
B. Drives Platform improvement
C. Creates exponential impact
Benefits to the company
10