Children's Views on Reporting Online Harms
Research by Childnet, released in March, shows that any children 'are not reporting online harms, often due to a lack of confidence that anything will be done about the issues they face.'
Key findings include:
Only 10% of children aged 8-17 strongly agree that they know how to report different types of content on the platforms they use
Nearly a third of children (32%) say they sometimes do nothing when they are upset or worried about something online, with more than one in 20 (7%) regularly doing nothing in response.
Young people are more than twice as likely to block someone online (44%) as report them (21%).
Under 50% of all age groups believe that something they reported that could be harmful would be removed.
Read the full report at:
How to Report Harmful Online Content
Report Harmful Content is an online service provided by UK Safer Internet Centre and operated by SWGfL. It is a national reporting centre designed to allow anyone to report harmful content online. Reports can be made about content falling into these eight categories:
Online Abuse
Bullying or Harassment
Threats
Impersonation
Unwanted Sexual Advances (Not Image Based)
Violent Content
Self-Harm or Suicide Content
Pornographic Content
The website also provides guidance for reporting content on specific platforms, such as Youtube, Facebook, Twitter, TikTok, Roblox etc.
Visit: https://reportharmfulcontent.com to make a report or read advice.