top of page
Writer's pictureEinar Sigurdsson

Top 10 Resources for Understanding the Effects of Harmful Content

With the incessant spread of harmful content across the web, it is becoming more critical than ever to learn about where this content comes from, the effects it has on victims, and what to do in the event you come across this material while visiting otherwise unsuspecting sites. To help spread awareness and education, we’ve curated a list of 10 digital resources to navigate the complexities of online harm.

 


Recommended digital resources to understand the complexities of online harm:



1. Documentary: The Cleaners


The 2018 documentary The Cleaners presents the raw truth of what content moderators go through when ‘cleaning up’ online platforms, reminding social media users that there is a person out there censoring their feed. Hours of exposure to disturbing content can cost an individual their mental health, and to tack on poor working conditions creates a recipe for disaster. Watch this documentary on Vimeo to get an inside look at content moderation from the moderators’ perspective.


2. Publications by the National Center for Missing and Exploited Children


The National Center for Missing and Exploited Children (NCMEC) has long been dedicated to keeping children safe online. With the inception of their CyberTipline in 1998, they made it possible for anyone to report child sexual abuse material (CSAM) they might accidentally come upon while browsing the web.


Among multiple downloadable PDF publications, Coping with Child Sexual Abuse Material is a helpful source intended for families. It features a number of hotlines and law enforcement resources, as well as coping skills if you or your child has been exposed to or are a victim of CSAM. Another important publication is NCMEC’s Be the Solution: Helping Victims of Child Sexual Abuse Material: A Guide for Law Enforcement, available for download with an agency email. The PDF offers insights from CSAM survivors, caregivers, and experienced law enforcement to prepare other officers when they need to help survivors.


3. Stanford University’s The Journal of Online Trust & Safety

Stanford University recently created The Journal of Online Trust & Safety, an open-access online publication focused on harmful content topics, prioritizing critical conversations about CSAM, incitement and terrorism, self-harm, and more. The publication’s inaugural issue was released in October 2021; as of 2023, they are up to 5 issues. The journal allows research submissions, editorials, and letters for its peer-review section or commentary sections. In a few short years, the journal has become revered as a resource for spreading awareness and knowledge on the complexities of harmful content and online safety.


4. Research on online harms by Ofcom and Revealing Reality


Ofcom, the communications regulator for the UK has been a relevant figure in the legislation for the Online Safety Bill, whom Parliament appointed to enforce the Bill. Needless to say, the organization is highly knowledgeable on the growing problem of harmful content within online platforms.


Ofcom partnered with Revealing Reality–an independent social research company–to complete the extensive study, How People are Harmed Online: Testing a Model from a User Perspective. This is just one Ofcom study among others in their Making Sense of Media Programme. This study in particular showcases how individuals are harmed online from beginning to end, analyzing its effects on online platforms.


5. Global Legislative Map by Global Internet Forum to Counter Terrorism


In an effort to help tech companies stay up to speed with legislation surrounding harmful content, the Global Internet Forum to Counter Terrorism (GIFCT) created the Global Legislative Map, a digital map pinpointing these laws around the world.


With 17 categories and 24 applicable countries, you can narrow a search by the topic/regulation and the countries in scope will highlight on the map. The map regularly locates both current and emerging legislation including proposals and technical papers, making it perfect for online platforms to keep up with the rapid pace of content laws.


6. Federal Trade Commission 2022 Report


In 2022, the Federal Trade Commission produced a substantial report, Combatting Online Harms Through Innovation, which urges the industry to enhance AI and other technologies to detect and remove harmful content online. This resource gives context to the state of technology regarding content moderation, with an in-depth explanation of its abilities, limitations, and possibilities for combatting online harm as these digital tools improve.


7. Spectrum Labs podcast


AI specialists at Spectrum Labs offer knowledge on harmful content in their podcast called The Brief, ranging from topics on trust and safety, to content moderation, to child safety and discrimination. With clients in numerous tech industries, these podcasts bring niche perspectives to online safety discussions.


8. Educational Resources from International Centre for Missing and Exploited Children


The International Centre for Missing and Exploited Children (ICMEC) provides a resource page titled Online Safety Education with links to various curricula, statistics, and guides to teaching online safety. This is geared toward school administrators and teachers aiming to educate their students on “digital literacy and citizenship skills.” The main topic focuses on the impact of harmful content and how to prevent it by starting in the classroom.


9. Report Harmful Content website


Provided by the UK Safer Internet Centre and operated by South West Grid for Learning (SWGfL), Report Harmful Content is a comprehensive website that breaks down when, where, and how to report online. They provide advice and education for a multitude of subjects dealing with harmful content and other online abuses. The website lists out the community guidelines and where to report on certain social media platforms. The collective also takes reports from UK residents, based on the “8 harms” defined on their website, though they provide information on where users in other countries can go to for help.


10. INHOPE events


Funded by the European Union, INHOPE is a worldwide network of 52 hotlines that combat illegal content online, focusing on CSAM. INHOPE hosts many events, including a quarterly seminar as well as several webinars made available on the website. They showcase experts with a deeper perception of technology, methodologies, and research in the field of combatting CSAM and child exploitation online.


Comments


bottom of page