Harmful Algorithmic Content Promotion
26 items
2 sources
Harmful content promoted by online platform recommendation systems, requiring government assessment and action.
Cross-Source Insight
Harmful Algorithmic Content Promotion has been flagged across 2 independent accountability sources:
4 inquiry recs
22 PFD reports
This issue has been identified by multiple independent accountability bodies, suggesting it is a recurring systemic concern.
Inquiry Recommendations (4)
40 — Publish interim online harms code of practice
Recommendation: The government should publish, without further delay, the interim code of practice in respect of child sexual abuse and exploitation as proposed by the Online Harms White Paper (published April 2019).
Gov response: On 15 December 2020, the UK government published the Interim Code of Practice on Child Sexual Abuse and Exploitation.
Accepted
Delivered
71 — Pre-screen material before upload
Recommendation: The government should require industry to pre-screen material before it is uploaded to the internet to prevent access to known indecent images of children.
Gov response: On 10 November 2020, the UK government stated that it had launched the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse alongside the US, Australia, Canada and New Zealand. The interim code of …
Not Accepted
FR-12 — Pre-screening by Internet Providers
Recommendation: The Inquiry recommends that the UK government makes it mandatory for all regulated providers of search services and user-to-user services to pre-screen for known child sexual abuse material.
Gov response: We accept the need to hold companies to account for removing, reporting and limiting the spread of child sexual abuse material on their services. The UK’s world- leading Online Safety Bill will address this by …
Accepted in Part
In progress
FR-20 — Age Verification Online
Recommendation: The Inquiry recommends (as originally stated in its The Internet Investigation Report, dated March 2020) that the UK government introduces legislation requiring providers of online services and social media platforms to implement more stringent age verification measures.
Gov response: We accept the need to protect children from harmful and age-inappropriate content. The Online Safety Bill requires all in-scope companies to assess whether their service is likely to be accessed by children and, if so, …
Accepted in Part
In progress
PFD Reports (22)
Oliver Long
Concerns: The self-exclusion scheme (GamStop) fails to protect individuals from unlicenced overseas gambling sites, which target vulnerable users. There is a critical lack of public health information regarding these risks.
Overdue
Oliver Gorman
Concerns: There are inadequate age restrictions on dangerous aerosol products and unclear warnings about instant death. Social media platforms also fail to take responsibility for harmful content promoting such misuse.
Responded
Rhiannon Williams
Concerns: Online suicide forums and social media platforms provided information on self-harm and misleading professionals, raising concerns about the adequacy of The Online Safety Act 2023 in preventing access to such harmful content.
Responded
Miranda Avanzi
Concerns: The widespread and easily accessible availability of explicit, step-by-step suicide guides online, often without age verification, poses a significant risk, enabling vulnerable individuals to self-harm.
Overdue
Bethany Langton
Concerns: The easy online availability of lethal Sodium Nitrite, combined with suppliers' unawareness of its misuse and slow removal of suicide-related online guidance, facilitates self-harm.
Overdue
Isabella Shere
Concerns: Quora's platform contains easily accessible, unmoderated content related to self-harm and suicide, lacking age verification and featuring engagement functions that normalise serious subject matter for children.
Responded
Deborah Cooper
Concerns: Books providing explicit instructions on methods for ending one's life are freely available on Amazon.co.uk. Concerns are raised about the marketing, supply, and lack of regulation for such publications.
Responded
Adrian Gallagher
Concerns: An online book providing explicit, step-by-step suicide instructions, including methods to avoid detection, is readily accessible with inadequate age verification, posing a significant risk to vulnerable individuals.
Responded
Kimberley Liu
Concerns: Unregulated websites facilitate dangerous, unchecked sales of prescription-only sedative medications, actively instructing customers to evade detection, which exploits vulnerable individuals and poses a suicide risk.
Responded
Chloe Macdermott
Concerns: Online forums encourage suicide by providing methods without age restrictions or help signposting, and harmful content is not effectively removed. Lethal products are also easily purchased via international online retailers and delivered to the UK without effective border controls.
Overdue
Bronwen Morgan
Concerns: Vulnerable individuals are able to access websites that facilitate and promote self-harm and suicide methods, enabling them to acquire information and means to cause their own death.
Overdue
Luke Ashton
Concerns: Inadequate player protection tools and a flawed algorithm failed to identify and intervene with a problem gambler. The operator's reliance on minimal regulatory standards, rather than best practice, exacerbated risks.
Responded
Molly Russell
Concerns: Internet platforms lack age verification, age-specific content control, and parental monitoring features, exposing children to harmful material through algorithms and unrestricted access.
Responded
James Forryan
Concerns: Easily accessible websites openly promote and provide guidance on suicide methods, contributing to deaths. There is a lack of sufficient regulation and enforcement against such harmful online content.
Responded
Frances Thomas
Concerns: Outdated e-security guidance from the Department of Education led to inadequate web filtering, lack of oversight for blocklists, and insufficient scrutiny of age-inappropriate online content in schools.
Responded
Berenice Bell
Concerns: Websites promoting or assisting suicide are easily accessible, and platforms lack adequate independent scrutiny to remove age-inappropriate and harmful content.
Overdue
Joseph Nihill
Concerns: Online platforms actively promoted suicide methods and dangerous substances to vulnerable young men, undermining mental health support and posing a foreseeable risk of drawing individuals into self-harm.
Overdue
Jerrelle McKenzie
Concerns: The deceased accessed Dinitrophenol (DNP), a drug banned in the UK since 1938 due to its harmful effects, via the internet, likely influenced by social media, leading to his overdose.
Overdue
Callie Lewis
Concerns: An online suicide forum provided dangerous advice, enabling individuals to mislead mental health professionals and perfect suicide methods, thus frustrating necessary assessments and interventions.
Responded
Bradley Trevarthen
Concerns: School friends were aware of the deceased's increasing suicidal ideation and methods explored online but failed to report it to adults, partly due to fear of device bans.
Responded
Ben Walmsley
Concerns: The school's IT system lacked a mechanism to alert staff when students attempted to access blocked self-harm content, relying solely on teacher monitoring and risking missed safeguarding opportunities.
Overdue
Karnel Haughton
Concerns: Uncensored online videos promote dangerous 'choking game' activities, yet there is no national guidance for schools or support for parents, risking further injuries and deaths.
Overdue