Chloe Macdermott

PFD Report Partially Responded Ref: 2023-0534
Date of Report 19 December 2023
Coroner Paul Rogers
Coroner Area Inner West London
Response Deadline est. 13 February 2024
6 of 9 responded · Over 2 years old
Response Status
Responses 6 of 9
56-Day Deadline 13 Feb 2024
Over 2 years old — no identified published response
About PFD responses

Organisations named in PFD reports must respond within 56 days explaining what actions they are taking.

Source: Courts and Tribunals Judiciary

Coroner's Concerns AI summary
Online forums encourage suicide by providing methods without age restrictions or help signposting, and harmful content is not effectively removed. Lethal products are also easily purchased via international online retailers and delivered to the UK without effective border controls.
Responses
Amazon UK
2 Feb 2024
Response received
View full response
Dear Mr Rogers, Re: The Inquest Touching the Death of Chloe Elizabeth MACDERMOTT A Regulation 28 Report – Action to Prevent Future Deaths

In response to your report dated 19 December 2023 (the “Report”), please find below our written response in accordance with paragraph 7(2) of Schedule 5 to The Coroners and Justice Act 2009. Thank you for sharing with us your Report and for bringing to our attention your findings and concerns. We are greatly saddened to hear about the death of Chloe Elizabeth MacDermott and we extend our deepest condolences to her family. We understand your report refers to cause of death as being ‘sodium nitrate/nitrite toxicity’. Amazon’s UK Store prohibits the sale of poisons as defined under Schedule 1A of the UK Poisons Act 1972. This includes high concentration sodium nitrate and high concentration sodium nitrite mixtures/substances. More details can be found here. As you may know, in many other jurisdictions, sodium nitrite is a legal and widely available product offered by retailers to preserve foods (such as meats and fish) and for use in laboratories as a reagent. High concentration sodium nitrite is not intended for direct consumption, but unfortunately, like many products, it can be misused. At Amazon we work hard to earn and maintain customer trust, meaning that safety is a top priority for us. Last year, we invested more than $1.2 billion to protect our stores and our customers, building robust safety and compliance programs. We are constantly iterating and improving upon our approach. Since October 2022, we have globally restricted the sale of high concentration sodium nitrite to Amazon Business customers (which is solely focussed on business-to-business transactions) to minimize the potential for product misuse. Therefore, in countries where Amazon Business is not yet available or where local policies prohibit such sale (such as the UK store), high concentration sodium nitrite is not available for purchase. We trust this is helpful and again express our sincere condolences to Ms. MacDermott’s family for their tragic loss. Please do let us know if you require any further information.
Ofcom
12 Feb 2024
Response received
View full response
Dear Mr Rogers, We write in response to the Regulation 28 report to Prevent Future Deaths, received 19th December 2023, which was issued to Ofcom following the death of Chloe Macdermott (‘the Report’). On behalf of Ofcom, I would like to offer my deepest condolences to Ms Macdermott’s family and loved ones. Ensuring that online services, such as those mentioned in the Report, fulfil their duties in regard to harmful suicide content is a key priority for Ofcom as we begin to implement the Online Safety Act.

i. Response to Regulation 28 report following inquest into the death of Chloe Macdermott In this response, we set out our proposed actions in relation to the issues raised by the Report, where these fall within the scope of the Online Safety regime, and the timetable for these actions. These actions are pursuant to the new duties and powers assigned to Ofcom by the Online Safety Act 2023 and relate to Ofcom’s plans for implementation of the Act as the UK’s regulator for online safety. We thank the coroner’s office for bringing our attention to the specific, tragic circumstances surrounding Ms Macdermott’s death. Intelligence about the real-world effects of online harms and their links to specific services will be crucial as we develop our approach to the Online Safety regime. The report outlines a number of detailed matters of concern and our response below highlights the steps we are taking to promote compliance with the requirements of the regime across all relevant regulated services. We are currently in the process of putting in place regulation to implement the Online Safety regime. Until the relevant procedural steps outlined below are completed, the duties on regulated services are not yet fully in force. Further detail on Ofcom’s planned approach to online safety regulation is set out below. Whilst the regime is at this early stage, we are supporting all in scope services to be compliant with the provisions of the Act and intend to seek to engage constructively with services.

Classification: HIGHLY SENSITIVE Following reports1 of alleged illegal and harmful suicide content on , Ofcom contacted the service which subsequently announced via its website that UK users would be blocked. As of 31st January 2024, we are aware that the site is accessible by UK users. This is a situation which we will continue to monitor. The Online Safety Act 2023 The Online Safety Act 2023 (‘the Act’) received Royal Assent on 26 October 2023 and makes persons that operate a wide range of online services legally responsible for keeping people safer online. The Act covers certain categories of internet services that have links with the UK including what are known as user-to-user services and search services. The Act defines a user-to-user or search service as having links to the UK if it meets any one or more of the following criteria:
• Has a significant number of UK users; or
• Has UK users as one of its target markets; or
• Is capable of being used by UK users, and there are reasonable grounds to believe that there is a material risk of significant harm to UK users. Any service which meets one more or the above criteria, and which is not exempt2, will be expected to comply with the relevant duties under the Act. Among other things, the Act:
• Appoints Ofcom as the regulator for online safety and confers upon us a number of powers and duties (set out in detail below).
• Imposes a number of duties on those regulated services which focus on improving the systems and processes online services operate to ensure the safety of their users, rather than on the presence of individual pieces of content.
• Requires regulated services to assess the risks their services pose to users in relation to illegal content and content that is harmful to children and take steps to mitigate and manage those risks.
• Requires Ofcom to issue a number of regulatory publications to help regulated services understand how they can comply with their legal duties.
• Requires Ofcom to publish resources to help companies assess, understand and manage risk.
• Ofcom will also produce Codes of Practice, setting out recommended measures services can take to comply with the relevant duties under the Act in order to mitigate the risk of harm. As explained further below, the duties on all regulated user-to-user services relating to protecting their users from illegal harms will require those services to understand and take steps to manage and mitigate the risks of users encountering illegal suicide content3, or their services being used for the commission or facilitation of this offence. User-to-user services will also have to swiftly take down illegal suicide and illegal self-harm content when it is identified. Where regulated services are likely to be accessed by children, they will also have to take steps to prevent child users from

1 BBC News, ‘“Failure to act” on suicide websites linked to 50 UK deaths’, 24 October 2023 2 A number of exemptions also apply as set out in Schedule 1 to the Act. See: Vol 1, Section 3 of our Illegal Harms Consultation 3 In other words content that would ‘amount to’ an offence of assisting or encouraging suicide under s.2 of the Suicide Act 1961 or s. 13 of the Criminal Justice Act (Northern Ireland) 1966.

Classification: HIGHLY SENSITIVE encountering content that encourages, promotes or provides instructions for suicide or deliberate self-injury.4 Ofcom would only have a role in connection with the prevention of supply or marketing of particular substances within the UK insofar as this relates to a type of content or activity that would be caught under duties in the Online Safety Act. Based on the information available in the report, it is not clear that material relating specifically to the supply or marketing of the substance itself would fall in scope of the duties under the Act. There are also additional duties which apply to certain user-to-user services which will be ‘categorised’ based on user numbers and functionalities (these services will be known as ‘Category 1 services’). These duties are designed to make these services more transparent and accountable to their users about the steps they take to protect them from harm; and enable adult users to have more control over the type of content they encounter, including by having access to tools to reduce their potential exposure to suicide and self-harm content. A set of separate duties apply to regulated search services. These duties focus on those services understanding the risks of harm and focus on services taking steps to minimise the risk of individuals encountering illegal suicide and self-harm content and content that encourages, promotes or provides instruction for suicide or deliberate self-harm to children in search results. Although the Act is now law, there are numerous procedural steps needed for the new regime to be fully implemented, and these steps need to be completed before services’ legal duties under the regime - and Ofcom’s ability to enforce those duties – come into force. These steps include, for example, the completion of public consultations (the first is currently open for illegal harms); services understanding and managing the risks of harm to their users; and Parliament approving Ofcom’s final Codes of Practices. We explain our plans to implement the regime below. In the meantime, we are already encouraging in-scope service providers to take meaningful steps to improve safety on their platforms. To this end, we are committed to driving industry improvements by engaging with the largest and riskiest services via continuous ‘regulatory supervision.’

ii. Ofcom’s implementation of the Online Safety Act To coincide with Royal Assent, we set out our approach to implementing the Act on our website – this included an implementation road map setting out our key phases of work over the next three years as set out above. We set out in summary below our intended plans for implementation, and in diagram form in Figure 1. This timeline shows our key milestones and documentation but is not a comprehensive guide to everything we will produce over the first three years of the regime.

4 There are similar duties on regulated search services which focus on those services understanding the risks of harm and focus on services taking steps to minimise the risk of individuals encountering illegal content and content that is harmful to children in search results.

Classification: HIGHLY SENSITIVE Figure 1: Ofcom’s timeline for Online Safety implementation

Phase One: The first step in this was the publication of the Consultation: Protecting people from illegal harms online (‘illegal harms consultation’) on 9 November 2023. As part of this consultation, we published proposals for Codes of Practice which set out how services can comply with their illegal content duties in the Act.5 We also published draft Illegal Content Judgements Guidance on how services can identify illegal suicide and illegal self-harm content. Phase Two: We also published our Consultation: Guidance for service providers publishing pornographic content on 5 December 2023, which includes draft guidance on age assurance and other duties for services providing pornographic content. This will be followed by a broader consultation on measures to protect children from encountering other harmful content, including legal suicide and self-harm content, in Spring 2024. Phase Three: From late 2024, we will publish a call for evidence on additional duties on categorised services, which includes user empowerment measures that will apply to ‘Category 1’ services, followed by a full consultation on proposals in Q1 2025. As part of our preparatory work for implementation, we have been actively engaging with a range of expert stakeholders including government, law enforcement, and charities such as the Samaritans to develop our understanding, expertise and evidence base in relation to suicide and self-harm, and to ensure that we are aware of developing areas of risk. We have also been concentrating on growing our internal expertise in relation to this complex and important harms area. We will continue our programme of engagement with relevant experts as we consult on our initial proposals on how services can comply with their duties. Phase One Ofcom’s illegal harms consultation: assessing risks The Act requires Ofcom to produce a register of risks for illegal harms, and guidance to assist services in conducting their own risk assessment. Our draft guidance sets out a four-step risk

5 It is not mandatory for services to implement all measures in our Codes of Practice. However, where Codes of Practice are not implemented in full, services are obliged to demonstrate the steps they have taken to achieve the same safety outcomes

Classification: HIGHLY SENSITIVE assessment process which we propose as the best way to ensure that a service’s assessments meet their obligations. We are also consulting on our ’Risk Profiles’, which set out an explanation of factors in service design and operation that increase risk of harm. Services will be required to take account of our Risk Profiles when conducting their risk assessments. The information contained in the Risk Profiles is sourced from Ofcom’s own Register of Risk. For illegal suicide and self-harm content, we set out risk factors relating to:
• service type;
• user base;
• functionalities of the service; and
• recommender systems. We are using the consultation process to help us finalise this work. Ofcom’s illegal harms consultation: Codes of Practice The Act requires Ofcom to produce Codes of Practice setting out the measures that in-scope services may take to comply with their duties under the Act.6 The Codes will recommend proportionate systems and processes across a number of areas, including: moderation, governance, and user complaints. While services are not required to implement all measures in our Codes of Practice, in the event that they choose not to take the steps recommended, they will need to be able to explain how their chosen approach allows them to be compliant with their legal duties. We published our illegal content Codes of Practice in draft form alongside our illegal harms consultation.7 The proposed measures in our Codes of Practice would require services to, among other things:
• have a named person, who is accountable to the most senior governance body, for compliance with illegal content safety duties, and reporting and complaints duties;
• have in place effective and easy-to-find content reporting and complaint mechanisms, so that users that encounter illegal content (including illegal suicide and, if the offence is brought into force, self-harm content) can report it and see action taken;
• in the case of medium or high-risk services that use algorithms to recommend content to users, measure the risk that changes to algorithms increase the chance of users' exposure to illegal content (including illegal suicide and self-harm content);
• in the case of user-to-user services: have in place content moderation systems or processes that are designed to take down known illegal content (including illegal suicide and self-harm content) swiftly; and
• in the case of search services: have systems and processes in place that are designed so that search content that is illegal content is deprioritised or deindexed for UK users. In addition, our draft Codes of Practice include a proposal that search services should provide crisis prevention information in response to search requests that contain general queries regarding suicide

6 Section 41 of the Act 7 Ofcom, ‘Consultation: Protecting people from illegal harm online’, November 2023. See: Volume 4: How to mitigate the risk of illegal harms – the illegal Content Codes of Practice, Annex 7: Illegal Content Codes of Practice for user-to-user services and Annex 8: Illegal Content Codes of Practice for search services.

Classification: HIGHLY SENSITIVE and queries seeking specific, practical or instructive information regarding suicide methods. This information should include a helpline and links to freely available supportive information provided by a reputable mental health or suicide prevention organisation. It should also be prominently displayed to users in the search results. Ofcom’s illegal harms consultation: Illegal Content Judgements Guidance Our illegal harms consultation includes a draft version of Ofcom’s Illegal Content Judgements Guidance.8 This document provides guidance to in-scope services on how they may identify illegal content (content which may be reasonably inferred to amount to a relevant offence) including under Section 2 of the Suicide Act 1961. In our draft guidance, we note the intentional act of encouraging or assisting the suicide (or attempted suicide) of another person is an offence and have proposed that, in certain contexts, the provision of specific, practical or instructive information on suicide methods – for example about how to take one’s life, and content inducing someone to enter into a ‘suicide pact’, are likely to be able to be inferred to be illegal content. Our draft guidance therefore suggests that content of this type should be removed from services in order for providers to be compliant with their illegal content safety duties. We are seeking further evidence and feedback from stakeholders on these proposals as part of our consultation. After Ofcom’s illegal harms consultation and statement Once we have completed our illegal harms consultation, we are required to publish a statement setting out our response to issues raised by stakeholders, and our final policy decisions. The Online Safety Act requires Ofcom to submit our Codes of Practice on illegal harms to the Secretary of State and to publish associated guidance within 18 months of Royal Assent. Once we issue our statement, services will have three months to undertake their illegal content risk assessments. At this point we will also submit the Codes of Practice to the Secretary of State, which, subject to their approval, are to be laid in Parliament for 40 days. Following approval by Parliament, the Codes will come into force 21 days after they have been issued. At this time the illegal harms safety duties become enforceable, and we can begin investigations and – following the conclusion of those – impose sanctions if we find that services are not compliant with these duties. Phase Two: Phase Two will focus on child safety and include a consultation on protecting children, to be published in Spring of 2024. The consultation will include our proposals for:
• Draft guidance for services on carrying out their Children’s Access Assessments
• Ofcom’s analysis of the causes and impacts of harms to children
• Draft guidance on carrying out Children’s Risk Assessments
• Draft Codes of Practice setting out recommended measures to protect children online.

8 Ofcom, ‘Consultation: Protecting people from illegal harm online’, November 2023. See: Volume 5: How to judge whether content is illegal or not? (Illegal Content Judgements Guidance) and Annex 10: Online Safety Guidance on Judgement for Illegal Content.

Classification: HIGHLY SENSITIVE The Act places requirements on services that are likely to be accessed by children to protect children from content which is legal but harmful to them. Services which are required to comply with the safety duties protecting children will be under a duty to operate a service using proportionate systems and processed designed to prevent children of any age from encountering ‘primary priority content that is harmful to children’. Content which ‘encourages, promotes or provides instructions for suicide’ or ‘for an act of deliberate self-injury’ has been designated as ‘primary priority content’ under the Act. Regulated services will have three months to carry out Children’s Access Assessments after we publish our final Guidance. If they conclude that they are likely to be accessed by children, then they will have to carry out Children’s Risk Assessments. We intend to publish our final Guidance on Children’s Access Assessments in early 2025. Our main statement on the Children’s Safety Duties will follow in Spring 2025. This will allow services to complete their initial Children’s Access Assessments and determine whether they need to comply with the children’s safety duties, before the requirement to carry out Children’s Risk Assessments comes into force. At that point (Spring 2025), relevant services will have three months to carry out a Children’s Risk Assessment. At the same time, we will submit the children’s Codes of Practice to the Secretary of State. Subject to the Secretary of State’s approval, they will then be laid in Parliament for 40 days. Following approval by Parliament, the codes will come into force 21 days after they have been issued. At this time the children’s safety duties become enforceable, and we can begin investigations and impose sanctions for non-compliance. Assuming Parliament immediately approves the codes, we expect the duties to become enforceable in Summer 2025. Phase Three: Phase Three of online safety focuses on transparency, user empowerment, and other duties which will apply to Category 1 services.9 The user empowerment duties in particular will include a duty to include, to the extent that it is proportionate to do so, features which adult users may use or apply if they wish to increase their control over certain kinds of content including content which encourages, promotes or provides instructions for suicide or an act of deliberate self-injury. We plan to issue a Call for Evidence regarding our approach to phase three in early 2024. Regulation of Video Sharing Platforms In addition to our preparation for the implementation of the online safety regime, Ofcom has also been responsible for regulating UK-established video sharing platforms (VSPs) since November 2020. The scope of the VSP regime is much narrower than that of the Act, applying only to services that meet the definition of a VSP under Part 4B of the Communications Act 2003 and have the required connection with the UK10 and to protecting users from material included in videos only. We include a brief summary of the main provisions of the VSP regime which are potentially relevant to the broader points covered in the report.

9 ‘Category 1’ refers to certain user-to-user services categorised based on user numbers and functionalities. Services in this category are subject to additional duties related to transparency, user empowerment and protection of democratic and journalistic content. ‘Category 1 threshold conditions’ are set the Secretary of State, with advice provided by Ofcom. Ofcom will then be responsible for designating services into categories according to these thresholds. 10 See section 368S of the Communications Act 2003. A full list of notified services may be found on the Ofcom website: Notified video-sharing platforms - Ofcom; they include, among others, TikTok, Twitch and Snap

Classification: HIGHLY SENSITIVE Under the VSP Framework, VSP providers are required to take safety measures as are appropriate to protect children (under 18s) from videos containing ‘restricted material’. Importantly, they must also ensure that the measures they take are implemented effectively. ‘Restricted material’ includes material which might impair the physical, mental, or moral development of under 18s. The relevant legislation does not specify particular examples of such material. However, in our guidance for providers on measures to protect users, we state that VSPs should consider a range of harms, including self-injurious content which may cause physical harms, such as material promoting eating disorders, self-harm and suicide. It is for each VSP provider to decide what safety measures to take that are appropriate and proportionate for protecting children on their service from videos containing potentially harmful material. Where we have concerns about the measures a platform has in place, we will consider taking further action to push improved outcomes for children. This may include, where appropriate, targeted supervision or the use of enforcement powers. Our learnings under the VSP regime will help us prepare for the broader online safety regime.

iii. Conclusion The death of Ms Macdermott highlights the tragic impact that suicide and self-harm content can have. Ofcom is committed to addressing the risk of harm from such content effectively and proportionately. Government and Parliament have signalled the importance of tackling such content by designating illegal suicide content as a priority offence and legal suicide content as primary priority content that is harmful to children, and our strategic priorities reflect this. As we have set out in our approach to implementing the Online Safety Act, once the regime is in force we expect change. Specifically, we anticipate implementation of the Act will ensure people in the UK are safer online by delivering four outcomes:
• stronger safety governance in online firms;
• online services designed and operated with safety in mind;
• choice for users so they can have meaningful control over their online experiences; and
• transparency regarding the safety measures services use, and the action Ofcom is taking to improve them, in order to build trust. We have set out that we will expect all in-scope services to have appropriate trust and safety measures tackling the full range of harms listed in the Act. In particular, we want to see wider deployment and improvements in services’ measures to address areas which pose the greatest risk to people, including illegal and harmful suicide content, to protect UK users, especially children. We are committed to working with industry to ensure compliance with these duties, and to this end our draft illegal harms Codes of Practice include specific measures which we propose would allow services to meet their duties in an effective and proportionate manner. We will ensure that through consulting on our proposals we seek input and engagement with external experts. We will also work directly with services to promote compliance, including – where appropriate – through targeted supervision. Where we identify non-compliance, we will not hesitate to take enforcement action where appropriate to protect users from harm. We will have the powers to require non-compliant service

Classification: HIGHLY SENSITIVE providers to take remedial action and to impose financial penalties where necessary, as well as powers to seek business disruption measures in serious cases.11 Evidence included in reports from coroners and other experts will play an important role in our policy proposals and response as we implement the regime, and we will of course take the evidence in your report into account as we continue our policy development. We hope that this response provides helpful information about the significant steps Ofcom is taking as we continue to work through the implementation of the Act. If further information or clarification is required, we would be happy to provide this.
British Transport Police
12 Feb 2024
Response received
View full response
1

OFFICIAL

12th February 2024

“Mr Rogers,

Thank you for your correspondence to Assistant Chief Constable , British Transport Police in the Regulation 28 Report dated 19th December 2023 concerning the sad death of Chloe Elizabeth Macdermott.

May I first express my deepest condolences to Chloe’s family, friends and loved ones.

ACC is the National Police Chief Council’s lead for Suicide Prevention by virtue of him being the current chair of the NPCC Suicide Prevention Steering Group. Under ACC ’s leadership, the Steering Group aims to foster collaborative partnerships across England, Wales, and Scotland. The Steering Group’s focus is on understanding, responding to, and preventing suicide, aligning with the NPCC Policing Vision 2025. The Steering Group's purpose includes developing a comprehensive understanding of suicide trends, formulating prevention strategies, providing a platform for networking and collaboration, establishing an evidence-based foundation for effective tactics, and offering guidance for a nationally consistent approach to suicide prevention. Through this work, ACC has been made aware of recent concern and increase in the number of people purchasing materials on international sites on the internet, particularly Sodium Nitrate and Nitrite, for the suspected purpose of suicide. Currently intelligence is suggesting that the purchase and use is decreasing however it still remains a concern.

In collaboration with experts from the Department for Health and Social Care (DHSC), Home Office and other key stakeholders, NPCC Suicide Prevention Steering Group members have raised awareness and informed responses across the policing sector regarding the emerging trend of Sodium Nitrate and Nitrite use in suicides. We have also supported the National Crime Agency, which is conducting a criminal investigation into the supply of Sodium Nitrite by a specific foreign national and working closely with the CPS and international authorities.

In the UK Sodium Nitrate and Nitrite are reportable substances (Sodium Nitrate as an Explosive Precursor and Sodium Nitrite as a Reportable Poison) under the Poisons Act. Given recent events, ACC has discussed the regulation or restriction of Sodium Nitrate and Nitrite with the Home Office’s Chemical Reporting Team and the Dept of Health and Social Care have taken work forward to ensure as far as possible that both Sodium Nitrate and Nitrite is not sold to individuals. Legal suppliers have been directed to mix the substances with other products namely salt which will lead to vomiting upon ingestion. We have been advised that there are no current plans to change Sodium Nitrate or Nitrite status from 'reportable'. The Home Office, in collaboration with legitimate suppliers, is proactively challenging suspicious sales. Some suppliers are considering a mixed product formulation to induce vomiting upon ingestion, thereby reducing the viability of Sodium Nitrate or Nitrite as a suicide method.

In October 2023, legislative changes were made regarding the sale of poisons and explosive precursors by suppliers based in England, Wales and Scotland. These changes, which include substances which are classed as reportable Explosive Precursors like Sodium Nitrate or Poisons like Sodium Nitrite, aim to reduce the risk of harm by setting out obligations for suppliers related to regulated and reportable substances. As of 1st October 2023, it is now a criminal offence to sell certain substances without an Explosives Precursors and Poisons (EPP) license.

When specific intelligence indicates that an identifiable individual in the UK has purchased items like Sodium Nitrate and Nitrite, police will conduct a ‘safe and well’. Often, voluntary surrender of these items can be negotiated, with recovery and appropriate support referrals made. Where this is

2

OFFICIAL not possible, increased health surveillance measures are encouraged with partner agencies. In cases of death linked to chemical suicide, efforts are made to track shipments and download electronic devices to quickly identify the product's origin and prevent further victims.

Regarding police powers of entry, search, and seizure, Section 19 of PACE allows for the seizure of ‘evidence’ when an offence is established or suspected. However, the offence of Encouraging or Assisting Suicide, as per the Suicide Act 1961 (amended 2010) is complex. This is particularly true when the perpetrator operates from outside the UK, and/or the 'victim' is an active and willing participant. In such scenarios, the law is unclear, and there is no power of entry to a victim’s premises unless an immediate threat to life is believed to exist under Section 17 of PACE. Identifying and seizing potential self-harm items in a typical household is impractical due to the ubiquity of such items.

ACC is aware of and remains concerned of the number of on line suicide and self harm content sites. Sites such as Sanctioned-Suicide are incredibly concerning and as a result the NPCC have been working closely with Government colleagues to create and implement actions of which there are many to reduce and restrict access to such websites. The Governments new five-year Suicide Prevention Strategy for England, recently launched last year, contains numerous actions that will enable all relevant and influential organisations to support and achieve Government ambition. Whilst the Government is taking the lead in Suicide Prevention across England, working an collaborating with partners including law enforcement to limit access, share research and evidence and lessons learnt including responding to and managing suppliers we will deliver on the actions outlined in the Strategy. ACC has engaged with various Law Enforcement partners in UK Law Enforcement regarding the monitoring of on line pro-suicide forums. However, this is resource-intensive and at present there is no national agency charged with this responsibility and in any event we believe there is no legal basis for a police force to pursue such an initiative. Such monitoring would involve a large number of forums, many of which do not facilitate crime, raising significant lawful data processing issues, including whether it amounts to a Bulk Personal Dataset or would require a Directed Surveillance Authority and a potential disproportionate interference under Article 8. In fact the National Crime Agency (NCA) have stated that this type of activity is not consistent with the NCA’s overall mission of combatting serious and organised crime or the significant ambitions of the NCA Strategy in particular in respect of degrading the most harmful organised crime groups. There is also a lack of fit with the Strategic Priorities set for the NCA by the Home Secretary.

OFCOM is the regulator of the new Online Safety Act with responsibility to make online services safer for the people who use them. Services that fall under their remit will have to follow certain rules including protecting users from illegal content and activity online as well as protecting children from harmful content. This includes encouraging self harm and/or suicide. The Act covers over 130 ‘priority offences’ with main duties in regards to the Act relating to illegal content for services to assess risk of harm arising from or activity on the service taking proportionate steps to manage and mitigate risk. This includes preventing users from encountering content amounting to one of the offences and search services minimising risk of users encountering content amounting to an offence. Services also have duties to swiftly take down certain types of non priority legal content. The Act places new legal requirements on providers of the following three types of internet search; services that allow ‘user to user’ interactions or ‘user generated content’, search services and providers of pornographic content. The duties in the Act apply to services with links to the UK regardless of where in the world they are based. The Act grants OFCOM a range of enforcement powers and requires them to publish guidance on how they will exercise them. In January 2024 they launched the first of four consultations with stakeholders on putting into effect illegal content duties and enforcement powers under the Act.

Recognising the importance of informed responses, the Steering Group have disseminated briefing materials to all NPCC force and regional suicide prevention leads. In June 2023, ACC

wrote to all Chief Constables, highlighting the need for preparedness in addressing sodium nitrite related challenges. The Steering Group actively participates in cross-government efforts to address this suicide method. Our focus is on reducing access to sodium nitrate and limiting public awareness of its use in suicides, consistent with national strategies. While the Steering Group plays a crucial role in suicide prevention strategy, it is important to note that the quality and extent of crime investigation, including the investigation of the death in question, are matters for individual Chief Constables. As

3

OFFICIAL such, ACC ’s role does not extend to directing specific actions in investigations conducted by the Metropolitan Police or any other force. We are committed to preventing avoidable deaths and providing support to those at risk, continuously exploring innovative strategies to enhance our prevention efforts.

ACC believes that where necessary, proportionate and reasonable steps within the law are being taken to identify imports of Sodium Nitrate and Nitrite and other emerging novel suicide methods and that Police forces and health professionals continue to provide an appropriate response and support to those identified at risk. The NPCC remains committed to working with the Home Office and DHSC, who lead cross-government work on Suicide Prevention, to explore all options to prevent avoidable deaths and hope that responsible and appropriate reporting is supported and adhered to at all times with supportive and appropriate links to organisations that can, are able and do support those who may find themselves in crisis at any time https://www.samaritans.org/about- samaritans/media-guidelines/

Regards,

Inspector (on behalf of ACC )’’
Google
13 Feb 2024
Response received
View full response
Dear Paul Rogers, Regulation 28 - Prevention of Future Deaths (Chloe Elizabeth MacDermott) We refer to your Report on the Prevention of Future Deaths dated 19 December 2023 (“the Report”). This letter serves as a formal response to the queries and points raised within the Report. We are deeply saddened to hear of the tragic circumstances relating to the death of Chloe Elizabeth MacDermott. We understand from the Report that prior to her death on 23 May 2021, Ms MacDermott researched ways to end her life using the internet and internet chat rooms, including a specific site (“the Site”). The Report highlights that the Site is a forum that permits material to be exchanged and viewed within chatrooms, including the exchange of information and methods of suicide. Further concerns are raised in the Report, including the lack of age restrictions on the Site, the lack of prominent self-help signposting on the Site and the lack of content moderation on the Site. It is unclear from the Report whether Ms MacDermott used Google Search (or another search engine) to find the Site; nevertheless, we would like to take this opportunity to set out Search’s approach to keeping users safe from harmful and illegal suicide and self-harm content. Google Search serves as an index of information on the open web. When an individual enters a search query, it uses algorithms to return search results linking to the relevant web pages in the index, ranked from most to least relevant. Safety is core to how Google develops and operates its services, and we understand our responsibility to keep users safe, while still ensuring the free flow of information. As with all of our search ranking systems, we’re continually making improvements to ensure that we’re providing people with the highest quality information possible, while also not showing people shocking or potentially harmful results that they are not explicitly seeking. In relation to suicide and self-harm content, we take a combination of approaches, including (i) prohibiting policy violative content in Search features; (ii) providing information and resources via hotline oneboxes; and (iii) providing specialised ranking approaches for suicide and self-harm queries. We set out more detail on these measures below. We note that people use Search for suicide-related queries for many different reasons, including looking for support to manage their thoughts in moments of crisis, or seeking information as to how to support Registered address: Belgrave House, 76 Buckingham Palace Road, London SW1W 9TQ Registered in England and Wales with registration number 03977902

Google UK Limited Belgrave House, 76 Buckingham Palace Road, London SW1W 9TQ, United Kingdom loved ones. We recognise how important it is to increase awareness around help-seeking behaviours, while decreasing risk-taking and reducing stigma. We have therefore developed the measures below through extensive consultation with both internal and external experts in psychology, mental health, and related areas. These include not only academics and clinicians, but also practitioners who provide direct services to vulnerable populations. We also note that some of these measures have been implemented since May 2021, as we are continually evolving our processes in response to information, like user behaviour and sensitive content types, to ensure we are promoting authoritative and trusted content, while demoting content that could be harmful to users. We also recognise the importance of implementing robust systems and processes that provide our users with a level of protection that surpasses legal requirements. Policies and processes relating to suicidal content on Google Search We are aware of the prevelance of user-to-user pro-suicide forums and the danger they pose. This means that we adopt a variety of measures to keep users safe from this type of content. ● Where Search receives a valid removal request under UK law, Google will delist the URL from the search results. The removal process we employ involves taking down URLs that are visible on Search. Our URL removal process is designed to target specific URLs that appear on our Search results, rather than implementing a blanket removal for entire domains. This approach allows us to address individual URLs that may contain unlawfully-published content while preserving the accessibility of valuable and relevant information on the internet. For the avoidance of doubt, if Google were to receive a valid removal request relating to a URL featuring on the Site and it was determined to be illegal under UK law, the URL would be delisted. ● Content that promotes or glorifies suicide or self-harm can often fall into the category of legal but harmful, rather than illegal. Our ranking algorithms consistently classify these types of suicide forums as low quality, meaning that they will not feature highly in search results, particularly in response to general suicide-related queries. Conversely, higher quality results and information bars will be promoted (such as links to the Samaritans, NSPCC and other relevant charities). By integrating links and phone numbers on Search to reputable mental health organisations, Google Search aims to make it easier for users to seek help and guidance when they may need it. We note that, as it stands, the Site does not rank highly in Search for general queries related to self-harm or suicide. ● As a further measure to support both vulnerable users and the organisations they turn to for help, Google Search has taken proactive steps by working with the Samaritans to implement “OneBoxes” which display prominently in response to suicide-related queries. These information boxes are designed to provide users with quick and easy access to authoritative resources and support for mental health needs, including a 24 hour helpline and links to the Samaritans’ official website. We also note that when users search specifically for the Site, our systems will surface this Samaritans OneBox, with a phone number for national hotlines. ● Google Search also enforce content policies on Search features, such as Autocomplete predictions, to prevent surfacing dangerous content and we design our systems to avoid Registered address: Belgrave House, 76 Buckingham Palace Road, London SW1W 9TQ Registered in England and Wales with registration number 03977902

Google UK Limited Belgrave House, 76 Buckingham Palace Road, London SW1W 9TQ, United Kingdom showing violative predictions. We don’t allow content that could directly facilitate serious and immediate harm to people (such as content that promotes self-harm, or content that provides instructions on committing suicide).1 This means that Google will not predict a search query such as “how to commit suicide”. In response to concerns about this Site, Google will also not predict a search query for the Site or the specific method of suicide mentioned in the Report (e.g. by predicting “sanctioned suicide” or “suicide sodium nitrate”). As outlined above, many websites featuring suicide content are not necessarily currently illegal under UK law, and nor are search engines required by law to delist them. Google strongly supports the objectives of the Online Safety Act 2023 (“OSA”), which will give the industry regulator, Ofcom, enforcement powers against user-to-user services (like the Site) where such services do not have sufficient systems and processes to protect users from illegal content or children from content that is harmful to them. The regime will also give clarity over circumstances where search engines will be required to minimise users’ access to certain content. Google remains committed to continually improving services to prevent users from finding and experiencing harmful content, whilst also providing users with authoritative information on wide ranging topics. To the extent that you have any specific queries about Google Search please direct them to Google LLC, the entity that provides this service on behalf of UK users. Google LLC is a US company incorporated in Delaware, with its principal place of business at 1600 Amphitheatre Parkway, Mountain View, California, CA 94043, United States. Otherwise, if we can be of any further assistance on this matter, please do not hesitate to contact us.
DSIT
21 Feb 2024
Response received
View full response
Dear Mr Rogers,

Thank you for your correspondence of 3 January to the Secretary of State for Culture, Media and Sport, , regarding your Regulation 28: Report to Prevent Future Deaths regarding the tragic death of Chloe Elizabeth MacDermott. Your correspondence was passed to the Department for Science, Innovation and Technology due to our responsibility for points raised in the report, including the Online Safety Act (the act).

First and foremost, I would like to extend my deepest condolences to the family and friends of Chloe. The government recognises that the internet can, in some cases, be used to access appalling content with devastating consequences. As you may know, the act received Royal Assent in October last year. This legislation will force companies to take more accountability for the safety of their users. I will address matters (2)-(8) from the report in my response, as per my department's remit and having engaged with the Ministry of Justice on the areas which fall into their remit.

As you will be aware, under section 2(1) of the Suicide Act 1961 (as amended by section 59 of the Coroners and Justice Act 2009) it is an offence for a person to do an act capable of encouraging or assisting the suicide or attempted suicide of another person, with the intention that their act will encourage or assist the other person to commit or attempt to commit suicide. The person committing the offence need not know the other person or even be able to identify them. An offence may be committed whether or not a suicide or attempted suicide takes place. Section 2(1) of the Suicide Act 1961 applies when the act of encouraging or assisting suicide is committed in England or Wales. Forums causing concern, such as the one mentioned in your report, are often hosted outside the UK, which may prevent successful prosecution due to issues related to jurisdiction and enforcement. Under common law rules in cross-border cases, however, the courts of England and Wales will have jurisdiction over an act done outside the jurisdiction if it has a "substantial connection" with the jurisdiction (unless it could be argued that the conduct ought to be dealt with by the courts of another country). Whether or not this test is met in any individual case will be a matter for the Court.

Providing information about or discussing the issue of suicide, where there is no intention to encourage or assist suicide, is not an offence. There can, of course, be a fine line between simply providing information about suicide, sharing content that unintentionally has the effect on an individual of encouraging them to take their life, and someone intentionally encouraging a person to take their own life. There is a difficult balance to be struck between reducing the availability of harmful material and allowing the positive contribution that the internet can provide for those seeking help and support to overcome suicidal feelings. The 1961 criminal offence has a high threshold to avoid criminalising people who are expressing suicidal feelings and those offering them support.

I will now turn to the provisions in the act, and how the regulatory framework will address online suicide content, such as the type of content that Chloe may have accessed before her death. The act's illegal safety duties will apply to all in-scope 'user-to-user' services which allow users to post content online, or to interact with each other - including online forums. This means that once these duties are in force, they will be required to have systems and processes in place to proactively prevent users from encountering priority illegal content via their service and to minimise the length of time for which such content is present. The priority offences are set out in schedules 5, 6 and 7 and includes content that amounts to an offence under the Suicide Act
1961. This will help to protect all users - children and adults - from encountering this appalling content. The act also creates a new offence of intentionally encouraging or assisting serious self-harm. This offence applies to acts committed outside the UK if the act is committed by a person habitually resident in the UK. In other circumstances, the common law rules on jurisdiction will apply as with offences under the Suicide Act 1961. All in-scope user-to-user services will be required to put in place systems and processes that allow them to rapidly remove content where it reaches the threshold of this offence.

Search services will also have targeted duties that focus on mitigating and minimising the risk of users encountering illegal search content, including illegal suicide and self-harm search content. This includes, under section27(4), a requirement for search services to take or use, where proportionate, measures to be applied to the services' user support measures. The regulator now responsible for online safety, Ofcom, will recommend measures that search services can put in place to achieve these objectives. These could include removing results for sites that are known to host illegal suicide and self-harm content such as the forum you reference in your report, as well as signposting users that search for suicide methods away from this material and towards sources of support, which you highlight as a measure that was non-existent on the site Chloe had accessed. These duties will play a key role in reducing the volume of user traffic directed to websites with illegal suicide and self-harm content, reducing the ease at which users can find these kinds of horrifying sites and content.

It is especially important that children are protected from encountering content that may encourage them to self-harm or take their life. The act therefore places additional requirements on relevant in-scope online services to shield their child users from content that is legal, but nonetheless presents significant harm to them. All in-scope user-to-user services that are likely to be accessed by children will need to put in place systems and processes to prevent children of all ages from encountering legal content that encourages, promotes, or provides instructions for suicide or deliberate self-injury. Ofcom will recommend the measures companies can put in place to fulfil this objective. Many services prohibit this kind of content on their service, but where services actively permit it the act places more robust requirements on them - in acknowledgement of the serious risk this content poses to children. User-to-user services that allow these types of content are required to put in place systems and processes to prevent children from accessing this damaging content, including the use of highly effective age verification or age estimation solutions.

The government is acutely aware that users may decide to use virtual private networks (VPNs) to bypass protections, which is an issue you have raised. Under the act, in scope services will need to consider any risks arising from how their service is used as part of their illegal content and child safety risk assessment duties and then effectively mitigate and manage risks that they identify. Service providers may therefore be required to think about how safety measures they put in place could be circumvented - including through the use of VPNs - and take steps to prevent that or risk being treated as non-compliant. Ofcom is currently running a public consultation on its draft 'illegal content duties' codes of practice and details of this are on its website: https://www.ofcom.org.uk/consultations-and-statements/category-1/protecting-people- from-illegal-content-online.

There will be sites and services that choose not to comply with the act's regulatory framework. In these instances, Ofcom has a suite of robust enforcement powers to support its regulatory functions. These extend to instances where companies are based overseas but have a significant number of UK users or the UK as a target market. Ofcom's powers include the ability to apply to the court for business disruption measures. These are court orders that require third parties (those who provide an access facility to such services, such as internet service providers) to prevent, restrict or deter access to non-compliant services in extreme circumstances.

Every suicide is a tragedy, and it is essential that the vital protections delivered for users by the act come into force as soon as possible. My department is working closely with Ofcom to ensure that the implementation of the framework is as short as possible and all users, especially vulnerable users, get the online protections they so greatly need and deserve.
DHSC
8 May 2024
Response received
View full response
Dear Mr Rogers,

Thank you for your correspondence of 19 December to the Secretary of State for Health and Social Care about the death of Chloe Elizabeth MacDermott. I am replying as Minister with responsibility for mental health and suicide prevention, and I thank you for the additional time provided to the department to respond.

Firstly, I would like to say how deeply saddened I was to read of the tragic circumstances of Mrs MacDermott’s death, and I offer my sincere condolences to her family and loved ones. The circumstances your report describes are very concerning and I am grateful to you for bringing these matters to my attention.

I would like to assure you that the Government remains concerned about the prevalence of suicide and self-harm content online. The Government is taking a leading role in tackling methods of suicide, collaborating with partners across the world in policy, law enforcement and society more broadly to limit access, and share research, evidence and lessons learned. There are multiple actions in place to reduce and restrict access to this website, and others like it. This will include seeking to tackle at source the suppliers of harmful substances for the purposes of suicide.

The Department leads a cross-government and cross-sector group established specifically to identify and proactively tackle emerging methods of suicide. This involves close working across government and with others to ensure we are taking rapid, targeted action to address these methods- and has been prioritising tackling sodium nitrite. Through this group’s close working, there are currently over 30 live actions and interventions that collectively are:

• reducing public access to methods, including by reducing the sale and importation of methods where appropriate;
• reducing references to, and limiting awareness of, emerging methods, including by tackling online content and working with the media to ensure responsible reporting; and
• monitoring data and trends to inform rapid and targeted responses, improving the data we collect, and how that information is best shared to inform responses.

The suicide prevention strategy for England published on 11 September 2023, identifies promoting online safety and tackling methods of suicide as priority areas for action. The Department has also launched a national near real time suspected suicide surveillance system in November 2023 to improve the timely reporting and action to prevent suicides. This will support the rollout of a new national alert system to notify schools, universities, and charities of emerging methods of suicide and risks.

To address your concerns about unfettered access to harmful content and restricting such content, we have made enquiries with the Department for Science, Innovation and Technology (DSIT). Under the Online Safety Act 2023 (OSA) all in-scope services such as user-to-user platforms and Search services will have new duties to prevent users being harmed by illegal content that they encounter via their services. User-to-user platforms will also need to take steps to reduce risks that their services are used to perpetrate offences. These duties extend to the unlawful supply, or offer to supply, of controlled drugs. Platforms and Search services will need to take steps to prevent users encountering illegal sale of drugs content via their services. Platforms will need to remove this content when it does appear. You may wish to contact the DSIT for more detailed information regarding how the Online Safety Act will address illegal and harmful self-harm and suicide content.

It is not clear from your report the form of the substance or product used by the deceased. My department has written to you separately (e-mail of 28 March) to clarify this point and will be happy to look again if this information is supplied. But we made some enquiries with the Food standards Agency (FSA) which may be of interest to you.

The regulations that cover the use of nitrates and nitrites in food sits with the FSA in England, Wales and Northern Ireland, and in Scotland responsibility rests with Food Standards Scotland (FSS). Nitrates/nitrites are and remain important preservatives and are one of many ways that a food business can choose to hinder the growth of harmful micro- organisms. All food additives including nitrates and nitrites (E 249 – E 252) are subject to assimilated legislation, Regulation (EC) No. 1333/2008, on food additives, which establishes conditions of use for all food additives authorised in Great Britain. The group of nitrates/nitrites regulated include E 251 – Sodium Nitrate. The legislation sets out the acceptable conditions of use, the foods in which they may be used and where necessary, maximum permitted levels. The FSA website provides information on permitted food additives for the public and food business operators.

Nitrate is commonly found in curing salt (also known as ‘pink salt’ or the branded product called ‘Prague powder’ which is a mixture of table salt and sodium nitrite; or table salt, sodium nitrate and sodium nitrite) which can be sold on-line for home curing and is often sold as a component of DIY sets such as ‘make your own ham, bacon and chorizo’ kits. However, there are mandatory labelling provisions in assimilated Regulation (EU) No 1169/2011, which requires foods, including food additives, to be labelled with conditions for use and instructions for use, including any required dilution factors. The DIY kits come with instructions on safe use. When sold to the final consumer such kits must be labelled as ‘for food use’ and can only be sold in a mixture with salt or a salt substitute. If a food does not bear the appropriate safety labelling, then under the assimilated General Food Safety legislation (EC) No 178/2002, such omission would mean the product cannot lawfully be

placed on the market. If found to be non-compliant with the legislation the product can be recalled and there can be prosecution for placing unsafe food on the market.

I note that the Home Office is a recipient of this report and for your concerns around border and/or custom controls, I refer you to the Border Agency as the lead on preventing the importation of drugs at UK borders. A priority of the Government’s 10-year drug strategy is to “break drug supply chains.” The Government made further commitments in its Serious and Organised Crime Strategy 2023-2028, including to deliver an “end-to-end plan to tackle drugs supply”, which includes strengthening border controls on illicit commodities. The National Crime Agency (NCA), which leads and coordinates the UK law enforcement response to serious and organised crime and may also be able to provide further information on this topic.

I hope this response is helpful. Thank you for bringing these important concerns to my attention.
Action Should Be Taken
It is for each addressee to respond to matters relevant to them.
Report Sections
Investigation and Inquest
On the 5th December 2023 evidence was heard touching the death of Chloe Elizabeth MACDERMOTT. She died on 23rd May 2021 aged 43 years.

Medical Cause of Death

I (a) toxicity

How, when, where Chloe Elizabeth MACDERMOTT came by her death: Chloe Elizabeth Macdermott had been struggling with her mental health for some years prior to her death. She became increasingly suicidal and researched ways to end her life . On or about 21st May 2021 she formed an association with two other persons with whom she planned to end her life. She had purchased using Amazon US. On 22nd May 2021 whilst her husband was away from home, she contacted the persons she had discussed committing suicide with and an agreement was made to act that night. Chloe and one other person in a different part of the UK ingested around midnight between 22nd and 23rd May 2021. Chloe died in the early hours of 23rd May 2021 from the effects of toxicity on her bed in her home r Conclusion of the Coroner as to the death:

Suicide Circumstances of the death:

Extensive evidence was heard by the court in the form of written and oral evidence, including expert evidence.

Of particular significance for the purpose of this report are the following matters:

(1) Chloe was able to purchase the product used over the internet and have it delivered to her home in the UK. Enquiries showed the product was purchased using Amazon in the United States. (2) and other such forums encourage suicide, assist it by provision of information about suicide methods, counsel suicide by providing information about it and thereby potentially facilitate the commission of a criminal offence in the United Kingdom. Matters of Concern:

(3) is a forum that permits material to be exchanged and viewed within its open chatrooms whereby suicide is encouraged, assisted, counselled and procured through the provision and exchange of information and methods. (4)

(5) No age or other restrictions are in place to prevent access to children, vulnerable teenagers and vulnerable adults. (6) No prominent signposting is in place to organisations from whom help is available to prevent suicide. (7) Posts are made by users containing details of methods of suicide without any effective administration to remove such harmful content. (8)

(9) The availability of through the internet and its delivery to individual users in the UK with a non-commercial or agricultural use. (10) The ability for UK users to purchase through Amazon in the United States and to take delivery in the United Kingdom without effective border and/or custom controls.
Related Inquiry Recommendations

Public inquiry recommendations addressing similar themes

Pre-screening by Internet Providers
IICSA
Harmful Algorithmic Content Promotion
Age Verification Online
IICSA
Harmful Algorithmic Content Promotion
Publish interim online harms code of practice
IICSA
Harmful Algorithmic Content Promotion
Pre-screen material before upload
IICSA
Harmful Algorithmic Content Promotion

Data sourced from Courts and Tribunals Judiciary under the Open Government Licence.