Rhiannon Williams

PFD Report All Responded Ref: 2025-0139
Date of Report 12 March 2025
Coroner Kirsten Heaven
Response Deadline ✓ from report 7 May 2025
All 2 responses received · Deadline: 7 May 2025
Response Status
Responses 2 of 3
56-Day Deadline 7 May 2025
All responses received
About PFD responses

Organisations named in PFD reports must respond within 56 days explaining what actions they are taking.

Source: Courts and Tribunals Judiciary

Coroner’s Concerns
During the inquest the evidence revealed matters giving rise to a concern. In my opinion there is a risk that future deaths will occur unless action is taken. In the circumstances it is my statutory duty to make a report under paragraph 7, Schedule 5 of the Coroners and Justice Act 2009 and Regulations 28 and 29 of the Coroners (Investigations) Regulations 2013.

Rhiannon accessed an online ‘suicide forum’ where she was able to access information on how to take her own life and where she obtained advice/information on misleading professionals and her family as to her thoughts and intentions. Rhiannon also accessed a social media platform to obtain information about the method Rhiannon used to take her own life. A similar concern was raised by Patricia Harding Senior Coroner for Central and South East Kent in a Prevent Future Death report of 2019 raising a similar concern in respect of ‘suicide forums’. The response from the Department for Digital, Culture, Media & Sport of 14 January 2020 referred to The Online Harms White Paper. I am aware of The Online Safety Act 2023 and also BBC reporting -https://www.bbc.co.uk/news/uk-67082224 - touching upon whether this Act is in fact sufficient to deal with the risks posed by ‘suicide websites’. I am concerned about the risk to life posed by the website and social media platform considered in this inquest and so draw them to your attention.
Responses
Department for Science Innovation and Technology
6 May 2025
The Department outlines the existing Online Safety Act framework and Ofcom's role in enforcement, noting Ofcom's investigation into a suicide forum. DSIT officials continue to work with DHSC on the Suicide Prevention Strategy, which prioritises online safety. AI summary
View full response
Dear Ms Heaven,

Thank you for your Regulation 28 Report to Prevent Future Deaths dated 12 March 2025, regarding the tragic death of Rhiannon Williams. First and foremost, I would like to express my sincere condolences to the family and friends of Rhiannon. Every death is a tragedy, but incredibly so when it involves a young person who had their whole life ahead of them. The circumstances of Rhiannon’s death are deeply concerning, and I am grateful to you for bringing these matters to my attention. As you reference in your report, the Online Safety Act (‘the Act’) received Royal Assent in October
2023. The Act lays the foundation for strong protections against illegal content for all users and harmful material for children, and the government is committed to working with Ofcom, the independent regulator, to ensure it is implemented quickly and effectively. As the Act is a regulatory framework, I turn first to the relevant criminal law in this area (which is the responsibility of the Ministry of Justice). As you will be aware, under section 2(1) of the Suicide Act 1961 (the responsibility of the Ministry of Justice) it is an offence for a person to do an act capable of encouraging or assisting the suicide or attempted suicide of another person, with the intention that their act will encourage or assist suicide or an attempt at suicide. The criminal offence has a high threshold to avoid criminalising people who are expressing suicidal feelings and those offering them support, by for example, sharing their own experiences. Whether a prosecution for encouraging or assisting suicide can be brought is a matter for the Crown Prosecution Service following a police investigation. I will now turn to the regulatory framework of the Online Safety Act, which tackles illegal and legal forms of online suicide content in a number of ways. The Act's duties apply to search services and platforms that allow users to post content online or to interact with each other. This includes a broad range of websites, apps and other services, including social media services, consumer cloud storage sites, video sharing platforms, online forums, dating services, and online instant messaging services. It will apply to services even if the companies providing them are outside the UK. As of 17 March 2025, the Act’s illegal safety duties came into force, requiring in-scope services to have proportionate systems and processes in place to proactively prevent users from encountering priority illegal content via their service, and to minimise the length of time for which such content is present. Content is ‘priority illegal content’ where it amounts to a ‘priority offence’. Under the Act, intentionally encouraging or assisting suicide is a priority offence. Priority offences reflect the most serious and prevalent illegal content and activity, against which companies must take proactive measures, as well as ensuring their services are not used to facilitate or commit a priority offence. Under the Act, all in- scope services (regardless of their size) must now proactively prevent all users from being exposed to illegal user-generated suicide content. The Act also introduced a new communications offence which criminalises relevant acts capable of intentionally encouraging or assisting the serious self-harm of another person, and this was commenced on 31 January 2024. On 5 March, Ofcom published an article

setting out what online service providers operating in the UK need to do to protect people from suicide and self-harm material. On 3 March 2025, in advance of the illegal safety duties coming into force, Ofcom announced that it had launched an enforcement programme to assess industry compliance with the first set of duties for in- scope services to submit illegal harms risk assessments to the regulator. Services are required to keep these assessments up-to-date and update them before making a significant change to the design or operation of their service which could put their users at risk. They will ensure that providers thoroughly identify the risks on their own sites, enabling them to mitigate and manage potential harms arising from them. Ofcom were clear that platforms that failed to provide a sufficient response by 31 March could face enforcement action. On 9 April, Ofcom launched an investigation into whether the provider of an online pro-suicide forum has failed to comply with its duties under the Act. This enforcement action taken by Ofcom is the first investigation opened under these new laws, demonstrating pro-suicide forums are a high priority for enforcement. The strongest protections in the Act are for children. In-scope services that are likely to be accessed by children will have a duty to protect children from encountering legal but nonetheless harmful content. This includes taking steps to prevent children from ‘primary priority content’, which includes that which encourages, promotes, or provides instructions for self-harm, eating disorders or suicide. Providers of user-to-user services have a duty to use highly effective age assurance (HEAA) to prevent children encountering this type of content. The child safety codes of practice are currently being finalised by Ofcom and will set out ways in which providers can comply with the child safety duties. They are expected to be laid in Parliament in April 2025, with the child online safety regime expected to be fully in effect by Summer 2025. We also know that users may find or encounter online suicide content via search services. Under the Act, search services have targeted duties that focus on minimising the risk of users encountering illegal search content, including illegal suicide and self-harm search content. In practice, this could look like removing results for sites that are known to host illegal suicide and self-harm content. These duties also include a requirement to take or use, where proportionate, user support measures, which might for example, signpost users towards sources of support. Combined, these duties are expected to play a key role in reducing the volume of user traffic directed to websites with illegal suicide and self-harm content, reducing the ease with which users can find these kinds of sites and content. In addition, search services must minimise the risk of children encountering PPC in search results or on the pages they land on when they click on them. In your report you express particular concern around content found on online pro-suicide forums. I share your concern around these ‘small but risky’ sites, which is why in September 2024 I wrote to Ofcom’s CEO about their approach to regulating these services. Ofcom have established a dedicated supervision taskforce which is developing and delivering a workplan focussing on high priority themes like suicide offences. Additionally, such services are required to comply with the existing illegal content and child safety duties of the Act which are already in force. As has been shown by Ofcom’s investigation on 9 April, if there is evidence of non-compliance Ofcom will undertake enforcement action against such services. Ofcom’s enforcement powers are robust and will promote strong compliance with the Act. Ofcom will be able to direct platforms to come into compliance, issue substantial fines to non-compliant services (up to 10% of qualifying worldwide revenue or £18 million, whichever is greater), or apply to the courts for business disruption measures that require third parties to withdraw their services from, or restrict access to, non-compliant services in the UK. Ofcom can take enforcement action against any company that provides services to UK users, wherever they are located. This is necessary given the global nature of the online world.

This government has committed to tackling suicide and will continue to look at ways we can extend protections under the Act, while maintaining our focus on swift implementation. Alongside this, officials from my department continue to work closely with officials from The Department of Health and Social Care (DHSC), who coordinate the government’s approach to preventing suicides and improving people’s mental health, and lead on the Suicide Prevention Strategy for England. The Strategy identified online safety as a priority area for action and committed to tackling harmful suicide and self- harm content. In addition, the Strategy proposes targeted support for priority groups such as children and young people. I would like to thank you for bringing your concerns to my attention. I hope that this response is helpful and demonstrates the government's commitment to tackling suicide, and my own commitment as the Secretary of State responsible for the Online Safety Act, to work with Ofcom to ensure the Act is implemented as quickly and effectively as possible.
OFCOM
7 May 2025
Ofcom has opened an investigation into the specific suicide forum mentioned in the report to assess its compliance with the Online Safety Act 2023. They are committed to holding providers accountable and will take enforcement action for non-compliance. AI summary
View full response
Dear Ms Heaven, I write in response to your Regulation 28 Report to Prevent Future Deaths (‘the Report’), dated 12 March 2025, following the inquest into the death of Rhiannon Auriol Williams. First, on behalf of Ofcom I would like to offer our deepest condolences to Ms Williams’ family and loved ones following their tragic loss, and in such terrible circumstances. I would like to thank the Coroner for bringing to our attention the potential role of suicide forums and social media platforms in the tragic circumstances of Ms Williams’ death. This is important to us as we begin to implement and enforce the Online Safety Act 2023 (‘Act’). I would like to assure you and Ms Williams’ family that Ofcom is committed to taking action to ensure that online suicide forums and social media platforms take their new duties of protecting people from illegal and harmful suicide content seriously.
1. Overview of our response to the Report In the Report, you set out your concerns regarding how Ms Williams was able to:
• access advice and information about suicide methods on a suicide forum ( ) and social media (TikTok), and
• obtain advice on how to mislead her family and professionals as to her thoughts and intentions on a suicide forum ( . You also included a link to a 2023 BBC article which discussed the suicide forum in question and touched upon whether the Online Safety Act would be sufficient to deal with the risk posed by suicide forums. This letter sets out Ofcom’s response to these concerns in more detail below (see page 6), including more detail on the investigation opened into on 9 April 2025. In announcing this investigation publicly, we opted not to name as the suicide forum we are investigating owing to the risk of increasing traffic to the forum. We would ask that you bear this in mind if stating or publishing anything publicly relating to this response. Before setting out Ofcom’s response to the concerns in the Report, it may be useful to explain the relevant duties that the Act imposes on Ofcom and on online user-to-user and search services,1 the

1 ‘User-to-user services’ and ‘Search services’ are terms used and defined in the Act. A user-to-user service (U2U service) is an internet service by means of which content that is generated directly on the service by a user of the service or uploaded

Classification: HIGHLY SENSITIVE timelines for the duties under the Act coming fully into force, and the three different phases of our implementation of the Act.
2. Background on Ofcom’s implementation of the Online Safety Act The Online Safety Act 2023 The Act makes persons that operate a wide range of online services legally responsible for keeping people safer online. It covers companies that provide user-to-user or search services and have links to the UK, which is understood according to the following criteria:
• Has a significant number of UK users; or
• Has UK users as one of its target markets; or
• Is capable of being used by UK users, and there are reasonable grounds to believe that there is a material risk of significant harm to UK users. Any service which meets one more or the above criteria, and which is not exempt2, will be expected to comply with the relevant duties under the Act. Among other things, the Act also:
• Makes Ofcom the regulator for online safety and confers upon us a number of powers and duties which we discuss in more detail below.
• Requires Ofcom to issue a number of regulatory publications to help regulated services understand how they can comply with their legal duties (discussed further below). These include Codes of Practice (for example our Illegal Harms user-to-user Codes of Practice) setting out recommended measures service providers can take to mitigate risks of harm in compliance with their duties, and resources to help companies assess, understand and manage risk. Examples of such documents that are most relevant to the Report are Ofcom’s Illegal Harms Risk Assessment Guidance, Risk Profiles, Register of Risks and Illegal Content Judgements Guidance. Provisions of the Act: legal duties on service providers The Act imposes a number of duties on service providers, in order to improve their systems and processes to keep their users safe from encountering harms. These include:
• Duties on user-to-user and search service providers to assess the risks their services pose to users in relation to illegal content (including illegal suicide content); and content that is harmful to children (including content that encourages, promotes or provides instructions for suicide), and take steps to mitigate and manage those risks;
• Duties on user-to-user service providers to swiftly take down illegal content (including illegal suicide content) when it is identified, and to prevent children from encountering content that is harmful to them (including content which encourages, promotes or provides instructions for suicide); and

to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service. Search services are services that are, or include, a search engine, which allows users to search more than one website or database. 2 A number of exemptions also apply as set out in Schedule 1 of the Act. See: Overview of regulated services.

Classification: HIGHLY SENSITIVE
• Additional duties for categorised service providers, for example providing, to the extent that it is proportionate to do so, features which adult users may use or apply if they wish to increase their control over certain kinds of content (including content that encourages, promotes or provides instruction for suicide). Timeline for duties under the Act coming fully into force Although the Act is now law, there are numerous procedural steps needed for the new regime to be fully implemented, and these steps need to be completed before all of the legal duties providers need to comply with under the Act – and Ofcom’s ability to enforce those duties – come into force. Below, we set out more detail on the different phases leading to the implementation of the Act and their enforcement status at the time of writing. As part of our preparatory work for implementation, we actively engaged with a range of expert stakeholders including government, law enforcement, and charities to develop our understanding, expertise and evidence base in relation to suicide, and to ensure that we are aware of developing areas of risk. We have also commissioned research to grow internal expertise and develop our evidence base in this complex and important harms area.3 We will continue our engagement with relevant experts as the programme evolves. For more information on our timelines for implementation, we published our updated roadmap for the main stages of our work in October 2024 and our intended plans for implementation are summarised in diagram form in Figure 1 in the annex below. Phase One: Illegal Harms On 16 December 2024 we published our statement on protecting people from illegal harms online (‘Illegal Harms Statement’) which included our final Illegal Harms Codes of Practice and guidance. On the same day, the Codes were submitted to the Secretary of State who laid them in Parliament for 40 days. Concurrently, regulated service providers had until 16 March 2025 to undertake their illegal content risk assessments. Following approval by Parliament, the Codes came into force on 17 March
2025. This means the illegal harms safety duties are now enforceable, and we can take enforcement action against providers that are not complying with their illegal harms duties. We can begin investigations and – following the conclusion of those – impose sanctions if we find that services are not compliant with these duties. Illegal Harms – Assessing and mitigating risk of illegal harms As explained above, the Act requires Ofcom to produce a Register of Risks for Illegal Harms and guidance to assist services in conducting their own risk assessment. Our final Risk Assessment Guidance sets out a four-step process which we propose as the best way to ensure that a service’s assessments meet their obligations. If regulated service providers concluded from their risk assessment that they are at risk of illegal harm(s), they need to take steps to mitigate the risk of harm they have identified on their service. The Act also requires Ofcom to produce Codes of Practice setting out the measures that in-scope providers may take to comply with their duties under the Act.4 According to our Illegal Harms Codes

3 See, for example, our research on suicide content and search services: ‘One Click Away: a study on the prevalence of non- suicidal self injury, suicide, and eating disorder content accessible by search engines’. See also our research into children’s experience of suicide, self-harm and eating disorders content: ‘Experiences of children encountering online content relating to eating disorders, self-harm and suicide’ 4 Section 41 of the Act.

Classification: HIGHLY SENSITIVE of Practice,5 user-to-user service providers that have concluded they are at risk of illegal suicide content being posted on their service should take the following steps to mitigate the harm from this material:
• Ensure their content moderation teams are appropriately resourced and trained to deal with that content;
• Allow users to report illegal content, which includes suicide material, through reporting and complaints processes that are easy to find, access and use;
• Test their algorithms, checking whether and how design changes impact the risk of illegal content, including illegal suicide content, being recommended to users;
• Set clear and accessible terms and conditions explaining how users will be protected from illegal content, which includes suicide material; and
• Have systems for removing illegal content, which includes illegal suicide material, quickly as soon as they become aware of it. While service providers are not required to implement all measures in our Codes of Practice, in the event that they choose not to take the steps recommended, they will need to be able to explain how their chosen approach allows them to be compliant with their legal duties. Phase Two: Protecting children from harms online Although the Report concerns the death of an adult, we take this opportunity to set out information about services’ duties to protect children from content harmful to them. This is because such content includes content that encourages, promotes or provides instructions for suicide without intent. Services that are likely to be accessed by children have additional duties to protect children’s online safety. Our Protection of Children statement where we set out our final decisions in relation to children’s risk assessments and safety measures was published on 24 April 2025. The Government laid the Codes in Parliament on the same day, and subject to Parliamentary approval they will become enforceable from 25 July 2025. Protecting children – Assessing access and risks We published our final Children’s Access Assessments Guidance in January 2025 and all user-to-user and search services were required to carry out Children’s Access Assessments by 16 April 2025. A children’s access assessment is a process for establishing whether a service is ‘likely to be accessed by children’ within the meaning of the Act.6 If they concluded that their service is likely to be accessed by children, they now (following the publication of our Protection of Children Statement) need to carry out a children’s risk assessment by 24 July 2025 and comply with the children’s safety duties in the Act. The Act requires Ofcom to produce a Children’s Register of Risks, and guidance to assist services in conducting their own children’s risk assessments. Our Children’s Risk Assessment Guidance sets out a four-step risk assessment process which we propose as the best way to ensure that services meet their risk assessment obligations. Following the outcome of the children’s risk assessment, service providers will need to take steps to mitigate the risk(s) of harm they have identified on their service.

5 See here for the illegal content Codes of Practice for search services. See here for the illegal content Codes of Practice for user-to-user services. 6 Section 37 of the Act explains the meaning of “likely to be accessed by children”.

Classification: HIGHLY SENSITIVE Protecting children - Mitigating the risk of harm The Act imposes duties relating to content which is harmful to children. The Act explains that content which “encourages, promotes or provides instructions for suicide” (‘suicide content’) is a form of ‘primary priority content that is harmful to children’.7 The Act is clear that primary priority content should not be shown to children. Where regulated services are likely to be accessed by children, they will also have to understand the risks of, and take steps to prevent (in the case of user- to-user services) or minimise the risk of (in the case of search services) child users encountering, suicide content. The Protection of Children Codes of Practice set out the measures that service providers can take to comply with the children’s safety duties across a number of areas, including age assurance; recommender systems; content moderation; governance and accountability; user support; and user reporting and complaints.8 Phase Three: Additional duties on categorised services All online service providers within scope of the Act must protect all UK users from illegal content and, where applicable, protect children from content that is harmful to them. In addition, a small proportion of these service providers will be categorised and designated as Category 1 (user-to- user), 2A (search) or 2B (user-to-user) services if they meet certain thresholds set out in secondary legislation by Government. These categorised service providers will be required to comply with a range of additional requirements, largely focused on bringing an appropriate level of safety, transparency, and accountability to the online world, reflecting the nature of such services. Parliament confirmed the thresholds for Categories 1, 2A and 2B in law on 27 February 2025. Examples of duties some of these providers will need to comply with include transparency reporting and providing, to the extent that it is proportionate to do so, features which adult users may use or apply if they wish to increase their control over certain kinds of content, including legal content which encourages, promotes or provides instructions for suicide. We consulted on transparency reporting requirements in Summer 2024 and hope to publish the first register of categorised services in Summer 2025. We expect to publish our consultation on all other additional duties no later than early 2026.

7 The Act requires Ofcom to provide guidance for providers of user-to-user and search services, which gives examples of content, or kinds of content, that Ofcom considers to be, or considers not be, harmful to children (‘Guidance on Content Harmful to Children’). This is intended to support service providers that may need to make judgements about whether content on their service amounts to content that is harmful to children as defined in the Act. 8 For more information, see here for the Protection of Children Code of Practice for user-to-user services. See here for the Protection of Children Code of Practice for search services.

Classification: HIGHLY SENSITIVE
3. Our response to the ‘Matters of Concern’ In the Report, your express concern that Ms Williams was able to access information about suicide methods on and TikTok. You also express concern that Ms Williams was able to obtain advice on on how to mislead her family and professionals as to her thoughts and intentions. Please see below our response to your concerns, explaining how Ofcom’s implementation of the Act will help mitigate the risk to life posed by illegal or harmful content on services such as and similar service providers, and social media platforms such as TikTok. Whether content of the type that Ms Williams was exposed to is illegal under the Act Our Illegal Content Judgements Guidance (ICJG) sets out that it is an offence to intentionally encourage or assist the suicide (or attempted suicide) of another person. Online material which amounts to this offence is considered to be illegal suicide content under the Act. The ICJG chapter on the suicide offence states that “content which provides specific, practical or instructional information on suicide methods is likely to be capable of constituting assistance for the purposes of this offence in an online context”. For example, this could include “details on the most effective way of taking one’s own life, or tips about how to do so in a way which avoids interruption from others.”9 We further say that content which constitutes assistance is only illegal “if there is also intent, on the part of the poster, to assist suicide, or attempted suicide.”10 The type of material that you express concern about in the Report, namely details about suicide methods, may therefore amount to the suicide offence under the Act depending on the circumstances. If so, it would be considered as illegal content under the Act. Service providers in scope of the Act, which includes suicide forums and social media platforms, have a duty to implement measures, such as content moderation systems, to protect UK users from such content and ultimately remove it when they become aware of it. Ongoing action undertaken by Ofcom Ensuring that UK users – and especially children – are protected from illegal content, including content that amounts to the suicide offence, is a priority for Ofcom. Now that the illegal harms duties are in force, we are taking a range of actions to drive compliance in this area. Ofcom is committed to driving change through both supervisory engagement and enforcement action, as appropriate, and is focusing its resources on the largest and/or riskiest user-to-user and search providers. This includes both large social media platforms that may contain content relating to suicide, and small dedicated online forums that feature discussion of suicide and therefore present a high risk of harm to UK users. Targeted engagement through supervision In addition to our formal powers to ensure that service providers comply with their duties under the Act, we have established Supervision teams which are already actively engaging with selected service providers to drive compliance with the Act. The providers we have chosen to supervise are those that pose particular risk, either because of their size, functionality or because of the nature of the service. This includes the largest services such as popular social media platforms, search services, messaging and gaming services, as well as smaller services that present particular risk due to the

9 Ofcom, 2024. Illegal Content Judgements Guidance (ICJG), p. 175. 10 Ofcom, 2024. Illegal Content Judgements Guidance (ICJG), p. 175.

Classification: HIGHLY SENSITIVE nature of their site, such as image and video sharing sites, chat forums and file sharing services. The goal of this targeted and sustained engagement is to secure compliance with regulatory duties and improve the safety of UK users, by understanding in detail services’ measures, assessing how well they protect users, and pushing for timely improvements where necessary. Evidence, like the information included in the Report, helps inform our supervisory engagement and prioritisation, such as which providers we should be engaging with and what we should focus our engagement on. We have been actively engaging with TikTok since 2022 under the Video Sharing Platform (VSP) regime (a pre-cursor to the Online Safety Act). One of our key priorities under the VSP regime has been to ensure that children are protected from finding and viewing age-inappropriate videos. Based on the information included in the Report, we will be engaging with TikTok on some of the issues you have raised in your Report. In addition to our broader supervisory programme, we have a dedicated taskforce dealing with ‘small but risky services’. The taskforce’s focus is bringing smaller, riskier services into compliance with their duties under the Act. For the purposes of this work the services in question are those which:
• are typically low reach – generally under or around 1% of UK population as active monthly users;
• have high risk features or functionalities – as defined by our published guidance; and/or
• are services brought to our attention for other risk factors. Given the risk they present to UK users, suicide discussion forums are an area of interest for us and a potential target for this taskforce. Enforcement Our preference is to work with services to encourage voluntary compliance. But we will, if necessary, launch enforcement action where we determine that a service provider is not complying with its duties, for example where we consider it is not taking appropriate steps to protect users from harm. Under the Act, where we identify compliance failures, we can impose fines of up to £18m or 10% of the service provider’s qualifying worldwide revenue (whichever is greater). In the most serious cases of non-compliance, we can seek a court order imposing business disruption measures, which may require third parties (such as providers of payment or advertising services, or ISPs) to withdraw services from, or limit access to, a regulated service in the UK. To support our investigations, we have powers to request information and obtain skilled person’s reports, and can take enforcement action where service providers fail to respond or provide inaccurate information. We expect our early enforcement action to focus on ensuring services are adequately assessing risk and putting in place the measures that will be most impactful in protecting users, especially children, from serious harms such as those relating to CSAM, pornography and illegal suicide content. To date we have launched three sector-wide compliance programmes as the illegal content duties have come into force to monitor whether providers are complying with their illegal content risk assessment duties and record keeping duties under the Act. We are also taking targeted action against specific services, the first of which is directly relevant to the matters of concern included in the Report. Ofcom’s investigation into a suicide forum

Classification: HIGHLY SENSITIVE In line with the approach outlined above, on 9 April 2025, we opened an investigation into . The investigation will examine whether there are reasonable grounds to believe that this service provider has failed, or is failing, to comply with its duties to:
• put appropriate safety measures in place to protect its UK users from illegal content and activity;
• complete – and keep a record of – a suitable and sufficient illegal harms risk assessment; and
• adequately respond to a statutory information request. Timeline leading up to the investigation Ofcom’s Supervision team repeatedly reached out to before the Act came into force, in an attempt to promote compliance in respect of its duties under the Act. In early 2025, Supervision also sent advisory letters to inform and remind the provider of its upcoming duties and the consequences of non-compliance. As this approach did not resolve our concerns, it was escalated to our Enforcement team. On 3 March 2025, a legally binding information notice was sent to under the Risk Assessment Enforcement Programme. We had been clear that failure to adequately respond to our request for service providers to submit a record of their illegal content risk assessment may result in enforcement action and that, as soon as the duties took effect for providers, we would not hesitate to take action where we suspect there may be serious breaches which appears to pose a risk of very significant harm to UK users. Following correspondence with , we conducted an initial assessment to assess our compliance concerns and decide if an investigation was warranted. As our concerns again remained unresolved, we opened our investigation into the online suicide forum three weeks after the illegal harms duties came into force. Ofcom’s investigatory process Ofcom’s Online Safety Enforcement Guidance sets out how Ofcom will normally approach enforcement under the Act. This includes our approach to information gathering and analysis and the procedural steps we must take to fairly determine the outcome of the investigation. A typical process is set out at Figure 2 in the Annex below. Although we are not in a position to give specific details on how long this investigation will take, we expect the initial stages to span across several months while we gather additional information from the provider and assess its compliance with the Act. We will provide regular updates on this investigation on our website.

Conclusion We thank the Coroner again for bringing to our attention the circumstances of Ms Williams’ death. Ms Williams’ death highlights the impact that suicide content online can have on people, and Ofcom is committed to holding providers to account where they fail to address the risk of harm from this type of content. We will work directly with service providers to promote compliance, including, where appropriate, through targeted supervision with a view to protecting adults and children from suicide material that is illegal or harmful. Where we identify non-compliance, we will not hesitate to take appropriate enforcement action. Evidence included in reports from coroners and other experts will also play an important role in any further policy proposals as we continue to implement the Act.

Classification: HIGHLY SENSITIVE We hope that this response provides helpful information about the significant steps Ofcom is taking as we continue to work through the implementation of the Act. If further information or clarification is required, we would be happy to provide this.
Report Sections
Investigation and Inquest
On 7 March 2025 an inquest was heard into the death of Rhiannon Auriol Mary Williams.

The medical cause of death was: 1a Asphyxia 1b Combined Drug Toxicity and Neck Ligature and Submersion

II Selective Serotonin Inhibitors Withdrawal

The conclusion of the inquest was:

Suicide
Circumstances of the Death
Rhiannon Auriol Mary Williams was age 24 at the time of her death. Rhiannon was an exceptionally bright, talented and creative person who excelled academically and who was loved and supported by her family. Rhiannon’s mental health deteriorated at a time when she suddenly stopped taking SSRI medication and a result, she experienced withdrawal symptoms which caused anxiety. Rhiannon was also concerned that she may have autism spectrum disorder, and her counsellor documented that she was displaying traits of this disorder. For at least a year prior to her death and possibly longer Rhiannon experienced suicidal thoughts. From May 2023 Rhiannon’s suicidal thoughts and distress worsened although the full extent of Rhiannon’s thoughts and distress was kept hidden from her family. In the months before her death Rhiannon started researching websites relating to suicide and in the month of her death, Rhiannon undertook a Tik Tok search of ‘drowning in the bath’. On 15 September 2023 Rhiannon accessed a website called . This website describes itself as a community that discusses mental illness and suicide from the perspective of suicidal people, as well as the moral implications of the act itself and actively encourages people to commit suicide. It also encourages individuals to hide their thoughts and actions from their loved ones. There is no moderation of this website. Rhiannon used this website to download a detailed document entitled ‘ guide’ which describes in detail how to use these medications to bring about death. There is evidence that Rhiannon used the internet to obtain . There is also evidence that Rhiannon had written down and practised the method that she eventually used on the day of her death – information also likely obtained from the above website / social media. On 16 September 2023 Rhiannon was found . Toxicology findings confirmed the presence of at an elevated level which would have had a sedatory effect and likely impairment of Rhiannon’s respiration.
Related Inquiry Recommendations

Public inquiry recommendations addressing similar themes

Pre-screening by Internet Providers
IICSA
Harmful Algorithmic Content Promotion
Age Verification Online
IICSA
Harmful Algorithmic Content Promotion
Publish interim online harms code of practice
IICSA
Harmful Algorithmic Content Promotion
Pre-screen material before upload
IICSA
Harmful Algorithmic Content Promotion

Data sourced from Courts and Tribunals Judiciary under the Open Government Licence.