Adrian Gallagher
PFD Report
All Responded
Ref: 2024-0010
All 3 responses received
· Deadline: 22 Feb 2024
Response Status
Responses
3 of 1
56-Day Deadline
22 Feb 2024
All responses received
About PFD responses
Organisations named in PFD reports must respond within 56 days explaining what actions they are taking.
Source: Courts and Tribunals Judiciary
Coroner’s Concerns
is available to anyone to purchase online, directly from the company. The company appears to have some link to the UK as there is a UK helpline number. The appears to provide step by step instruction on how to end your life using certain methods, including how to make the death appear to be due to natural causes and therefore avoid referral to the coroner. Whilst the introduction suggests it is aimed at those who are elderly and long-term suffering, there is also reference to suicide for other reasons within the book and is likely to appear to vulnerable mental health patients. According to the evidence of the police officer, you can also buy drugs to end your life through this website. The only check on age and ID appears to be after a purchase, to allow you access to online forums where you can get further advice on best methods. The has been banned in Australia (and possibly other countries) as it is deemed to encourage/ assist in suicide. The , in some format, is also available on Amazon (and I am writing to Amazon directly to flag this).
Responses
The NCA acknowledged the concerns and is engaging with Ofcom to scope out collaboration on combating illegal suicide content online. They noted existing strategies and foreshadowed legislation to address harmful online content.
AI summary
View full response
Dear Ms Davies, Thank you for your letter of 28 December 2023 regarding the death of Adrian Gallagher. I would like to say how deeply saddened I was to read of the circumstances of Mr Gallagher’s death. I’d like to offer my condolences to his family and loved ones. I’m grateful to you for raising your concerns about the availability of information regarding suicide methods online. As a result of your letter we have been engaging with Ofcom, the online safety regulator under the Online Safety Act, to scope out how we can work effectively together, within our respective roles, to combat illegal suicide content online. The NCA understands that Ofcom is in the process of consulting on its draft codes of practice and guidance that will explain the steps regulated online services will need to take to meet their legal duties under the Act to protect their users in the UK from this sort of content. The handbook in question describes a number of drugs that have been classified as Class A and Class B under schedules 2 and 3 of the Misuse of Drugs Act 2001. These are being tackled under the HMG Drugs Strategy, by prioritising breaking drug supply chains and limiting the volume of illicit drugs available in the UK. The NCA also welcomes the Suicide Prevention strategy, published on 11 September 2023 by the Department of Health and Social Care, alongside a commitment to reduce the lives lost to suicide. The strategy sets out national ambitions for suicide prevention over the next 5 years, which include everyone having a role to play in suicide prevention. As a consequence of the strategy,
OFFICIAL OFFICIAL
Leading the UK's fight to cut serious and organised crime 2 of 3
all areas of the country now have local suicide prevention plans, including guidance on providing bespoke support to specific groups and communities of concern, including those who have been in contact with mental health services. I understand this is supported by a £57 million investment through the NHS Long Term Plan. Within the strategy, the NCA particularly welcomes the references to continuing to support search engine and social media platforms to remove content that encourages suicide and provide ready access to suicide prevention services, and the read across to the Online Safety Act. The strategy also aims to clearly define who is responsible and accountable for keeping the public safe from this content. Work so far has involved high quality signposting and support being prevalent across a wide range of platforms. A recent example of this is the Google OneBox, a pop-up alert that provides details about how to contact Shout or the Samaritans. The strategy foreshadows the recently enacted Online Safety Act, which I am sure you are aware of. This Act requires all in-scope companies to tackle illegal content, such as suicide and self-harm content. Requirements have also been made that the largest services will have to offer adults optional tools to limit their exposure to legal content that encourages, promotes or provides instructions for suicide or self-harm. A further extension of this is the Criminal Justice Bill, introduced to the House of Commons on 14 November 2023, and, at time of writing, is in its reporting stage at the House of Commons. Sections 11 and 12 of this bill look to replace Section 184 of the Online Safety Act with a broader offence covering “any act capable of encouraging or assisting serious self-harm of another person”, with an emphasis on harmful intent. I also wanted to bring to your attention the work that Department of Health and Social Care are doing with Samaritans on the delivery of their online excellence programme. This involves developing a hub of excellence in suicide prevention and the online environment, working in partnership with Facebook, Instagram, Google, YouTube, Twitter and Pinterest and aims to promote consistently high standards across the sector. This will be crucial in changing the availability of harmful content online. Thank you for bringing these important concerns to our attention and I hope you find this brief summary of the work being done across the government and the NCA, helpful. If I can be of any further assistance please do not hesitate to contact me.
OFFICIAL OFFICIAL
Leading the UK's fight to cut serious and organised crime 3 of 3
T/Deputy Director Borders & Commodities National Crime Agency
OFFICIAL OFFICIAL
Leading the UK's fight to cut serious and organised crime 2 of 3
all areas of the country now have local suicide prevention plans, including guidance on providing bespoke support to specific groups and communities of concern, including those who have been in contact with mental health services. I understand this is supported by a £57 million investment through the NHS Long Term Plan. Within the strategy, the NCA particularly welcomes the references to continuing to support search engine and social media platforms to remove content that encourages suicide and provide ready access to suicide prevention services, and the read across to the Online Safety Act. The strategy also aims to clearly define who is responsible and accountable for keeping the public safe from this content. Work so far has involved high quality signposting and support being prevalent across a wide range of platforms. A recent example of this is the Google OneBox, a pop-up alert that provides details about how to contact Shout or the Samaritans. The strategy foreshadows the recently enacted Online Safety Act, which I am sure you are aware of. This Act requires all in-scope companies to tackle illegal content, such as suicide and self-harm content. Requirements have also been made that the largest services will have to offer adults optional tools to limit their exposure to legal content that encourages, promotes or provides instructions for suicide or self-harm. A further extension of this is the Criminal Justice Bill, introduced to the House of Commons on 14 November 2023, and, at time of writing, is in its reporting stage at the House of Commons. Sections 11 and 12 of this bill look to replace Section 184 of the Online Safety Act with a broader offence covering “any act capable of encouraging or assisting serious self-harm of another person”, with an emphasis on harmful intent. I also wanted to bring to your attention the work that Department of Health and Social Care are doing with Samaritans on the delivery of their online excellence programme. This involves developing a hub of excellence in suicide prevention and the online environment, working in partnership with Facebook, Instagram, Google, YouTube, Twitter and Pinterest and aims to promote consistently high standards across the sector. This will be crucial in changing the availability of harmful content online. Thank you for bringing these important concerns to our attention and I hope you find this brief summary of the work being done across the government and the NCA, helpful. If I can be of any further assistance please do not hesitate to contact me.
OFFICIAL OFFICIAL
Leading the UK's fight to cut serious and organised crime 3 of 3
T/Deputy Director Borders & Commodities National Crime Agency
The department highlights that the Online Safety Act has received Royal Assent, establishing a legal framework requiring tech companies and search services to prevent exposure to and remove illegal suicide content. Ofcom has enforcement powers, including substantial fines, for non-compliant services, including those based overseas.
AI summary
View full response
Dear Ms Davies,
Thank you for your correspondence of 28 December 2023, and for the opportunity to respond to this Report to Prevent Future Deaths, regarding the tragic death of Adrian Brendan Gallagher. I would firstly like to extend my deepest condolences to the family and friends of Adrian. I was very sorry to read about the circumstances surrounding his death.
The Online Safety Act (the act), as you may know, received Royal Assent in October last year and will ensure that tech companies take more responsibility for the online safety of their users. The new laws will apply to search services and all companies that allow users to post content online or to interact with each other (known as user-to-user services), including for example, online forums. The safety duties in the act will not apply to websites that do not host user generated content or facilitate interaction between users or are search services. With this in mind, I will set out how the act will protect users from illegal suicide and self-harm content once the relevant duties are in force.
When the act's illegal safety duties are in force, all in-scope user-to-user services will be required to have systems and processes in place to proactively prevent users from being exposed to priority illegal content via their service, and to minimise the length of time for which such content is present. The priority offences are set out in schedules 5, 6 and 7 and include content that amounts to an offence under the Suicide Act 1961, as well as content facilitating the sale of illegal drugs, as per the Misuse of Drugs Act 1971. This will help to protect all users
- children and adults - from encountering such content.
The act also creates a new communications offence of encouraging or assisting serious self- harm which applies to acts both inside and outside the UK, if the act is committed by a person who is either in the UK when they commit the offence or are habitually resident in the UK. In other circumstances, a prosecutor may be able to rely on the common law position that, where a case has a "substantial connection" to the UK, the offence can be prosecuted in the UK unless it could be argued that the conduct ought to be dealt with by the courts of another country. Providers will be required to put in place systems and processes that allow them to rapidly remove content that amounts to this offence.
Search services will also have targeted duties that focus on mitigating and minimising the presentation of illegal search content, such as image or text search results, to users. These duties will play a key role in reducing the volume of user traffic directed to websites with illegal suicide and self-harm content, whether or not the content on these websites is user-generated. This will reduce the ease at which users can find these kinds of horrifying sites and content.
There will be sites and services that choose not to comply with the act's regulatory framework. In these instances, Ofcom has a suite of enforcement powers to support its regulatory functions. These extend to instances where a company is based overseas - if the service has a significant number of UK users or the UK as a target market. Ofcom's powers include the ability to impose substantial fines of up to £18 million or 10% of global revenue (whichever is higher), as well as applying to the court for business disruption measures. These are court orders that require third parties (those who provide an access facility to such services, such as internet service providers) to prevent, restrict or deter access to non-compliant services in extreme services.
Every suicide is a tragedy, and my department is working with other departments to tackle the deeply concerning issue of vulnerable users gaining access to harmful content online, whether on services that are in scope of the act or otherwise. The act introduces much needed protections for users and we are working closely with Ofcom to ensure that the implementation of the framework is as quick as possible.
Thank you for your correspondence of 28 December 2023, and for the opportunity to respond to this Report to Prevent Future Deaths, regarding the tragic death of Adrian Brendan Gallagher. I would firstly like to extend my deepest condolences to the family and friends of Adrian. I was very sorry to read about the circumstances surrounding his death.
The Online Safety Act (the act), as you may know, received Royal Assent in October last year and will ensure that tech companies take more responsibility for the online safety of their users. The new laws will apply to search services and all companies that allow users to post content online or to interact with each other (known as user-to-user services), including for example, online forums. The safety duties in the act will not apply to websites that do not host user generated content or facilitate interaction between users or are search services. With this in mind, I will set out how the act will protect users from illegal suicide and self-harm content once the relevant duties are in force.
When the act's illegal safety duties are in force, all in-scope user-to-user services will be required to have systems and processes in place to proactively prevent users from being exposed to priority illegal content via their service, and to minimise the length of time for which such content is present. The priority offences are set out in schedules 5, 6 and 7 and include content that amounts to an offence under the Suicide Act 1961, as well as content facilitating the sale of illegal drugs, as per the Misuse of Drugs Act 1971. This will help to protect all users
- children and adults - from encountering such content.
The act also creates a new communications offence of encouraging or assisting serious self- harm which applies to acts both inside and outside the UK, if the act is committed by a person who is either in the UK when they commit the offence or are habitually resident in the UK. In other circumstances, a prosecutor may be able to rely on the common law position that, where a case has a "substantial connection" to the UK, the offence can be prosecuted in the UK unless it could be argued that the conduct ought to be dealt with by the courts of another country. Providers will be required to put in place systems and processes that allow them to rapidly remove content that amounts to this offence.
Search services will also have targeted duties that focus on mitigating and minimising the presentation of illegal search content, such as image or text search results, to users. These duties will play a key role in reducing the volume of user traffic directed to websites with illegal suicide and self-harm content, whether or not the content on these websites is user-generated. This will reduce the ease at which users can find these kinds of horrifying sites and content.
There will be sites and services that choose not to comply with the act's regulatory framework. In these instances, Ofcom has a suite of enforcement powers to support its regulatory functions. These extend to instances where a company is based overseas - if the service has a significant number of UK users or the UK as a target market. Ofcom's powers include the ability to impose substantial fines of up to £18 million or 10% of global revenue (whichever is higher), as well as applying to the court for business disruption measures. These are court orders that require third parties (those who provide an access facility to such services, such as internet service providers) to prevent, restrict or deter access to non-compliant services in extreme services.
Every suicide is a tragedy, and my department is working with other departments to tackle the deeply concerning issue of vulnerable users gaining access to harmful content online, whether on services that are in scope of the act or otherwise. The act introduces much needed protections for users and we are working closely with Ofcom to ensure that the implementation of the framework is as quick as possible.
The Department of Health and Social Care works with cross-government and cross-sector partners, having implemented over thirty actions to reduce awareness and limit access to suicide methods. Officials continue to engage with police, charities, and online marketplaces regarding harmful publications, and support programmes like the Samaritan's Online Excellence Programme.
AI summary
View full response
Dear Ms Davies,
Thank you for your Regulation 28 report to prevent future deaths dated 28th December 2023 regarding the death of Mr Adrian Brendan Gallagher. I am replying as Minister with responsibility for suicide prevention at the Department of Health and Social Care.
Firstly, I would like to say how deeply saddened I was to read of the circumstances of Mr Gallagher’s death. Whilst I know that it will come as little comfort to his family, I nevertheless hope they will accept my heartfelt condolences.
The Government remains extremely concerned about the prevalence of abhorrent suicide and self-harm content online, and sadly, we are aware of the content that you have raised.
We keep under constant review what the most appropriate actions are to take to reduce the harm caused by publications like the . In doing this, we work closely with other government departments, charities and experts to review actions regularly and continue to monitor the impact. This includes learning lessons from international examples where these are relevant.
I welcome your decision to write to Amazon to flag these concerns. My officials continue to work with the police and suicide prevention charities to revisit the conversations they have had with online marketplaces on the sale of harmful publications. I would be interested to see a copy of Amazon’s response to you should they reply, to support those conversations.
The Government is taking a leading role in tackling methods of suicide, collaborating with partners across the world in policy, law enforcement and society more broadly to limit access, and share research, evidence and lessons learned. This includes seeking to tackle at source the suppliers of harmful substances for the purposes of suicide.
The Department of Health and Social Care leads a cross-sector, cross-government working group to tackle emerging methods, working with those partners. As a result of this group, there are now over thirty actions in place to reduce awareness and limit access to methods of suicide, including the one used by Mr Gallagher. The group meets regularly and remains vigilant to any further methods that we receive intelligence on, and will not hesitate to act should it be required.
As a Government, we are committed to tackling online safety and reducing the prevalence of harmful suicide and self-harm content.
The Department for Science, Innovation and Technology (DSIT) will respond outlining how the Online Safety Act will address illegal and harmful self-harm and suicide content. The Department of Health and Social Care will continue to work closely with DSIT and Ofcom to support this work. We will also continue our support of programmes such as the Samaritan’s Online Excellence Programme, which provides support and guidance for some of the biggest online platforms.
To close, we want to assure you that online safety and tackling methods of suicide are critical components of the Suicide Prevention Strategy, and we, alongside our expert advisory group, will continue to do what we can to prevent future deaths of this manner.
I hope this reply is helpful. Thank you for bringing these concerns to my attention.
Thank you for your Regulation 28 report to prevent future deaths dated 28th December 2023 regarding the death of Mr Adrian Brendan Gallagher. I am replying as Minister with responsibility for suicide prevention at the Department of Health and Social Care.
Firstly, I would like to say how deeply saddened I was to read of the circumstances of Mr Gallagher’s death. Whilst I know that it will come as little comfort to his family, I nevertheless hope they will accept my heartfelt condolences.
The Government remains extremely concerned about the prevalence of abhorrent suicide and self-harm content online, and sadly, we are aware of the content that you have raised.
We keep under constant review what the most appropriate actions are to take to reduce the harm caused by publications like the . In doing this, we work closely with other government departments, charities and experts to review actions regularly and continue to monitor the impact. This includes learning lessons from international examples where these are relevant.
I welcome your decision to write to Amazon to flag these concerns. My officials continue to work with the police and suicide prevention charities to revisit the conversations they have had with online marketplaces on the sale of harmful publications. I would be interested to see a copy of Amazon’s response to you should they reply, to support those conversations.
The Government is taking a leading role in tackling methods of suicide, collaborating with partners across the world in policy, law enforcement and society more broadly to limit access, and share research, evidence and lessons learned. This includes seeking to tackle at source the suppliers of harmful substances for the purposes of suicide.
The Department of Health and Social Care leads a cross-sector, cross-government working group to tackle emerging methods, working with those partners. As a result of this group, there are now over thirty actions in place to reduce awareness and limit access to methods of suicide, including the one used by Mr Gallagher. The group meets regularly and remains vigilant to any further methods that we receive intelligence on, and will not hesitate to act should it be required.
As a Government, we are committed to tackling online safety and reducing the prevalence of harmful suicide and self-harm content.
The Department for Science, Innovation and Technology (DSIT) will respond outlining how the Online Safety Act will address illegal and harmful self-harm and suicide content. The Department of Health and Social Care will continue to work closely with DSIT and Ofcom to support this work. We will also continue our support of programmes such as the Samaritan’s Online Excellence Programme, which provides support and guidance for some of the biggest online platforms.
To close, we want to assure you that online safety and tackling methods of suicide are critical components of the Suicide Prevention Strategy, and we, alongside our expert advisory group, will continue to do what we can to prevent future deaths of this manner.
I hope this reply is helpful. Thank you for bringing these concerns to my attention.
Report Sections
Investigation and Inquest
On 17 November 2017 I commenced an investigation into the death of Adrian Brendan GALLAGHER aged 24. The investigation concluded at the end of the inquest on 19 December 2023. The conclusion of the inquest was that: This was a death due to suicide.
Circumstances of the Death
Adrian Gallagher had a history of mental health struggles dating back to 2013, with no definitive diagnosis. On 12 June 2017 he was admitted to Hollins Park Hospital as an informal patient, but was discharged at his request on 16 June 2017. The same day, he was taken to hospital having been found intoxicated at a bridge, with suicidal ideation. He was formally sectioned under the Mental Health Act the following day and re-admitted to Hollins Park. During the admission, Adrian’s condition appeared to stabilise with changes to his medication, and he was allowed long periods of unescorted leave. By August 2017 he was awaiting a bed at Lea Court, a rehabilitation unit, and was spending the majority of the day away from the hospital. His presentation during this period did not give the hospital team or his parents cause for concern in relation to self harm/ suicidal ideation. On 9 November 2017 Adrian returned to hospital following a period of leave at his father’s house. No concerns or changes to his presentation were noted. Sadly, he was found deceased in bed the following morning, with his death confirmed at 08.50 on 10 November 2017. His death was due to an intentional overdose. Police interrogation of Adrian’s phone after his death identified that on 12 September 2017 he made a purchase from . It is not clear from the phone records what that purchase was, but it was the evidence at the inquest that the most likely purchase was .
It was the evidence of the attending police officer that, although not easy to do, you can also buy pentobarbital through the site.
28 – After Inquest Document Template Updated 30/07/2021
It was the evidence of the attending police officer that, although not easy to do, you can also buy pentobarbital through the site.
28 – After Inquest Document Template Updated 30/07/2021
Similar PFD Reports
Reports sharing organisations, categories, or themes with this PFD
Related Inquiry Recommendations
Public inquiry recommendations addressing similar themes
Data sourced from Courts and Tribunals Judiciary under the Open Government Licence.