Sun, January 4, 2026
Sat, January 3, 2026
Fri, January 2, 2026

MPs Accuse Facebook & Instagram of Failing Young People After Manchester Attack

MPs Demand Facebook & Instagram Protect Young People After Manchester School Attack Insights

A damning report released by a parliamentary committee has accused Meta (the parent company of Facebook and Instagram) of failing to adequately protect young people on its platforms, drawing stark parallels between the company’s practices and those that contributed to the radicalization of the perpetrator of the 2017 Manchester Arena bombing. The Digital, Culture, Media and Sport (DCMS) committee's report, based on evidence gathered following a review into online safety, specifically highlights how Facebook and Instagram algorithms amplified extremist content, potentially influencing Ali Harbi Ali, who murdered Sir David Amess in 2021, and drawing uncomfortable comparisons to the radicalization pathway of Salman Abedi.

The core of the criticism revolves around Meta’s persistent failure to address algorithmic amplification – the way its platforms prioritize content based on user engagement, often leading individuals down rabbit holes of increasingly extreme material. The report argues that while Meta has made some superficial changes following previous scrutiny (including removing certain keywords and implementing reporting mechanisms), these measures are insufficient to counter the underlying problem: algorithms designed for maximizing engagement, regardless of the potential harm they inflict.

The committee’s investigation was initially triggered by a review into disinformation and misinformation during the 2019 general election. However, it broadened its scope to examine online safety more generally, particularly concerning children and young people. The report draws heavily on internal Facebook documents leaked by whistleblower Frances Haugen (as detailed in the "Facebook Files" – see further information below), which revealed that Meta’s own researchers were aware of the negative impacts of their algorithms on mental health, body image, and exposure to harmful content. These documents demonstrated a consistent pattern: acknowledging problems but prioritizing profit over user safety.

The Manchester Arena bombing serves as a particularly poignant case study in the report's findings. While not directly claiming Meta caused Abedi’s actions, the committee points out that his online activity revealed an increasing exposure to extremist content on Facebook and Instagram. Abedi was reportedly drawn into radical Islamic groups through targeted advertising and algorithmic recommendations. The report suggests that Meta’s algorithms played a role in amplifying this content, pushing it towards vulnerable individuals like Abedi. This echoes concerns raised previously about the platforms' ability to identify and remove terrorist propaganda effectively.

The DCMS committee isn't just pointing fingers; they are demanding concrete action. Their recommendations include:

  • Greater Transparency: Meta must be more transparent about how its algorithms work, allowing independent researchers access to data to assess their impact.
  • Algorithmic Accountability: The company needs to take responsibility for the consequences of its algorithmic choices and implement safeguards to prevent harmful content from being amplified. This includes prioritizing user safety over engagement metrics.
  • Age-Appropriate Design: Platforms must be designed with children’s wellbeing in mind, including stricter age verification measures and tailored content moderation policies. The Online Safety Bill currently progressing through Parliament aims to enforce this (see further information below).
  • Independent Oversight: An independent body should oversee Meta's online safety practices and hold the company accountable for its failures.

The report’s timing is significant, coinciding with the ongoing debate surrounding the UK’s Online Safety Bill. This bill seeks to impose a legal duty of care on social media platforms to protect users from harmful content. Meta has been highly critical of the bill, arguing that it would stifle free speech and be overly burdensome to implement. The DCMS committee's report directly challenges this narrative, asserting that Meta’s resistance is rooted in protecting its business model rather than genuine concerns about freedom of expression.

The "Facebook Files," leaked by Frances Haugen, have been instrumental in shaping the public perception of Meta's practices and providing ammunition for lawmakers worldwide. These documents revealed internal discussions within Facebook acknowledging the harms caused by their platforms, including increased rates of anxiety, depression, and body image issues among young users. They also highlighted a reluctance to address these problems due to concerns about impacting user engagement and advertising revenue (more information on the Facebook Files can be found here: [ https://www.theguardian.com/technology/2021/oct/17/facebook-papers-what-we-know ]).

The Online Safety Bill, currently undergoing parliamentary scrutiny, is intended to address many of the concerns raised in the DCMS report and by whistleblowers like Haugen. It places a legal obligation on social media companies to proactively identify and remove harmful content, with significant fines for non-compliance (more information about the bill can be found here: [ https://bills.parliament.uk/bills/3158 ]).

The report’s conclusions are a stark reminder of the power and potential dangers of social media algorithms. While Meta claims to be committed to user safety, the DCMS committee's findings suggest that its actions have consistently fallen short of this commitment. The pressure is now on for Meta – and other social media giants – to fundamentally rethink their approach to online safety and prioritize the wellbeing of users, particularly young people, over profit margins. The Manchester Arena bombing serves as a tragic illustration of what can happen when these platforms fail to do so.

I hope this article provides a comprehensive summary of the Independent's report and its context.


Read the Full The Independent Article at:
[ https://www.independent.co.uk/news/business/manchester-tips-mps-facebook-instagram-b2893632.html ]