PARLIAMENTARY DEBATE
Online Safety: Children and Young People - 26 November 2024 (Commons/Westminster Hall)

Debate Detail

Contributions from Richard Burgon, are highlighted with a yellow border.

[Peter Dowd in the Chair]

Lab
  11:19:58
Lola McEvoy
Darlington
I beg to move,

That this House has considered online safety for children and young people.

Just give me one second to get my notes in order, Mr Dowd.
Con
Dame Caroline Dinenage
Gosport
Will the hon. Lady give way?
Lola McEvoy
I will!
  14:30:49
Dame Caroline Dinenage
The hon. Lady has called a debate on a really important issue. Could she set out why she thinks that now is a really important time to discuss this vital topic?
  14:30:49
Lola McEvoy
I will—and I thank the hon. Lady for her intervention.

It is a pleasure to serve under your chairmanship, Mr Dowd. It is my great honour to open this debate on online safety for our children. I welcome the Minister answering for the Department for Science, Innovation and Technology, and the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), answering for the official Opposition. I tabled this as my first debate in Westminster Hall, because I believe this issue is one of the most defining of our time. I promised parents and children in my constituency of Darlington that I would tackle it head-on, so here I am to fulfil that promise.

I would like to put on the record that I have long been inspired by the strength of the parents of Bereaved Families for Online Safety—a group of parents united by the unbearable loss of their children and by their steadfast commitment to get stronger online protections to prevent more children’s deaths. I say to Ellen, who is here with us this afternoon: thank you for your courage—you have experienced unimaginable pain, and I will do everything I can to prevent more parents from going through the same.

The consensus for action on this issue has been built, in no small part due to the incredible drive of parents to campaign for justice. It is felt in every corner of the country, and it is our job as a Government to step in and protect our children from online harm. In my constituency of Darlington, at door after door right across the town and regardless of background, income or voting intention, parents agreed with me that it is time to act to protect our children. I am taking this issue to the Government to fight for them.

I am standing up to amplify the voice of the girl who sends a picture of herself that she thought was private but arrives at school to find that it has been shared with all her peers; she is not only mortified but blamed, and the message cannot be unsent. I am standing up to amplify the voice of the boy who gets bombarded with violent, disturbing images that he does not want to see and never asked for, and who cannot sleep for thinking about them. I am standing up for the mother whose son comes home bruised and will not tell her what has happened, but who gets sent a video of him being beaten up and finds out that it was organised online. I am standing up for the father whose daughter refuses to eat anything because she has seen video after video after video criticising girls who look like her. I say to all those who have raised the alarm, to all the children who know something is wrong but do not know what to do, and to all those who have seen content that makes them feel bad about themselves, have been bullied online, have seen images they did not want to see or have been approached by strangers: we are standing up for you.
Lab
  14:34:01
Mr Tanmanjeet Singh Dhesi
Slough
I congratulate my hon. Friend on securing this debate on online safety for children and young people. I have a keen personal interest, as a father of two young children. Earlier this year, Ofcom published 40 recommendations about how to improve children’s safety online, including through safer algorithms, and the Government rightly pointed to the role that technology companies can play in that. Does my hon. Friend agree that these companies must take their responsibilities much more seriously?
  14:34:30
Lola McEvoy
I absolutely agree that the companies must take those responsibilities seriously, because that will be the law. I am keen that we, as legislators, make sure that the law is as tight as it possibly can be to protect as many children as possible. We will never be able to eradicate everything online, and this is not about innovation. It is about making sure that we get this absolutely right for the next generation and for those using platforms now, so I thank my hon. Friend for his intervention.

The first meeting I called when I was elected the MP for Darlington was with the headteachers of every school and college in my town. I asked them to join together to create a town-wide forum to hear the voices of children and young people on what needs to change about online safety. The first online safety forum took place a couple of weeks ago, and the situation facing young people—year 10s, specifically—is much worse than I had anticipated.

The young people said that online bullying is rife. They said it is common for their peers to send and doctor images and videos of each other without consent, to spread rumours through apps, to track the locations of people in order to bully them through apps, to organise and film fights through apps, to be blackmailed on apps, to speak on games and apps to people they do not know, and to see disturbing or explicit images unprompted and without searching for them. They also said it is common to see content that makes them feel bad about themselves. This has to stop.

The last Government’s Online Safety Act 2023 comes into force in April 2025. The regulator, Ofcom, will publish the children’s access assessments guidance in January 2025. This will give online services that host user-generated content, search services and pornography services in the UK three months to assess whether their services are likely to be accessed by children. From April 2025, when the children’s codes of practice are to be published, those platforms and apps will have a further three months to complete a children’s risk assessment. From 31 July 2025, specific services will have to disclose their risk assessments to Ofcom. Once the codes are approved by Parliament, providers will have to take steps to protect users. There is to be a consultation on the codes in spring 2025, and I urge everybody interested in the topic—no matter their area of expertise or feelings on it—to feed into that consultation. The mechanism for change is in front of us, but my concern is that the children’s codes are not strong enough.
Lab
  14:37:13
Dan Norris
North East Somerset and Hanham
I congratulate my hon. Friend on securing this important debate. Could she comment on the use of artificial intelligence to create child sexual abuse materials? That is a key issue now. Many years ago, I trained with the National Society for the Prevention of Cruelty to Children as a child protection officer, and what I learned back then is that we have to get ahead of all the technologies in order to deal with the challenges effectively. Does she have any thoughts on that point? She may be coming to it in her own remarks.
  14:36:42
Lola McEvoy
I thank my hon. Friend for raising that great threat. My area of expertise on the issue is children’s and service users’ voices. There is definitely space for Ofcom and the Government to try to regulate the illegal manufacturing of images through AI. When I asked children in my constituency whether they had ever seen something that they knew was made by AI, they said yes—they had seen images of people that they knew were not real—but the notifications and warnings to tell them that it was AI were not as explicit as they could be. In other words, they could tell for themselves, but the notifications were not comprehensive enough for other children, who may not have noticed. This is a real danger.

There will always be content created online that we cannot police. We have to accept—as we do with any other piece of legislation—that there will be criminal actors, but I have called this debate because there are ways to protect children from harmful content, including by using the right age verification model. I am keen to focus my contribution on trying to protect children from content, in the round, that is harmful to them.

As I said before, the mechanism for change is in front of us, but my concern is that the children’s codes are not strong enough. The children in my town have told me—and I am sure everybody here knows it—that the current age verification requirements are easily passed through, and that content on some sites is deeply disturbing and sent to them without them asking for it. That means that the sites are hosting content that is deeply disturbing for children, and that the age verification is not fit for purpose. We need to talk either about stopping those sites from hosting that content, which is very difficult, or about changing the age verification process.
LD
  14:39:05
Dr Danny Chambers
Winchester
I want to talk about the scale of the problem that the hon. Lady touches on. The Children’s Commissioner for England reveals that 79% of children under 18 have encountered violent pornography before the age of 18, with the average age of first exposure being 13. Everything the hon. Lady is saying is very important, but this is not a niche problem; it is something that parents in Winchester have spoken to me about repeatedly in the four months since I was elected.
  14:39:18
Lola McEvoy
It is indeed prolific, for all our children—the whole generation. It is interesting that, among the different experts I have spoken to, there is consensus; the argument has been won that children are unsafe online and that is affecting them deeply, across the country. It is our job—it falls to legislators—to rectify the issue. I do not wish to defend online platforms, but they will do what the law tells them to do. They want to operate in this country. They want to make money. There is nothing wrong with that; they just have to adhere to the law. It is our job to make sure that the law is tight to protect our children. That is the crux of the issue.
Lab
  14:39:20
Alistair Strathern
Hitchin
My hon. Friend is powerfully illustrating the responsibility on all of us to step up to the needs of this moment. Parents in my constituency—at schools including William Ransom and Samuel Lucas—have been leading the way in taking further proactive action, signing up to a smartphone-free pledge to delay the age at which their young people have access to smartphones. Hundreds across the constituency have already signed up to the pledge. Does my hon. Friend agree that that underlines the strength of parental feeling on online safety and some of the wider associated issues, and that it highlights our responsibility to legislate—not just to celebrate the benefits of technology, but to do all we can to protect young people from the very real dangers it presents, too?
  14:39:20
Lola McEvoy
A smartphone-free pledge is a great idea, and I will take it to Darlington. Parents are further down the line than we are on this; children are further down the line than we are; campaign groups are further down the line than we are. We are lagging behind. We have taken action—the last Government passed the Online Safety Act. I think it is time for us to make sure that there is nothing missing from that Act. In my view, there are some areas where we could go further.

Children in Darlington have said to me that they are getting these unsolicited images—from the algorithms. These images are being fed to them. They are not from strangers, or bogeymen from another country, although that might happen. The most common complaint is that the algorithm is feeding them content that they did not ask for, and it is deeply disturbing, whether it is violent, explicit or harmful. Once they have seen it, they cannot unsee it.

That is why I am arguing to strengthen the codes. I am not sure that we should be retrofitting harmful apps with a code that may or may not work, and having to tweak a few bits of the algorithm to check whether it will actually protect our children. I think we can take stronger action than that.
Lab
  14:39:26
Dan Aldridge
Weston-super-Mare
Numerous mental health charities and a number of civil society experts have raised with me that there are powers within the Online Safety Act that must be used by the regulator. Indeed, the Secretary of State for DSIT made it very clear last week that he backed the Act and those powers. Does my hon. Friend agree that the regulator could and should act with more powers than it has?
  14:40:51
Lola McEvoy
I am loath to tell Ofcom that it does not have enough power. As I understand it, the powers are there, but we need to be explicit, and they need to be strengthened. How do we do that? The reason I outlined the timelines is that the time to act is now. We have to explicitly strengthen the children’s codes.

There are many ways to skin a cat, as they say, but one of the simpler ways to do this would be to outline the audience that the apps want to market to. Who is the base audience that the apps and platforms are trying to make money from? If that is explicitly outlined, the codes could be applied accordingly, and strengthened. If children are the target audience, we can question some of the things on those apps and whether the apps are safe for children to use in and of themselves.
  14:40:51
Mr Dhesi
With children able to access online content a lot more easily nowadays, many of my Slough constituents feel that it is critical that the content itself is appropriate and safe. Does my hon. Friend share my concerns about the rise of extreme misogynistic content and its impact on young people, especially considering that research has shown that it is actually amplified to teens?
  14:45:36
Lola McEvoy
I thank my hon. Friend for raising the really important—indeed, deeply concerning—issue of the rise of anti-women hate, with the perpetrators marketing themselves as successful men.

What we are seeing is that boys look at such videos and do not agree with everything that is said, but little nuggets make sense to them. For me, it is about the relentless bombardment: if someone sees one video like that, they might think, “Oh right,” and not look at it properly, but they are relentlessly targeted by the same messaging over and over again.

That is true not just for misogynistic hate speech, but for body image material. Girls and boys are seeing unrealistic expectations of body image, which are often completely fake and contain fake messaging, but which make them reflect on their own bodies in a negative way, when they may not have had those thoughts before.

I want to drive home that being 14 years old is tough. I am really old now compared with being 14, but I can truly say to anybody who is aged 14 watching this: “It gets better!” It is hard to be a 14-year-old: they are exploring their body and exploring new challenges. Their hormones are going wild and their peers are going through exactly the same thing. It is tough, and school is tough. It is natural for children and young people to question their identity, their role in the world, their sexuality, or whatever it is they might be exploring—that is normal—but I am concerned that that bombardment of unhealthy, unregulated and toxic messaging at a crucial time, when teenagers’ brains are developing, is frankly leading to a crisis.

I return to an earlier point about whether the parts of apps or platforms that children are using are actually safe for them to use. There are different parts of apps that we all use—we may not all be tech-savvy, but we do use them—but when we drill into them and take a minute to ask, “Is this safe for children?”, the answer for me is, “No.”

There are features such as the live location functionality, which comes up a lot on apps, such as when someone is using a maps app and it asks for their live location so they can see how to get from A to B. That is totally fine, but there are certain social media apps that children use that have their live location on permanently. They can toggle it to turn it off, but when I asked children in Darlington why they did not turn it off, they said there is a peer pressure to keep it on—it is seen as really uncool to turn it off. It is also about being able to see whether someone has read a message or not.

I then said to those children, “Okay, but those apps are safe because you only accept people you know,” and they said, “Oh no, I’ve got thousands and thousands of people on that app, and it takes me ages to remove each person, because I can’t remember if I know them, so I don’t do it.” They just leave their location on for thousands of people, many of whom may be void accounts, and they do not even know if they are active any more. The point is that we would not allow our children to go into a space where their location was shown to lots of strangers all the time. Those children who I spoke to also said that the live location feature on some of these apps is leading to in-person bullying and attacks. That is absolutely horrifying.
SNP
  14:48:51
Kirsty Blackman
Aberdeen North
On that point, is the hon. Member aware that if someone toggles their location off on Snapchat, for example, it constantly—in fact, every time the app is opened—says, “You’re on ghost mode. Do you want to turn your location back on?” So every single time someone opens the app, it tries to convince them to turn their location back on.
  14:49:14
Lola McEvoy
I thank the hon. Member for raising that issue, because there are lots of different nudge notifications. We can understand why, because it is an unregulated space and the app is trying to get as much data as possible—if we are not paying for the service, we are the service. We all know that as adults, but the young people and children who we are talking about today do not know that their data is what makes them attractive to that app.
  14:50:00
Dan Aldridge
I thank my hon. Friend for allowing me to intervene again. In my previous role as head of public policy at the British Computer Society, the one thing that my colleagues and I talked about a lot was the lack of focus on education in the Online Safety Act. I commend the previous Government for passing that legislation, which was very brave. The Act has tried to do some wonderful things, but what is missing is that we have failed to empower a generation of young people to act safely online, to be able to take back the power and say, “No, I am not going to do that.” We have failed in that so far. How do we build that in for the future?
  14:50:53
in the Chair
Peter Dowd
Order. I would like to bring to the attention of Members that we have had a huge number of interventions and we are 20 minutes into the debate. The Minister and Opposition spokesperson will get up at just after half past 3. It is a matter for the speaker whether she takes more interventions, but that does mean that the amount of time for those who have asked to speak will be significantly more restricted than I originally planned. That is just a housekeeping matter to be aware of. There is also an issue about the length of interventions: they are getting a bit long. On a matter of this importance, I do not want to restrict interventions and contributions, but I ask Members to please bear that in mind.
  14:51:51
Lola McEvoy
Okay, I will make progress. On the live location element, which I have discussed, I am not sure that there is any advantage in children using that, unless it is a specifically regulated live location app where the parents have given consent for their child.

I do not know whether chatting to strangers on games is suitable for children. Adding peers to a group and enjoying playing with them on games is fine, but there could be strangers from other countries, with no indication of their age. One child told me that he had found out, after about three weeks, that the person he had been playing with was a 50-year-old man on another continent. That man was probably mortified, as was the child, and they stopped playing together. Why are we leaving it up to them? That is such a high-risk strategy for those apps; we need to think about that.

It is down to Parliament to decide what is safe for our children, and to enforce it. Asking platforms to mark their own homework and police themselves will undoubtedly lead to more children seeing inappropriate, harmful content and sharing it with others. I would like the Government to strengthen the children’s codes, and consider changing the onus from reactive safety measures that make apps safe for children, when we suspect they are children, to proactively making apps or platforms safe for all children in the first place, and creating adult-only apps that require strong age verification, because adults can consent to giving their data.

A number of ways to protect children online are being debated, as I am sure we will hear this afternoon. I feel strongly that retrofitting apps once children have been exposed to harmful content or strangers, or have shared things they should not, is not the safest or most effective way to do this. A number of options around age verification are on the table, but I would like the Government to consider that being a child is tough and that children have a right to make mistakes. The issue is that those mistakes involve mass communications to peers and a permanent digital footprint, because someone has consented, aged 13, to give away their data.

We need to see whether any child can consent to give away their data, and therefore whether apps that identify their audience as children should be allowed to keep data at all. Should children be in chatrooms with strangers across the world? Should children be allowed to share their live location with strangers or people they have accepted as contacts? Should children be allowed to view unregulated livestreams or addictive-by-design content? Those questions have been raised not only by children themselves but by parents and national advocacy charities and leaders in this space. There is a consensus that we have to take action on this issue, so let us make the most of it.
  14:53:48
in the Chair
Peter Dowd
Order. I remind Members that they should bob if they wish to be called in the debate.
SNP
  14:54:15
Kirsty Blackman
Aberdeen North
I could talk for hours on this subject, Mr Dowd, but, do not worry, I will not. There are a number of things that I would like to say. Not many Members present sat through the majority of the Online Safety Bill Committee as it went through Parliament, but I was in every one of those meetings, listening to various views and debating online safety.

I will touch on one issue that the hon. Member for Darlington (Lola McEvoy) raised in her excellent and important speech. I agree with almost everything she said. Not many people in Parliament have her level of passion or knowledge about the subject, so I appreciate her bringing forward the debate.

On the issue of features, I totally agree with the hon. Member and I moved an amendment to that effect during the Bill’s progress. There should be restrictions on the features that children should be able to access. She was talking about safety by design, so that children do not have to see content that they cannot unsee, do not have to experience the issues that they cannot un-experience, cannot be contacted by external people who they do not know, and cannot livestream. We have seen an increase in the amount of self-generated child sexual abuse material and livestreaming is a massive proportion of that.

Yesterday, a local organisation in Aberdeen called CyberSafe Scotland launched a report on its work in 10 of our primary schools with 1,300 children aged between 10 and 12—primary school children, not secondary school children. Some 300 of those children wrote what is called a “name it”, where they named a problem that they had seen online. Last night, we were able to read some of the issues that they had raised. Pervasive misogyny is everywhere online, and it is normalised. It is not just in some of the videos that they see and it is not just about the Andrew Tates of this world—it is absolutely everywhere. A couple of years ago there was a trend in online videos of young men asking girls to behave like slaves, and that was all over the place.

Children are seeing a different online world from the one that we experience because they have different algorithms and have different things pushed at them. They are playing Roblox and Fortnite, but most of us are not playing those games. I am still concerned that the Online Safety Act does not adequately cover all of the online gaming world, which is where children are spending a significant proportion of their time online.

A huge amount more needs to be done to ensure that children are safe online. There is not enough in place about reviewing the online safety legislation, which Members on both sides of the House pushed for to ensure that the legislation is kept as up to date as possible. The online world changes very rapidly: the scams that were happening nine months ago are totally different from those happening today. I am still concerned that the Act focuses too much on the regulation of Facebook, for example, rather than the regulation of the online world that our children actually experience. CyberSafe Scotland intentionally centred the views and rights of young people in its work, which meant that the programmes that it delivered in schools were much more appropriate and children were much better able to listen and react to them.

The last thing that I will mention is Girlguiding and its girls’ attitude survey. It is published on an annual basis and shows a huge increase in the number of girls who feel unsafe. That is because of the online world they are experiencing. We have a huge amount of responsibility here, and I appreciate the hon. Member for Darlington bringing the debate forward today.
  14:58:21
in the Chair
Peter Dowd
I will keep this to an informal four-minute limit. Regrettably, if Members speak beyond that, I will have to introduce a formal figure.
Lab
Ms Julie Minns
Carlisle
It is a pleasure to speak under your chairmanship, Mr Dowd. Some 20 years ago, I started a new job with an as yet unbranded mobile network operator. At the time, the network had no masts, no handsets and no customers. Text messaging was just catching on, the BlackBerry was in its infancy and wireless application protocol was the new kid on the block. For those who do not know what WAP was, it was a bit like having Ceefax on a handset; for those who do not know what Ceefax was, I cannot really help.

My counterparts and I at the four mobile networks were acutely aware that the introduction of 3G would change how we used our phones. I will, however, confess that understanding what that change would look like—all while using dial-up at home—was something of a stab in the dark. Nevertheless, no matter how challenging, we knew that the advent of 3G required the mobile industry to take greater responsibility to protect the safety of our customers, in particular those under the age of 18. The networks moved from walled garden internet, where access was controlled by age verification and personal identification number, to a world where internet was freely available.

The mobile networks published the first self-regulatory code of content on mobile. It was a world first, and something that UK mobile operators were rightly proud of, but the pace of change was rapid; within months, we networks published a further self-regulatory code to govern location-based services, which, as we have heard already, present a clear danger to young people. We knew then that location tracking could be used in grooming and other predatory behaviour. We published the code, but the pace of change over the past 20 years has been unrelenting, and we now arrive at a point at which almost everything we do happens online.

The role of the mobile network is no longer as a gatekeeper to services, but rather as a pipe to over-the-top services such as YouTube, WhatsApp and TikTok. Those services can be more readily controlled by both the service provider and the handset manufacturer. That is not to absolve the networks of responsibility, but to acknowledge that they operate in a mobile value chain. I might pay £25 a month to my mobile network, but if I renew my handset every two years at a cost of £800, I am paying far more to the handset manufacturer than to the mobile network operator. I believe there is a strong argument that those who derive the greatest financial value from that value chain bear far greater responsibility for keeping children and young people safe online than is currently the case.

I turn now to one specific aspect of online harm. Having worked closely with the Internet Watch Foundation during my time in industry, I am fully aware of—and I thank it for—its important work in assessing child sexual abuse image material and removing it from the internet. I have visited and met the IWF teams who have to view and assess some of the most upsetting content. Their work is harrowing and distressing, but, sadly, it is essential.

Last year, the IWF assessed more than 390,000 reports and confirmed more than 275,000 web pages containing images or videos of children suffering sexual abuse. Each page contained hundreds, if not thousands, of indecent images of children. The IWF reported that 2023 was the most extreme year on record, with more category A sexual abuse imagery discovered than ever before, 92% of it self-generated child abuse. That means that the children have been targeted, groomed and coerced into sexual activities via webcams and devices with cameras.

For the first time, the IWF also encountered and analysed more than 2,400 images of sexual abuse involving children aged three to six. Some 91% of those images were of girls, mainly in domestic settings such as their own bedrooms or bathrooms. Each image or video is not just a single act; every time it is viewed or downloaded is another time that that child is sexually abused.

That is why I conclude my remarks with a clear ask to both the online and offline media and broadcast channels of our country: please stop describing these images as “kiddie porn” and “child pornography”. I did a search of some online news channels before I came to this debate; that language is still prevalent, and it has to stop. These images are not pornography. They are evidence of a crime and evidence of abuse. They are not pictures or videos. They are depictions of gross assault, sadism and bestiality against children. They are obscene images involving penetrative sexual activity with teenagers, children and babies. If there is one thing we can agree on in this debate, it is that the media in this country must start describing child sexual abuse material for what it is. Language matters, and it is time the seriousness of the offence was reflected in the language that describes it.
in the Chair
Peter Dowd
I am going to have to introduce a formal time limit of three and a half minutes.
LD
Caroline Voaden
South Devon
It is a pleasure to speak under your chairmanship, Mr Dowd. I congratulate the hon. Member for Darlington (Lola McEvoy) on bringing forward this important debate. The internet has undeniably introduced a valuable resource for learning that has transformed society, but technology has also brought with it significant risks that I believe we in this House have an urgent duty to address. Nobody knows that more acutely than all those parents who have tragically lost their children after online abuse, who are bravely represented today here in the Public Gallery by Ellen.

The statistics are sobering. Recent figures from Ofcom reveal that one in five children in the UK has experienced some form of online harm, including cyber-bullying, exposure to inappropriate content and exploitation. The NSPCC reports that more than 60% of young people have encountered online bullying, but I think the risk goes much further than that. We know that the average age at which a child first views pornography is estimated to be 12, with some evidence now suggesting it is as young as eight years old. Free and widely available pornography is often violent, degrading and extreme, and it has become the de facto sex education for young people.

The pornography crisis is radically undermining the healthy development of children and young people, and contributing to increasing levels of sexual inequality, dysfunction and violence. That reality represents how children’s lives are affected by those dangers, and as parliamentarians we have a duty to keep our children safe and free from harm—online as well as offline. Nine in 10 children are now on a mobile phone by the age of 11, and around a quarter of three-year-olds now have their own smartphone. I do not know about you, Mr Dowd, but I find that statistic particularly troubling.

I believe it is crucial to differentiate smartphone use from the broader digital environment. Smartphones, as we know, are engineered to be addictive, with notifications that stimulate the release of dopamine, the same chemical that is linked to pleasure. It is too easy for children to become trapped in a cycle of dependency and peer pressure, addicted to feeds and competing for likes on social media. Addiction is exactly what the tech companies want. Research from the Royal Society for Public Health shows that social media harms mental health—we all know that—particularly among young users. Around 70% of young people now report that social media increases their feelings of anxiety and depression.

The Children’s Commissioner, Rachel de Souza, believes that Ofcom’s children’s codes, which the hon. Member for Darlington talked about, are not strong enough and are written for the tech companies rather than for the children. She says that we need a code that protects our children from the “wild west” of social media. In South Devon I often hear from parents overwhelmed by the digital environment their children are navigating. They want to protect their children, but they feel ill equipped to manage those complexities. Hundreds of them have signed up to the smartphone-free pledge, and are pressuring schools to take part as well. We need to give them support, by backing what they want to do with legislation.

I believe we need a legislative framework that will restrict the addictive nature of smartphones, tighten age restrictions and restrict access to social media platforms for all children under 16. We have to protect them. Those measures are crucial for online child safety, and I believe there is a broad consensus in the House that big tech must be held accountable for the harm it perpetuates. We must abide—
in the Chair
Peter Dowd
Order. I call Jess Asato.
Lab
Jess Asato
Lowestoft
It is a pleasure to speak under your chairmanship, Mr Dowd. I welcome this debate, brought forward by my hon. Friend the Member for Darlington (Lola McEvoy). Prior to being elected as an MP, I spent almost a decade working in organisations supporting vulnerable women and children. My experience in that area over those years was very much a case of one step forward, two steps back.

Efforts to make our children’s increasingly online lives safer have been constantly outpaced by technological change. The law, the police and the courts have been unable to keep up with that change, and in its wake children have been the unwitting guinea pigs in a huge social experiment. The Online Safety Act has the potential to reset the relationship between children and the internet if the principles of safety by design are truly followed by tech companies and our regulator Ofcom. Of course we welcome age verification, which will finally come into force next year and will prevent children from accessing violent and harmful pornography.

There remains much more that we need to do in this space. That is why I am pleased to co-sponsor the safer phones Bill—Protection of Children (Digital Safety and Data Protection) Bill—sponsored by my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister). Smartphones, and social media in particular, are clearly negatively impacting on the mental health of our children, as well as their sleep and learning. Only last week, in an evidence session hosted by my hon. Friend, we heard that smartphones are contributing to a significant increase in short-sightedness among children, who are glued to their phones and seeing a decline in outdoors activity. We risk creating a generation suffering from myopia, and yet—perhaps because as adults we are also glued to our phones—we have not yet acted in the best interests of our young people. We regulate the toys we give to children so that they do not contain harmful lead and are age appropriate, yet no such regulation applies to smartphones. What international board of child psychologists was consulted? What paediatricians? What parents? What children?

A particularly worrying new trend that is outpacing our ability to counter it is the rise of nude deepfakes, or AI-generated sexually explicit images. They are becoming an increasingly worrying issue in schools and more than half a million children already have experience of them, according to new data from Internet Matters. Despite the fact that creating and sharing nude deepfakes of children or non-consenting adults is illegal, the programs that make them are still readily accessible. We would not ban the possession of zombie knives without banning their sale; that is why last week I called on the Government to ban nudifying tools and apps.

We seem to be setting up our children to fail, to be harmed and to be criminalised. Some 99% of the images created are of women and girls—indeed, the apps often do not work on boys. The Government have an ambitious target to halve violence against women and girls within a decade, a target that can only be achieved if we tackle the root cause by looking online. I would be grateful if the Minister could look at how nudifying apps could be banned as part of this Government’s commitment to keep women and children safe.
DUP
Jim Shannon
Strangford
I congratulate the hon. Member for Darlington (Lola McEvoy) on setting the scene so very well and on her insightful knowledge of the subject. I am very much a supporter of the Online Safety Act, and I have spoken about it on many occasions in the past. I believe we need strong protections for our children and young people; there is just so much danger out there, and it only seems to be getting worse. I have heard some horror stories of the dangers online, so it is great to discuss such matters and try to get answers from the Minister, who I wish well in the position she now holds.

Many will be aware—my staff are certainly fully aware—that my knowledge of the world of social media is somewhat limited; I am just about using text messages on the phone. However, social media and AI have brought tremendous advantages. The Office for National Statistics revealed that 83% of 12 to 15-year-olds now own a smartphone with full internet access. It is rare to see a young person who does not have one. My grandchildren, young as they are, seem to have all the knowledge that this old boy does not.

Cyber-bullying, grooming and online exploitation are, however, at the forefront of the dangers. The Police Service of Northern Ireland revealed that, in 2023, crimes involving children being contacted online by sexual predators rose by nearly a third. Officers working with the specialist unit say that they had the busiest year since its establishment in 2010. How worrying is that trend? Grooming can happen anywhere.

Another issue of importance that I want to focus on is self-harm and suicide among younger generations. Suicides in Northern Ireland are up by 8% from what they were last year for those people in the younger category—and last year they were horrendous. More than three quarters of people saw self-harm content online for the first time at the age of 14 or younger, and individuals with a history of self-harm report being 10 years old or younger when they first viewed it. Such things are incredibly worrying. We need to see safeguards against those as young as 10 seeing that damaging content, including on self-harm.

I am aware that issues regarding content on eating disorders are also prevalent. My office has been contacted about them by countless parents; it is a massive issue for my office. There is a clear danger to life from some of this content, which has led to hundreds of young girls and boys being referred to specialist clinics and counselling to help them through it. For any parent or family, that is just heartbreaking.

The online safety strategy and action plan was brought to the Executive in Northern Ireland by the Department of Health in 2020 to last until and be reviewed in 2025. Thankfully the Online Safety Act 2023, led from Westminster, applies to Northern Ireland, and with 40% of young people using social media there is a clear need for that legislation. Again, I hope that it can be strong enough to combat the dangers that are out there.

Ever mindful of your timescale, Mr Dowd, and to give others the opportunity to speak, I will conclude. The online world and its advancements are truly a wonderful thing—even for someone like me, who does not know how it works—but there are clear problems with some of the aspects surrounding it. I hope that we can work together, alongside Ministers and large social media companies, to do our best for our young people, to use the online world to their advantage and to give them the best start in life. We want them to have that best start, but we want them to be safe—that is what we are asking for. I look to the Minister for her input and her reassurance that the things we have asked for can actually happen.
Lab
  15:15:35
Jake Richards
Rother Valley
I congratulate my hon. Friend the Member for Darlington (Lola McEvoy) on securing this debate. Her daughter was born just a few months before mine, when we were both mere parliamentary candidates trying to juggle our election campaigns and family life, and failing. Just last night in the voting Lobby, we swapped notes on how it is going now that we are Members of Parliament, and I think we are both failing on that as well, but we are trying our hardest. Having spoken to her in depth about this issue, I know that she is keen to champion it and that she will be successful in doing so.

My daughter is just 15 months old, and when I look at the online world around us, I have deep concern and worries for her and children growing up across the country. The issue of online safety must be grasped urgently, and I hope that this Parliament will finally seize the initiative. Many of the hundreds of new MPs come to this issue afresh. It is great to see new colleagues and friends here today. This is a generation who have, to some extent, grown up online, are aware of the huge benefits that technology and social media have offered and are adept, to some extent at least, at using those networks—I am still not wholly sure how TikTok works—but who have also seen the increasingly toxic results for young people as technology has developed, and that has been exacerbated by the pandemic.

I have been in numerous schools across my constituency of Rother Valley to speak with students, teachers and teaching staff, and the pupils I meet are impeccably behaved and interesting and interested in my role as their MP, but when I have a cup of tea with the teachers afterwards, they so often tell me about the negative effect of smartphone apps, online bullying and the frankly shocking content that youngsters are exposed to. I speak to parents across Rother Valley who are deeply concerned about the content available to children, whether it be sexual or of an addictive or exploitative nature. Many feel that they are losing parental control, to some extent at least, to the magnet of online activity. I have run an online survey for constituents about the issue of smartphones and online safety for children over the last few months and have been inundated with these worries. This is an epidemic, which is why this debate and this subject are so important.

There is good work happening to combat the worries. Recently I met a group in my constituency that is run by Sara Cunningham and works in schools across the country in combating misogyny, online violence and pornography. That incredible organisation is doing brilliant work, but it cannot be left to the third sector to regulate this issue or pick up the mess. It is surely time for Government and regulation to take a greater role. I hope that that can be done on a cross-party basis, because this issue crosses the political divide. I would like personally to praise former Prime Minister Theresa May for at least beginning to champion the issue. I hope that, now, this Government and the Minister who is present today can take the issue forward.
Con
  15:18:42
Dame Caroline Dinenage
Gosport
I start by paying tribute to the hon. Member for Darlington (Lola McEvoy), who made really powerful and impactful comments, as have all those who have spoken today. I join her tribute to the bereaved families who have done such incredible work to campaign on this vital issue. I should, before anything else, direct everybody to the work of my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), who was the architect—the genesis—behind the Online Safety Act. I was one of the many Ministers who took over that baton for a couple of years and pulled the Act together.

As the hon. Member for Darlington said, only when we meet families who have been deeply impacted by online dangers and online harms does the impact of this really land with us. For me, meeting Ian Russell, whose daughter Molly took her own life in 2017 as a result of the content that she had seen online, underlined how incredibly disastrous for young and vulnerable people the harms of the internet can be.

As the hon. Lady said, it is not just about the sites that are hosting inappropriate content; it is about the algorithms that take someone’s fears and anxieties and put them into an echo chamber where they are normalised and reinforced, which is the most dangerous part of this. Unfortunately, it is the algorithms that social media companies prize above everything else; they are the most jealously guarded parts of their organisations. Molly was one example resulting from that, but there are so many other examples of suicide, self-harm, anxiety, eating disorders and body image issues that come out of that world.

A year on from the Online Safety Act, it is interesting to see how it is fully implemented, particularly against the backdrop of the speed at which technology is evolving. It is frightening because, virtually every week in our constituencies, we see examples of the harms that are out there. In my constituency, just in the last couple of weeks, junior-age children were using the online world to bully and harass each other. That is something that used to stay within the school gates. Bullying still happened—I am so elderly, and it happened when I was at school—but it was something that was left behind at the school gates; it did not follow you home. Also, 27% of children have seen pornography by the age of 11, which brings a very toxic view of sex and relationships.

The Online Safety Act will hopefully encourage providers to do what they say they are doing when it comes to protecting children online, but the Minister has a huge responsibility to make sure that that happens, and to hold not just them but Ofcom to account to make sure that it is robustly implementing the guidelines that it is setting up. There are some amazing champions of that—Baroness Kidron has made incredible strides in the other place—but we need to make sure that Ofcom has not only the powers but the capacity. It has a huge amount under its jurisdiction and there is a huge amount of pressure. I know that the Minister will work very hard to ensure that it is held to account and equipped with what it needs.
Lab
  15:22:21
Leigh Ingham
Stafford
Huge congratulations to my hon. Friend the Member for Darlington (Lola McEvoy) for securing this debate, which I know is of grave concern not only for my constituents in Stafford, Eccleshall and the villages, but for parents and caregivers throughout the country.

I am concerned that there is a disproportionate impact on girls and young women regarding online harm. Take, for example, the report just mentioned regarding exposure to harmful content; that recent report stated that 60% of girls aged 11 to 16 said that they had received negative comments about their appearance online, so I am very concerned about that growing impact on young people, particularly girls and young women.

Even more troubling is the increase in severe online abuse, such as grooming. In cases where the victim’s gender was identified between 2023 and 24, an overwhelming 81% of the children targeted were girls. I believe the increase in online harm to be directly connected to the increase in violence against women and girls.

I therefore join calls for significantly enhanced rules on social media platforms to safeguard our young people. That must tackle both the blunt and sharp ends of online harm: the insidious exposure to harmful content and the more direct and egregious abuses, such as grooming.
Con
  15:23:39
Sir Jeremy Wright
Kenilworth and Southam
It is a great pleasure to serve under your chairmanship, Mr Dowd. I congratulate the hon. Member for Darlington (Lola McEvoy) not just on securing this debate but on the way in which she made her case. I want to focus on a couple of the more technical aspects of the Online Safety Act, which are important in fulfilling the objectives that we all share this afternoon, which, as she rightly said, are to make sure that the vehicle that we now have in the OSA delivers the right outcomes for the safety of children online.

I am grateful to my hon. Friend the Member for Gosport (Dame Caroline Dinenage); she is right that I had ministerial responsibility for the Act. I think, frankly, it is harder to find Conservative Ministers who did not have responsibility for it at some point or another, but what we all tried to do was make sure that the structure of the Act would support the objectives that, again, we all share.

I will mention two specific things, which I should be grateful if the Minister would consider. I do not expect her to respond to them this afternoon, but if she would consider them and write to me, I should be very grateful.

It seems to me that we need to make sure that as responsibility for implementing the Act moves from us as legislators to Ofcom as the regulator, Government and Parliament and the regulator are on the same page. There are two areas where I am concerned that that might not be the case. The first is the question whether harm to children is all about content. I do not think it is. We have heard this afternoon that many aspects of risk and harm to children online have nothing to do with the specific nature of an individual piece of content.

The Act is important, and I believe it does support Ofcom’s ability to act in relation to harms beyond specific matters of content. For the Minister’s benefit, I have in mind section 11 of the Act on risk assessment—as she will know, because she knows it off by heart. For everybody else here, section 11 deals with risk assessment, and on that a great deal hangs. If we do a risk assessment, the obligation is to do something about risks, and that hangs on what risks are identified in the assessment. So the risk assessment matters.

As I read the Act, section 11 says that, yes, we must risk-assess for individual harmful pieces of content, but under section 11(6)(f) we also must risk-assess for the different ways that the service is used, including functionalities or other features of the service that affect how much children use the service—which goes back to a point made earlier. Those are the sorts of things it is important to underline that we expect Ofcom to attend to.

I am grateful for the Government’s statement of strategic priorities, but the point made about this being a fast-moving landscape is fundamental. Again in the Act, the codes of practice are vital, because they set out the things that platforms ought to do to keep children safe. If the platforms do the things set out in the codes, they are broadly invulnerable from further regulatory intervention. We need to act urgently to ensure that the codes of practice say what we want them to say. At the moment my concern is that Ofcom may simply talk about current good practice and not urge advancements in good practice to be maintained by the platforms. Those are the two areas that I hope the Minister will think about in relation to the draft codes and the need for an ongoing relationship between us in Parliament and Government and Ofcom to ensure that the Act continues to deliver as we want it to.
Lab
  15:27:30
Josh MacAlister
Whitehaven and Workington
I congratulate my hon. Friend the Member for Darlington (Lola McEvoy) on securing this important debate.

I would like to say a few words about the context of this debate and the parallels between it and some of the debates in the last century, specifically to do with road safety. Despite the car being a relatively common feature on our roads from about 1900, it was not until the 1930s, when there were already 1 million cars on the road, that we decided to introduce any age limit on driving. It was not until 1983 that wearing a seatbelt became compulsory. At that time, many people, including MPs here in Parliament, argued that the law would be impossible to police, was an overreach of the state and would not save any lives. In fact, when it was introduced, deaths dropped dramatically and we got the best out of the rise of the motor vehicle. There is a strong parallel between the introduction of seatbelt measures and what we now need to do as a Parliament on online safety.

The Online Safety Act was an incredibly welcome piece of legislation, but it was the very first measure and must be seen as a stepping-stone piece of legislation rather than a destination in its own right. Most people involved in the creation of the legislation and those at Ofcom themselves would probably recognise that description. Where we need to go next, I believe, is to address issues of excess screen time, social media use and the wider harms that come from the fact that the average 12-year-old is now spending 21 hours a week on their smartphone. There are obvious harms from that. My hon. Friend the Member for Darlington highlighted social anxiety and peer-to-peer comparison and the mental health impacts of that. There are very clear impacts on sleep and on the classroom, and the evidence behind that is growing. There is also an enormous impact in that those 21 hours a week used to be spent by children doing other stuff. Children used to do other things that they now do not do because they spend time on their devices. That presents a complete generational rewiring of childhood, which needs to be considered closely.

That is why it is really welcome that last week the Government announced that they will commission a study into this area. The evidence has moved on considerably since the chief medical officer last looked at this in 2019. With fresh eyes looking at the evidence now, I believe that the chief medical officer will give very different advice. That is why I have introduced the safer phones Bill—the Protection of Children (Digital Safety and Data Protection) Bill.

I would like three things to happen. First, the age of digital consent for data sharing should be raised from 13 to 16. That would put not just Ofcom, but the Information Commissioner’s Office in a position to regulate this, and I would like extra powers for parent groups to come together to ensure that that is enforced. Secondly, Ofcom needs additional powers to make sure that it can go beyond just the content, as my hon. Friend the Member for Darlington mentioned. Finally, we need to look at this as a public health issue, as well as a tech regulation issue.
LD
Victoria Collins
Harpenden and Berkhamsted
It is an honour to serve under your chairmanship, Mr Dowd. It has also been a real honour to be part of this debate, and I have been scribbling away because so much genuine passion has been put into it. Do I have 10 minutes, Mr Dowd?
in the Chair
Peter Dowd
Yes.
  15:44:33
Victoria Collins
My cogs are turning—everyone in this debate wants to make a difference, and the time is now. That is the critical point. There is far too much illegal and harmful activity on social media and online, whether that is racist abuse, incitement to violence or the grooming of children—so much has been brought up.

Keeping children safe online is more difficult, but more important, than ever before. Several Members have mentioned that they spoke to their local parent groups and schools. I met children from The Grove school in Harpenden. One child said, “How old do you think I should be to have a smartphone?” And I said, “Well, how old would you like it to be?” He said, “Eleven.” I said, “Why?” He said, “Because that is when my brother got his.” It was really interesting that the teachers said, “We are discussing this as a school right now because the kids are asking those questions.” What also came through was the importance of listening to young people, because they are the ones who are crying out for change and saying that something is not right.

We have heard from many Members, including the hon. Member for Darlington (Lola McEvoy), who set up the debate in a way that that none of us could follow, speaking with passion about the people behind this—the parents and the families. That is what we are all here for. We heard from the hon. Member for Rother Valley (Jake Richards) about how covid exacerbated problems, which highlighted the importance of discussing this issue now. The hon. Member for Gosport (Dame Caroline Dinenage) talked about Ian Russell and Molly; I think most of us are aware of that story. Ian has come to Parliament many times to talk about the impact, and we must never forget his family and so many more behind them. The hon. Member for Whitehaven and Workington (Josh MacAlister) spoke of the parallels between this issue and road safety, reminding us that we have to act now because, if we do not, we will look back and realise that we were doing a disservice to so many. We have to keep up on safety.

So much of this debate has been about identifying the issues with online safety, such as what the algorithms are sending us, location and chat features, the content and so much more. The hon. Member for Aberdeen North (Kirsty Blackman) talked about self-generated explicit content and the pervasive misogyny that so many have mentioned. The hon. Member for Carlisle (Ms Minns) mentioned young pornography being a crime and that we need to get the language right. That is key. Sexual inequality and violence are pervasive because of that content.

The hon. Member for Whitehaven and Workington spoke about the addictiveness of phones, and the hon. Member for Lowestoft (Jess Asato) highlighted the fact that mobile phone use is impacting short-sightedness. The hon. Member for Whitehaven and Workington mentioned sleep and asked what we are doing about the 21 hours a week spent on phones. So much of this is about what I call “digital mental health”, which refers to what is happening as a whole, beyond the algorithm and the impact of the content. The hon. Member for Strangford (Jim Shannon) mentioned self-harm, and I will certainly keep in mind the term “generational rewiring”, which the hon. Member for Whitehaven and Workington used.

When it comes to legislation, we have not acted fast enough and we have not gone far enough. As has been said, we need to move towards safety by design, but we also need legislation that is reactive and agile enough to keep up with the change. As Liberal Democrats, we were proud to push for the Online Safety Act to go further, and we successfully called for it to include online fraud and scams, as well as to outlaw cyber-flashing.

The hon. Member for Aberdeen North talked about online games, and the fact that we need to stay up to date. The hon. Member for Gosport mentioned holding Ofcom to account. The hon. Member for Stafford (Leigh Ingham) talked about grooming laws, and how we need blunt and sharp elements in the instruments that we use. The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) reminded us that behind all this, we must get the technicalities right in the Online Safety Act, highlighting that this is not just about the content, but about keeping up with the speed and agility of the change.

As a Liberal Democrat, I would like to highlight what we are calling for. The importance of being proactive has been mentioned many times, and that means calling for safety by design. We are also calling for an independent advocacy body for children’s safety online. We would like to create a new online crime agency to effectively tackle illegal content and online activity, such as revenge porn, threats and incitement to violence on social media. We would also like to include a digital Bill of Rights to protect everyone’s rights online. That includes balancing the rights to privacy, freedom of expression and participation. The regulation of social media must respect the rights and privacy of those who use it legally and responsibly, but should not have a laissez-faire approach.

Another important element is education. The hon. Member for Darlington said that we cannot tackle all of this content. We cannot get all of this right, but it is important that we also empower young people and parents to be able to say what is right and wrong, and to help them to feel empowered to make a change, whether that is by using tools, or by speaking up and saying, “Actually, this is not right.” We should make sure that they feel they have that voice.

My hon. Friend the Member for South Devon (Caroline Voaden) mentioned that big tech needs to be held accountable—absolutely. We have to make sure that those who are building the platforms are the ones who ensure their safety by design, and that they keep up with that.

I close with a reminder that keeping young people safe online is more difficult, but more important, than ever before. We must act sooner rather than later and use all the tools at our disposal, whether that is through Ofcom and regulatory changes, by incentivising companies or by educating parents and children. Frankly, from the debate I have heard today, I have hope that if we work together, we can make sure that those changes are enacted swiftly and are kept up to date.
Con
  15:38:27
Dr Ben Spencer
Runnymede and Weybridge
It is a pleasure to serve under your chairmanship, Mr Dowd. I would like to pay tribute to the hon. Member for Darlington (Lola McEvoy) for securing this debate. She spoke powerfully and knowledgeably on a wide range of issues, particularly on the children’s codes, and her requests for reform and improvements.

There were many contributions from hon. Members in this important debate, but one that really struck me, and which I would like to draw particular attention to, was the contribution from the hon. Member for Carlisle (Ms Minns). When hon. Members speak in debates, there are few times when all Members listen. She spoke rightly and powerfully about the awful statistics—I say “statistics”, but I really mean the number of horrendous acts of child sexual abuse that have been and are taking place, and the impact that that will have on those children and, indeed, all people who are exposed to it. All of us, as parliamentarians, need to be very mindful of that. Each and every one is an individual tragedy.

Protecting children from harmful or illegal content is something that all Members are committed to, and it is right that we work together to protect children. I welcome the Online Safety Act brought in by the last Conservative Government. That groundbreaking legislation had the protection of children at its heart, introducing effective, pragmatic laws and restrictions to combat some of the horrors we have heard about. It was great to have several of the architects of the Online Safety Act taking part in the debate and asking pertinent questions to the Minister, whose job it is to ensure that this piece of legislation works for us, our children and our families.

As a responsible Opposition, it is now our job to pose the questions and to support the Government in delivering protections for our children. I will make my speech in that spirit, particularly with a series of questions that I have for the Minister about the Act’s implementation.

I commend the Secretary of State for Science, Innovation and Technology for meeting bereaved parents who have lost children to harmful online content, and for publishing the draft statement of strategic priorities for online safety. I pay tribute to those in the Gallery whose families have been tragically affected by online harms.

The Secretary of State has stated that the Government will implement safety by design to stop more harm occurring in the first place. We support the Government’s aspiration to deliver safe online experiences for all users, as we did in the previous Government. It is important that we consider whether the expectation should fall on users to take precautionary steps to avoid severely harmful content, and particularly those who are most vulnerable. But when the Government talk of safety by design, it is crucial that they place the onus on social media companies to ensure the safety of their users. Given the role that algorithms play in pushing themed content to users, what plans do the Government have to empower users to exercise greater personal control over the algorithms?

The Government outlined the need to ensure that there are no safe havens online for illegal content and activity. Although we wholeheartedly support that aim, to what extent will removing the ease of mainstream access push such content further out of sight and possible regulation? We support the Government’s desire to improve transparency and accountability across the sector, but while there is a desire to increase algorithmic transparency, how do the Government intend to improve regulatory co-ordination in the pursuit of achieving that? In addition, the inculcation of a culture of candour via the transparency reporting regime will be challenging. How will that be facilitated?

In January 2024, Instagram and Facebook announced that they would block under-18s from seeing harmful content relating to eating disorders, self-harm and suicide, but it has been highlighted that the content is so prevalent that it can still be found easily online. What steps do the Government intend to take to ensure that the existing legislation is enforced?

We must ensure that children are protected from material that is not age-appropriate, such as pornography. That is why the last Government tightened up age restrictions by requiring social media companies to enforce age limits consistently and to protect their child users. It is right that services must assess any risk to children from using their platforms and set appropriate age restrictions, ensuring that child users have age-appropriate experiences and are shielded from harmful content. Again, this should be followed closely to ensure that platforms—or indeed, children—are not finding ways around restrictions. Currently, age checks are not strong across all platforms. I would welcome the Minister’s thoughts on how the Government plan to do that. Restrictions introduced by the last Government are a good start but, as was noted in the debate, as technology changes, we must keep up.

The Government talk of ensuring that age assurance technology to protect children is being effectively deployed. How do they intend to ensure that that happens and to ensure that companies are investing in the most up-to-date technology to facilitate it? Will the Government proactively stress-test that capability?

We must stand against the harms that come our children’s way. We must build on the success of the previous Conservative Government by ensuring that all restrictions and laws work. We must embrace technology and understand that the internet and social media, in general, are a force for good, embedded in our daily lives, while also understanding that checks and balances are essential if we are to ensure a safe online environment for all users.
  15:44:51
Feryal Clark
The Parliamentary Under-Secretary of State for Science, Innovation and Technology
It is a pleasure to serve under your chairmanship, Mr Dowd. I congratulate my hon. Friend the Member for Darlington (Lola McEvoy) on securing this debate. As hon. Members can see, debates in Westminster Hall take a whole different form from debates in the House; they are a lot more informative and collegiate, and Westminster Hall is a much nicer place to debate. I welcome the parents in the Public Gallery and thank them for their commitment and the work they continue to do to make sure that this issue stays on our agenda and continues to be debated. I know they have met my colleagues, and I look forward to meeting them as well.

I am grateful to all hon. Members for the incredibly powerful and informative contributions to today’s debate. As the mother of two young children, I always have online safety on my mind. Every time I am on my phone in front of my children or want to keep them distracted by putting on a YouTube video, it scares me, and the issue is always at the back of my mind. It is important that we as parents always have the safety of our children in mind. My hon. Friend the Member for Rother Valley (Jake Richards) talked about being a parent to really young children while being an MP or candidate. As a mother who had two children in part during the last term, I can assure him that it does get easier. I am happy to exchange some tips.

The growth in the use of phones and social media has been a huge societal change, and one that we as parents and citizens are grappling with. I am grateful to all hon. Members here who are engaging in this debate. The Government are committed to keeping children safe online, and it is crucial that we continue to have conversations about how best to achieve that goal. We live in a digital age, and we know that being online can benefit children of all ages, giving them access to better connections, education, information and entertainment. However, we know that it can also accentuate vulnerabilities and expose children to harmful and age-inappropriate content. We believe that our children should be well-equipped to make the most of the digital opportunities of the future, but we must strike the right balance so that children can access the benefits of being online while we continue to put their safety first.

Last week, the Secretary of State visited NSPCC headquarters to speak to their voice of online youth group. That is just the latest meeting in a programme of engagement undertaken by the Secretary of State and my colleague in the other place, Baroness Maggie Jones. Getting this right has been and will continue to be a long process. Many hon. Members here will remember the battle to get the Online Safety Act passed. Despite the opposition—some Members in this place sought to weaken it—there was cross-party consensus and a lot of support, and so it was passed.
Ind
  15:44:51
Richard Burgon
Leeds East
On a number of occasions during the passage of the Online Safety Bill in this House, I raised the story of my constituent Joe Nihill from Leeds, who sadly took his own life after accessing very dangerous suicide-related content. I want to bring to the Minister’s attention that before Ofcom’s new powers are put into practice at some point next year, there is a window where there is a particular onus on internet service providers to take action. The website that my constituent accessed, which encouraged suicide, deterred people from seeking mental health support and livestreamed suicide, has been blocked for people of all ages by Sky and Three. Will the Minister congratulate those two companies for doing that at this stage and encourage all internet service providers to do the same before Ofcom’s new powers are implemented next year?
  15:50:01
Feryal Clark
I thank the hon. Member for making that point and I absolutely welcome that intervention by internet providers. As I will go on to say, internet providers do not have to wait for the Act to be enacted; they can start making such changes now. I absolutely agree with him.

Many colleagues have raised the issue of the adequacy of the Online Safety Act. It is a landmark Act, but it is also imperfect. Ofcom’s need to consult means a long lead-in time; although it is important to get these matters right, that can often feel frustrating. None the less, we are clear that the Government’s priority is Ofcom’s effective implementation of the Act, so that those who use social media, especially children, can benefit from the Act’s wider reach and protections as soon as possible. To that end, the Secretary of State for Science, Innovation and Technology became the first Secretary of State to set out a draft statement of strategic priorities to ensure that safety cannot be an afterthought but must be baked in from the start.

The hon. Member for Strangford (Jim Shannon) raised the issue of suicide and self-harm. Ofcom is in the process of bringing the Online Safety Act’s provisions into effect. Earlier this year, it conducted a consultation on the draft illegal content, with one of the most harmful types being content about suicide. Child safety codes of practice were also consulted on. We expect the draft illegal content codes to be in effect by spring 2025, with child safety codes following in the summer.

Under the Act, user-to-user and search services will need to assess the risk that they might facilitate illegal content and must put in place measures to manage and mitigate any such risk. In addition, in-scope services likely to be accessed by children will need to protect children from content that is legal but none the less harmful to children, including pornography, bullying and violent content. The Act is clear that user-to-user services that allow the most harmful types of content must use highly effective age-assurance technology to prevent children from accessing it.

Ofcom will be able to use robust enforcement powers against companies that fail to fulfil their duties. Ofcom’s draft codes set out what steps services can take to meet those duties. The proposals mean that user-to-user services that do not ban harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict those parts of the service that host harmful content. The codes also tackle algorithms that amplify harm and feed harmful material to children, which have been discussed today. Under Ofcom’s proposal, services will have to configure their algorithms to filter out the most harmful types of content from children’s feeds, and reduce the visibility and prominence of other harmful content.

The hon. Member for Aberdeen North (Kirsty Blackman), the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) and others discussed strengthening the codes. Ofcom has been very clear that it will look to strengthen the codes in future iterations. The Government will encourage it to do so as harmful online technology and the evidence base about such technology evolves.
  15:53:40
Sir Jeremy Wright
Will the Minister give way?
  15:53:49
Feryal Clark
I am short of time, so I will have to proceed.

For example, Ofcom recently announced plans to launch a further consultation on the illegal content duties once the first iteration of those duties is set out in spring next year. That iterative approach enables Ofcom to prioritise getting its initial codes in place as soon as possible while it builds on the foundations set out in that first set of codes.

My hon. Friends the Members for Slough (Mr Dhesi) and for Lowestoft (Jess Asato) and the hon. Member for Aberdeen North raised the issue of violence against girls and women. In line with our safer streets mission, platforms will have new duties to create safer spaces for women and girls. It is a priority of the Online Safety Act for platforms proactively to tackle the most harmful illegal content, which includes offences such as harassment, sexual exploitation, extreme pornography, internet image abuse, stalking and controlling or coercive behaviour, much of which disproportionately affects women and girls. All services in scope of the Act need to understand the risks facing women and girls from illegal content online and take action to mitigate that.

My hon. Friend the Member for Carlisle (Ms Minns) set out powerfully the issues around child sexual exploitation and abuse. Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims. UK law is crystal clear: the creation, possession and distribution of child sexual abuse images is illegal. The strongest protections in the Online Safety Act are against child sexual abuse and exploitation. Ofcom will have strong powers to direct online platforms and messaging and search services to combat that kind of abuse. It will be able to require platforms to use accredited, proactive technology to tackle CSEA and will have powers to hold senior managers criminally liable if they fail to protect children.

I am running short of time, so I shall make some final remarks. While we remain resolute in our commitment to implementing the Online Safety Act as quickly and effectively as possible, we recognise the importance of these ongoing conversations, and I am grateful to everyone who has contributed to today’s debate. I am grateful to the brave parents who continue to fight for protections for children online and shine a light on these important issues. The Opposition spokesperson, the hon. Member for Runnymede and Weybridge (Dr Spencer), asked a host of questions. I will respond to him in writing, because I do not have time to do so today, and I will place a copy in the Library.
in the Chair
Peter Dowd
I call Lola McEvoy to briefly respond to the debate.
  15:58:27
Lola McEvoy
Thank you, Mr Dowd. I am grateful to the Minister for her response.

We have had an insightful and cohesive debate, and I thank all Members for their time and expertise. It is clear to me—and, I am sure, to all of us—that innovation has outstripped legislation, leaving our children and young people shouting for help. Crime is organised and exacerbated on these platforms, and the police cannot stop it without our help. Twenty-four-hour access means that content and bullying have caused school refusals, and our educators cannot teach our children without our help.

Children and young people never share everything with their parents, but the sheer quantity of material, along with the functions of content providers, means that parents cannot protect their children without our help. Children’s mental health services are drowning after huge surges in the number of those needing support. Many issues are caused or exacerbated by online platforms, and our NHS cannot get our children well without our help. Today has demonstrated cross-party agreement for action, as well as agreement that this is one of the great issues of our time. We have our consensus, so now let us use it.

Question put and agreed to.

Resolved,

That this House has considered online safety for children and young people.

Contains Parliamentary information licensed under the Open Parliament Licence v3.0.