PARLIAMENTARY DEBATE
Online Harms - 7 October 2020 (Commons/Westminster Hall)

Debate Detail

Contributions from Fleur Anderson, are highlighted with a yellow border.

[Sir Edward Leigh in the Chair]

Lab
  00:01:40
Holly Lynch
Halifax
I beg to move,

That this House has considered online harms.

It is always a pleasure to serve under your chairmanship, Sir Edward. I am delighted to have secured this debate this afternoon. I know that lots of colleagues are keen to participate, and many of them have much greater expertise in this policy area than I do. I have never been more overwhelmed on securing a debate by offers of briefings, information, research and support from organisations that are dedicated to trying to make a difference in this area. Given the strength of feeling and the depth of the evidence base, it is remarkable that we have not made more progress.

I was approached by the Petitions Committee who asked if four online petitions could be considered as part of this debate. Those petitions are entitled: “Make online abuse a specific criminal offence and create a register of offenders,” “Make online homophobia a specific criminal offence,” “Hold online trolls accountable for their online abuse via their IP address” and “Ban anonymous accounts on social media”. The petitions have collectively been signed by more than half a million people and I am pleased to say that there were 773 signatories from my Halifax constituency.

I had intended to include a list and thank all those who sent briefings, but there were so many, it would take me about 12 hours to read out that list. I would therefore just like to mention the National Society for the Prevention of Cruelty to Children, Barnardo’s, the Antisemitism Policy Trust, John Carr OBE, the Mayor of London, Sadiq Khan, and my good and honourable Friend the Member for Ogmore (Chris Elmore), who has a vast knowledge and expertise in this area, not least in his capacity as the chair of the all-party parliamentary group on social media. I will reference others throughout my speech. I thank them all for the information and support in shaping the focus of my efforts.

During lockdown, we have seen how the internet has facilitated digital connection and social media has provided a lifeline to the outside world for so many. None of us in this room is ignorant of the good that social media can do; however, as lawmakers, we are all collectively responsible for the utter failure to regulate it and for the societal damage that that is causing.

The online harms White Paper published last year confirms that nearly nine in 10 UK adults and 99% of 12 to 15-year-olds are online. The NSPCC estimates that in the first three months of 2020, online sex crimes recorded against children surpassed 100 a day—that is roughly one every 14 minutes. Barnardo’s also contacted me about some of the harrowing online experiences it has been supporting children through as part of its new “See, Hear, Respond” campaign over the course of the lockdown—the sorts of experiences that would significantly damage adults, let alone children.

As MPs, we all know what it is like to be in the public eye and to be on the receiving end of online abuse, but I started to ramp up my work in this area when I was approached by a brilliant woman, Nicky Chance-Thompson, who is the chief exec of the magnificent Piece Hall in my constituency, which everyone should come and visit when they have the opportunity. She is a deputy lieutenant and the Yorkshire Choice Awards Business Woman of the Year 2019. She is also on Northern Power Women’s power list.

When Caroline Flack tragically died in February this year, Nicky bravely approached me and others to share her own experiences of women in the public eye and to call on all of us to get a grip of online abuse before any further lives are lost. Nicky published an article with the Yorkshire Evening Post describing how she was a victim and survivor of online abuse, which rides high on social media. She said:

“Cowards hiding behind fake profiles can say anything they like about anyone, and there appears to be no consequences for them nor recourse for the victims…Misogyny is unpalatably frequent. Many women in high profile or public positions cop it simply for doing their jobs or being successful.”

She urged everyone involved to speak up and take action because “silence is killing people.”

Nicky’s article was published by the Yorkshire Evening Post as part of their “Call It Out” campaign, which has been spearheaded by editor, Laura Collins. It proved to be the catalyst for a broader initiative between Nicky, myself, editors of the Yorkshire Post and Yorkshire Evening Post, James Mitchinson and Laura Collins, Stop Funding Hate, the Conscious Advertising Network and the Journalism Trust Initiative, led by Reporters Without Borders. We came together to agree a constructive way forward to make progress on cleaning up the internet. We interrogated the online harms White Paper; its joint ministerial statement bears the names of two former Cabinet Members who both left Government over a year ago, which hardly screams urgency, but it does state:

“While some companies have taken steps to improve safety on their platforms, progress has been too slow and inconsistent overall.”

I am afraid that, in itself, is a reflection of the Government’s inaction.

We talk a great deal about public health right now, but, as my hon. Friend the Member for Newcastle upon Tyne North (Catherine McKinnell) said in a discussion I had with her about her Petitions Committee investigation into online abuse, we will look back on this period in history with disbelief and shame that we did nothing in the face of what can only be described as a public health ticking time bomb. She compared unregulated online abuse and hate to smoking, and that analogy is entirely right.

Until a landmark study in the 1950s, whether a person chose to smoke was nothing to do with Government, and even when the body of research provided evidence for the link between tobacco use and lung cancer and other chronic diseases, Governments were slow to involve themselves in efforts to stop people smoking, or to get them to smoke less or not to start in the first place. If we think about where we are now on smoking, although smoking cessation budgets have been slashed in recent years, we proactively fund stop smoking services, have school education programmes and heavily regulate what is available to purchase and how it is advertised.

We do that because we recognised that smoking was having a detrimental impact on physical health. We invested, not only because it was the right thing to do, but because it was more cost-effective to intervene than to allow so many people to become so unwell as a consequence. Compare that with online abuse and hate and the impact we know it is having on the wellbeing and mental health of society, particularly young people.
Con
Mrs Maria Miller
Basingstoke
I commend the hon. Lady for securing this debate. She mentioned the importance of regulation, and as she was speaking I was reflecting on the regulation that is in place to govern the BBC and broadcast media, because it was felt that, if communication was going straight into the living room of every home in this country, it needed to have a firm regulatory footing. Does she not think that a similar approach to this sector could have prevented some of the harms that she is talking about today?
  13:59:00
Holly Lynch
The right hon. Lady makes an important point. I am about to come on to some of the different ways that we need to extend the regulation that is already there. She makes the point that that information was going straight into homes; information online is coming straight into somebody’s hand in front of their face, so why do we not extend the same types of regulation to it? I will come on to that in more detail, but I thank her for that point.

As I said, 99% of 12 to 15-year-olds are online, and seven in 10 young people have experienced cyber-bullying, with nearly 40% of young people saying they experienced cyber-bullying on a high-frequency basis, according to the Royal Society for Public Health’s “#StatusofMind” report. Those of us in this Chamber know better than anyone the impact that social media is having on public discourse and on the ability to have safe spaces for the exchange of different opinions, which are vital in any democracy.

One of the reasons the Yorkshire Evening Post was so motivated to launch the Call It Out campaign was realising the impact of the barrage of online abuse directed predominantly, but not exclusively, towards their its female journalists. Editor Laura Collins, who I commend for her leadership on this issue, told me this week that the sentiment of one comment on Facebook responding to an article about the local restrictions in Leeds was not uncommon: it said, “Whoever is publishing these articles needs executing by firing squad”. The newspaper reported it to Facebook on 28 September and nine days later is yet to receive a response.

Our “Clean Up The Internet” initiative, somewhat underwhelmed by the White Paper, feared that the Government did not have the will to truly transform the way the internet is used, so we considered what else would need to happen. Online social media platforms have said far too often that they just provide the platform and can only do so much to oversee the content shared on it, but that holds no water at all where paid ads are concerned. It is a glaring omission from the White Paper that it does not consider misinformation and disinformation, which can be not only shared widely for free, but promoted through online advertising.

As we have heard, advertising in print or on broadcast platforms is regulated through Ofcom and the Advertising Standards Authority, and it must be pre-approved by a number of relevant bodies. There are clear rules, powers and consequences. The internet, however, to quote the NSPCC campaign, is the “wild west”. We must therefore extend that regulation to online advertising as a matter of urgency.

The urgency is twofold. The spread of misinformation and disinformation relating to the pandemic, whether it is conspiracy theories about its origins or even its existence, fake cures or promoting the sale of personal protective equipment by bogus companies, when we are trying to combat a virus, can have fatal consequences. So-called clickbait advertising and the monetisation of items dressed up as news, with the most outrageous and sensational teasers inevitably receiving the most clicks and generating the most income, means that credible news from real journalists with integrity to both their conduct and their content, like those at the Yorkshire Post and the Yorkshire Evening Post, is being driven out of that space. The online business model does not work for those who play by the rules, because there simply are not any.

Let us move on to what else would make a difference. I hope that the Minister will be able to answer a number of questions today about the progress of legislation and regulation. We have had the initial response to the White Paper, but when can we expect to see the Bill published? If we consider that the process began when the Green Paper was published in October 2017 and that the Government have suggested it may be 2023 before new legislation comes into effect, that will be six years, which is an incredibly long time in the life of a child—almost an entire generation.

Opportunities to strengthen protections for children online have been continually missed. During lockdown, large numbers of children have been harmed by entirely avoidable online experiences. If the Government had acted sooner, those consequences may not have been as severe or widespread.
Lab
  13:59:00
Chris Elmore
Ogmore
I congratulate my hon. Friend on securing the debate and thank her for her tribute to me. I pay tribute to her for the work she does in her constituency and across Yorkshire on this issue.

In terms of protection of children, one of the most concerning things I have seen during the pandemic is about the Internet Watch Foundation, which is Government- funded and reports to the police and central Government about the number of URLs focusing on paedophilia and child exploitation images. Takedown has reduced by some 80% since the pandemic started. I have raised that with Ministers in the Cabinet Office and the Department for Digital, Culture, Media and Sport. Does she agree that the Government need to take that far more seriously and put funding in place to ensure such things can be taken down and that children are protected from the most extreme online harms?
  13:59:00
Holly Lynch
My hon. Friend, who has vast experience in this area, references some of the most extreme and harrowing online experiences, which our children are now becoming exposed to on a regular basis. We absolutely must re-resource this area to get a grip of it and prevent children from becoming victims, which happens every day that we do not tighten up the rules and regulations surrounding the use of the internet.

I also ask the Minister whether legislation will include— it should—regulation of, or rather the removal of, misinformation and disinformation online. Will it seek to regulate much more of what is harmful and hateful but is not necessarily criminal from a public health perspective, if nothing else? Will the proposed duty of care be properly underpinned by a statutory framework? Just how significant will the consequences be for those who do not adhere to it?

The Government announced the suspension of the implementation of an age-verification regime for commercial pornography sites on 16 October 2019, despite the fact that it only needed a commencement date. It is not at all clear why that was or when it will be reintroduced. I hope that the Minister can enlighten is about when the regime will come into effect.

The Local Government Association has raised important concerns. Local authorities have statutory safeguarding responsibilities on issues such as child exploitation, as we have just heard, suicide prevention and tackling addiction, all of which become incredibly difficult when a child or young person—or an adult, for that matter—goes online. It had to produce the “Councillors’ guide to handling intimidation”, which recognises the growing need among councillors for support related to predominantly online intimidation. That is another damning indication of just how bad things have become.

I have worked with these groups on this issue and have been overwhelmed with suggestions for what more could be done. First, no one should be able to set up an entirely anonymous profile on social media platforms. The rise in bots and people hiding behind anonymous profiles who push hate and abuse should simply no longer be allowed. People would not necessarily have to put all their information in the public domain, but they would need to provide accurate information in order to be able to set up an account or a profile. The approach is explicitly called for in two of the public petitions attached to the debate, demonstrating that there is public support for such an approach. That would allow us to hold both the platform and the individuals responsible to account for any breaches in conduct.

Imagine if being held to account for posting something that is predetermined to be abusive through the online harms Bill, such as hateful antisemitic content, meant that an appropriate agency—be it Ofcom, the police or the enforcement arm of a new regulator— could effectively issue on-the-spot fines to the perpetrator. If we can identify the perpetrator, we can also work with police to determine whether a hate crime has occurred and bring charges wherever possible. The increased resources that are necessary for such an approach would be covered by the revenue generated by those fines. That type of approach would be transformative. Can the Minister respond to that point—not necessarily to me, but to all those who have signed the petitions before us, which ask for that kind of thinking?

Fearing that the Government lack the will to adopt the radical approach that is required, the working group that I spoke about will look to get more and more advertisers on board that are prepared to pull their advertising from social media platforms if the sorts of transformations that we are calling for are not forthcoming. I put everyone on notice that that work is well under way.

On securing the debate, I was approached by colleagues from all parties, and I am pleased that so many are able to take part. Given just how broad this topic is, I have not said anything about extremist and radical content online, gang violence, cyber-bullying, self-harm, explicit and extreme content, sexual content, grooming, gaming and gambling, and the promotion of eating disorders. I am sure others will say more about such things, but I fear the Government will say that there is so much to regulate that they are struggling to see the way forward. There is so much there that it is a dereliction of duty every day that we fail to regulate this space and keep damaging content from our young people and adults alike.

We know that this is an international issue, and Plan International has just released the results of its largest ever global survey on online violence after speaking to 14,000 girls aged 15 to 25 across 22 countries. The data reveal that nearly 60% have been harassed or abused online, and that one in five girls have left a social media platform or significantly reduced their use of it after being harassed. This plea goes to the social media companies as well: if they want to have users in the future who can enjoy what they provide, they must create a safe space. Currently, they simply do not. It is an international issue, but we are the mother of Parliaments, are we not?

The Government seem so overwhelmed by the prospect of doing everything that they are not doing anything. I urge the Minister to start that process. Take those first steps, because each one will make some difference in bringing about the change that we have a moral obligation to deliver.
in the Chair
Sir Edward Leigh
I remind Members that the convention still applies: if you wish to speak, you should be present at the beginning. There are quite a large number of people on the call list, so please restrict your comments to about four minutes; otherwise, I will have to impose a time limit. I will call Members as on the call list, starting with Andrew Percy.
Con
  00:03:00
Andrew Percy
Brigg and Goole
It is a pleasure to serve under your chairmanship, Sir Edward. I pay tribute to the hon. Member for Halifax (Holly Lynch), a fellow Yorkshire MP, for securing the debate and for the content of her speech.

I will primarily focus on antisemitism online, particularly in my role as co-chair of the all-party parliamentary group on antisemitism. Before I do that, I want to raise the issue of financial scams, which many of us have been lobbied on. I had a very sad case of a constituent who was recently affected by such a scam. They are truly shocking and harmful scams that take place online. I hope that as the Government move forward in this space and introduce legislation, they will have scams in mind too.

We all know the history of the rise of antisemitism in recent years, both on the far left and on the far right in this country, and across Europe and the wider world. It is sad that antisemitism has continued to grow and to find space online during this period of coronavirus. We have seen a fall in the number of physical incidents, probably because of the lockdown, but sadly we have seen all too much of a continuation online. Between the start of this year and June the Community Security Trust recorded 344 online incidents. There would have been many more were it not for the narrow reporting parameters. We could easily be up into the millions if we could measure antisemitism under the broadest scope. Those examples are as shocking as what we all know. There are many Members present who are members of the all-party parliamentary group on antisemitism and who have taken a stand on the issue.

During the period in question there has been Zoombombing of Jewish events with vile racist antisemitic commentary. Sadly, there have been covid conspiracy theories growing online. It is disappointing, first, that there is an anti-mask movement—sadly that is across the world; but often that moves closely to antisemitism. A constituent recently contacted me about that, with some barking mad idea about masks and how terrible they are, to which I replied, “Next thing you will be telling me it is all the fault of the Rothschilds,” to which—no word of a lie—I received a response saying “Actually, the Rothschilds knew this was going to happen.” That is how this stuff spreads. It is a simple step from one to the other.
Lab/Co-op
Stephen Doughty
Cardiff South and Penarth
The hon. Gentleman is making some crucial points. Does he share my disgust, and is he appalled, that YouTube at the start of this debate is providing links to a notorious antisemitic radio station called Radio Aryan and, indeed, a whole channel dedicated to antisemitic material? I will not read the name out. The content is there right now, as we are having this debate. YouTube has not removed it.
  00:04:03
Andrew Percy
It is absolutely shocking. It should not take legislation to deal with it; it is obvious that the content should not be there. We need the Government to legislate, as I shall come on to in a moment, but it takes no brain surgeon to figure this stuff out. Sadly, too many platforms do not do enough.

Then of course there was the shocking Wiley incident, when he was tweeting on average every 87 seconds, which is incredible. There were 600 tweets, on a conservative estimate, which were seen online by more than 47 million people, of vile antisemitic abuse. Let us just consider some examples of it. He tweeted:

“If you work for a company owned by 2 Jewish men and you challenge the Jewish community in anyway of course you will get fired.”

Another one was:

“Infact there are 2 sets of people who nobody has really wanted to challenge #Jewish & #KKK but being in business for 20 years you start to undestand why:”

Then—something completely disgusting:

“Jewish people you think you are too important I am sick of you”

and

“Jewish people you make me sick and I will not budge”.

It took days. As I said, it took, at a conservative estimate, 600 tweets before anything was done about it. Instagram videos were posted. When one platform closed it down it ended up elsewhere. That is despite all the terms and conditions in place.

Enforcement is, sadly, all too invisible, as the hon. Member for Cardiff South and Penarth (Stephen Doughty) has highlighted, with regard to Radio Aryan. I was pleased that Wiley was stripped of his honour, but he should never have been able to get into the position of being able to spout that bile for so long. The best we have been able to do is strip him of an honour. It is completely and utterly unacceptable.

There is a similar problem with other platforms. I want to talk briefly about BitChute. It is an alternative platform, but we see the same old tropes there. Videos get millions of views there. It is a nastier version of YouTube—let us be honest—with videos in the name of the proscribed group National Action, a channel, for example, with the name “Good Night Jewish Parasite”, livestreaming of terrorist content, racist videos about Black Lives Matter protesters and much more; but it is a UK-based platform with UK directors, and while action is taken against individual videos there is, sadly, not enough recourse. Given the time limits, I shall quickly ask two questions and make two comments on legislation and where we are heading.

The online harms White Paper suggested a number of codes of practice, and that seems to have been rowed away from somewhat in recent weeks and months, so that there will be reliance, instead, on the terms and conditions. I do not think that that is enough. I hope that the Minister will confirm that enforceable codes of action will flow. I hope that also if she has some time she will perhaps meet me, and the Antisemitism Policy Trust and other partners, to discuss the matter in more detail.

Will the Minister consider introducing senior management liability for social media companies? The German model for fines is often talked about, but it has not worked. The maximum fine so far issued in Germany is, I think, two million dollars or pounds, which is nothing for Facebook. It can afford to build that into its programme.

There is plenty more I could have said—I am conscious of the time—but I hope the Minister will commit to meet with us and respond to those two points.
  00:01:01
in the Chair
Sir Edward Leigh
I remind Members that unless we keep to four minutes, we will not get everybody in.
Lab
  00:01:26
Chris Elmore
Ogmore
Thank you, Sir Edward; the pressure is on. I congratulate my hon. Friend the Member for Halifax (Holly Lynch), as I have already said. I remember a debate on online harms some four years ago, when I first entered the House, when only three Members were in this room. Clearly our numbers are restricted today, but it is great to see a full Westminster Hall, as more and more Members come to realise the huge problems that the online platforms create.

Being aware of the time, I want to stick to two areas where I think the platforms are failing. First, I have raised anti-vax issues right across the summer, and as the pandemic started. In the last year an additional 7.8 million people have visited anti-vax Facebook pages or followed the Twitter, Instagram or YouTube accounts of organisations that are trying to make a quick buck out of people’s fears by selling false vaccines and treatments, asking them not to consult a doctor if they have any symptoms—“Don’t get tests because you can cure these things with different types of herbal tea”.

Across all the platforms—none is particularly worse than the others in my view, because they all have a responsibility—the argument that comes back is: “It’s a point of view: a position they could take, if you could form an argument, about this possibly being the way forward on covid.” Sadly, I have heard Members of this House suggest that covid is no worse than flu, despite all clinical professionals saying that is not the case. This gets picked up on anti-vax platforms, which quote Members saying, “You don’t have to wear a mask, you don’t have to get a vaccine and you don’t have to worry about it, because it’s no worse than flu”. Even if the Member has not said that, they twist their words into that position. How the platforms manage that is a huge concern.

I welcomed Facebook’s intervention yesterday to take down President Trump’s comments about covid. It is nice to see an intervention at that level, confirming that he indeed spouts fake news. It is about time Facebook did a lot more of that to address what is happening in this pandemic.

My second point is about the protection of children and young people. I have a huge concern about cyber-bullying and the targeting of young people, and specifically the growing number of young people being coerced, via gaming or the platforms or livestreaming, into committing sexual acts of harm against themselves, and that then is moving into the dark web. The Internet Watch Foundation says that Europe is the grooming capital of the world—it is mainly in the Netherlands, but it is on the increase in this country. I have already mentioned the concern of the IWF and the Met about the need for the Government to put more resources into getting these URLs taken down. There is a real fear among the tech community that young people are being taught how to abuse themselves by people who are grooming them. I know the Minister cares about this—we have spoken about it before. It needs to be rectified.

My two asks, in the half a minute left to me, are that we introduce the Bill as quickly as possible and that it is robust and clear, and takes on the platforms. I agree with the hon. Member for Brigg and Goole (Andrew Percy) that it cannot be about the platforms setting their own regulations and then Ofcom deciding what should or should not be controlled and fines being issued. There should be culpability.

My final ask to the Minister is to create a social media levy fund that allows research into this issue. Make the platforms pay for the fact that they are not willing to do half of this work themselves, for the sake of young people, politicians, people in public life and people in the street who believe the fake news or the anti-vax information, because they are fearful of something. If they do not take responsibility, they should be fined for the dishonour of not dealing with these problems.
in the Chair
Sir Edward Leigh
Well done—four and a half minutes.
Con
  00:00:00
Fiona Bruce
Congleton
The 2015 Conservative manifesto made a commitment that

“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material”.

That is crucial, because of what the Children’s Commissioner says about the damaging impact of such sites on young people’s views of sex or relationships and

“belief that women are sex objects.”

In 2016 the Government therefore rightly introduced proposals for age verification, or AV, and some of us here spent many hours scrutinising, amending and ultimately passing part 3 of the Digital Economy Act 2017. Commercial providers would have to implement age verification systems requiring users to provide proof of age—that they were over 18—or the provider sites would be blocked. That is critical when only a small proportion of those sites are UK-based; the top 50 are all based outside the UK.

Concerningly, however, in 2019 the Government suddenly announced that they were not going to implement part 3 of the 2017 Act, which was then the subject of an angry urgent question. At the same time, though, Ministers gave reassurances that they regarded protecting children from pornography as “a critically urgent issue” and that their purpose was not to abandon plans to introduce AV on commercial pornography sites but to introduce AV instead through the online harms Bill, which would address all online harms in the same piece of legislation.

The indications were that that Bill would be ready for pre-legislative scrutiny in early 2020. I am more than saddened that that was not the case. The Government produced an online harms White Paper and a consultation in April 2019. The consultation closed in June 2019; the Government’s full response to it is still awaited, with no draft Bill yet in sight.

We have heard that the draft Bill might now be published in mid-2021, meaning that, subject to pre-legislative scrutiny, it could be 2023 before it is on the statute book, six years after this House passed part 3 of the 2017 Act—six years during which increasing numbers of children, some as young as five, have had unfettered access to online pornography.

Parents, children’s charities and many colleagues here in Parliament are deeply concerned. This week, Savanta ComRes polling has been published showing that the public are not happy. In mid-September, 2,100 adults were polled across the UK, 63% of whom said that the Government should implement part 3 of the 2017 Act now and additional protections against other online harms through the online harms Bill, when that legislation has been passed. Only 21% thought the Government should delay introducing statutory AV on pornographic sites until all the other mechanisms for addressing online harms are ready. If we discount the “don’t knows”, 74% said the Government should implement part 3 of 2017 Act now.

Finally, I suspect that the provisions that the Government may introduce could be even weaker than those in part 3 of the 2017 Act, having received replies to written parliamentary questions indicating that the proposed duty of care will apply not to all commercial pornographic sites but only to those that do not enable user-generated functionalities, because they usually require payment, which acts as a deterrent to children accessing them.

The Government should neither delay nor water down their manifesto commitment. I call on them to implement part 3 of the 2017 Act immediately and to introduce additional online safety protection through the online harms Bill urgently. We can never make the internet safe, but we can make it safer.
SNP
Dr Lisa Cameron
East Kilbride, Strathaven and Lesmahagow
It is a pleasure to serve under your chairmanship, Sir Edward.

I thank the hon. Member for Halifax (Holly Lynch) for bringing this extremely important issue to Westminster Hall for debate. She spoke very eloquently and comprehensively on these issues, highlighting the lack of consequences under current legislation and the absolutely devastating impact of online abuse.

In the time that I have, I will speak briefly about a couple of different issues. The first is mental health. In this generation, one in five individuals online are children, so there will be an absolute tidal wave of mental health issues linked to online abuse. Indeed, that is already happening: I see it with my own children and their peers, and I see it across my constituency and, indeed, the UK. As chair of the all-party parliamentary group on mentoring, which provides the secretariat for the Diana Award, I recently spoke to the ambassadors on its youth board and heard that online bullying is one of the most fundamental issues that children have to cope with every single day of their lives. When I was young, bullying used to happen at school, but a child could come home from school and there was a gap, and they could receive support from elsewhere. Nowadays, it is constant. Now children are linking with their peers all the time—the majority of that is online, as well as occasionally in person, now that they are back at school.

The mental health impact must be addressed, and the legislation is crucial to that. It is a starting point, but we also need to support the youth ambassadors in schools who want to do anti-bullying work, be mentors to peers and ensure that they progress that good work to build resilience in their generation against this abuse. We look to the Minister to ensure that the legislation is robust and provides support in schools. I echo the words and sentiments of the hon. Member for Brigg and Goole (Andrew Percy), who spoke about the importance of tackling antisemitism, particularly online. The Community Security Trust is recording the data and found 185 online instances in 2015 and 697 cases in 2019. That is a massive increase of 277%. That is another issue that really has to be tackled.

Finally, will the Minister look closely not only at the individuals who are online anonymously, but at those who impersonate others? I had that myself at the last election. As soon as I gave up the “MP” in my handle, someone online took that handle and started to tweet as though they were me. I am raising this not from my own perspective particularly—although it could have been extremely damaging—but because, as everyone here knows, constituents contact us online to tell them about their personal issues as a first port of call. Will the Minister consider meeting me to discuss that important issue further?
DUP
Carla Lockhart
Upper Bann
I commend the hon. Member for Halifax (Holly Lynch) for securing this important debate on an issue that I am passionate about, being motivated by my own experience of online abuse. A constituent has spoken to me about their experience and their concerns. I speak also as a mother who has concerns about what the online world will look like when my son starts accessing the internet.

Covid-19 has magnified the benefits of the internet. Our social media platforms have been so important in getting the safety message out, as well as being important for remote learning and home schooling. More than ever, we talk to our family online. Those are just some examples of how the internet can be a tremendous power for good. However the internet also remains one of the most dangerous and potentially destructive interventions of the last century. It is a place where online bullying and harassment occurs daily, where child abuse is endemic, and where damaging information can be shared, suicide and self-harm promoted, and terrorism encouraged and espoused.

In recent months, these abhorrent sides of the internet have certainly got a tighter grip. If what happens every second on the internet happened on the streets, we would not let our children anywhere near it and we would expect the full force of the law to be applied to it. Yet Government and law enforcement have allowed the intangible, remote worldwide web to self-regulate, and have also allowed an attitude of self-regulation for well over a decade. It is my belief that this must end.

I wholeheartedly endorse the six tests outlined by the NSPCC last week on what must be delivered by the legislation, including liability, empowering a regulator, effective sanctions, and recognising and dealing with the gravity of issues such as abuse, self-harm and suicide. Can anyone give me a good reason why we cannot take these steps?

I wish today to highlight a number of concerns. The first is about age verification, which is an absolute must in the forthcoming legislation. The Government U-turn of 16 October 2019 on age verification has never been accompanied by an explanation that made sense. Given that age verification providers were in place and ready to provide robust, high-specification, secure, anonymised verification, the alleged technology issues explanation does not stack up. In addition, why would the Government hold back on something that could be delivered now, as opposed to trying to marry it with other online harms that will be picked up in the current legislation? The Minister at that time told us that provision in the Conservative manifesto for statutory age verification on commercial pornography sites was “a critically important issue”. If it is critically important, it needs tackling immediately. I also ask the Government to prove us wrong on the perception that they are intent on watering down the protections for children in this regard.

On anonymity, we need to tackle these faceless trolls. We need to strip away the ability to have several accounts, all of which are used to abuse individuals. We are all accustomed to confirming our identity, be it for employment or going through the airport. Why not online?

Finally, let us look at what else we can do to increase awareness and educate children, parents and society at large. I had the privilege last week of meeting a young lady from my constituency, Mrs Deborah Webster, who has carried out significant research into the impact of legal but harmful content on our young people. Her findings are staggering—so much so that she now goes into schools to carry out a digital resilience programme. I think we all need some digital resilience in today’s society, but it needs to be underpinned by education and the curriculum. This is a huge opportunity for the Government to make a difference in almost every home in the country. It ought not to be missed.
Con
Mrs Maria Miller
Basingstoke
It is a great pleasure to serve under your chairmanship, Sir Edward, and I congratulate the hon. Member for Halifax (Holly Lynch) on securing the debate. I also thank the NSPCC, CARE, UK Safer Internet Centre, Girlguiding and Refuge for their excellent briefings. As the hon. Lady pointed out, this is an enormously complex issue. A number of petitions touching on areas of online harm have attracted around half a million signatories in total, as she said. That shows the Minister not only the strength of feeling but the importance of the Government’s providing a comprehensive response to this.

Let us be honest: when this sector wants to act, it does. It acted back in 2010 on online child abuse images, by putting in place protocols around splash pages, and it has acted on some issues around electoral fraud and fake news. However, the problem is that the industry does not consistently react, because it does not feel that it needs to. That has to change.

The Government have shown a clear intent to act in this area, through the 2017 Green Paper, the White Paper and the promise of legislation. The core concept that the Government want to put forward—as we understand it, anyway—is a duty of care: to make companies take responsibility for the safety of their users and to tackle the harm caused by their content, their activities and their services. Those are basic things that one would think were already in place, but they are not. They are to be applauded as a starting point, but again let us be clear that it is only a starting point, because setting up a regulator and regulatory frameworks do not provide a route of redress for victims. Lawyers know that a duty of care will not enable people to pursue a complaint to the regulator about an individual problem; it will just give the regulator an opportunity to fine companies or hit them over the head with a big stick.

People can bring a claim through ordinary legal proceedings, but that is limited by the existing legal framework, which we know is inadequate. The Law Commission is belatedly looking at a number of these areas, but it feels like the horse has already bolted. We might have to wait months or even years for its recommendations to come through, be reviewed and then be put forward in further legislation. It would be wholly unacceptable for the Government to bring forward a Bill with only measures to regulate, not legislation that actually has teeth.

We also need to deal with the inadequacies of the legislation, and I suggest that the Government should focus on at least three areas. When it comes to image-based abuse, the law is a mess. We have layer upon layer of legislation that does not give the police the necessary tools to protect victims. The second area is age verification, which my hon. Friend the Member for Congleton (Fiona Bruce) has already gone through. That is a promise we have not yet delivered on, and this Bill has to deliver on it. The third area is the importance of putting in place legislation that protects victims of intimidation during elections, which again the Government have promised to look at.

In conclusion, the coronavirus lockdown has served to create a perfect storm for online abuse. The Government have to act, and act quickly. Regulation alone is not enough; we need legislative reform as well.
Lab
Darren Jones
Bristol North West
It is a pleasure to serve under your chairmanship again, Sir Edward, and I congratulate my hon. Friend the Member for Halifax (Holly Lynch) on securing this important and timely debate. I also thank House officials for ensuring that Westminster Hall is open once again, so that we can have these debates. Before I begin my remarks, I will note my declarations of interest: my chairmanship of the parliamentary internet, communications and technology forum all-party parliamentary group, and of the APPG on technology and national security; my chairmanship of Labour Digital and the Institute of Artificial Intelligence; and my previous professional work on these issues as a technology lawyer, as noted in the Register of Members’ Financial Interest.

The online harms Bill will be a big and important piece of legislation, covering a range of difficult issues, from defining content that is harmful but not illegal and how we protect children, through to ensuring an effective regulatory framework that delivers a meaningful duty of care. Given the time, I will not rehearse the many important arguments for getting this right; I will keep my remarks short, both to give the Minister enough time to give substantive and full answers and so that other colleagues have a chance to contribute. The Secretary of State confirmed to the House in early September that the full response to the White Paper would be published this year—that is, 2020—and that legislation would be introduced early next year, which is 2021. On that basis, I have three sets of questions.

First, can the Minister confirm whether the publication of the full response to the White Paper is currently allocated to her Department’s forward grid, and if so, when it is pencilled in for publication? My understanding is that it will be published between now and December. Could she also tell us whether the Department has secured a legislative slot with the Leader of the House for First Reading, and if so, give us a rough idea of when that might be? Does the Department envisage a period of prelegislative scrutiny before Second Reading? If it does, what role will the House of Lords play in that?

Secondly, can the Minister reassure us that the initial scope of the duty of care and the enforcement powers being made available to the regulator have not been watered down, and that she agrees with me that, while it is difficult to define what is harmful but not illegal, Parliament is the body best placed to do so, not private companies? Will she also reassure us that the passage of this Bill will not be linked to negotiations with the United States on the UK-US trade deal, given that we know that the United States has placed liability loopholes for platforms in trade deals with other countries?

Finally, will the Minister confirm that the answer I received from the Security Minister on the Floor of the House—that the online harms Bill will include provisions for enhancing sovereign defensive and offensive digital capabilities—is correct? If so, will she tell us whether the progression of the Bill is linked to the ongoing integrated review?
in the Chair
Sir Edward Leigh
Textbook timekeeping.
Lab/Co-op
Stephen Doughty
Cardiff South and Penarth
It is a pleasure to see you in the Chair, Sir Edward. I congratulate my hon. Friend the Member for Halifax (Holly Lynch) on securing today’s debate. These issues are absolutely crucial; I have spoken about them many times in this place, particularly during my time on the Select Committee on Home Affairs, and have exposed gross failures by social media companies on a number of occasions. Of course, this goes well beyond the usual suspects. We have heard today about a range of sites, including gaming sites, BitChute, Gab, Discord and others that are less well known than the YouTubes, Facebooks, Instagrams and Twitters of this world. Even Tripadvisor, I have been told, was being used to share links to extremist content, which I am sure many of us find absolutely shocking.

I am also informed very much by my experiences in my Cardiff South and Penarth constituency over the last eight years. I have seen online videos glamorising drugs gangs and violence not removed due to claims of freedom of expression. These were videos that showed young people dripping in blood disposing of evidence after stabbing somebody—a simulated event, but one that was clearly glorifying utterly unacceptable and disgusting behaviour. I have seen jihadi organisations recruiting and spreading their messages of terror, including proscribed organisations, and other obvious ones such as Radio Aryan given the freedom to operate and spread their message of hate by YouTube, Facebook, Twitter and others, not taken off. I have seen online attacks on black, Asian and minority ethnic communities, antisemitism, Islamophobia, and attacks on the LGBT+ community. I have had my own experience of such online attacks, including threats about real-world events, and having to deal with those through the police.

My hon. Friend the Member for Ogmore (Chris Elmore) spoke about fake news, whether that is anti-vax, about 5G or about foreign powers spreading disinformation in our country. I am told that the Russian state, for example, seeded 32 separate narratives about the poisoning of the Skripals and its shameful activities with chemical weapons on our own soil, in order to spread disinformation.

We have heard about the huge damage to mental health, as well as issues such as cyber-stalking and the activities of paedophiles, but I want to draw the House’s attention to two specific issues today regarding the extreme right wing. The first is an organisation called the British Hand. Children as young as 12 are being drawn into extreme far-right groups, often by peers of a similar age, through the ease of anonymous social media accounts on bespoke sites such as Discord and Telegram, but also through Instagram, YouTube and Facebook. One particular cell, exposed by Hope Not Hate, is the British Hand group, allegedly led by a 15-year-old in Derby who recruits through Instagram and Telegram to encourage acts of violence, particularly against migrants. The group has been sharing videos of the Christchurch shooting and pro-National Action material, which is readily available on its Instagram page and through private group chats, instigating members to commit acts of violence, to join organisations such as, I am sorry to say, Sir Edward, the Army and the Army cadets, and to participate in the study of homemade weapons. That group must be dealt with and I hope the Minister will have something to say about them.

Secondly, there is the Order of Nine Angles. Alongside my hon. Friend the Member for Barnsley East (Stephanie Peacock), I have repeatedly called for the group to be banned. On 12 September, Mohamed-Aslim Zafis was sitting outside his mosque in Toronto when he was stabbed to death. The Canadian Anti-Hate Network has uncovered the alleged perpetrator, William Von Neutegem, is linked to the far right and particularly the Nazi satanist group the Order of Nine Angles. The group spread their message on YouTube using videos with references to neo-Nazism and the occult, and with chanting, a nine-pointed star and the monolith of a homemade altar associated with Order of Nine Angles ceremonies. This comes after serious events in the United States as well. They need to be banned. Hope Not Hate has been doing excellent work on this and I declare my interest as a supporter of theirs in Parliament.

This problem has got to be dealt with. We cannot see any more delay from the Government, either in banning these organisations or bringing forward online harms legislation. We cannot wait until 2023; this was needed years ago. It is needed in all of the areas that Members have spoken about today, but I would particularly like to see a focus on these extreme right organisations that have been given such a free rein.
Lab
  00:01:53
Carolyn Harris
Swansea East
May I say how wonderful it is to be back in Westminster Hall after so long? I thank all the House staff who have made that possible and I congratulate my hon. Friend the Member for Halifax (Holly Lynch) for securing this important debate.

Many Members have spoken about the abuse and grooming that happens online. I completely agree that that needs to be targeted and legislated for. Some may have assumed that I would be talking about gambling today, but I am not. I want to focus on a specific area that is a cause of untold damage and should be included in the scope of any legislation about online harms.

Online sellers are creating a significant problem by not protecting consumers who are purchasing electrical goods from their sites. We know from experience that such goods may be substandard, counterfeit, recalled or non-compatible with use in this country. There is a severe lack of transparency that is putting many lives at significant risk.

Electricity causes more than 14,000 house fires annually, almost half of all accidental house fires. Each year, thousands are injured due to electrical incidents. In a survey carried out by Electrical Safety First, a staggering 93% of customers said that they would expect online sellers to protect them from purchasing counterfeit or substandard products, yet even when buying from well-known global platforms far too often that is not the case. Millions of people are falling victim with potentially tragic consequences.

This year has unsurprisingly seen a record rise in online sales, meaning a record number of unsuspecting customers purchased potentially unsafe goods. When the hair, beauty and wellbeing sector was forced to close salons and spas for longer than other businesses, many people will have gone online to purchase equipment to cut the family’s hair or bought items that would enable them to do their own beauty treatments at home. In fact, research suggests that as many as 21 million consumers did this. With stocks running out fast, many people were just grateful to buy whatever they could get hold of, probably without even considering whether the product would be safe or not.

With covid restrictions set to continue, the increase in online shopping could have devastating consequences. Sales of electrical goods online are likely to be even higher this Christmas, so we have to do everything in our power to limit the harm and protect innocent consumers.

Since I first became an MP in 2015, I have campaigned for the need for the safer sale of electrical goods through online marketplaces. As a nation, our shopping habits were already moving online, but the coronavirus and subsequent lockdowns have escalated that trend at a far greater pace than we could ever have imagined. That is why I urge the Government to include the sale of unsafe electrical products in the expected online harms legislation.

Tighter controls are needed on the platforms selling those products, to ensure that people are purchasing items confident in the knowledge that their goods meet safety standards. Online sellers must take responsibility for their own checks and procedures to guarantee that the goods they are selling are genuine, safe and not subject to manufacturer’s recall.

Online harms legislation is vital to protect people from a wide range of potential dangers. Almost 90% of adults use the internet, and none of us can deny how potentially dangerous so many aspects of the digital world are. We cannot ignore the fact that our legislation needs to catch up. Terrorist activity, online bullying, gambling, child safety and the safety of vulnerable adults are all areas in desperate need of legislation, but so is the hidden harm of the potential purchasing of life-threatening electrical goods. When the Government bring in this new legislation, it is essential that the sale of unsafe electrical goods is included within the remit. Potentially fatal products are an online harm and it is our responsibility to ensure those sales are legislated against.
Lab
  00:05:53
Fleur Anderson
Putney
It is an honour to serve under your chairmanship, Sir Edward. I congratulate and thank my hon. Friend the Member for Halifax (Holly Lynch) for securing this important and, I hope, influential debate.

Online harms are one of the biggest worries and harms faced by parents across my constituency and across the country. As a parent, I am very worried about what is happening in the safety of my own home, which I cannot control. Speaking to other parents, I know that is a shared concern. In our own homes, children can have free and unfettered access to pornography and to people inciting young people to violent hate and extremist views. Women can be threatened to share intimate images, which can cause long-lasting damage. Our online world must be a safe and positive place for us all to explore, including our children, but at present it is not. Providers are not taking action. Parents just cannot keep up. Self-regulation is definitely not cutting it and online harm in our society is spiralling out of control.

The 2015 Conservative manifesto pledged that

“we will stop children’s exposure to harmful sexualised content online, by requiring age verification for access to all sites containing pornographic material.”

Well, it is time to come good on that commitment. If the Government had acted sooner, large numbers of children would not have been harmed by avoidable online experiences during lockdown. The consequences of ongoing inaction are severe and widespread. Our children can never unsee images they have stumbled across in all innocence in their own home. There are more children online for more time with more anxiety, yet there is less regulation, less action taken by providers and more sex offenders online.

I want to highlight three key issues. The first is pornography. According to the NSPCC, in the first three months of this year, more than 100 sex crimes against children were recorded every day. Lockdown led to a spike in online child abuse, meaning that that is much higher. The second issue is youth violence. The Mayor of London and deputy mayor for policing and crime have been vocal about the role of the internet in spreading violent messages and the incitement to commit serious youth violence. That is around us every day. The third issue is threats around sharing intimate images. One in 14 adults and one in seven young women have experienced threats about sharing intimate images. As a mother of two daughters, I am really concerned about that, and I know that parents across the country share that concern.

Although the sharing of intimate images was made a crime in 2015, threatening to share them can be just as damaging, but it is not illegal in England and Wales, although it is in Scotland. The threats are used to control, damage and affect mental health, and one in 10 survivors said that the threats had made them feel suicidal. There is also a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people and can have a damaging impact on young people’s views of sex or relationships, leading to a belief that women are sex objects. There are links between sexually coercive behaviour and higher rates of sexual harassment and forced sex. We simply cannot let this situation go unregulated any longer, so I have some questions for the Minister.

When will the First Reading of the online harms Bill be? Is there urgency to tackle online harms? Will the Minister commit to introducing legislation to outlaw threats to share intimate images as part of the Domestic Abuse Bill? Can she introduce a statutory instrument to redesignate the regulator as the British Board of Film Classification? That could be done very quickly and would enable age verification of pornographic websites. Will the online harms Bill contain strong and robust action, with a framework of comprehensive regulations and an adaptable new regulator that can adapt to the issues that will come up in future that we do not even know about yet?

It is time for tough action. We have really strict limits against hate speech and pornography in other areas of life, but just where most children are most of the time is where the Government are failing in their duty of care.
in the Chair
Sir Edward Leigh
We now come to the summing-up speeches. I call Gavin Newlands.
SNP
  00:02:54
Gavin Newlands
Paisley and Renfrewshire North
It is a pleasure to see you in the Chair, Sir Edward. We have had an absolutely excellent debate, as evidenced by the number of Members who have attended. The debate opened with a thoughtful and powerful contribution from the hon. Member for Halifax (Holly Lynch), and I congratulate her on securing it. She spoke alarmingly of the 100 online sex crimes against children each day—or more than 100 now—of the endemic misogyny online, and of the serious danger of doing nothing, comparing the long-term effects of doing nothing to smoking.

The hon. Member for Brigg and Goole (Andrew Percy) spoke of the ridiculous anti-mask and anti-vaxxer narrative and how it has gained traction online. He shared some of the vile antisemitic comments that often get posted online, often unchallenged on some platforms. The hon. Member for Ogmore (Chris Elmore) went into a lot more detail on the anti-vax brigade, who are too often emboldened by too many in the public eye, including Members in this place. I am sure he is delighted that Trump’s dangerous posts are being taken down. The hon. Member for Congleton (Fiona Bruce) expressed her disappointment, which I share, in the delay thus far in Government action, and she spoke of the survey that found that 63% of adults wanted the Government to implement part 3 of the Digital Economy Act 2017 immediately.

My hon. Friend the Member for East Kilbride, Strathaven and Lesmahagow (Dr Cameron) spoke of the tidal wave of mental health problems that we can all see in our constituencies and perhaps even in our own families, and the prevalence of online bullying. I think we all agree with what the hon. Member for Upper Bann (Carla Lockhart) said: if every incident that happened online happened in the street, we would not let our children out the door. She spoke of the importance of age verification being part of any Government action.

The right hon. Member for Basingstoke (Mrs Miller) spoke of providers and platforms and the absolute necessity for them to show a duty of care to their customers. She also spoke about the inadequacies of the existing legal framework. I look forward to the Minister’s answers to the three sets of on-point questions from the hon. Member for Bristol North West (Darren Jones). The hon. Member for Cardiff South and Penarth (Stephen Doughty) spoke of foreign powers spreading disinformation and the dangers of the extreme right wing online, which is definitely a sentiment I agree with.

The hon. Member for Swansea East (Carolyn Harris) surprised us all by not talking about gambling. She spoke about the fraudulent sale of dangerous, substandard and counterfeit goods. Lastly, the hon. Member for Putney (Fleur Anderson) said that the internet should be a positive place for our children, but that online harm is spiralling out of control. We have had quite contrasting contributions to the debate, but certainly there was a consensus within that that the Government must take action now. As has just been said, the internet should be an enormous and progressive force for good, whether for our economic development or for connecting with family and friends across the world, but all too often our experience can be a negative one, be that through the daily undermining of civil discourse, identity theft or being bullied or abused.

The internet has become an integral, indispensable and in many ways pervasive part of daily life, with nearly 90% of UK adults online. For 12 to 15-year-olds that figure is 99%, and I can definitely state that my 14-year-old is not one of the 1% in this case. One thing I think we all agree on is that the sheer pace of the development of the internet and our use of it, particularly over the past few months, has been difficult for Parliaments, Governments and therefore laws to keep pace with. These reforms are absolutely vital; they were already overdue and they have been subject to repeated delay. The pandemic has only added to the urgent need for their completion, as the world has moved online to an even greater extent.

The last formal update on the White Paper came in a report in February, but during the past six months, the NSPCC has reported an increased risk to children online during lockdown, while cases of covid-19-related fraud and scams have become prevalent. We know of fraudsters routinely targeting victims through sponsored Google and Facebook links and harvesting personal details from fake call centres.

There is a long-standing problem with serious organised criminals impersonating investment products; the Investment Association has reported that in the three months following the start of lockdown, reports of cloning scam activity spiked. Pandemic misinformation and online conspiracy theories have real consequences in the real world too, from increased numbers of people saying they will refuse a vaccine to the burning down of 5G masts.

That all increases the urgency for reform, but, as has been mentioned during the debate, the recent report from the House of Lords Democracy and Digital Technologies Committee said that the Bill might not come into effect until 2024, as the Government drag their feet on a draft Bill. In her response to the debate, may I seek an assurance from the Minister that that is not the case and that it will come in sooner than that?

The longer these delays continue, the more difficult it becomes for the Government to deny that they are due to the influence of extensive lobbying by large tech companies, coupled with a fear of potential damage to US-UK trade talks. Again, I seek an assurance in the Minister’s contribution that any trade talks with the US will have no influence on the Government’s approach or their timetable for taking action. The UK’s reputation as a secure financial centre is also at stake; with Brexit already leading firms to look to relocate, it is even more vital for the Government to avoid giving them another reason to do so.

It goes without saying, but I will say it anyway, that the Scottish Government firmly believe that online abuse is unacceptable and that everybody deserves to be treated fairly, regardless of age, disability, gender, gender identity, race, religion, belief or sexual orientation. The Scottish Government have funded respectme, Scotland’s fantastic anti-bullying service, which acts as a source of information for young people in Scotland. The organisation has created and made available publications to raise awareness of the issue of cyber-bullying. It has highlighted that online bullying is still bullying, and it implores us not to get caught up on the medium of abuse. We absolutely must tackle online abuse as robustly and reactively as offline abuse. There should be greater steps taken to inform the public of their right to report online abuse to the police, and training given to police forces on how to handle such cases.

A number of our own MPs, as has been mentioned, have experienced online abuse. Although politicians have chosen to be in public life, and with that comes an acceptance of public criticism, there is a crystal-clear difference between criticism—even harsh criticism—and abuse. I stand with all hon. Members who have suffered abuse, particularly my hon. Friend the Member for East Kilbride, Strathaven and Lesmahagow during the last election. I will also make the point that, while men are harassed online, when women are the target, online harassment quickly descends into sexualised hate or threats. Online gender-based violence is a clear example of the deeply rooted gender inequalities that still sadly exist in our society.

We have also heard that children are deeply vulnerable to online abuse. We must do more to keep them safe. The Scottish Government have a national action plan on internet safety for children and young people. The plan emphasises the role that wider society, including the online sector, must play in enhancing internet safety for children and young people. The Scottish Government continue to work to ensure that professionals and communities have the appropriate skills and knowledge to provide support to children and young people.



As in England and Wales, there are a number of offences in Scots law that can cover online bullying and harassment. The Scottish Government are looking to add further protections in this area by publishing a hate crime Bill, which will consolidate, modernise and extend existing hate crime legislation, ensuring that it is fit for 21st-century Scotland. The Scottish Government have engaged extensively with more than 50 organisations, including Police Scotland, the Crown Office and others that work in the criminal justice system. The Bill does not prevent people from expressing controversial, challenging or offensive views, nor does it seek to stifle criticism or rigorous debate in any way, but it will target individuals whose behaviour is threatening or abusive and is intended to stir up hatred. The Scottish Government will continue to consult and listen to all views as the Bill progresses, to ensure that the correct balance is struck.

It is crystal clear from today’s debate that there is a consensus for action on this vital issue. We just need the Government to get on with it. The longer they wait, the more lives are ruined by online crime and abuse.
Lab
  00:00:00
Chi Onwurah
Newcastle upon Tyne Central
It is a great pleasure to serve under your chairship, Sir Edward. I congratulate my hon. Friend the Member for Halifax (Holly Lynch) on securing the debate and other Members on their contributions, which have been thoughtful, well-informed and passionate on this critical subject.

I also declare an interest: as a chartered engineer, I spent 20 years building out the networks that have become the internet. Over that time, but most particularly in the 10 years since I entered Parliament, our lives have been increasingly lived online, with 80% of UK adults using the internet daily or almost every day. Social platforms such as Facebook, Google, YouTube, Instagram and Twitter are woven into the fabric of our lives. Together with a vast array of online apps for everything from video conferencing to healthy eating, they are a critical enabler of an active life as citizen, consumer and economic contributor.

The covid-19 pandemic has accelerated the shift online. At the height of the lockdown, UK adults were spending on average over four hours a day online. For those not digitally excluded, it brought huge benefits, keeping us in touch virtually as physical touch became antisocial. However, as we have heard, particularly for my hon. Friends the Members for Swansea East (Carolyn Harris) and for Cardiff South and Penarth (Stephen Doughty), the right hon. Member for Basingstoke (Mrs Miller) and the hon. Members for East Kilbride, Strathaven and Lesmahagow (Dr Cameron) and for Congleton (Fiona Bruce), the internet is at times an increasingly dark, challenging and inhospitable place. No matter how vulnerable or how well informed people are, they have little control over content, which is curated by tech platforms, allowing the spread of disinformation, sexual exploitation, fake news, extremism, hatred and other harmful content.

The importance and timeliness of today’s debate can be seen in the number of hon. Members in the Chamber, in yesterday’s United States Congress tech antitrust report and in today’s report from UBS, which reveals the eye-watering levels of wealth in the tech sector. Yet, as we have heard, the UK Government have done nothing. Regulation has not kept pace with technology, crime or consumers, leaving growing numbers of people increasingly exposed to significant online harms. It did not have to be this way.

In 2002, the then Labour Government saw the growth of new communications technologies and undertook a comprehensive, forward-looking review of the issues they raised. The result was the Communications Act 2003 and a new regulator, Ofcom, with the power to ensure that these issues were resolved in the public interest. That regulatory framework was given a 10-year lifespan—I know that because I was head of technology at Ofcom at the time.

In 2012, the Conservative-led Government saw the growth of our online lives, social media and big data, and did—nothing. The 2012 review of online harms may be the most important review we never had. As we have heard, the Government cannot even respond to their own belated and limited online harms consultation in a timely manner, leaving it to big tech to continue to control our online lines.

I consider myself a tech evangelist. I believe that tech is an engine of progress like no other. I believe it can improve the lives of my constituents and enable a more equal, more productive and more sustainable skills- based economy through a fourth industrial green revolution. However, people need to be protected online and empowered to take control of their online lives. The Government need to be on the side of the people, not tech lobbyists.

Hon. Members have set out many of the critical issues, so I will focus my remaining remarks on four areas: children, finance, disinformation and regulation. As emphasised particularly by the hon. Member for Upper Bann (Carla Lockhart) and my hon. Friend the Member for Putney (Fleur Anderson), the Government are failing in their duty to safeguard children. Worsened by increasing social isolation due to the pandemic, online abuse is being normalised for a whole generation. The previous Chancellor of the Exchequer, the right hon. Member for Bromsgrove (Sajid Javid), called the pandemic the “perfect storm” for child abuse. The UK Safer Internet Centre found 8.8 million attempts to access child sexual abuse in one month alone. How will the Government address that, and what will they do to support schools? The centre found that schools desperately need help and support in levelling up online safety. Will the Government replace the UK Safer Internet Centre’s EU funding, so that it can continue to do its good work as we leave the European Union?

On financing, the platform giants’ business model is driven by algorithms that serve up more and more extreme content, which drives extreme behaviours such as radicalisation and self-harm. The model depends on eyeballs and is financed through advertising. Google and Facebook control the online advertising market, which facilitates so much online harm. What plan do the Government have to address the failings of that model or to give the Competition and Markets Authority and the advertising regulators the powers to do so? It is despicable that, nearly three years after her death, the family of Molly Russell have had only limited access to her data and have been denied access to the algorithms and all the content that helped facilitate her suicide. Will the Minister ensure that that changes?

The tech giants’ model also means that Google and Facebook have control of the online high street, directing the traffic on it, even as Amazon unfairly outcompetes the high street in our real-world towns. How will the Government address economic online harms and enable competition?

Our ability to build back from covid will depend on the successful deployment of a vaccine. As we have heard from the hon. Member for Brigg and Goole (Andrew Percy) and my hon. Friend the Member for Ogmore (Chris Elmore), however, misinformation on vaccines—as well as on 5G, the holocaust and just about everything—is freely available and promoted on social media. The Government’s counter-disinformation officer has a full-time dedicated staff of zero. When will they take disinformation seriously, and what will they do about it?

We are a constructive Opposition. It might appear that I have been liberal in my criticism of the Government, but that is born from my experience, the experience of hon. Members present and, most importantly, the experience of constituents up and down the country. Far too many people’s lives are detrimentally affected by what they experience online. As a constructive Opposition, we have proposals as well as criticisms. The Government have been too slow to act, and tech giants have thought themselves unaccountable for too long. However, they can be made accountable. Self-regulation has failed, but robust, reasonable, rational, forward-looking and principles-based regulation can succeed. It is shocking that, in all this time, the Government have not established what those principles should be. Is anonymity a right, or is it a privilege? Is identity a right? How do we decide when legal online content becomes harmful?

Labour has made it clear that we need a digital bill of rights and a legal duty of care to give more powers and protections. We need a statutory regulator for online platforms to crack down on the harm, the hate and the fake. However, we have also launched the Our Digital Future consultation to build consensus on the underlying principles by which our online lives should be guided—it is still taking submissions, if hon. Members would like to contribute. We are also committed to eradicating the digital divide—indeed, the many new digital divides—as a result of which marginalised people have become increasingly excluded from the online world.

Many bodies have contacted me and asked me to raise their concerns about issues from dangerous goods online to data adequacy, small business competition to fake reviews, age verification to facial recognition, and antisemitism to intellectual property. I cannot do them all justice. The Government must outline a clear plan to address the multitude of online harms. It cannot be limited to the platforms simply policing their terms and conditions. Enforcement and redress are required, and I repeat the questions posed by my hon. Friends the Members for Halifax and for Bristol North West (Darren Jones), although I despair of answers. The Government must get a grip if our lives are to flourish online without fear or favour.
  00:00:57
Caroline Dinenage
The Minister for Digital and Culture
It is a pleasure to serve under your stewardship, Sir Edward. I thank the hon. Member for Halifax (Holly Lynch) for tabling this incredibly important topic for debate. This is my first opportunity since taking this role in February to speak publicly about online harms, and I am grateful for the chance to do so. I am also grateful to all Members who have taken part in the debate and raised some incredibly important topics.

My hon. Friend the Member for Brigg and Goole (Andrew Percy) summed up an important challenge at the beginning: it should not take Government legislation to sort this out, but, unfortunately, it does, now more than ever. That was brought home to me over the summer, when I talked to the father of Molly Russell, a young lady whose story started with online bullying and then led on to her seeking information online as to how to take her own life, which she did. That was a conversation that I never want to have with another parent again. It was utterly chilling. That is why my dedication to making sure the legislation is fit for purpose is stronger than ever.

The hon. Member for Ogmore (Chris Elmore) challenged me to ensure that the legislation is robust, clear and soon, and I take that challenge. I have had a number of other challenges from across the room, and given that I have only a few moments to respond, I will get through as many as I can. Anyone I do not get to, I will write to.

As hon. Members know, the Government published the online harms White Paper last year, setting out how to make legislation to make the UK the safest place in the world to be online. User safety is very much at the heart of our approach. The intention to establish a new duty of care for companies towards their users will ensure they have appropriate systems and processes in place to deal with the harmful content on their services and to keep their users safe.

Our approach will require companies to have clear and accessible mechanisms for users to report harmful content and to challenge it—take it down, in fact—where necessary. Companies will be expected to enforce their terms and conditions transparently and consistently. The duty of care will be overseen by a regulator, which will have oversight of these mechanisms and strong enforcement powers to deal with non-compliance.

The White Paper spoke about some of these powers, but we have also consulted on further powers to carry out things such as business disruption activities, blocking internet service providers and personal sanctions for senior managers. Further information will be published in the full Government response.

Since publishing the public consultation, we published the interim Government response earlier in the year, which shares the findings from the consultation and indicated the direction of travel. We intend to publish the full Government response within the next few weeks and to have the legislation ready early next year.

A range of other issues have been raised today, and I will get through as many as I can. The hon. Member for Upper Bann (Carla Lockhart) and many other hon. Members suggested that there might be some watering down of the legislation compared with the White Paper. In fact, the hon. Member for Bristol North West (Darren Jones) thought that it might be part of some of our trade negotiations. That is not the case. There will be no watering down—in fact, the opposite.

The protection of children is at the heart of our approach to tackling online harms, and a number of hon. Members have raised that. There is huge recognition that the online world can be particularly damaging for children. We understand that. It is their mental health and their very well-being that are at stake. Our online harms proposals will assume a higher level of protection for children than for the typical adult user. We will expect companies to have a range of tools to protect them, including measures such as age assurance and age verification technologies to protect them from accessing inappropriate content.

My hon. Friend the Member for Congleton (Fiona Bruce) spoke about the Digital Economy Act 2017. This will go further than the focus of the Digital Economy Act. One criticism of that Act was that its scope did not cover social media companies. One of the worst places where a considerable quantity of pornographic material is available to children is on social media. Our new approach will therefore include social media companies and all sites on which there is user-generated contact, including major pornography sites.

It is important that we no longer see age verification for pornography in isolation, but as part of this wider package to protect children across a range of sites and harmful materials. This technology is new and emerging, and it is important that we take every opportunity to get at the front end of it. That is why we are collaborating with the Home Office, GCHQ and a wide range of stakeholders on research into the verification of children online, and considering the technical challenges of knowing who online is a child. We ran a successful technical trial to test the use of age-assurance technologies at scale. The initial findings have been promising, and I look forward to developing that work shortly.

In recent years, there has been a massive rise in online abuse, harassment and intimidation, with a large majority of that on social media. I am clear that any abuse targeted towards anybody is unacceptable, but we heard from many Members that certain users are disproportionately targeted. For example, we know that issues such as revenge porn are rising. The UK Safer Internet Centre recently cited the fact that, this year, the revenge porn helpline has already dealt with 22% more cases than in the whole of 2019. That is not acceptable.

We are clear that what is illegal offline should be illegal online, including a number of things raised today, such as incitement to violence and the selling of faulty and potentially hazardous goods. We need to make sure that social media companies take as much responsibility as they can, but we also need to make sure that law enforcement agencies are equipped to take action where they need to. In some cases, the law is not fit for purpose to deal with the challenges of the online world, as we heard from my right hon. Friend the Member for Basingstoke (Mrs Miller). That is why we instructed the Law Commission to review existing legislation on abusive and harmful communications. It is also undertaking additional reviews, including on the taking, making and sharing of intimate images, which is obviously incredibly upsetting for victims. Given the nature of lawmaking, a patchwork of offences has been developed over time to address this issue. The Law Commission is now considering the best way to address these harms and whether there are any gaps in legislation. We are working alongside it to consider the right legislative vehicle to take this issue forward.

Finally, we have seen some horrific examples involving disinformation and misinformation over the covid period, including the burning down of 5G masts because of some horrific conspiracy theories. We stood up the cross-Whitehall counter-disinformation cell earlier in the year and, to give reassurance to those who asked for it, we have been working since the beginning of the summer with colleagues across Government and with social media companies on how to respond to anti-vax campaigns, so that is very much in hand.

As well as calling for action from companies, it is key that users are empowered with the knowledge and skills to keep themselves safe, which is why our online media literacy strategy will come out in partnership with the White Paper. With that, I will end, to leave time for the hon. Member for Halifax to conclude the debate.
  00:19:40
Holly Lynch
I am grateful to all hon. Members who have taken part in this important debate, and not least to the Labour shadow spokesperson, my hon. Friend the Member for Newcastle upon Tyne Central (Chi Onwurah), for her important contribution. She again made use of her expertise in the powerful position she took.

I am also grateful to the Minister for her response. She gave what I felt was quite a personal commitment to get a grip of this. However, while we asked a lot of specific questions, we heard phrases like “range of tools”, “terms and conditions”, “misinformation and disinformation” and “is in hand”. It was not necessarily the absolute clarity we were asking for. Will social media companies have heard that response and realised that the transformation that we ask for is on its way? I am not sure they will, so I politely put it to the Minister that we very much hope that her personal contribution and the commitments she made are forthcoming, and that where she is afforded more time in the future, she will be able to share more with us.

I will end with a personal plea, echoing the sentiments of my hon. Friend the Member for Putney (Fleur Anderson) and the hon. Members for East Kilbride, Strathaven and Lesmahagow (Dr Cameron) and for Upper Bann (Carla Lockhart). As a mother of a two-year-old boy, the clock is ticking, and I need assurances that, by the time he is old enough to use the internet, we will have got a grip of this. Others shared experiences of their children using it here and now, and they are witnessing the impact that it has on them. We also have responsibilities beyond that, as Members of Parliament and lawmakers.

Motion lapsed (Standing Order No. 10(6)).

Contains Parliamentary information licensed under the Open Parliament Licence v3.0.