PARLIAMENTARY DEBATE
Online Harms - 19 November 2020 (Commons/Commons Chamber)

Debate Detail

[Relevant Documents: Online abuse and the experience of disabled people, Petitions Committee, First Report of Session 2017-19, HC 759 and the Government response, HC 2122; and Oral evidence taken before the Petitions Committee on 21 May and 2 July 2020, on Tackling Online Abuse, HC 364.]
Con
  00:00:00
Jeremy Wright
Kenilworth and Southam
I beg to move,

That this House recognises the need to take urgent action to reduce and prevent online harms; and urges the Government to bring forward the Online Harms Bill as soon as possible.

The motion stands in my name and those of the hon. Member for Kingston upon Hull North (Dame Diana Johnson) and my hon. Friend the Member for Congleton (Fiona Bruce). I begin by thanking the Backbench Business Committee for finding time for what I hope the House will agree is an important and urgent debate. I am conscious that a great number of colleagues wish to speak and that they have limited time in which to do so, so I will be brief as I can. I know also that there are right hon. and hon. Members who wished to be here to support the motion but could not be. I mention, in particular, my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Digital, Culture, Media and Sport Committee, who is chairing the Committee as we speak.

I hope that today’s debate will largely be about solutions, but perhaps we should begin with the scale of the problem. The term “online harms” covers many things, from child sexual exploitation to the promotion of suicide, hate speech and intimidation, disinformation perpetrated by individuals, groups and even nation states, and many other harms. Those problems have increased with the growth of the internet, and they have grown even faster over recent months as the global pandemic has led to us all spending more time online.

Let me offer just two examples. First, between January and April this year, as we were all starting to learn about the covid-19 virus, there were around 80 million interactions on Facebook with websites known to promulgate disinformation on that subject. By contrast, the websites of the World Health Organisation and the US Centres for Disease Control and Prevention each had around 6 million interactions. Secondly, during roughly the same period, online sex crimes recorded against children were running at more than 100 a day. The online platforms have taken some action to combat the harms I have mentioned, and I welcome that, but it is not enough, as the platforms themselves mostly recognise.
Con
Sir John Hayes
South Holland and The Deepings
You may have noticed, Mr Deputy Speaker, that I am ostentatiously wearing purple. I have been missionised to do so because it is World Pancreatic Cancer Day. We have been asked to emphasise it, because raising awareness of that disease is important.

My right hon. and learned Friend is right to highlight the horror of degrading and corrupting pornography. Indeed, the Government have no excuse for not doing more, because the Digital Economy Act 2017 obliges them to do so. Why do we not have age verification, as was promised in that Act and in our manifesto? It is a straightforward measure that the Government could introduce to save lives in the way my right hon. and learned Friend describes.
Jeremy Wright
I agree with my right hon. Friend, but I will be careful, Mr Deputy Speaker, in what I say about age verification, because I am conscious that a judicial review case is in progress on that subject. However, I agree that that is something that we could and should do, and not necessarily in direct conjunction with an online harms Bill.

Digital platforms should also recognise that a safer internet is, in the end, good for business. Their business model requires us to spend more and more time online, and we will do that only if we feel safe there. The platforms should recognise that Governments must act in that space, and that people of every country with internet access quite properly expect them to. We have operated for some time on the principle that what is unacceptable offline is unacceptable online. How can it be right that actions and behaviours that cause real harm and would be controlled and restricted in every other environment, whether broadcast media, print media or out on the street, are not restricted at all online?

I accept that freedom of speech online is important, but I cannot accept that the online world is somehow sacred space where regulation has no place regardless of what goes on there. Given the centrality of social media to modern political debate, should we rely on the platforms alone to decide which comments are acceptable and which are unacceptable, especially during election campaigns? I think not, and for me the case for online regulation is clear. However, it must be the right kind of regulation—regulation that gives innovation and invention room to grow, that allows developing enterprises to offer us life-enhancing services and create good jobs, but that requires those enterprises to take proper responsibility for their products and services, and for the consequences of their use. I believe that that balance is to be found in the proposed duty of care for online platforms, as set out in the Government’s White Paper of April last year.

I declare an interest as one of the Ministers who brought forward that White Paper at the time, and I pay tribute to all those in Government and beyond, including the talented civil servants at the Department for Digital, Culture, Media and Sport, who worked so hard to complete it. This duty of care is for all online companies that deal with user-generated content to keep those who use their platforms as safe as they reasonably can.
DUP
Jim Shannon
Strangford
We have covered some important information. Does the right hon. and learned Gentleman agree that there needs to be a new social media regulator with the power to audit and impact social media algorithms to ensure that they do not cause harm? Such a regulator would enable that to happen.
Jeremy Wright
I agree that we need a regulator and will come on to exactly that point. The hon. Gentleman is entirely right, for reasons that I will outline in just a moment.

I recognise that what I am talking about is not the answer to every question in this area, but it would be a big step towards a safer online world if designed with sufficient ambition and implemented with sufficient determination. The duty of care should ask nothing unreasonable of the digital platforms. It would be unreasonable, for example, to suggest that every example of harmful content reaching a vulnerable user would automatically be a breach of the duty of care. Platforms should be obliged to put in place systems to protect their users that are as effective as they can be, not that achieve the impossible.

However, meeting that duty of care must mean doing more than is being done now. It should mean proactively scanning the horizon for those emerging harms that the platforms are best placed to see and designing mitigation for them, not waiting for terrible cases and news headlines to prompt action retrospectively. The duty of care should mean changing algorithms that prioritise the harmful and the hateful because they keep our attention longer and cause us to see more adverts. When a search engine asked about suicide shows a how-to guide to taking one’s own life long before it shows the number for the Samaritans, that is a design choice. The duty of care needs to require a different design choice to be made. When it comes to factual inquiries, the duty of care should expect the prioritisation of authoritative sources over scurrilous ones.

It is reasonable to expect these things of the online platforms. Doing what is reasonable to keep us safe must surely be the least we expect of those who create the world in which we now spend so much of our time. We should legislate to say so, and we should legislate to make sure that it happens. That means regulation, and as the hon. Gentleman suggests, it means a regulator—one that has the independence, the resources and the personnel to set and investigate our expectations of the online platforms. For the avoidance of doubt, our expectations should be higher than the platforms’ own terms and conditions. However, if the regulator we create is to be taken seriously by these huge multinational companies, it must also have the power to enforce our expectations. That means that it must have teeth and a range of sanctions, including individual director liability and site blocking in extreme cases.

We need an enforceable duty of care for online platforms to begin making the internet a safer place. Here is the good news for the Minister, who I know understands this agenda well. So often, such debates are intended to persuade the Government to change direction, to follow a different policy path. I am not asking the Government to do that, but rather to continue following the policy path they are already on—I just want them to move faster along that path. I am not pretending that it is an easy path. There will be complex and difficult judgments to be made and significant controversy in what will be groundbreaking and challenging legislation, but we have shied away from this challenge for far too long.

The reason for urgency is not only that, while we delay, lives continue to be ruined by online harms, sufficient though that is. It is also because we have a real opportunity and the obligation of global leadership here. The world has looked with interest at the prospectus we have set out on online harms regulation, and it now needs to see us follow through with action so that we can leverage our country’s well-deserved reputation for respecting innovation and the rule of law to set a global standard in a balanced and effective regulatory approach. We can only do that when the Government bring forward the online harms Bill for Parliament to consider and, yes, perhaps even to improve. We owe it to every preyed-upon child, every frightened parent and everyone abused, intimidated or deliberately misled online to act, and to act now.
Mr Nigel Evans
Mr Deputy Speaker
Order. There is now a three-minute limit on speeches.
Lab
  00:04:05
Dame Diana Johnson
Kingston upon Hull North
I pay tribute to the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright) and the hon. Member for Congleton (Fiona Bruce) for securing this debate. Today is World Children’s Day, when we are asked to imagine a better future for every child, and I will focus my remarks on an online harm that the Government could act on quickly to protect our children. Commercial pornography websites are profiteering from exposing children in the UK to hardcore violent pornography—pornography that it would be illegal to sell to children offline and that it would be illegal to sell even to adults, unless purchased in a licensed sex shop.

Three years ago, Parliament passed legislation to close this disastrous regulation gap. Three years on, the Government have still not implemented it. Assurances that the regulation gap will be filled by the forthcoming online harms legislation do not stand up to objective scrutiny. This is a child protection disaster happening now, and the Government could and, I hope, will act now.

Children are being exposed to online pornography on an alarming scale, and during the covid-19 pandemic, there is no doubt that the figures will have increased even more with children more often having unsupervised online access. The issue is the widespread availability and severity of online pornography accessible at home. It is no longer about adult magazines on the top shelf in the newsagent. Contemporary pornography is also overwhelmingly violent and misogynistic, and it feeds and fuels the toxic attitudes that we see particularly towards women and girls.

Back in 2017, Parliament passed part 3 of the Digital Economy Act. Enacted, it would prohibit commercial pornography websites from making their content available to anyone under the age of 18 and create a regulator and an enforcement mechanism. It was backed by the leading children’s charities, including the National Society for the Prevention of Cruelty to Children and Barnardo’s, as well as the majority of parents. However, in 2019 the Government announced that they would not be implementing part 3 of the 2017 Act. In the online harms White Paper in February, the Government said that any verification

“will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions”.

That is not good enough. Parliament has already spoken. We have said what we want to happen. I expect the Government to build on part 3 of the 2017 Act. It is set out and is ready to go to. They should act on it now.
Con
Damian Collins
Folkestone and Hythe
I congratulate my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) on his excellent speech introducing this debate. We need to be clear that the online harms White Paper response from the Government is urgently needed, as is the draft Bill. We have been discussing this for several years now. When I was Chair of the Digital, Culture, Media and Sport Committee, we published a report in the summer of 2018 asking for intervention on online harms and calling for a regulatory system based on a duty of care placed on the social media companies to act against harmful content.

There are difficult decisions to be made in assessing what harmful content is and assessing what needs to be done, but I do not believe those decisions should be made solely by the chief executives of the social media companies. There should be a legal framework that they have to work within, just as people in so many other industries do. It is not enough to have an online harms regulatory system based just on the terms and conditions of the companies themselves, in which all Parliament and the regulator can do is observe whether those companies are administering their own policies.

We must have a regulatory body that has an auditing function and can look at what is going on inside these companies and the decisions they make to try to remove and eliminate harmful hate speech, medical conspiracy theories and other more extreme forms of harmful or violent content. Companies such as Facebook say that they remove 95% of harmful content. How do we know? Because Facebook tells us. Has anyone checked? No. Can anyone check? No; we are not allowed to check. Those companies have constantly refused to allow independent academic bodies to go in and scrutinise what goes on within them. That is simply not good enough.

We should be clear that we are not talking about regulating speech. We are talking about regulating a business model. It is a business model that prioritises the amplification of content that engages people, and it does not care whether or not that content is harmful. All it cares about is the engagement. So people who engage in medical conspiracy theories will see more medical conspiracy theories. A young person who engages with images of self-harm will see more images of self-harm. No one is stepping in to prevent that. How do we know that Facebook did all it could to stop the live broadcast of a terrorist attack in Christchurch, New Zealand? No one knows. We have only Facebook’s word for it, and the scale of that problem could have been a lot worse.

The tools and systems of these companies are actively directing people to harmful content. People often talk about how easy it is to search for this material. Companies such as Facebook will say, “We downgrade this material on our site to make it hard to find,” but they direct people to it. People are not searching for it—it is being pushed at them. Some 70% of what people watch on YouTube is selected for them by YouTube, not searched for by them. An internal study done by Facebook in Germany in 2016, which the company suppressed and was leaked to the media this year, showed that 60% of people who joined Facebook groups that shared extremist material did so on the recommendation of Facebook, because they had engaged with material like that before. That is what we are trying to regulate—a business model that is broken—and we desperately need to move on with online harms.
Lab
  00:00:46
Chris Elmore
Ogmore
I thank the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright) for securing the debate with the hon. Member for Congleton (Fiona Bruce). I pay particular tribute to him, because when he was Culture Secretary, he and Margot James, who is no longer in this place, spearheaded this legislation. They are a credit to the House for ensuring that it was a priority for the Government then. I know how important the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Boston and Skegness (Matt Warman), thinks this is, but some of us—me included—have been talking about this issue for more than three and a half years, and the Bill needs to come forward. The delays just are not acceptable, and too many people are at risk.

I pay tribute to the hon. Member for Folkestone and Hythe (Damian Collins) for not only his speech but his chairmanship of the DCMS Committee, which he did without fear or favour. He took on the platforms, and they did not like it. All credit to him for standing up for what he believes in and trying to take on these giants.

In the two minutes I have left, I want to talk about the inquiry of my all-party parliamentary group on social media in relation to child harm, which the right hon. and learned Member for Kenilworth and Southam touched on. The Internet Watch Foundation is a charity that works with tech industries and is partly funded by them. It also works with law enforcement agencies and is funded by the Government and currently by the European Union. It removes self-generated images of child abuse. It removes URLs of children who have been coerced and groomed into taking images of themselves in a way that anyone in this House would find utterly disgusting and immoral. That is its sole, core purpose.

The problem is extremely complex. The IWF has seen a 50% increase in public reports of suspected child abuse over the past year, but the take-down rate of URLs has dropped by 89%. I have pressed DCMS Ministers and Cabinet Office Ministers to ensure that IWF funding will continue, to address the fact that these URLs are not being taken down and to put more resources into purposefully tackling this abhorrent problem of self-generated harm, whether the children are groomed through platforms, live streaming or gaming.

The platforms have not gone far enough. They are not acknowledging the problem in front of them. I honestly believe that if a future Bill provides the power for the platforms to decide what is appropriate and for Ofcom to make recommendations or fine them on that basis, it is a flawed system. It is self-regulation with a regulator—it does not make any sense. The platforms themselves say that it does not work.

Will the Minister please—please—get a grip on the issues that the IWF is raising, continue its funding, and do all that he can to protect children from the harm that many of them face in their bedrooms and homes across the UK?
Con
  00:01:38
Fiona Bruce
Congleton
The Prime Minister reminded us today that the first duty of Governments is to protect their citizens from harm. Our children need and deserve to be kept much safer from online harm, so I urge the Government not to let the best be the enemy of the good. They committed themselves to producing an online harms Bill to comprehensively address online harms and acknowledged that such a Bill was critically urgent, but they have failed to do so expeditiously. Specifically, they have failed to implement age verification, legislation on which was actually passed in part 3 of the Digital Economy Act 2017. I join colleagues in urging them to do so today. We will never make the internet safe, but we can make it safer by implementing measures quickly to give children some protection from commercial pornography sites, pending the introduction of a more comprehensive Bill.

We need to do so much more to protect children from being drawn into producing material themselves. There is growing concern about self-generated indecent images of children, made when a child is tricked or coerced into sending sexual images of themselves. I commend the work of my right hon. Friend the Member for Bromsgrove (Sajid Javid), who, with the Centre for Social Justice, has launched an investigation into child sexual abuse, and I commend his op-ed in The Sun on Sunday last week. It is not often that I commend something in The Sun, but in his op-ed he highlighted the increase in livestreamed abuse in which sex offenders hire traffickers in countries such as the Philippines to find children for them to violate via a video link. I also thank the International Justice Mission for its effective work in highlighting this despicable trade and consumption, in respect of which the UK is the world’s third largest offender. As the IJM says, we need to do more than highlight this; the Government need to improve prevention, detection and prosecution.

Yes, we have made great strides as a country in detecting and removing child sexual abuse material from UK-hosted websites, but livestreamed abuse is not being detected or reported and much more needs to be done by tech companies and social media platforms to rectify the situation. Legislation must require them to act. For example, they could adopt a safety-by-design approach so that a camera cannot be flipped to face a child. Regulation of the online space is needed to ensure that companies take swift and meaningful action to detect the online sexual exploitation of children, and there must be more accountability for offenders who commit this abuse. We should not distinguish the actions of those offenders from the actions of those who prey on children in person. Every image depicts a real child being hurt in the real world. Communities of online offenders often ask for original videos and images as their price of admission, prompting further targeting and grooming of vulnerable children.

The Government need to act urgently to help better to protect vulnerable children—indeed, all children—and to promote greater awareness, including through education. Children need to know that it is not their fault and that they can talk to someone about it, so that they do not feel, as so many teachers who have talked to Childline have said, “I can’t deal with this any more. I want to die.”
Lab/Co-op
  00:05:55
Stephen Doughty
Cardiff South and Penarth
Many of us took part in a debate on these issues in Westminster Hall recently. I do not want to repeat all the comments I made then, but I have seen the wide range of online harms in my constituency of Cardiff South and Penarth, and the online harms leading to real-world harms, violence and hatred on our streets.

In that Westminster Hall debate, I spoke about the range of less well-known platforms that the Government must get to grips with—the likes of Telegram, Parler, BitChute and various other platforms that are used by extremist organisations. I pay tribute to the work that HOPE not Hate and other organisations are doing. I declare an interest as a parliamentary friend of HOPE not Hate, and commend to the Minister and the Government its excellent report on online regulation which was released just this week.

I wish to give one example of why it is so crucial that the Government act, and act now, and it relates to the behaviour of some of the well-known platforms. In the past couple of weeks, I have spoken to one of those platforms: YouTube—Google. It is not the first time that I have spoken to YouTube; I have raised concerns about its content on many occasions as a member of the Home Affairs Committee. It was ironic to be asked to take part in a programme to support local schools on internet safety and being safe online, when at the same time YouTube, despite my personally having reported instances of far-right extremism, gang violence and other issues that specifically affect my constituency, has refused to remove that content. YouTube has not removed it, despite my reporting it.

I am talking about examples of gang videos involving convicted drug dealers in my constituency; videos of young people dripping in simulated blood after simulated stabbings; videos encouraging drug dealing and violence and involving young people as actors in a local park, just hundreds of metres from my own house—but they have not been removed, on grounds of legitimate artistic expression. There are examples of extremist right-wing organisations promoting hatred against Jews, black people and the lesbian, gay, bisexual and transgender community that I have repeatedly reported, but they were still on there at the start of this debate. The only conclusion I can draw is that these companies simply do not give a damn about what the public think, what parents think, what teachers think, what those in all parts of the House think, what Governments think or what the police think, because they are failing to act, having been repeatedly warned. That is why the Government must come in and regulate, and they must do it sooner rather than later.

We need to see action taken on content relating to proscribed organisations—I cannot understand how that content is online when those organisations are proscribed by the Government—where there are clear examples of extremism, hate speech and criminality. I cannot understand why age verification is not used even as a minimum standard on some of these gang videos and violent videos, which perhaps could be justified in some parallel world, when age verification is used for other content. Some people talk about free speech. The reality is that these failures are leading to a decline in freedom online and in safety for our young people.
Con
Damian Hinds
East Hampshire
There are so many aspects to this, including misinformation on the pandemic, disinformation and foreign influence operations, harassment, engagement algorithms, the effect on our politics and public discourse, the growth in people gambling on their own, scammers and chancers, and at the very worst end, radicalisation and, as we have heard from many colleagues, sexual exploitation. I am grateful to the Backbench Business Committee for granting time for the debate, but this is not one subject for debate but about a dozen, and it needs a lot more time at these formative stages, which I hope the Government will provide. My brief comments will be specifically about children.

When I was at the Department for Education, I heard repeatedly from teenagers who were worried about the effect on their peers’ mental health of the experience of these curated perfect lives, with the constant scoring of young people’s popularity and attractiveness and the bullying that no longer stops when a young person comes through their parents’ front door but stays with them overnight. I heard from teachers about the effect of technology on sleep and concentration and on taking too much time from other things that young people should be doing in their growing up. I take a lot of what will be in this legislation as read, so what I will say is not an exclusive list, but I have three big asks of what the legislation and secondary legislation should cover for children. By children, I mean anybody up to the age of 16 or 18. Let us not have any idea that there is a separate concept of a digital age of consent that is in some way different.

First, the legislation will of course tackle the promotion of harms such as self-harm and eating disorders, but we need to go further and tackle the prevalence and normalisation of content related to those topics so that fewer young people come across it in the first place. Secondly, on compulsive design techniques such as autoplay, infinite scroll and streak rewards, I do not suggest that the Government should get in the business of designing applications, but there need to be natural breaks, just as there always were when children’s telly came to an end or they ran out of coins at the amusement arcade, to go and do something else. Actually, we need to go further, with demetrification—an ugly word but an important concept—because children should not be worrying about their follower-to-following ratio or how many likes they get when they post a photograph. We should bear in mind that Facebook managed to survive without likes up to 2009.

Thirdly, we need to have a restoration of reality, discouraging and, at the very least, clearly marking doctored photos and disclosing influencers’ product placements and not allowing the marketing of selfie facial enhancements to young children. It is not only about digital literacy and resilience, though that plays a part. The new material in schools from this term is an important step, but it will need to be developed further.

It has always been hard growing up, but it is a lot harder to do it live in the glare of social media. This generation will not get another chance at their youth. That is why, yes, it is important that we get it right, but it is also important that we get it done and we move forward now.
Lab
  00:03:38
Catherine McKinnell
Newcastle upon Tyne North
This vital work is indeed taking far too long, and so much so that the Petitions Committee has launched a new inquiry on tackling online abuse following up our report in the last Parliament and looking at potential solutions for reducing crime and preventing it. Although the Government’s response to our previous report was positive, regrettably its online harms White Paper failed to address most of our concerns in relation to the impact on disabled people. The new inquiry will therefore continue to scrutinise the Government’s response to online abuse and press Ministers on the action that needs to be taken. We would welcome evidence from campaigners, legal professionals, social media companies and members of the public.

I want to address as well some of the most troubling material available online—material that has too often spilled over into the offline world with tragic consequences. From your internet browser today you could access video that shows graphic footage of real-event stabbings before alleging that the attack was, in fact, a Jewish plot. If you were so inclined, you could watch a five-hour-long video that alleges a Jewish conspiracy to introduce communism around the world—10,000 people already have. I could go on. These videos and others like it are easily discoverable on some of the so-called alternative platforms that have become safe havens for terrorist propaganda, hate material and covid-19 disinformation, so it is crucial, when the Government finally bring their online harms Bill forward, for it to have teeth.

The White Paper proposes establishing a new duty of care for users, overseen by an independent regulator, making it clear that fulfilling a duty of care means following codes of practice. The Government have rightly proposed two statutory codes—on sexual exploitation and abuse and on terrorism. Will the Minister now commit to bringing forward another code of practice on hate crime and wider harms? Without such a code, any duty of care for users will be limited to what the site’s terms and conditions allow. Terms and conditions are insufficient, as the Government acknowledge; they can be patchy and poorly applied.

The Antisemitism Policy Trust, which provides the secretariat to the all-party parliamentary group against antisemitism, which I co-chair, has produced evidence outlining how hateful online materials can lead to violent hate crime offline. A code of practice on hate crime, with systems-level advice to start-ups and minimum standards for companies, will go some way towards creating a safer world. There is much more in the Bill that needs serious consideration, but as a minimum we need to see a code of practice for hate crime drawn up and given the same status as that for child sexual exploitation and abuse and terrorism, and I hope today that the Minister can give us some reassurance that this will be taken seriously.
Con
  00:05:59
Karen Bradley
Staffordshire Moorlands
I congratulate my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) on securing this debate and I thank the Backbench Business Committee for granting time for it.

There is no doubt that the internet can be a force for good. Over the past few months, we have all enjoyed the fact that we can keep in touch with family and friends. We can work from home. Some people can even participate in certain parts of our proceedings, although clearly not this debate. But the internet can be used for harm. In the limited time I have I want to make just two points. One is about the impact on children and the other is about advertising online.

When I was the Secretary of State for Digital, Culture, Media and Sport, I initially took the idea to the then Prime Minister, my right hon. Friend the Member for Maidenhead (Mrs May), that we should have an internet safety strategy. That is what has become the online harms strategy. The internet safety strategy was born out of my work in the Home Office when I was the Minister for Preventing Abuse, Exploitation and Crime. It was clear to me through the work that I did in particular on protecting children that the internet was being used to harm children. We have some great successes. The WePROTECT initiative, for example, which has had a real impact on removing pornographic images of children and child abuse online, is a great success, but we must never rest on our laurels. I ask my hon. Friend the Minister, who knows full well about all this, because he was with me when lots of this work was happening in the Department, to deal with the issue of age verification on pornography. I know that it does not resolve every issue. It is not going to solve every problem, but there was a message given to me by children time and again. If there was one thing they wanted to see stopped, it was access to pornography because that was what was fuelling the harm that they faced.

Turning to advertising, I will share with the House that this weekend I will be cooking a beef brisket that I will be purchasing from Meakins butchers in Leek, and I will be putting on it a beef rub. Hon. Members may ask why I am telling them that. I am telling them that because I have been mithered for weeks by my 15 year-old son, who has seen such a beef rub on Instagram. He is not getting his advertising from broadcast media; he is getting his advertising from the internet, and he is desperate to try a beef rub on beef brisket, and I will therefore make sure he does so over the weekend.

We have to have a level playing field on advertising. Our broadcast media are about to face real restrictions on the way that certain products can be advertised. This will impact on our public service broadcasters in particular, but we do not see the same level of regulation applied to the internet, and I know for one that the place my children are seeing advertising is on the internet. It is, sadly, not on broadcast media in the way I picked up my advertising. I ask my hon. Friend the Minister to make sure he does something on that matter as well.
Lab
  00:02:07
Stephen Timms
East Ham
I want to raise just two points: first, the current epidemic of online frauds; and, secondly, the online sale of the illegal weapons used on our streets in gang violence.

First, the Pension Scams Industry Group has told the current Work and Pensions Committee inquiry that 40,000 people have suffered the devastation of being scammed out of their pension in five years. Much of that is online. Mark Taber told us he has reported to the Financial Conduct Authority this year 380 scam adverts on Google. It is a crime, but after weeks or months the FCA just issues a warning. The Transparency Task Force told us of

“high-profile, known crooks…running rings around the regulators”,

and said:

“Paid keyword search is a highly efficient means for pensions & savings scammers to target their victims.”

Another witness told us that there is

“a big increase in social media scams”.

Which? said that

“we need to look at what sort of responsibilities should be given to those online platforms to protect their users from scams.”

A director at Aviva told us that it

“had to take down 27 fake domains linked to our brand... It is very difficult and it takes a very long time to engage the web domain providers to get it down.”

He called big technology companies “key enablers of fraud”, and he made a call

“to extend the Online Harms Bill to include the advertising of fraudulent investments”.

I think that should be done, and I want to ask the Minister if it will be in the legislation.

Secondly, the Criminal Justice Act 1988 bans the sale and import of a list of weapons: disguised knives, butterfly knives, flick knives, gravity knives, stealth knives, zombie knives, sword sticks, push daggers, blowpipes, telescopic truncheons and batons. But all of them are available online for delivery in the post. That is how most weapons used on the streets in London are obtained. As we debated in the Offensive Weapons Public Bill Committee in 2018, companies should not sell in the UK products that it is illegal to purchase here. The Under-Secretary of State for the Home Department, the hon. Member for Louth and Horncastle (Victoria Atkins), said that the Home Office was working with the Department for Digital, Culture, Media and Sport on these online harms, and looking at

“what more we can do to ensure…companies act responsibly and do not facilitate sales of ‘articles with a blade or point’ or ‘corrosive products’ in their platforms.”––[Official Report, Offensive Weapons Public Bill Committee, 11 September 2018; c. 280.]

What I want to ask the Minister is: will that promise be fulfilled in the coming legislation?
Con
  15:19:46
Caroline Ansell
Eastbourne
Through this pandemic, we have seen what a saving grace the online world has proved to be to us. It is a window, and it has connected us to family and friends, and it has provided important information and services. In fact, I have worked hard to bring together different providers and trainers to close the digital divide that so disadvantages those who are not online. However, at the same time as being a saving grace, it is also a serious threat to our health and wellbeing, our security and our democracy—all of these things. I hope that, through this experience, we have now come to a place where we recognise that there is no longer this distinction between the offline and the online worlds.

That question was very much put at the trial of the man who threatened to kill me in 2017. I can assure hon. Members and all watching that it was real and it hurt. The same pain, the same suffering and the same frustration was felt by one of my constituents in 2016, where again the same question was posed: is there a difference between our online and offline experiences? She was a victim of revenge porn, a really dark and sinister crime. Her frustration and her powerlessness at not being able to bring down images that directed people from across the country to find her and rape her—and how the law did not reach her—was just something extraordinary to me. I therefore hope that that distinction is very much gone. We need a levelling up in our online and offline worlds

I want now to focus on children. I applaud the work done to date and I welcome the online harms Bill to come, but unfinished business is my point in this debate. We made a commitment to introduce statutory age verification on porn websites. We supported that in 2016 and we supported it in 2017. It is still supported now. The most recent survey suggested that 83% of parents urged it as mission critical to protect their children. We know that early exposure to porn is harmful. I understand that there are technical issues, but surely these can be overcome. Other countries have shown the way, when we were previously world-leading—France, for example, most recently.

More must be expected of our social media giants to maintain safe online environments, but I urge the Minister: we have the legislation, let us use it.
Lab
  15:21:16
Bambos Charalambous
Enfield, Southgate
The huge rise in online scams, hate speech and conspiracy theories has highlighted why the Government have to take action urgently, not just by passing legislation but having a counter-narrative to challenge the fake stories we hear about.

Looking at online hate speech at a recent Home Affairs Committee session, we heard that Facebook had deleted a staggering 9.6 million hate speech posts in the first quarter of this year. Much of that hate was directed towards south and east Asian communities, fuelled in part by President Trump using his position of power to fan the flames of hate by calling covid-19 the China virus. However, those 9.6 million posts are only the tip of the iceberg. There is still hate speech that has not been taken down, where apparently it falls short of being hate speech. This is an area that must concern us greatly.

It is not just the south and east Asian communities who have been targeted. Before lockdown, the blame for coronavirus was already being directed at the Muslim, Jewish, Gypsy, Roma, Traveller and LGBT+ communities. Chinese and east Asian people in the UK endured physical and verbal attacks, while Muslims were accused of ignoring lockdown and spreading the virus by visiting mosques. Conspiracy theories were abundant, and falsely linking those groups to the spread of the virus allowed those conspiracy theories to flourish.

That leads me to disinformation and conspiracy theories. The anti-vaccine conspiracy theories are particularly insidious, because casting doubt in people’s minds will result in people choosing not to be vaccinated, which in turn could lead to them catching the virus and passing it on to others. I will not give credence to any absurd anti-vaccine conspiracy theories by repeating them, but unchecked they could be damaging to the health of the nation.

Last year, I had the pleasure of visiting Ethiopia with the charity RESULTS UK to see how it has almost eradicated tuberculosis by vaccinating the majority of the country over the past decade, so I have seen the impact that a well-administered programme of vaccination can have. There needs to be a strong counter-narrative from the Government. That has been missing in countering both hate speech and anti-vaccination theories.

In conclusion, the Government have been dragging their heels on the online harms Bill, which has been talked about for the past three years. Urgent action is needed to counter hate speech, extremism and conspiracy theories to keep our communities and those who need protection safe. We need a counter-narrative to challenge those threats and we need legislative protection. We need action and we need it now, because people’s lives could be depending on it.
Con
  15:24:30
Robert Largan
High Peak
The internet has changed the world. In the past, typical hate crime took place on the street and involved a small number of people: the perpetrator, the victim and perhaps a handful of witnesses. The internet has changed all that. Now, when hate crime takes place online, it is seen and shared by thousands within minutes. The hatred is amplified and echoed in a toxic spiral that incites others to go further and further, sometimes spilling over into real life with devastating consequences. We are seeing the impact that the amplification of hate is having in real numbers. In the first six months of this year, the Community Security Trust recorded 789 antisemitic incidents across the UK. In 2019, it recorded a record annual total of 1,813. That is just one particular kind of hate directed at one tiny minority community.

I have seen this at first hand, for reasons I can never quite fathom. Last year, one then Labour councillor decided to start bombarding me with abusive messages over several months, accusing me of eating babies, claiming I was linked to Benjamin Netanyahu, repeatedly sending me messages with images of the crucifixion and images of pigs, songs referring to the Wandering Jew, photos of himself dressed in orthodox Jewish clothing, and repeatedly changing my name to Herr Largaman or Herr Larganberg. These incidents are relatively minor compared with what others have had to face, particularly women and many Members of this House. I pay tribute to the Community Security Trust for the amazing work it does, as well as to the Jewish Leadership Council and the Antisemitism Policy Trust, but the fact that such groups have to exist underlines why this Bill is so important.

We need to grasp the nettle and update our laws to reflect the new reality of the online world, and to make certain that this legislation is sufficiently strong and effective. In particular, I urge the Government to carefully consider the issue of anonymity. Many extremists hide behind a keyboard, masking their true identity to unleash abuse and spread false information. That has been facilitated by the growth of alternative social media platforms that anyone can access and post on anonymously. As a result, we have seen them turn into hotbeds of incitement and radicalisation. Some platforms even allowed the live-streaming of atrocities such as the murder of 51 worshippers at two mosques by white supremacists in New Zealand. It is important that we recognise that there is a place for anonymity, particularly for whistleblowers, victims of domestic abuse and people living under authoritarian regimes, but also that there is a sensible compromise, which I hope the Government will include in the Bill.

When I worked in financial services, we always had to carry out extensive “know your client” checks, as part of an effort to prevent fraud and money laundering. The same concept should apply to the online world. Firm penalties should be in place for companies breaching the duty of care—a modest fine will barely affect those companies—and there has to be individual liability for senior management in extreme cases. Again, that is not a new concept, as it already exists in financial services and in health and safety.
Con
  00:04:34
Elliot Colburn
Carshalton and Wallington
In my short contribution, I wish to focus on two areas: the need for this legislation to have sufficient teeth, and the need for clear definitions of what constitutes an online harm, which many of my constituents have been in touch with me about. I hear the criticism and concern that an online harms Bill could be overreaching and damage freedom of expression, but that should not stop the Government going ahead and trying to make the internet a safer place.

One of the best ways the Government could do that is by providing a clearer steer as to what constitutes “harm”. As we have heard, and as I think we are all agreed on in this House, high on the agenda must be a robust set of actions and consequences in place when content relating to terrorism, child abuse and equally abhorrent crimes is not taken down by social media companies. We can safely say that we, as Members of Parliament, know full well what a vile place the internet can be, given that we are sometimes on the receiving end of the most vile and horrific abuse. I was subjected to homophobic abuse during the election campaign in December last year. Any online harms Bill must therefore be sufficiently defined and powerful enough to establish how we can protect people against some of the harmful content available online.

I wish to go through some examples that have been raised with me by constituents. They include the fact that almost a quarter of children and young people who sadly lost their lives to suicide had previously searched the internet for suicide-related content; that one in five children had reported being victims of cyber-bullying; that social media companies were not just ignoring but refusing to take down content from so-called “conversion therapy” organisations, which leads so many lesbian, gay, bisexual and transgender people to consider self-harm or even suicide; that one in 14 adults were experiencing threats to share intimate images of themselves; that one in 10 women were being threatened by an ex-partner and going on to feel suicidal; that there was a higher prevalence of abuse among those with protected characteristics, be they women, religious minorities, LGBT+, black and minority ethnic or disabled people; that there was the issue of distorted body image among girls; and so much more.

We have seen the unwillingness of social media companies to act, which is why further regulation is necessary in this area, but it must be backed up not only by a regulator that has the teeth to act, but by proper education on safe and proper internet use, as regulation alone will not solve the problem. If the Government do get this right, they have the opportunity, probably a once-in-a-generation one, to make the internet a safer but no less free place to be.
DUP
  15:29:36
Jim Shannon
Strangford
I congratulate the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright) on his introduction and on all that he said. In my intervention I referred to the need for a social media regulator, and, as the hon. Member for Carshalton and Wallington (Elliot Colburn) has just said, we need a regulator with teeth. We need a regulator that actually does what it says it is going to do. That is important.

The Conservative manifesto of 2015 was very clear that it pertained not to social media platforms but to pornographic websites, and it committed to protecting children from them through the provision of statutory age verification. Part 3 of the Digital Economy Act 2017 made provision for that and it should have been implemented over a year ago. I respectfully express my dismay and concern that that has not happened.

The non-implementation of part 3 of the Act is a disaster for children, as it needlessly exposes them to commercial pornographic websites, when this House has made provision for their protection from some sites. Perhaps the Minister could explain why the Government’s detailed defence in the judicial review for not proceeding with the implementation seems to relate to the protection under paragraph 19, which states:

“US-based browser companies were planning on implementing DNS-over-HTTPS…a new internet standard”.

I have great concerns about that.

I am also troubled by the way in which the Government have moved from the language of requiring age verification for pornographic websites, as referred to in their manifesto, to the very different language of expectation. The Government have said:

“This includes age verification tools and we expect them to continue to play a key role in protecting children online.”

They also said:

“Our proposals will introduce higher levels of protection for children. We will expect companies to use a proportionate range of tools including age assurance and age verification technologies to prevent children from accessing age-inappropriate or harmful content.”

In their initial response to the online harms White Paper consultation, the Government also said:

“we expect companies to use a proportionate range of tools, including age assurance and age verification technologies to prevent children accessing age-inappropriate content such as online pornography and to protect them from harms.”

Quite simply, that is not enough. This should not be an expectation; it should be a requirement. We have to have that in place.

The NSPCC has highlighted some worrying statistics. Instagram removed 75% fewer suicide and self-harm images between July and September 2020, industry compliance to take down child abuse images fell by 89%, and 50% of recorded online grooming cases between April and June this year took place on Facebook platforms. What conversations have the Government had to ensure that Facebook and others design and deliver platforms that put child protection services front and centre, as they should be?
Con
  15:32:23
Christian Wakeford
Bury South
I congratulate my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), my hon. Friend the Member for Congleton (Fiona Bruce) and the hon. Member for Kingston upon Hull North (Dame Diana Johnson) on securing this important debate. As my right hon. and learned Friend said, there needs to be parity between online and real-world abuse. Just because hate is fuelled online, it does not make it any less real or any less hurtful, so there really should be parity. We are taking this seriously and that needs to be reflected in the law. People cannot hide behind a keyboard and expect to get away with it.

In the brief time I have, I want to tell two stories. The first involves a Conservative Member of this House who was in Germany some years ago, where they happened upon a far-right rally. The Member confronted the neo-Nazi group and was told to read a book about how Hitler was, in fact, a British spy—a preposterous conspiracy.

The second story is about a man named Joseph Hallett, who for some time has asserted his right to the throne of the United Kingdom, claiming he was cheated of his birth right by the illegitimate conception of King George V, a claim with no basis. He is known online as King John III, and his story has gained popularity among the QAnon movement, a conspiratorial group claiming special knowledge of satanic paedophile rings at the heart of government. Hallett, the fake king, thinks that the royal family is in hock to the Rothschilds, and anyone with an understanding of antisemitism will know where I am headed with this. He is an author, known by his second name, Greg, and he has written about his mad theories. His tome “Gifting the United Nations to Stalin” blames the Jews for 9/11. What else did he write? The book about Hitler being a British spy, recommended in person by a neo-Nazi to a Member of this House. Hallett has interacted with the QAnon community online. This conspiracy network captures the imagination of the unsuspecting, the naive or the bored, and drags them into worlds of hate.

The hatred is not limited to online spaces. QAnon accounts inspired the German faction known as Reichsbürger—citizens of the Reich—to storm the German Parliament in August. Perhaps it was one of its members that our colleague spoke to. More than 50 5G masts were burned down in Britain following another Q conspiracy. In spite of this, some elected representatives in the United States are voicing support for Q. Dealing with the type of legal but harmful content that Q represents is just one of the steps that will need to be taken through the online harms Bill.

In closing, I call on my hon. Friend the Minister to assure me that the proposed duty of care will not simply consist of a requirement for terms and conditions, which the White Paper professed to be insufficient. Will the Government consider giving a code of practice on hate crime equal status to the two proposed statutory codes on terrorism and child sexual exploitation and abuse, as the Antisemitism Policy Trust, the Community Security Trust, the Jewish Leadership Council and the Board of Deputies have called for? And can the Minister confirm that the Government will ensure that all elements of platforms with user-generated content will be covered?
Con
  00:04:03
Nick Fletcher
Don Valley
This is an incredibly important issue, and I agree with my hon. Friend the Member for Congleton (Fiona Bruce) that we should waste no time in introducing age verification as soon as possible to ensure that our children can use the internet in a safe way and not come across content that would expose them to material that they are far too young to see. Not only would that uphold the law, which is clear in setting out the illegality of under-18s viewing such content, but it would ensure that our young people’s development is not threatened and that children are allowed to be children.

Furthermore, we Conservatives should not forget that a year ago, we stood on a manifesto commitment to introduce statutory age verification checks for pornographic websites. This really matters and the public seem to believe so as well. Research carried out for the British Board of Film Classification in 2019 concluded that 83% of parents believe there should be robust age verification controls in place to stop children seeing commercial pornography online. If we are to respect the views of the public and uphold the public’s trust in this place, the Government must commit to enacting this policy. Statutory age verification checks for pornographic websites is what we promised and there should be no doubt that, as Conservatives, that is what we must deliver.

Equally, it is crucial that this subject is not broadened out by the Government to include other issues such as access to pornography on social media. Having read the debate on 7 October, I think that it is really important that today the Minister does not try to change the subject to accessing pornography on social media. Although that is an important issue, it is not what was referred to in our manifesto commitment in 2019. Of course, while I would be more than happy if the Department also brought something forward to protect children from pornography on Twitter, we must press ahead and look at that specific issue later. There is no reason not to press ahead and deliver part 3 as soon as possible.

In business questions last month, the Leader of the House laid out the Government’s reasons not to implement part 3, but while I appreciated the time he spent answering my question, I did not wholly buy into his argument. I therefore appeal to the Government to give the matter more thought after this debate. This is, after all, in the interests of protecting children from pornography between now and the implementation of any online harms Bill. As that is likely to be several years away, it is crucial that the Government reconsider their decision and act on the wishes of the electorate.

Having spoken to stakeholders, I understand that the Government could redesignate the regulator and bring forward an implementation date at any time and that we could move to full-blown implementation of part 3 within a matter of months. As a family man and a committed Christian, I urge the Government to enact part 3. This will protect our children and ensure that the Government hold true to their election promise.
SNP
Gavin Newlands
Paisley and Renfrewshire North
We have had another excellent, if curtailed, debate today. I thank the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright), the hon. Member for Kingston upon Hull North (Dame Diana Johnson) and the hon. Member for Congleton (Fiona Bruce) for securing it and the Backbench Business Committee for facilitating it. I do not have time to discuss and praise the various speeches that we have heard, but I particularly praise the right hon. and learned Member for Kenilworth and Southam, who opened the debate. I thought his speech was fantastic and immensely powerful; nobody could ignore what he said. Take note, Minister: if an SNP Member and a Tory Member can agree so wholeheartedly, actions surely must follow.

We spend more and more of our time online, whether we are interacting with others or are passive consumers of content—the growth of Netflix is testimony to the latter. As we spend more time online, the harms that were historically the preserve of the physical world are shifting to the online world. We have seen the growth in online financial scams and their increasing sophistication.

I have a number of constituents, as I am sure do other hon. Members, who have been scammed out of tens of thousands of pounds and lost everything, in part because the scammers were able to manipulate Google keywords advertising to drive traffic to their site and begin the scamming process. The pandemic and lockdown have seen an increase in those scams, as the perpetrators know people are spending more time online than normal.

Since the start of the pandemic, the level of disinformation around vaccination and healthcare has grown exponentially. Anti-vaxxers have already targeted the newly developed vaccines that we all hope will get us out of this situation. Such disinformation campaigns have always been dangerous, particularly for young people who are usually the main recipients of vaccines, but now present an even bigger danger to public health.

These lies—that is what they are—are propagated via the platforms of social media companies, which should have a responsibility to tackle such anti-science, anti-reason and anti-fact campaigns quickly and directly. It is not good enough for Mark Zuckerberg and the like to parrot free speech as if it were a “get out of jail free” card. Free speech comes with responsibilities; it does not give people the right to place others at risk of illness and death.

Just as children were most at risk from the anti-vaxxers until the pandemic hit, it is children who are most at risk from online harassment and abuse, in particular young women and girls. A recent report by Plan International on girls’ rights in the digital world makes extremely depressing reading. More than a fifth of girls have received abuse on a photo or status they have posted, and nearly a quarter have felt harassed by someone contacting them regularly on social media. The net result of the abuse, harassment and pressure is that nearly half of all girls are afraid to give their opinions on social media, for fear of the response, and 13% have stopped going on social media completely to avoid negative responses. Less than a week before the international day for the elimination of violence against women and girls, those figures are shocking.

A toxic environment is stopping women and girls participating in the online world on the same basis as boys and men. It feeds into a dangerous and violent misogyny that is on the rise on social media, again largely unchecked by the big tech companies until it becomes a big PR issue. It is no surprise that so many executive positions in those companies are occupied by men and so few by women.

For most households, online communication is now a fundamental part of daily life, whether it is streaming content or keeping in touch with family and friends on social media, but too often the regulation of online activities that cause harm seems to be stuck in the last century, when the internet was something we read about in newspapers or heard about on one of our four TV channels. The world has moved on dramatically in the past two decades, but the legislative framework has not. It is especially important that the victims of online harms, whether it be abuse, harassment or financial scams, feel able to report their experiences to the police or other relevant authorities. If big tech will not act, it falls to the Government to protect our citizens.

I understand that the pressures on the Government at the moment are absolutely huge, but so are the risks for individuals and for society for as long as these harms are allowed to proliferate. I urge the Government to heed the contributions of Members right across the House and bring forward concrete plans to introduce the Bill as soon as possible.
Lab
  00:07:19
Chi Onwurah
Newcastle upon Tyne Central
I thank the Backbench Business Committee, the right hon. and learned Member for Kenilworth and Southam (Jeremy Wright), the hon. Member for Congleton (Fiona Bruce) and my hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson) for stepping in where the Government have failed by bringing forward this debate and for the excellent opening contributions. Indeed, we have heard excellent remarks from all parts of the House in this debate, and I am sorry that I do not have time to do justice to them all—I have just noted how much time I have.

As a chartered engineer, I spent 20 years building out the networks that have become the internet. I am proud of that work and of what it has become. As we have heard, we are increasingly living our lives online, and the ongoing pandemic has accelerated that. For those who are not digitally excluded, social media platforms such as Facebook, Google, YouTube, Instagram and Twitter are all now woven into the fabric of our lives and, together with the vast array of online apps for everything from video conferencing to healthy eating, they are a critical enabler of an active life for citizen, consumer and economic contributor. None the less, as Members have shown so acutely, the internet can be a dark, challenging and inhospitable place. Content is curated by tech platforms that allow the spread of disinformation, sexual exploitation, fake news, extremism, hatred and other harmful content. September saw the highest number of public reports of suspected child sexual abuse material ever received in a single month by the Internet Watch Foundation. On TikTok, the hashtag #vaccinesaredangerous has had almost 800,000 views, with almost no misinformation warnings. Incredibly, we have yet to have a debate in Government time on online harms. Hon. and right hon. Members have expressed many concerns in this place in written and oral questions over the years, but Government have done nothing. Regulation has not kept pace with technology, crime or consumers, leaving growing numbers of people increasingly exposed to significant harms, but it did not have to be this way.

In 2010, the then Labour Government saw the growth of new communications technologies and undertook a comprehensive forward-looking review. The result was the Communications Act 2003 and a new regulator, Ofcom, with the power to ensure that these issues were resolved in the public interest. That regulatory framework had a 10-year lifespan—I know because I was head of technology at Ofcom at the time. In 2012, the Conservative-led Government saw the growth of our online lives—social media and big data—and did nothing. The 2012 review of online harms may be the most important review that we never had. It was not until April 2019 that the Government finally began a consultation since which legislation has been promised repeatedly and yet it comes not, leaving big tech in control of our online lives.

I consider myself a tech evangelist. I believe that tech is an engine of progress like no other. I believe that it can improve the lives of my constituents and enable a more equal, more productive and more sustainable skills-based economy through a fourth industrial revolution, but people need to be protected and empowered to take control of their lives online. The Government need to be on the side of the people and not tech lobbyists. This Government have failed us to a degree that is historically negligent, as this debate shows.

Members have highlighted how Government are failing in their duty to safeguard children from child abuse. Other Members have focused on the economic harms and the existing tech giants business model, which means that Google and Facebook have control of the online high street, even as Amazon unfairly competes the high street in our real-world towns out of existence. Ninety-seven per cent of UK consumers consult reviews when buying products online, yet investigations by Which? have repeatedly exposed fake and misleading reviews. How will the Government address these online harms in economic terms and enable real competition? We have also heard about online advertising, which is the driver of the business model. It is unregulated, leaving television companies at a disadvantage and driving more and more extreme content in front of viewers. My understanding is that the Government plan to ban all advertising of unhealthy foods on the internet. Is that the case, and why will the Government not act more broadly to address the failings of the advertising model?

As a constructive Opposition, we have proposals as well as criticisms. Self-regulation has failed—this debate has made that clear—but robust, reasonable, rational, forward-looking and principles-based regulation can succeed. It is shocking that in all this time, the Government have not established what those principles should be. Our ability to build back from covid depends on a successful vaccine, and we have had fantastic news about that recently, but, as we have heard, misinformation on vaccines as well as on 5G, the holocaust and so on is freely available. That is why Labour is calling for emergency legislation on anti-vax disinformation. Will the Government commit to that?

Labour has made it clear that we need a digital bill of rights and a legal duty of care to give more powers and protection. We need a statutory regulator for online platforms to crack down on the harm, the hate and the fake. We also need a public debate on what our online future should look like, and that is why we launched the consultation “Our Digital Future” to build consensus on the underlying principles. We are now analysing the over 600 responses that we have received, and we will publish our report soon. We are committed to eradicating the digital divide—indeed, the many new digital divides—as a result of which marginalised peoples have become increasingly excluded from the online world.

Many bodies, including the NSPCC, Big Brother Watch, the Carnegie UK Trust, Which? and the Institute of Alcohol Studies have contacted me and asked me to raise their concerns. I cannot do them all justice or spend time talking about algorithms, artificial intelligence, the internet of things and all the other emerging potential harms. Government must set out a clear plan to address these online harms and give people back control of their online lives, if our lives are to flourish online without fear or favour.
  00:01:51
Matt Warman
The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport
I thank my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) and the hon. Member for Kingston upon Hull North (Dame Diana Johnson) for securing this debate. [Interruption.] Wait for it; I entirely sympathise with the point made by my hon. Friend the Member for Congleton (Fiona Bruce) in business questions. The little time that the House has spent on this enormous subject today could never have done it justice, and it certainly does not reflect the huge importance that the Government ascribe to better protecting children and adults from online harm, while, of course, balancing that against the precious freedom of expression that we all hold so dear.

We know that we can and must do more in this vital area. Covid-19 has emphasised how much we rely on the web and on social media, and how vital it is for firms to apply to their users as soon as possible the duty of care that has been discussed this afternoon. Platforms can and must do more to protect those users, and particularly children, from the worst of the internet, which is sadly all too common today. The Government will ensure that firms set out clearly what legal content is acceptable on their platforms and ensure, via a powerful and independent regulator, that they enforce that consistently and effectively. Codes of practice will set out what is acceptable, on topics from hate crime to eating disorders, so that the networks themselves no longer make the rules.

I pay tribute to the many fine contributions that we have heard today, and I pay particular tribute to the work of my right hon. and learned Friend the Member for Kenilworth and Southam, the former Secretary of State responsible for the White Paper. I reassure him that the Government’s forthcoming online harms legislation will establish that new duty of care; that platforms will be held to account for the content that appears on their services; and that legislation will establish a systemic approach that is resilient in the face of a host of challenges, from online bullying to predatory behaviour.

Earlier this year, as my right hon. and learned Friend mentioned, we published the initial response, making clear the direction of travel. We will publish the full Government response to the online harms White Paper this year. We will set out further details of our proposals, and alongside that we will publish interim voluntary codes of practice on terrorist content and child sexual exploitation and abuse. The full Government response will be followed by legislation, which will be ready early next year. I know that there is huge concern about the time that this is taking, but we also know that it is critical that we get this right, and we will do that early in the new year. Covid emphasises the need to get on with this. We want to introduce effective legislation that makes platforms more responsible for the safety of their users and underpins the continued growth of the digital sector, because, as my right hon. and learned Friend said, responsible business is good for business.

The White Paper also set out the prevalence of illegal content and activity online, with a particular focus on the most serious of those offences, namely child sexual exploitation and abuse. Protecting children online from CSEA is crucial. Alongside the full Government response, we will publish interim codes on tackling the use of the internet by terrorists and those engaged in child sexual exploitation and abuse. We want to ensure that companies take action now to tackle content that threatens our national security and the physical safety of children, and that is what we will do.

I am sure that many Members here today have been the target of online abuse or know someone who has. We have heard powerful stories. Close to half of adults in the UK say that they have seen hateful content online in the past year. I want to make it clear today that online abuse targeted towards anyone is unacceptable; as with so many other areas, what is illegal offline is also illegal online.

Online abuse can have a huge impact on people’s lives, and is often targeted at the most vulnerable in our society. Our approach to tackling online harms will help more users to participate in online discussions by reducing the risk of bullying or being attacked on the basis of their identity. All in-scope companies will be expected to tackle illegal content and activity, including offences targeted at users on the basis of their sex, and to have effective systems in place to deal with such content. My Department is working closely with the Law Commission, which is leading a review of the law related to abusive and offensive online communications. The commission will issue final recommendations in 2021 which we will carefully consider.

It is important, though, to note that the aim of this regime is not to tackle individual pieces of content. We will not prevent adults from accessing or posting legal content, or require companies to remove specific pieces of legal content. Instead, the regulatory regime will be focused on the systems and processes implemented by companies to address harmful content. That is why it will have the extensive effect that so many Members have called for today.

I will deal briefly with anti-vaccination content. As we have heard today, many Members are concerned about this issue. As the Prime Minister made clear in the House yesterday, as we move into the next phase of vaccine roll-out, we have secured a major commitment from Facebook, Twitter and Google to the principle that no company should profit from or promote any anti-vaccine disinformation, and that they will respond to flagged content more swiftly. The platforms have also agreed to work with health authorities to promote scientifically accurate messages, and we will continue to engage with them. We know that anti-vaccination content could cost lives, and we will not do anything that could allow it to proliferate. We will also continue work on the media literacy strategy to allow people better to understand what they see online.

Let me briefly address a few points that were raised in the debate. On product safety, the Office for Product Safety and Standards has a clear remit to lead the Government’s efforts to tackle the sale of unsafe goods online, and my officials are working with their counterparts in other Departments to deliver a coherent pro-innovation approach to governing digital technologies, and they will continue to do so. The Home Office is engaging with the IWF, including on funding. On age verification, the Government are committed to ensuring that children are protected from accessing inappropriate harmful content online, including online pornography. The judicial review mentioned by my right hon. and learned Friend the Member for Kenilworth and Southam prevents me from saying more, but the Queen’s Speech on 19 December included a commitment to improve internet safety for all and to make the UK the safest place in the world to go online.

Tackling online harms is a key priority for this Government to make the internet a safer place for us all. I close by reiterating how vital it is that we get this legislation right. This Government will not shy away from ensuring that we do, and that we do so quickly.
  00:03:18
Jeremy Wright
I warmly thank all Members who have contributed to this debate, and congratulate all of them on saying so much in so little time. I hope that we have come together this afternoon to send a clear message about how much support there is across the Chamber for identifying not just the problem of online harms, but also the solutions.

I am grateful to my hon. Friend the Minister for what he has said this afternoon. I am even more grateful for what I know he is going to say after this debate to his colleagues in Government. I do not doubt for a moment his personal commitment to this agenda, but I hope that he will be able to say to others in Government that there has probably never been a piece of legislation more eagerly anticipated by everyone, on both sides of the House. Although the Government will not get a blank cheque on this legislation—no Government could and no Government should—they will, I think, get a commitment from all parties to a proper analysis and a proper supporting examination of how we might do this effectively. With that encouragement, I hope that the Minister will make sure that this happens very soon.

Question put and agreed to.

Resolved,

That this House recognises the need to take urgent action to reduce and prevent online harms; and urges the Government to bring forward the Online Harms Bill as soon as possible.
Dame Eleanor Laing
Madam Deputy Speaker
Moving very swiftly on, I am going to suspend the House for two minutes in order to do the necessary—only two minutes, because time is of the essence.
Sitting suspended.

Contains Parliamentary information licensed under the Open Parliament Licence v3.0.