PARLIAMENTARY DEBATE
Online Harms Consultation - 15 December 2020 (Commons/Commons Chamber)

Debate Detail

Contributions from Stephen Flynn, are highlighted with a yellow border.
  12:50:01
Oliver Dowden
The Secretary of State for Digital, Culture, Media and Sport
With permission, Mr Speaker, I will make a statement on our online harms consultation. We now conduct a huge proportion of our lives online. People in the UK spend an average of four hours and two minutes on the internet every day, and we know that for children it is even longer. That technology has improved our lives in countless ways but, as hon. Members on both sides of the House know, too many people are still exposed to the worst elements of the web: illegal content, racist and misogynistic abuse, and dangerous disinformation.

Those interactions may be virtual, but they are causing real harm. More than three quarters of UK adults express concerns about logging on, while a declining number of parents believe the benefits for their children of being online outweigh the risks. Trust in tech is falling. That is bad for the public and bad for the tech companies, so today the Government are taking decisive action to protect people online.

Through our full response to the online harms White Paper, we are proposing groundbreaking regulations that will make tech companies legally responsible for the online safety of their users. That world-leading regime will rebuild public trust and restore public confidence in the tech that has not only powered us through the pandemic, but will power us into the recovery.

I know that this legislation is keenly anticipated on both sides of the House. I want to reassure hon. Members that, when drafting our proposals, I sought to strike a very important balance between shielding people, particularly children, from harm and ensuring a proportionate regime that preserves one of the cornerstones of our democracy—freedom of expression. I am confident that our response strikes that balance.

Under our proposals, online companies will face a new and binding duty of care to their users, overseen by Ofcom. If those platforms fail in that duty of care, they will face steep fines of up to £18 million or 10% of annual global turnover. A number of people, including Ian Russell, the father of Molly Russell, have expressed concerns about that point; I want to reassure him and Members of this House that the maximum fine will be the higher of those two numbers, and platforms will no longer be able to mark their own homework.

To hold major platforms to their responsibilities, I can also announce to the House that they will be required to publish annual transparency reports to track their progress, which could include the number of reports of harmful content received and the action taken as a result. This will be a robust regime, requiring those at the top to take responsibility. I can therefore confirm that we will legislate to introduce criminal sanctions for senior managers, with Parliament taking the final decision on whether to introduce them. Of course, we hope not to use those powers, and for tech companies to engineer the harm out of their platforms from the outset, but people should have no doubt that they remain an option and we will use them if we need to.

Together, those measures make this the toughest and most comprehensive online safety regime in the world. They will have a clear and immediate effect: a 13-year-old should no longer be able to access pornographic images on Twitter; YouTube will not be allowed to recommend videos promoting terrorist ideologies; and antisemitic hate crime will need to be removed without delay. Those are just a few examples, but the House will take a keen interest in the details of the legislation, so I shall lay out a few key areas of action.

Our first focus is on illegal content, including child sexual abuse, terrorism and posts that incite violence and hatred. Sadly, many Members present today have been the target of online abuse, some of which might have been illegal, such as threats of violence. Unfortunately, that is particularly true for female Members of the House. This is not a problem suffered only by people in the public eye; close to half of adults in the United Kingdom say that they have been exposed to hateful content online in the past year.

Under the new laws, all companies in scope will need to take swift and effective action to remove criminal posts—if it is illegal offline, it is illegal online. Users will be better able to report this abhorrent content and can expect to receive more support from platforms. Crucially, the duty of care will apply even when communications are end-to-end encrypted. Encryption cannot serve as a protection blanket for criminals. Given the severity of certain threats, Ofcom will also be given powers to require companies to use technology proactively to identify and remove illegal content involving child sexual abuse or terrorism—that is a power of last resort.

Of course, not all harmful content is illegal. Every day, people are exposed to posts, images and videos that do not break any laws, but still cause a significant amount of harm. We all know that cyber-bullying can ruin a child’s life, but I want first to address one particularly horrific form of legal content. Sadly, too many Members present will be aware of cases in which children are drawn into watching videos that can encourage self-harm. Some find themselves bombarded with that content, sometimes ending ultimately in tragedy. It is unforgivable that that sort of content should be circulating unchecked on social media. Given the severity of its consequences, I believe that there is a strong case for making it illegal.

I can therefore announce that the Government have asked the Law Commission to examine how the criminal law will address the encouragement or assistance of self-harm. This is an incredibly sensitive area. We need to take careful steps to ensure that we do not inadvertently punish vulnerable people, but we need to act now to prevent future tragedies.

Many Members are particularly concerned about the effect online harm has on children. We have reserved our strongest and toughest protections for them. All companies will need to consider seriously the risks their platforms may pose to children and to take action. They will no longer be able to abdicate responsibility by claiming that children do not use their services when that is manifestly untrue—we all know examples of that—and we also expect them to prevent children from accessing services that pose the highest risk of harm, including online pornography. Cutting-edge age assurance or verification technologies will be a vital part of keeping children safe online.

At the same time, we are going further than any other country to tackle other categories of legal but harmful content accessed by adults. Major platforms will face additional obligations to enforce their own terms and conditions against things such as dangerous vaccine misinformation and cyber-bullying. Where the platforms fall short, they will face the legal consequences.

I know that some hon. Members are worried that the regulations may impose undue burdens on smaller, low-risk companies, so I can reassure them that we have included exemptions for such companies. As a result, less than 3% of UK businesses will fall within the scope of the legislation.

In this House we have always ardently championed freedom of expression. Robust and free debate is what gives our democracy its historic strength. So let me be clear: the purpose of the proposed regime is not to stop adults accessing content with which they disagree. It is not our job to protect people against being offended. I will not allow this legislation to become a weapon against free debate. Therefore, we will not prevent adults from accessing or posting legal content. Companies will not be able arbitrarily to remove controversial viewpoints, and users will be able to seek redress if they feel that content has been removed unfairly.

Nor will I allow this legislation to stifle media freedoms or become a charter to impose our world view and suppress that of others. I can confirm that news publishers’ own content on their sites is not in scope, nor are the comments of users on that content. This legislation is targeted exactly where it needs to be and tightly focused on delivering our core manifesto pledge to empower adult users to stay safe online while ensuring that children are protected.

We have engaged extensively to get to this point and this process is by no means over. We want all parliamentarians to feed into this significant piece of work and will continue to listen to their concerns as we go through pre-legislative scrutiny and beyond. However, I am confident that today’s measures mark a significant step in the continual evolution of our approach to life online, and it is fitting that this should be a step that our country takes. The world wide web was, of course, invented by a Brit, and now the UK is setting a safety standard for the rest of the world to follow. I commend this statement to the House.
Lab
  00:02:07
Jo Stevens
Cardiff Central
I thank the Secretary of State for advance sight of his statement. Let me start by saying that the Opposition welcome any moves to protect children and the vulnerable online. There are plenty of questions about gaps in the Government’s response relating to protecting children online, but the emphasis on children in this statement is very welcome.

We have been calling on the Government to introduce this legislation for almost two years. The publication of the online harms White Paper seems almost a lifetime ago. The legislation is long overdue, and I would like the Secretary of State to tell us when in 2021 the House can expect to see the Bill, because until it is on the statute book, the real harm that he has just described, which has been able to flourish online through a lack of regulation, will continue. Ireland has already published its legislation. France has produced legislation dealing with hate speech. Germany has had legislation in place since 2018, and the European Commission is expected to publish its proposed Digital Services Act today.

The Secretary of State has said that the UK will lead the way with this legislation, but I am afraid that the response today is lacking in ambition. It feels like a missed opportunity. This is a once-in-a-generation chance to legislate for the kind of internet we want to see that keeps both children and adult citizens safe and allows people to control what kind of content they see online. Instead, the Government have been timid, or maybe the Secretary of State was persuaded by Sheryl Sandberg and Nick Clegg in his meeting with them last month to water down the original proposals. Social media platforms have failed for years to self-regulate. The Secretary of State knows that, everyone in this House knows that, and the public know that.

On legal but harmful material, why are companies being left to set their own terms and conditions and then judged on their own enforcement of those terms and conditions? It is exactly the wrong incentive. It will actively encourage less strict terms and conditions, so the platforms can more easily say that they are being properly enforced. When the Secretary of State says that companies will no longer be marking their own homework, I am afraid that he is wrong, because that is exactly what they will be doing.

The financial penalties described are welcome, but the Government have given in to big tech lobbying on criminal liability for senior executives for repeated breaches being properly built into the forthcoming legislation and implemented straight away. Rather, that will be left hanging to a possible future date through additional secondary legislation. Ireland’s legislation will include criminal sanctions rather than the vague threat that the Secretary of State has decided on. Will he explain what is to be gained by waiting? Never mind one last chance—repeat offenders have had chance after chance after chance.

The Secretary of State has referred to the novel concept of age assurance. Is that the same as age verification—the age verification that has been accepted by both the platforms and users as being unenforceable—or is it something different?



We know that online harms can easily become real harm. Encouragement and assistance of self-harm is one example, as the Secretary of State has mentioned. Harmful anti-vaccination disinformation impacting on public health is another. The Government have said today that they are asking the Law Commission to examine how criminal law will address the issue of encouragement or assistance of self-harm, but the Government could have asked the Law Commission to do that nearly two years ago when the White Paper was published. They have not done the hard work of deciding what should perhaps be illegal, which would have made their response today a better one.

There are other notable absences from the response, including reference to financial harm and online scams. This is a growing area of concern for millions of people across the United Kingdom, so why has it been ignored in the response? The Secretary of State has referred to failing public trust in tech. He says that he wants to rebuild it, but, sadly, today’s statement does not live up to that aspiration.
  00:09:12
Oliver Dowden
I am rather sorry that the hon. Lady seems intent on seeing the negative in everything. This is a groundbreaking piece of legislation. Let me go through some of the points that she raises. She talks about our being timid in the face of tech lobbying. First of all, I can assure her that, although I have discussed end-to-end encryption in respect of national security issues, I have not discussed with Sheryl Sandberg or Nick Clegg any online harm provisions. That is simply not the case. Indeed I think that she will find from the reaction of some tech firms that they are struck by the scale of the fines that we are proposing. These would be some of the largest fines ever imposed—up to 10% of the global revenue of a company such as Facebook, which shows how enormous the maximum fine could be.

On criminal liability, I want tech firms to comply with this, and if they do not do so, they will face steep fines. If they still do not comply, Members should be in no doubt that their senior managers will face criminal sanction. We will take the power in this Bill—we will not have to come back to the House for primary legislation—and enact it through secondary legislation.

The hon. Lady asks about what we have been doing so far. We have taken many steps already to protect people online. For example, just a couple of months ago, the Information Commissioner’s age appropriate design code was put before Parliament. Today, alongside this full response to the White Paper, we are publishing, through the Home Office, an interim code of practice on online child sexual exploitation and abuse, and we will do so similarly in relation to terrorist content and activity online. We will expect tech firms to start complying with that now. It is clear what the Government’s intent is and if those firms fail to comply, we will have the powers through this legislation to ensure that that happens.

The hon. Lady asks about letting tech firms mark their own homework. We are empowering Ofcom to hold these tech firms to account. First of all, we will make sure that the terms and conditions are robust, and if they are not, those firms will face consequences. If they do not enforce those terms and conditions, they will face consequences, and the House will set out what those legal but harmful things are through secondary legislation. We will propose the sort of harms that those tech firms should guard against. Members will be able to vote on them, and those firms will have to take action appropriately. I believe that this marks a significant step forward, and Opposition Members should welcome this important step in protecting children, particularly online.
Mr Speaker
I call the Chair of the Digital, Culture, Media and Sport Committee, Julian Knight.
Con
  00:08:50
Julian Knight
Solihull
It has been two long years since the Digital, Culture, Media and Sport Committee’s report on fake news, and it is welcome that, at long last, the Government have moved to appoint a regulator, to impose a duty of care and to put in place a substantial fines regime. However, there are still areas of concern. Can the Secretary of State outline his thinking on these? Does he accept that the number of priority categories defined as online harm needs to be broadened from what is currently envisaged to include things such as misinformation? The Secretary of State rightly focused on children, but this is about more than children; it is about the very status of our society and about looking after adults.

The Secretary of State mentioned transparency reports. How can we ensure that these transparency reports do not become another exercise in public relations for the tech firms? Will there be independent outside academic oversight? When it comes to news publishing exemptions, will that also apply to video sharing?

Finally, does the Secretary of State recognise that a system of dynamic, ongoing enforcement through a financial services-style compliance regime in tier 1 social media companies provides a good belt and braces for retrospective enforcement action on what prelegislative scrutiny is planned?
  00:00:52
Oliver Dowden
My hon. Friend the Chairman of the Select Committee asks about the involvement of the Committee; we will of course seek to involve the Committee extensively in the prelegislative scrutiny. He has already made an important suggestion about dynamic monitoring, which we will of course consider as we firm up the legislation.

My hon. Friend talks about video sharing; the exemption for news publishers to protect freedom of speech will apply to all their output and will include that.

My hon. Friend asks about disinformation; if disinformation—for example, anti-vax content—causes harm to individuals, it will be covered by the legislation, and I very much expect to set that out as one of the priority areas that would have to be addressed in secondary legislation.
SNP [V]
  00:02:56
John Nicolson
Ochil and South Perthshire
I thank the Secretary of State for the advance copy of his speech, much of which we SNP Members agree with.

At a time when anti-vax disinformation floods social media, when hate is spouted at minority groups under the cowardly veil of anonymity, often without consequence for the perpetrators, and when more children than ever before are using the internet and need to be shielded from harmful content, the proposed online harms Bill is welcome.

We welcome, too, the requirement that companies must accept a duty of care, and the fact that Ofcom will be the independent regulator—but it must be a regulator with teeth. As Dame Melanie Dawes, Ofcom’s boss, told the Digital, Culture, Media and Sport Committee a short while ago, Ofcom needs much-enhanced powers to be effective; what additional powers will she have?

To enjoy maximum support in the House, the Bill must, while balancing the right to free expression, tackle illegal content as well as content that is potentially harmful but not illegal. In particular, companies must protect all children from harm, and the Government are right to recognise that.

The covid epidemic and lockdown have seen a surge in homophobia and transphobia online. The TIE—Time for Inclusive Education—campaign stated that 72% of LGBT+ young people had reported attacks or cyber-bullying, with organisations such as the so-called LGB Alliance leading the onslaught. In that context, surely there is a case for looking again at social media anonymity. Noms de plume are fine, but we believe that users’ identities should be known to the social media publishers—they should not be completely anonymous in all circumstances. Does the Secretary of State agree with that?

Social media disinformation has been especially pernicious during the covid pandemic. Experts tell us that the disinformation during this crisis is unparalleled in the internet era, and the consequences of online harm can be catastrophic, undermining public trust, faith in health officials and acceptance of the value of the vaccine now being rolled out.

In principle, we welcome much in the proposals. Of course, the proof of the pudding will be in the eating—exactly how tough the Government are prepared to be in reality, how hard they will be on the social media companies, and whether they will enforce some of the proposals—but we welcome it.
  00:05:08
Oliver Dowden
I am grateful for the hon. Gentleman’s welcome for the legislation. He raised some important points. We have not taken powers to remove anonymity, because it is very important for some people—for example, victims fleeing domestic violence and children who have questions about their sexuality that they do not want their families to know they are exploring. There are many reasons to protect that anonymity.

The hon. Gentleman talked about Ofcom; over the years, we have seen Ofcom rise to the challenge of increased responsibilities, and I am confident that it will continue to do so. We will of course look to Ofcom to bring in independent expertise to help it in that process. It will clearly require a step change from Ofcom, but Dame Melanie Dawes and others are very much alert to that.

The hon. Gentleman talked about misinformation and disinformation. There are three things that we have to do to address those. First, we have to rely on trusted sources. We are so fortunate in this country to have well-established newspapers and broadcasters that are committed to public service information. We have seen that through the covid crisis, which is why we have supported them through this period. Secondly, we have to rebut false information. Through the Cabinet Office, we are working 24/7 to do that. Finally, we have to work with the tech companies themselves. For example, the Health Secretary and I have recently secured commitments to remove misinformation and disinformation within 48 hours and, crucially, not to profit from it.

As for hon. Gentleman’s central concern, I think these measures really do mark a step change in our approach to tech firms. The old certainties are changing, and we are taking decisive action.
Con
Jeremy Wright
Kenilworth and Southam
I welcome the progress that the Government are making in this area, and my right hon. Friend’s personal commitment and determination to deliver it, but, as he said, there is further progress to be made. That progress will only really be made when we see legislation, which I urge him again to introduce as soon as possible. In the meantime, I understand the Government’s focus on the larger platforms where the greatest harms are likely to be concentrated, but may I urge him, in the design and architecture of the regulatory system that he is putting in place, to ensure that it can deal with smaller platforms that grow fast or host particularly damaging material, and, of course, that it can deal with the ever-changing nature of the harms themselves?
Oliver Dowden
I pay tribute to my right hon. and learned Friend and other former Culture Secretaries who are present, all of whom have played a decisive role in helping to shape this important legislation. My right hon. and learned Friend rightly raises the point about smaller platforms. What we have sought to do with these proposals is to exclude very small enterprises—for example, a cheese retailer that allows its customers to leave comments on its site. Strictly speaking, that is user-generated material, but I think we would all agree that we would not want that to be within scope. However, at the same time, some smaller sites can be used as a back route—for example, for paedophiles to exchange information. We will design the legislation proportionately so that we can upscale the regulation in cases of that sort.
Lab [V]
Clive Efford
Eltham
I welcome the legislation as far as it goes, and agree with the Secretary of State that it is landmark legislation, rather like the Gambling Act 2005, which was passed by the Labour Government. I remind him that it was largely the things that were not covered by that legislation that came back to be the most challenging issues to confront us all. Given that, let me ask the Secretary of State about the scope of the legislation: will it cover online harms such as the advertising of gambling targeted at young people, gambling through social media or even loot boxes in online gaming, whereby young people are asked to pay for boxes of which they do not know the content?
Oliver Dowden
As the hon. Gentleman may know, we have already issued a call for evidence in respect of loot boxes, and will take appropriate action in response. Many of the issues that he has raised are covered by our call for evidence on gambling. The scope of this legislation will cover any platform that allows self-generated content to be on it; to the extent that gambling websites have user-generated content on them, they will fall within the scope of this legislation, potentially.
Con
Sajid Javid
Bromsgrove
I welcome my right hon. Friend’s statement. He has said that at the heart of these measures is the protection of our children—something with which the whole House will agree. He may know that I am leading an investigation with the Centre for Social Justice on the epidemic of child sexual abuse and exploitation that is taking place in our country. I therefore particularly welcome what he said today about the publication of the interim code of conduct on online child sexual abuse. But for it to have any effect, it must have teeth; it must be legally binding. Will he assure the House that when the online safety Bill becomes an Act, this code will be a statutory obligation?
  13:15:00
Oliver Dowden
I pay tribute to the work that my right hon Friend is doing, both on this and through the important work of the Centre for Social Justice. Yes, I can certainly give him that assurance. As I said, I would expect tech firms to abide by these codes of practice now—they have been published in interim form—because it is in the interests of tech firms to clean up their act, and this gives them a way of doing so. That has been the point across our approach. Of course, if they fail to do so, we will take the power in legislation to make it binding regardless, but I hope that the firms will abide by the codes of practice and I do not have to use those powers.
LD
  13:15:17
Jamie Stone
Caithness, Sutherland and Easter Ross
Clearly, regulations alone will not be strong enough to tackle the challenges of the internet. I am sure every single one of us in this place regards the safety of our children as paramount, so may I suggest to the Secretary of State that the education of our children might empower them to take down or zap harmful stuff online? What consideration is he giving to improving the education of children to give them that ability? Will he also have discussions with his colleague the Secretary of State for Education to that end, and might he further extend those discussions to the equivalent Ministers in the devolved Administrations?
  13:15:59
Oliver Dowden
The hon. Gentleman makes an important point, and of course I will be happy to extend that discussion. I am already doing so with my right hon. Friend the Education Secretary, but I would be happy to do so with representatives of the devolved Governments. The hon. Gentleman is absolutely right to highlight the importance of education, and that applies not just to children but to parents. The more that parents, particularly those who have not grown up with the internet, understand the risks involved for their children, the better equipped they are to take action. Probably the single most important thing that parents can do is better understand the risks. That is why, in respect of children, we will be publishing the online media literacy strategy in the spring to address exactly that.
Con
Damian Collins
Folkestone and Hythe
I thank the Secretary of State for his important and long-awaited statement on this piece of legislation. I have a few questions, though. He mentioned that social media companies would be required to produce transparency reports on their effectiveness in dealing with harmful content. Will Ofcom be able to audit those reports and request data and information from the companies? Otherwise, those reports will not be very transparent at all. He also said that there would be a carve-out exemption for news providers. I agree with that, but how is he defining a news provider? Some of the most egregious spreaders of disinformation pretend to be news providers but are actually fake news websites. It is important that we know that. He also said that if companies’ terms and conditions did not come up to standard and they did not meet their duty of care obligations, they would “face the legal consequences”. Can he say what those consequences will be?
  13:17:31
Oliver Dowden
As ever, my hon. Friend raises some very pertinent questions. On the powers for Ofcom, it will be able to interrogate companies on data and equipment. The question of the definition of news publishers is a challenging one, for the reasons that he sets out. Essentially, we want to avoid the situation whereby a harmful source of information sets up as if it were a news publisher. That will be an important part of our engagement with Members through the pre-legislative scrutiny, so I hope I will be able to reassure him on those points.
DUP [V]
Carla Lockhart
Upper Bann
I welcome today’s announcement and trust that it represents progress towards making the internet a safer place for my constituents. In protecting our children, the vulnerable and wider society online, there can be no half measures. In that regard, I have a number of areas of concern. The Secretary of State referred to cutting-edge age assurance or verification technologies. Can he explain what exactly is meant by age assurance and the practicalities of that process? How does it differ from age verification? What evidence is there that it is more effective in protecting children from harmful content? Does he agree that the prevalence of online scams—and the thousands of lives across the UK impacted by such scams—makes their omission from the Government’s response significant? Will he outline how the Government will address this increasingly widespread online harm?
Oliver Dowden
On age assurance, we are looking at the sort of emerging technology whereby, for example, one can look at how children type and use artificial intelligence to see that it is a child rather than an adult. Just yesterday, I was at a company called SafeToNet, which is doing fantastic work—for example, building into social media platforms through the electronic device that a child is using, whether that is an iPad or a phone, safety features that would block pornographic images and so on. The hon. Lady also asked me about further powers that we are taking. Forgive me; I have temporarily forgotten the point that she raised, but I am happy to write to her on that point.
Con
David Johnston
Wantage
I welcome my right hon. Friend’s statement. Large tech platforms build incredibly complicated models to track our every move, profile us and suggest products that we might want to buy. They now even read our messages and suggest how we might like to reply, and yet when it comes to removing harmful content, they suggest that it is too difficult for them. Does he agree that what he is setting out is well within their capabilities, as long as they have the will?
Oliver Dowden
My hon. Friend makes a very important point. Too often, tech firms say that they cannot do such things, but strangely, when it is in their commercial interest to do so, they find a way of doing it. This legislation is setting a clear direction of travel from Government, so that they know that we will be willing to take that action to force them to take measures in the public interest.
SNP
Alison Thewliss
Glasgow Central
First Steps Nutrition Trust has launched a study this month which shows the impact of online marketing of infant formula. I am all for impartial information, but that is not what is happening. Baby clubs, carelines and online influencers have free rein, and they are undermining breastfeeding and pushing parents to buy more expensive formula than they can afford. Will the Secretary of State protect our youngest citizens and prohibit all infant formula advertising online?
Oliver Dowden
The hon. Lady raises a very important point. The purpose of this legislation is to deal with user-generated content. If that sort of thing is being promoted by users, which we can all see is a popular marketing device, it will fall within scope. It is similar to the point raised by the hon. Member for Upper Bann (Carla Lockhart) about fraud. If fraud is being promoted through user generation, that is a harm that can be addressed, but it does not extend to the whole scale of advertising, which is beyond the intent of the legislation.
Con
Jo Gideon
Stoke-on-Trent Central
I welcome my right hon. Friend’s statement. Earlier this year, Staffordshire police, Stoke-on-Trent City Council and Staffordshire County Council launched an operation to crack down on gangs exploiting children through county lines, drug dealing and other criminality. These children are often groomed and recruited on online platforms and messaging services. Can my right hon. Friend confirm that, under the rules outlined in the online harms consultation, technology firms will be required to build technology into their platforms that can prevent that sort of activity?
Oliver Dowden
Yes, I am happy to give my hon. Friend exactly that assurance. Companies must tackle illegal content on their platforms and protect children from harmful content and activity online. They really do need to build the right systems. As I said in answer to an earlier question, I have seen the technology; there is no excuse anymore not to use it.
Lab
Darren Jones
Bristol North West
I want to ask the Secretary of State two questions on the issue of how we understand what is harmful but perhaps legal. First, will Ofcom be given the powers that it already has for other regulated sectors to demand access to information about how a service is being used and what content is on it? Secondly, why has the Secretary of State abandoned age verification?
Oliver Dowden
On age verification, we are moving it from what we previously had, which was not dealing with user-generated content. Most pornography that children access is on sites that have user-generated content. Usually, that is the way that children stumble across it by mistake. It is really important that we broaden the scope of what we are doing, and that is precisely what we are addressing through this legislation.
Con [V]
Christian Wakeford
Bury South
Earlier this year, we witnessed the Wiley scandal, which saw an antisemitic rant over numerous posts. It took 72 hours and a mass boycott of social media by the Jewish community and its supporters before any action was taken by the platforms. Does my right hon. Friend agree that the law should apply online as it does offline and that online platforms must do more to stop the spread of hate speech and illegal content?
Mr Speaker
I think the hon. Gentleman forgot to put on his tie and jacket.
  00:03:05
Oliver Dowden
Sadly that will not be addressed by this legislation, Mr Speaker. [Interruption.] Not that I could—I believe that is a matter for the House.

My hon. Friend makes a very important point about antisemitic abuse. I have met organisations to discuss that in framing the legislation. Most antisemitism is illegal and should be addressed through the provisions made for illegality. Beyond that, we will be setting out, as a priority, harms to be addressed through this legislation.
Lab [V]
  00:03:37
Dame Margaret Hodge
Barking
I, too, welcome the statement. In the past two months, Community Security Trust has identified 90,000 posts mentioning me. Most were hostile, antisemitic, misogynistic and ageist. Many were anonymous and, through disinformation, aimed to undermine my credibility and so silence me. I ask the Secretary of State to think again. Does he not agree that anonymity on social media can no longer be universally protected, although it should be protected for groups such as whistleblowers and victims of domestic violence? Will he not agree that when users post illegal content or harmful abuse, social media companies should be required to collect and pass on information on the identity of the user to regulatory bodies and to the police?
  00:04:12
Oliver Dowden
The right hon. Lady raises a very important point. As a Member of Parliament who proudly represents a very large Jewish community, I know the challenges of antisemitism, and that has been at the front of my mind in framing this legislation. It is a challenging area, this point about anonymity. Of course, if there is criminal conduct that the police and law enforcement agencies are investigating, they have ways of dealing with that anonymity in order to bring criminal cases. The reluctance I have had, and the Government have had, to introduce provision across the board is about how we lift the veil of anonymity while at the same time protecting some very vulnerable people who rely on it. But of course we will continue to keep it under review.
Con
  00:05:23
Karen Bradley
Staffordshire Moorlands
I fear that we on the Government Benches look a little like the ghost of Secretaries of State past to my right hon. Friend. I welcome this statement and the moves that the Government have made. Taking him back to the issue of age assurance and age verification, I am pleased to hear that he is looking at different types of technology to protect children, but will he please not let the perfect be the enemy of the good and do something about age verification as soon as possible?
  00:05:57
Oliver Dowden
My right hon. Friend is absolutely correct. I should pay tribute to all her work in this area. Of course we will not allow the best to be the enemy of the good. We will not be mandating the use of specific technological approaches. We know that those technological approaches are available, and Ofcom will be holding tech companies to account to make sure that they take advantage of them in order to provide protection for children.
Lab
  00:06:24
Liz Twist
Blaydon
As the Secretary of State will be aware, Wikipedia, while not a social network, is edited by its users. It includes highly dangerous instructional information on suicide generated by those users. How will that be covered by the forthcoming legislation, and how will he deal with the international aspect of preventing harm online?
  00:06:45
Oliver Dowden
I thank the hon. Lady for her question; she raises an important point. We are looking to legislate to make self-harm illegal—to push it into that category. On international engagement, there is a coalition of nations around the world that are now moving in this direction, including the US. The hon. Member for Cardiff Central (Jo Stevens) mentioned steps taken in Ireland and elsewhere. We have constantly led this debate. We started this debate with these proposals and we are delivering them at a faster pace than other countries around the world.
Con [V]
Mrs Maria Miller
Basingstoke
I warmly welcome my right hon. Friend’s statement. However, we have to be very clear that the duty of care and the regulator that he is proposing will not look at or resolve individual complaints. What is more, we are already seeing some of the smart movers in the online world starting to change their practices so that they will evade the regulation that he is talking about. So, to be really effective this Bill has to sit alongside stronger and clearer laws that protect the individual from dreadful online abuse, such as image-based abuse which the Secretary of State and I have talked about, and which I know he cares as deeply as I do about resolving. He cannot introduce one without the other, so can he give me an assurance today that he will put reforms, particularly with regard to online image-based abuse, on the same time footing as the Bill he is talking about today?
Oliver Dowden
My right hon Friend, another former Culture Secretary, makes an important point. She and I have discussed this at length. It is essential that, alongside the duties of care, we specifically outlaw certain things: she has made important points about deep fakes, cyber-flashing and so on. I can confirm that, working with the Law Commission, we will be looking through this legislation specifically to outlaw that kind of activity and make it illegal.
Lab
Chris Elmore
Ogmore
As the Secretary of State will undoubtedly be aware, I really welcome this Bill; I honestly believe that it is well intended, but I fear it is rather muddled and jumbled. I would like to know when the Bill is coming to the Floor of the House—not pre-legislative scrutiny, which the Secretary of State has mentioned in answer to several other Members, but when the Bill is coming—because we have been waiting two years for just this statement. I would also like to know why culpability has been delayed; self-governance has not worked for 15 years, so why delay it? Finally, why not deal with the issues around economic crime? That is increasing, and I believe it is a mistake not to deal with the problems of economic crime in society through platforms.
Oliver Dowden
I welcome the hon. Gentleman’s overall support. He asked when this is coming; the legislation will be brought before the House in 2021. He asked about economic crime, and other Members also raised that. [Interruption.] Well, to the extent that this comes from user-generated content, of course it will fall within scope, but if we seek to make the Bill deal with every harm on the internet, it will quickly become very unwieldy. Most fraud comes as a result of activities such as online advertising. We must try to have some sort of a scope around this.

The hon. Gentleman asked why we are delaying taking powers. We are not delaying taking powers: from the get-go, these enormous fines of up to 10% of global turnover will be imposed. If that is still not effective, we will have taken the power to use criminal sanctions for senior managers, and it will simply be a case of passing secondary legislation to bring that into force. As it is such a big step to have criminal liability, if we can avoid criminal liability I would like to do so. I believe the fines will be sufficient, but if they are not, then we will have taken those powers.
Con
Tim Loughton
East Worthing and Shoreham
I welcome these robust proposals, particularly the focus on children, but they need to lead to robust legislation and robust practice. I particularly welcome the referral to the Law Commission about self-harm sites; will my right hon. Friend make sure they include so-called self-help sites on eating disorders, which are nothing of the sort and just promote those sorts of behaviour?

May I also return to the point made by the right hon. Member for Barking (Dame Margaret Hodge) about anonymity, because it is key? Whether it is hate speech, extremism, antisemitism or grooming sites, the perpetrators hide behind anonymity. When they get taken down, they reappear under a different name. Is it not possible for them to have to reveal their identity, and prove their identity to the platform providers only, so it does not involve whistleblowers revealing themselves, so that they cannot get away with it, they cannot keep reposting, and they can be referred to the police where necessary?
Oliver Dowden
I hear my hon. Friend’s points about anonymity, and, as he said, they were made very powerfully by the right hon. Member for Barking (Dame Margaret Hodge). We are seeking to get the balance right so that we protect victims of domestic violence and others who rely on anonymity; of course, there are the law enforcement powers, but we genuinely keep an open mind, and if we can find a way of doing this that is proportionate, we will continue to consider whether there are measures we can take as we go through pre-legislative scrutiny. We are grappling with that challenge.
Lab
Stephen Timms
East Ham
The Work and Pensions Committee is inquiring into pension scams. Much of that problem is online, boosting the profits of tech firms and causing immense hardship. Martin Lewis, Which?, my hon. Friend the Member for Cardiff Central (Jo Stevens) on the Front Bench and others have called for such scams to be in scope here. The right hon. Gentleman says they will be if they are “user-generated”, so can he explain how these measures will address the very serious problem of financial online harms?
Oliver Dowden
Through secondary legislation, we will set out priority harms. I will not go into every last harm, because that will be a process for scrutiny. On the broader point about financial fraud and so on, the right hon. Gentleman raises very important points, and of course we will seek to address that as a Government; I am just not convinced that this is the appropriate legislative vehicle for doing so.
Con [V]
Craig Whittaker
Calder Valley
Whether it is on promoting illegal content, anti-vaccine content, covid denial or conspiracy theories in general, for far too long now social media platforms have failed to get their own houses in order, and trust in the industry has fallen. Does my right hon. Friend agree that the measures he is proposing today will ensure a new age of accountability for tech that in turn will restore trust in the industry?
Oliver Dowden
As ever, my hon. Friend is absolutely correct. This marks a watershed and introduces that new age of accountability. For too long, tech firms have considered that because of the novelty of their technology, they are not subject to the same norms as others—broadcasters and so on. This starts to redress that balance.
Lab
Navendu Mishra
Stockport
This is a global problem that requires a global response. Will the Secretary of State confirm what co-operation protocols are in place to block offending platforms across multiple countries?
Oliver Dowden
First, on blocking offending platforms, we will reserve that power in this legislation; it is a power that will be available to Ofcom. Of course, we engage on exactly those points through various international forums, and we continue to work together.
Con
Dr Kieran Mullan
Crewe and Nantwich
Even the most vigilant parents struggle to keep up with the latest apps, websites and ways to get around parental controls. While parental responsibility will always remain key, these proposals help parents to deliver that. However, I think people will be anxious to know that the proposals have teeth, especially when it comes to the very wealthy companies that are involved. Can the Secretary assure us that they do have teeth, and that he will be able to act in a way that means something to these companies?
Oliver Dowden
Yes, my hon. Friend is absolutely right. There seems to be a degree of complacency among some Opposition Members about the scale of the fines we are proposing. We have never before proposed fining tech firms up to 10% of global turnover. That is an enormous sum for them, and it gives real teeth and credibility to what we are doing.
SNP
Stephen Flynn
Aberdeen South
A constituent of mine in Aberdeen has been in contact to say that in recent months they have had to respond to three instances of children in primary school accessing Pornhub on mobile phones. I am sure the Secretary of State shares my profound concern about that, but we do not want warm words; we want action, so will he tell us whether and when online age verification checks will finally be put in place?
Oliver Dowden
As a father of primary-age children, I share the hon. Gentleman’s complete outrage that that is possible. This legislation will address exactly that. A site like Pornhub will fall within the scope of this legislation, because it has a large amount of user-generated content, and we will expect it to take appropriate measures to safeguard children from accessing the site. If it fails to do so, it will face severe consequences.
Con [V]
Dame Cheryl Gillan
Chesham and Amersham
I welcome the statement. Bearing in mind that the perpetrators of online harms and abuse know no international boundaries, does the Secretary of State agree that, as a member of the Council of Europe, which is a key pillar for the protection of human rights online, we have an important ally in the ECHR, which rules on applications alleging violations? What plans does he have to work with our international partners? Particularly given the speed at which technology moves, how can he be sure that his proposals will keep pace with technological advances and escalating international activity? Bearing in mind the high-profile international cases involving people with autism, can he offer better international protection for individuals caught up inadvertently in incidents?
Oliver Dowden
My right hon. Friend is absolutely right. As I make this announcement to the House, I am writing to my counterparts around the world to inform them of what the British Government are doing; it is world-leading. There is a lot of interest from my counterparts around the world and I shall be working with them because although, as we all know, the UK is a significant country in terms of market share for tech firms, we cannot operate in isolation. It is important to work with major markets, such as the US and the EU, to achieve a co-ordinated approach. We are all trying to move in this direction, so the more we can join up our approach, the more effective we can be.
Lab
Ms Angela Eagle
Wallasey
Online activity is important to extremists of all kinds in furthering their aims. Fake news—disinformation—is the currency of authoritarian forces, undermining our democracy; and on their business models, currently tech companies profit from that. What action would the Bill take to defend our democratic values if it was on the statute book now? How would it solve this threat?
Oliver Dowden
This legislation is specifically aimed at harm caused to individuals, so of course, to the extent that there is harm to individuals, such material will fall within the scope of this legislation. But remember: this sits alongside other action by the Government. For example, the Cabinet Office is leading work on the cross-Government defending democracy programme, to deal with the wider challenges to our democratic values.
Con
Greg Smith
Buckingham
I very much welcome my right hon. Friend’s statement today. Organisations like the Internet Watch Foundation have over 24 years’ experience in tackling threats to children online; indeed, the foundation played a huge role in reducing the percentage of vile, indecent images of children from 18% of such images held on UK servers across the globe, down to 1% today. Can my right hon. Friend assure me that organisations like the foundation will be fully involved, so that we may harness their expertise in the regulatory framework that he brings forward?
Oliver Dowden
Yes, I can give my hon. Friend that assurance. I have, of course, met the Internet Watch Foundation. Ofcom will need to draw on expert advice, and I would expect that to include that of the foundation.
SNP
Joanna Cherry
Edinburgh South West
I was very pleased to hear the Minister mention misogynistic abuse. In October 2019, the Joint Committee on Human Rights published a report on democracy, freedom of expression and freedom of association, in which we found that in relation to its hateful conduct policy, Twitter has omitted sex from the list of protected characteristics; that means that shocking misogynistic images and violent abuse and threats against women are often found not to be in violation of Twitter’s policies.

Does the Minister agree with the Committee’s recommendation that Twitter should remedy that omission, so that the protected characteristic of sex is protected by its hateful conduct policy? Does he agree that all the protected characteristics deserve equal protection in any online harms legislation?
Oliver Dowden
The short answer is yes. I agree with the hon. and learned Lady; misogyny should and will be addressed. The point of the legislation is that Ofcom will hold tech companies to account, to make sure that they have policies that deal effectively with misogyny, that they enforce those policies, and that if they fail to do so they will face the financial consequences. We reserve criminal powers to act as well.
PC
Ben Lake
Ceredigion
Diolch, Madam Deputy Speaker. Unfortunately, hate speech and harmful content are not static entities and are constantly changing and adapting. How will the legislation be future-proofed to ensure that regulations remain effective in tackling harmful content as its nature inevitably evolves?
Oliver Dowden
The hon. Gentleman makes an important point. Several other hon. Members have raised the point about future-proofing, and I apologise for not addressing it in my answers so far. Ofcom will be given the discretion to deal with emerging threats, through codes of conduct and so on, but we will also use secondary legislation and identification of priority harms. We are not including those in primary legislation to enable us, over time, to update them as new threats emerge.
Con
Felicity Buchan
Kensington
Does my right hon. Friend agree that senior managers of social media companies must be held responsible if they fail to protect children and vulnerable people?
Oliver Dowden
Yes, I do, and I thank my hon. Friend for her question. The financial penalties we are proposing will cause all senior executives to sit up and think. The last thing one would want to do in a senior management position in such a company is to expose it to such a high level of fine, but we will still, ultimately, reserve the criminal sanction as well, in the way I have set out.
Lab
  00:06:45
Yvette Cooper
Normanton, Pontefract and Castleford
The Select Committee on Home Affairs has spent many years being deeply frustrated by the weak responses of social media companies to our urging them to take action against hateful extremism and online child abuse, so I welcome the measures the Secretary of State has announced. The Government response states that

“the regulator will have the power to require companies to use automated technology…to identify illegal child sexual exploitation and abuse content or activity on their services, including, where proportionate, on private channels.”

Will he confirm that that means major platforms will need to use this automated technology on the end-to-end encrypted private channels? What proportionality test is he applying here, given that child sexual abuse is clearly so abhorrent and wrong in all circumstances? When will it ever be disproportionate to pursue this?
Oliver Dowden
The right hon. Lady raises important points. On private channels, companies will be expected to use emergent technology to check for this sort of thing happening. The point about proportionality is that clearly we cannot expect them to individually, through human activity, spot this kind of thing; they will have to rely on artificial intelligence and so on. As the regulator becomes confident that those technologies work, it will expect the firms in question to use it. There is a slightly separate issue about end-to-end encryption, and the right hon. Lady will be familiar with the sort of conversations the Home Secretary and I are having with Facebook, for example, on that. Encryption cannot be used as an excuse to get out of being subject to this legislation, and we would expect firms that use end-to-end encryption still to take measures to protect against child abuse and exploitation, for precisely the reasons the right hon. Lady sets out.
Con
John Howell
Henley
In 2007, the Council of Europe produced a convention, which I understand we have signed, that deals with the online abuse of children. Will my right hon. Friend work with me and other members of the Council of Europe to strengthen that convention, in order to make sure that the regulators are genuinely robust and can deal with this problem?
Oliver Dowden
Yes, of course I would be delighted to do so. As Members will know, child abuse, sadly, knows no boundaries—the child abuse viewed by people in this country is often generated around the world—so it is important that we have a co-ordinated approach.
Lab
Peter Kyle
Hove
Will this Bill tackle the website craigslist, which profits from perpetrators who place adverts that sexually exploit young people? If they are acting like pimps, is it not about time we started treating them as such?
Oliver Dowden
Yes, of course the scope of this Bill covers any websites that host user-generated content. Within that, all sites that are subject to this legislation will have to take measures to protect children—this is across the board, not just the category 1 providers—so I would expect that to happen.
Con
Damian Hinds
East Hampshire
This is world-leading, and I very much welcome what the Secretary of State has had to say today. Ultimately, I suspect we will need to move towards global norms and even global institutions, but today I am particularly encouraged by what he has said about so-called “legal but harmful” material, confirming that this is not just about platforms setting their own terms and conditions. I welcome the role he outlined for Parliament in the secondary legislation. As the Government set that secondary legislation, may I encourage him to have in mind harms such as self-harm and eating disorders, which are growing so rapidly among young people? I am talking about not only the active encouragement and assistance of those things, but the prevalence of normalisation of them on the internet and therefore in young people’s lives.
Oliver Dowden
I can give my right hon. Friend that assurance, and he is right in what he says. The nub of the proposed legislation is to deal with that legal but harmful issue and ensure that those duties of care are in place. On the law and children, we would expect companies to do this already, but the legislation will ensure that they take action to enforce the law as it stands. The new area of regulation being created is in respect of “legal but harmful”, and of course we will engage extensively with hon. Members in identifying that in secondary legislation.
Lab
Olivia Blake
Sheffield, Hallam
Many problems are down to systems, not individual posts, when sharing disinformation. How will the legislation deal with the systems? What responsibility will cross-posting sites have for the content propagated through their channels?
Oliver Dowden
The hon. Lady has hit on the essence of the problem and what we have sought to do through the legislation. We are imposing a duty of care precisely because we know that such things evolve over time and that each company needs to take appropriate steps. Clearly, we cannot individually identify every single harm or every single action. Instead, we are setting it out as a duty of care to ensure that flexibility.
Con [V]
Ruth Edwards
Rushcliffe
I welcome the new regulations and my right hon. Friend’s reassurance that smaller businesses and new entrants to the market will not be disadvantaged. Can he tell me what criteria will be used to determine when a business meets the threshold for the new regulations to apply?
Oliver Dowden
My hon. Friend is entirely correct to raise that point. Essentially, the criterion will be that the purpose of the website is not in any way related to user-generated content, but that is just a small by-product. I used the example—it might be seen as slightly frivolous, but it is a way to illustrate it—of the online cheese retailer. Many small businesses, which are essentially retail or other activities, may allow reviews and so on. It is perfectly reasonable that we should say from the start that they are not subject to the regulation. In practice, they would not be anyway because they will not fall within the codes of conduct. It is my experience with regulation that the more we can exclude from the beginning, the better, because it removes that worry, which frequently comes from small businesses that have one or two people, not massive compliance departments that can deal with it.
LD [V]
Mr Alistair Carmichael
Orkney and Shetland
May I welcome the return of pre-legislative scrutiny? If ever there were an instance that required it, this is certainly it. Can I press the Secretary of State on the duty of care that he has outlined in relation to private messaging? From what he said a few minutes ago to the Chair of the Select Committee, the right hon. Member for Normanton, Pontefract and Castleford (Yvette Cooper), he seems to expect companies such as Facebook to police content and behaviour on apps such as WhatsApp. I do not see how they can do that without undermining the idea of end-to-end encryption, which is very important for people’s privacy and security. How will he do that in practice without relying on technology that has not yet been invented?
  00:04:38
Oliver Dowden
The right hon. Gentleman makes an important point about privacy. Clearly, if it was up to individuals within those companies to identify content on private channels, that would not be acceptable—that would be a clear breach of privacy. That is why we will rely on technology and AI and so on to identify trends that can be used to spot that kind of thing. I urge him to go along to some of these tech companies and see the advances that they are making, because it is very instructive.

As I said to the Chair of the Select Committee, the right hon. Member for Normanton, Pontefract and Castleford (Yvette Cooper), end-to-end encryption takes a whole other level of challenge. The Home Secretary and I are actively engaging with Facebook, for example, to discourage it from using end-to-end encryption unless it can put appropriate protections in place. Those conversations are ongoing.
Con
  13:54:27
Saqib Bhatti
Meriden
Earlier this year I participated in a roundtable with the Board of Deputies of British Jews, and I advocated for this legislation, so I welcome the statement, especially the immediate removal of antisemitic material. There are those who would consider that it might be a slippery slope to an attack on our freedom of speech, but does my right hon. Friend agree that instead it creates a framework to ensure that our fundamental right to freedom of speech is protected from those who seek to corrupt or even abuse it ?
Oliver Dowden
My hon. Friend is absolutely right. We are taking measures to guard against things such as antisemitic abuse, but we have taken two very clear decisions. First, we are protecting press and journalistic freedom; they will not be subject to this legislation for exactly the reasons he outlines. Secondly, we will ensure when we draft the legislation that it does not create a situation whereby Government or social media companies can start putting their world-view on to their output. There must be reasonable grounds for taking content down—they cannot just take it down because it does not cohere with their world-view.

Virtual participation in proceedings concluded (Order, 4 June).

Contains Parliamentary information licensed under the Open Parliament Licence v3.0.