PARLIAMENTARY DEBATE
Online Safety Bill (Fifth sitting) - 7 June 2022 (Commons/Public Bill Committees)
Debate Detail
Chair(s) † Sir Roger Gale, Christina Rees
Members† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
ClerksKatya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill CommitteeTuesday 7 June 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
We now start line-by-line consideration of the Bill. The selection and grouping list for the sitting is available on the table in the room for anybody who does not have it. It shows how the clauses and selected amendments have been grouped for debate. Grouped amendments are generally on the same subject or a similar issue.
Now for a slight tutorial to remind me and anybody else who is interested, including anybody who perhaps has not engaged in this arcane procedure before, of the proceedings. Each group has a lead amendment, and that amendment is moved first. The other grouped amendments may be moved later, but they are not necessarily voted on at that point, because some of them relate to matters that appear later in the Bill. Do not panic; that does not mean that we have forgotten them, but that we will vote on them—if anybody wants to press them to a Division—when they are reached in order in the Bill. However, if you are in any doubt and feel that we have missed something—occasionally I do; the Clerks never do—just let us know. I am relaxed about this, so if anybody wants to ask a question about anything that they do not understand, please interrupt and ask, and we will endeavour to confuse you further.
The Member who has put their name to the lead amendment, and only the lead amendment, is usually called to speak first. At the end of the debate, the Minister will wind up, and the mover of the lead amendment—that might be the Minister if it is a Government amendment, or it might be an Opposition Member—will indicate whether they want a vote on that amendment. We deal with that first, then we deal with everything else in the order in which it arises. I hope all that is clear, but as I said, if there are any questions, please interrupt and ask.
We start consideration of the Bill with clause 1, to which there are no amendments. Usually, the Minister would wind up at the end of each debate, but as there are no amendments to clause 1, the Minister has indicated that he would like to say a few words about the clause.
Clause 1
Overview of Act
Question proposed, That the clause stand part of the Bill.
This simple clause provides a high-level overview of the different parts of the Bill and how they come together to form the legislation.
If you would indulge me, Sir Roger, this is the first time I have led on behalf of the Opposition in a Bill Committee of this magnitude. I am very much looking forward to getting my teeth stuck into the hours of important debate that we have ahead of us. I would also like to take this opportunity to place on record an early apology for any slight procedural errors I may inadvertently make as we proceed. However, I am very grateful to be joined by my hon. Friend the Member for Worsley and Eccles South, who is much more experienced in these matters. I place on record my grateful support to her. Along with your guidance, Sir Roger, I expect that I will quickly pick up the correct parliamentary procedure as we make our way through this colossal legislation. After all, we can agree that it is a very important piece of legislation that we all need to get right.
I want to say clearly that the Opposition welcome the Bill in principle; the Minister knows that, as we voted in favour of it at Second Reading. However, it will come as no surprise that we have a number of concerns about areas where we feel the Bill is lacking, which we will explore further. We have many reservations about how the Bill has been drafted. The structure and drafting pushes services into addressing harmful content—often in a reactive, rather than proactive, way—instead of harmful systems, business models and algorithms, which would be a more lasting and systemic approach.
Despite that, we all want the Bill to work and we know that it has the potential to go far. We also recognise that the world is watching, so the Opposition look forward to working together to do the right thing, making the internet a truly safe space for all users across the UK. We will therefore not oppose clause 1.
This is important legislation. We spend so much of our lives online these days, yet there has never been an attempt to regulate the space, or for democratically elected Members to contribute towards its regulation. Clause 1 gives a general outline of what to expect in the Bill. I have no doubt that this legislation is required, but also that it will not get everything right, and that it will have to change over the years. We may see many more Bills of this nature in this place.
I have concerns that some clauses have been dropped, and I hope that there will be future opportunities to amend the Bill, not least with regard to how we educate and ensure that social media companies promote media literacy, so that information that is spread widely online is understood in its context—that it is not always correct or truthful. The Bill, I hope, will go some way towards ensuring that we can rely more on the internet, which should provide a safer space for all its users.
The hon. Member for Pontypridd is absolutely right to say that in many ways the world is watching what the Government are doing regarding online regulation. This will set a framework for many countries around the world, and we must get it right. We are ending the myth that social media and search engines are not responsible for their content. Their use of algorithms alone demonstrates that, while they may not publish all of the information on their sites, they are the editors at the very least and must take responsibility.
We will no doubt hear many arguments about the importance of free speech during these debates and others. I would like gently to remind people that there are many who feel that their free speech is currently undermined by the way in which the online world operates. Women are subject to harassment and worse online, and children are accessing inappropriate material. There are a number of areas that require specific further debate, particularly around the safeguarding of children, adequate support for victims, ensuring that the criminal law is future-proof within this framework, and ensuring that we pick up on the comments made in the evidence sessions regarding the importance of guidance and codes of practice. It was slightly shocking to hear from some of those giving evidence that the operators did not know what was harmful, as much has been written about the harm caused by the internet.
I will listen keenly to the Minister’s responses on guidance and codes of practice, and secondary legislation more generally, because it is critical to how the Bill works. I am sure we will have many hours of interesting and informed debate on this piece of legislation. While there has already been a great deal of scrutiny, the Committee’s role is pivotal to ensure that the Bill is as good as it can be.
Question put and agreed to.
Clause 1 accordingly ordered to stand part of the Bill.
Clause 2
Key Definitions
Question proposed, That the clause stand part of the Bill.
Clause 3 stand part.
That schedules 1 and 2 be the First and Second schedules to the Bill.
Clause 4 stand part.
The first important thing to note is the broadness in the drafting of all the definitions. A service has links to the UK if it has a significant number of users in the UK, if the UK users are a target market, or if
“there are reasonable grounds to believe there is a material risk of significant harm to individuals”
in the UK using the service. Thus, territorially, a very wide range of online services could be caught. The Government have estimated in their impact assessment that 25,100 platforms will be in scope of the new regime, which is perhaps a conservative estimate. The impact assessment also notes that approximately 180,000 platforms could potentially be considered in scope of the Bill.
The provisions on extraterritorial jurisdiction are, again, extremely broad and could lead to some international platforms seeking to block UK users in a way similar to that seen following the introduction of GDPR. Furthermore, as has been the case under GDPR, those potentially in scope through the extraterritorial provisions may vigorously resist attempts to assert jurisdiction.
Notably absent from schedule 1 is an attempt to include or define how the Bill and its definitions of services that are exempt may adapt to emerging future technologies. The Minister may consider that a matter for secondary legislation, but as he knows, the Opposition feel that the Bill already leaves too many important matters to be determined at a later stage via statutory instruments. Although it good to see that the Bill has incorporated everyday internet behaviour such as a like or dislike button, as well as factoring in the use of emojis and symbols, it fails to consider how technologies such as artificial intelligence will sit within the framework as it stands.
It is quite right that there are exemptions for everyday user-to-user services such as email, SMS, and MMS services, and an all-important balance to strike between our fundamental right to privacy and keeping people safe online. That is where some difficult questions arise on platforms such as WhatsApp, which are embedded with end-to-end encryption as a standard feature. Concerns have been raised about Meta’s need to extend that feature to Instagram and Facebook Messenger.
The Opposition also have concerns about private messaging features more widely. Research from the Centre for Missing and Exploited Children highlighted the fact that a significant majority of online child abuse takes place in private messages. For example, 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Furthermore, recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people they have not met offline before. Nearly three quarters—74%—of cases of children contacted by someone they do not know initially take place by private message. We will address this issue further in new clause 20, but I wanted to highlight those exemptions early on, as they are relevant to schedule 1.
On a similar point, we remain concerned about how emerging online systems such as the metaverse have had no consideration in Bill as it stands. Only last week, colleagues will have read about a researcher from a non- profit organisation that seeks to limit the power of large corporations, SumOfUs, who claimed that she experienced sexual assault by a stranger in Meta’s virtual reality space, Horizon Worlds. The organisation’s report said:
“About an hour into using the platform, a SumOfUs researcher was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see—all while another user in the room watched and passed around a vodka bottle.”
There is currently no clear distinction about how these very real technologies will sit in the Bill more widely. Even more worryingly, there has been no consideration of how artificial intelligence systems such as Horizon Worlds, with clear user-to-user functions, fit within the exemptions in schedule 1. If we are to see exemptions for internal business services or services provided by public bodies, along with many others, as outlined in the schedule, we need to make sure that the exemptions are fit for purpose and in line with the rapidly evolving technology that is widely available overseas. Before long, I am sure that reality spaces such as Horizon Worlds will become more and more commonplace in the UK too.
I hope that the Minister can reassure us all of his plans to ensure that the Bill is adequately future-proofed to cope with the rising expansion of the online space. Although we do not formally oppose the provisions outlined in schedule 1, I hope that the Minister will see that there is much work to be done to ensure that the Bill is adequately future-proofed to ensure that the current exemptions are applicable to future technologies too.
Turning to schedule 2, the draft Bill was hugely lacking in provisions to tackle pornographic content, so it is a welcome step that we now see some attempts to tackle the rate at which pornographic content is easily accessed by children across the country. As we all know, the draft Bill only covered pornography websites that allow user-generated content such as OnlyFans. I am pleased to see that commercial pornography sites have now been brought within scope. This positive step forward has been made possible thanks to the incredible efforts of campaigning groups, of which there are far too many to mention, and from some of which we took evidence. I pay tribute to them today. Over the years, it is thanks to their persistence that the Government have been forced to take notice and take action.
Once again—I hate to repeat myself—I urge the Minister to consider how far the current definitions outlined in schedule 2 relating to regulated provider pornographic content will go to protect virtual technologies such as those I referred to earlier. We are seeing an increase in all types of pornographic and semi-pornographic content that draws on AI or virtual technology. An obvious example is the now thankfully defunct app that was making the rounds online in 2016 called DeepNude. While available, the app used neural networks to remove clothing from images of women, making them look realistically nude. The ramifications and potential for technology like this to take over the pornographic content space is essentially limitless.
I urge the Minister carefully to keep in mind the future of the online space as we proceed. More specifically, the regulation of pornographic content in the context of keeping children safe is an area where we can all surely get on board. The Opposition have no formal objection at this stage to the provisions outlined in schedule 2.
The Opposition spokesperson, the hon. Member for Pontypridd, made some points about making sure that we are future-proofing the Bill. There are some key issues where we need to make sure that we are not going backwards. That particularly includes private messaging. We need to make sure that the ability to use AI to find content that is illegal, involving child sexual abuse for example, in private messages is still included in the way that it is currently and that the Bill does not accidentally bar those very important safeguards from continuing. That is one way in which we need to be clear on the best means to go forward with the Bill.
Future-proofing is important—I absolutely agree that we need to ensure that the Bill either takes into account the metaverse and virtual reality or ensures that provisions can be amended in future to take into account the metaverse, virtual reality and any other emerging technologies that we do not know about and cannot even foresee today. I saw a meme online the other day that was somebody taking a selfie of themselves wearing a mask and it said, “Can you imagine if we had shown somebody this in 1995 and asked them what this was? They wouldn’t have had the faintest idea.” The internet changes so quickly that we need to ensure that the Bill is future-proofed, but we also need to make sure that it is today-proofed.
I still have concerns, which I raised on Second Reading, about whether the Bill adequately encompasses the online gaming world, where a huge number of children use the internet—and where they should use it—to interact with their friends in a safe way. A lot of online gaming is free from the bullying that can be seen in places such as WhatsApp, Snapchat and Instagram. We need to ensure that those safeguards are included for online gaming. Private messaging is a thing in a significant number of online games, but many people use oral communication—I am thinking of things such as Fortnite and Roblox, which is apparently a safe space, according to Roblox Corporation, but according to many researchers is a place where an awful lot of grooming takes place.
My other question for the Minister—I am not bothered if I do not get an answer today, as I would rather have a proper answer than the Minister try to come up with an answer right at this moment—is about what category the app store and the Google Play store fall into.
While I am on my feet, I should perhaps have said earlier, and will now say for clarification, that interventions are permitted in exactly the same way as they are on the Floor of the House. In exactly the same way, it is up to the Member who has the Floor to decide whether to give way or not. The difference between these debates and those on the Floor of the House is of course that on the Floor of the House a Member can speak only once, whereas in Committee you have the opportunity to come back and speak again if you choose to do so. Once the Minister is winding up, that is the end of the debate. The Chair would not normally admit, except under exceptional circumstances, any further speech, as opposed to an intervention.
I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.
I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?
That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.
First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?
Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.
As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.
The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.
The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.
The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.
The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.
The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.
The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.
I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.
Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.
Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.
The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.
I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.
Question put and agreed to.
Clause 2 accordingly ordered to stand part of the Bill.
Clause 3 ordered to stand part of the Bill.
Schedules 1 and 2 agreed to.
Clause 4 ordered to stand part of the Bill.
As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?
It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.
Clause 5
Overview of Part 3
Question proposed, That the clause stand part of the Bill.
Complexity is an issue that crops up time and again when speaking with charities, stakeholders and civil society. We all recognise that the Bill will have a huge impact however it passes, but the complexity of its drafting is a huge barrier to implementation. The same can be said for the regulation. A Bill as complex as this is likely to lead to ineffective regulation for both service users and companies, who, for the first time, will be subject to specific requirements placed on them by the regulator. That being said, we absolutely support steps to ensure that providers of regulated user-to-user services and regulated search services have to abide by a duty of care regime, which will also see the regulator able to issue codes of practice.
I would also like to place on record my gratitude—lots of gratitude today—to Professor Lorna Woods and Will Perrin, who we heard from in evidence sessions last week. Alongside many others, they have been and continue to be an incredible source of knowledge and guidance for my team and for me as we seek to unpick the detail of this overly complex Bill. Colleagues will also be aware that Professor Woods and Mr Perrin originally developed the idea of a duty of care a few years ago now; their model was based on the idea that social media providers should be,
“seen as responsible for public space they have created, much as property owners or operators are in a physical world.”
It will come as no surprise to the Minister that Members of the Opposition fully fall behind that definition and firmly believe that forcing platforms to identify and act on harms that present a reasonable chance of risk is a positive step forward.
More broadly, we welcome moves by the Government to include specific duties on providers of services likely to be accessed by children, although I have some concerns about just how far they will stretch. Similarly, although I am sure we will come to address those matters in the debates that follow, we welcome steps to require Ofcom to issue codes of practice, but have fundamental concerns about how effective they will be if Ofcom is not allowed to remain fully independent and free from Government influence.
Lastly, on subsection 7, I imagine our debate on chapter 7 will be a key focus for Members. I know attempts to define key terms such as “priority content” will be a challenge for the Minister and his officials but we remain concerned that there are important omissions, which we will come to later. It is vital that those key terms are broad enough to encapsulate all the harms that we face online. Ultimately, what is illegal offline must be approached in the same way online if the Bill is to have any meaningful positive impact, which is ultimately what we all want.
I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.
Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.
It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.
The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.
The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.
My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.
There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.
The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.
I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.
We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.
Question put and agreed to.
Clause 5 accordingly ordered to stand part of the Bill.
Clause 6
Providers of user-to-user services: duties of care
This is slightly more complex. It is a very complex Bill, and I think I am right in saying that it is the first time in my experience that we are taking other clause stand parts as part of the groups of amendments, because there is an enormous amount of crossover between the clauses. That will make it, for all of us, slightly harder to regulate. It is for that reason—the Minister was kind enough to say that I was reasonably generous in allowing a broad-ranging debate—that I think we are going to have to do that with this group.
I, and I am sure Ms Rees, will not wish to be draconian in seeking to call Members to order if you stray slightly outside the boundaries of a particular amendment. However, we have to get on with this, so please try not to be repetitive if you can possibly avoid it, although I accept that there may well be some cases where it is necessary.
‘(6A) All providers of regulated user-to-user services must name an individual whom the provider considers to be a senior manager of the provider, who is designated as the provider’s illegal content safety controller, and who is responsible for the provider’s compliance with the following duties—
(a) the duties about illegal content risk assessments set out in section 8,
(b) the duties about illegal content set out in section 9.
(6B) An individual is a “senior manager” of a provider if the individual plays a significant role in—
(a) the making of decisions about how the provider’s relevant activities are to be managed or organised, or
(b) the actual managing or organising of the provider’s relevant activities.
(6C) A provider’s “relevant activities” are activities relating to the provider’s compliance with the duties of care imposed by this Act.
(6D) The Safety Controller commits an offence if the provider fails to comply with the duties set out in sections 8 and 9 which must be complied with by the provider.”
This is one of those cases where the amendment relates to a later clause. While that clause may be debated now, it will not be voted on now. If amendment 69 is negated, amendment 70 will automatically fall later. I hope that is clear, but it will be clearer when we get to amendment 70. Having confused the issue totally, without further ado, I call Ms Davies-Jones.
Chapter 2 includes a number of welcome improvements from the draft Bill that the Opposition support. It is only right that, when it comes to addressing illegal content, all platforms, regardless of size or reach, will now be required to develop suitable and sufficient risk assessments that must be renewed before design change is applied. Those risk assessments must be linked to safety duties, which Labour has once again long called for.
It was a huge oversight that, until this point, platforms have not had to perform risk assessments of that nature. During our oral evidence sessions only a few weeks ago, we heard extensive evidence about the range of harms that people face online. Yet the success of the regulatory framework relies on regulated companies carefully assessing the risk posed by their platforms and subsequently developing and implementing appropriate mitigations. Crucial to that, as we will come to later, is transparency. Platforms must be compelled to publish the risk assessments, but in the current version of the Bill, only the regulator will have access to them. Although we welcome the fact that the regulator will have the power to ensure that the risk assessments are of sufficient quality, there remain huge gaps, which I will come on to.
Despite that, the Bill should be commended for requiring platforms to document such risks. However, without making those documents public, platforms can continue to hide behind a veil of secrecy. That is why we have tabled a number of amendments to improve transparency measures in the Bill. Under the Bill as drafted, risk assessments will have to be made only to the regulator, and civil society groups, platforms and other interested participants will not have access to them. However, such groups are often at the heart of understanding and monitoring the harms that occur to users online, and they have an in-depth understanding of what mitigations may be appropriate.
We broadly welcome the Government’s inclusion of functionality in the risk assessments, which will look at not just content but how it spreads. There remains room for improvement, much of which will be discussed as we delve further into chapter 2.
Our amendment 69 would require regulated companies to designate a senior manager as a safety controller who is legally responsible for ensuring that the service meets its illegality risk assessment and content safety duties and is criminally liable for significant and egregious failures to protect users from harms. Typically, senior executives in technology companies have not taken their safeguarding responsibilities seriously, and Ofcom’s enforcement powers remain poorly targeted towards delivering child safety outcomes. The Bill is an opportunity to promote cultural change within companies and to embed compliance with online safety regulations at board level but, as it stands, it completely fails to do so.
The Bill introduces criminal liability for senior managers who fail to comply with information notice provisions, but not for actual failure to fulfil their statutory duties with regard to safety, including child safety, and yet such failures lead to the most seriously harmful outcomes. Legislation should focus the minds of those in leadership positions in services that operate online platforms.
A robust corporate and senior management liability scheme is needed to impose personal liability on directors whose actions consistently and significantly put children at risk. The Bill must learn lessons from other regulated sectors, principally financial services, where regulation imposes specific duties on directors and senior management of financial institutions. Those responsible individuals face regulatory enforcement if they act in breach of such duties. Are we really saying that the financial services sector is more important than child safety online?
The Government rejected the Joint Committee’s recommendation that each company appoint a safety controller at, or reporting to, board level. As a result, there is no direct relationship in the Bill between senior management liability and the discharge by a platform of its safety duties. Under the Bill as drafted, a platform could be wholly negligent in its approach to child safety and put children at significant risk of exposure to illegal activity, but as long as the senior manager co-operated with the regulator’s investigation, senior managers would not be held personally liable. That is a disgrace.
The Joint Committee on the draft Bill recommended that
“a senior manager at board level or reporting to the board should be designated the ‘Safety Controller’ and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users. We believe that this would be a proportionate last resort for the Regulator. Like any offence, it should only be initiated and provable at the end of an exhaustive legal process.”
Amendment 69 would make provision for regulated companies to appoint an illegal content safety controller, who has responsibility and accountability for protecting children from illegal content and activity. We believe this measure would drive a more effective culture of online safety awareness within regulated firms by making senior management accountable for harms caused through their platforms and embedding safety within governance structures. The amendment would require consequential amendments setting out the nature of the offences for which the safety officer may be liable and the penalties associated with them.
In financial services regulation, the Financial Conduct Authority uses a range of personal accountability regimes to deter individuals who may exhibit unwanted and harmful behaviour and as mechanisms for bringing about cultural change. The senior managers and certificate regime is an overarching framework for all staff in financial sectors and service industries. It aims to
“encourage a culture of staff at all levels taking personal responsibility for their actions”,
and to
“make sure firms and staff clearly understand and can demonstrate where responsibility lies.”
Returning to the senior managers and certificate regime in the financial services industry, it states that senior managers must be preapproved by the regulator, have their responsibilities set out in a statement of responsibilities and be subject to enhanced conduct standards. Those in banks are also subject to regulatory requirements on their remuneration. Again, it baffles me that we are not asking the same for child safety from online platforms and companies.
The money laundering regulations also use the threat of criminal offences to drive culture change. Individuals can be culpable for failure of processes, as well as for intent. I therefore hope that the Minister will carefully consider the need for the same to apply to our online space to make children safe.
Amendment 70 is a technical amendment that we will be discussing later on in the Bill. However, I am happy to move it in the name of the official Opposition.
I would reinforce the earlier points about accountability. There are too many examples—whether in the financial crash or the collapse of companies such as Carillion—where accountability was never there. Without this amendment and the ability to hold individuals to account for the failures of companies that are faceless to many people, the legislation risks being absolutely impotent.
Finally, I know that we will get back to the issue of funding in a later clause but I hope that the Minister can reassure the Committee that funding for the enforcement of these regulations will be properly considered.
Speaking of compliance, that brings us to the topic of amendments 69 and 70. It is worth reminding ourselves of the current enforcement provisions in the Bill, which are pretty strong. I can reassure the hon. Member for Liverpool, Walton that the enforcement powers here are far from impotent. They are very potent. As the shadow Minister acknowledged in her remarks, we are for the first time ever introducing senior management liability, which relates to non-compliance with information notices and offences of falsifying, encrypting or destroying information. It will be punishable by a prison sentence of up to two years. That is critical, because without that information, Ofcom is unable to enforce.
We have had examples of large social media firms withholding information and simply paying a large fine. There was a Competition and Markets Authority case a year or two ago where a large social media firm did not provide information repeatedly requested over an extended period and ended up paying a £50 million fine rather than providing the information. Let me put on record now that that behaviour is completely unacceptable. We condemn it unreservedly. It is because we do not want to see that happen again that there will be senior manager criminal liability in relation to providing information, with up to two years in prison.
In addition, for the other duties in the Bill there are penalties that Ofcom can apply for non-compliance. First, there are fines of up to 10% of global revenue. For the very big American social media firms, the UK market is somewhere just below 10% of their global revenue, so 10% of their global revenue is getting on for 100% of their UK revenue. That is a very significant financial penalty, running in some cases into billions of pounds.
In extreme circumstances—if those measures are not enough to ensure compliance—there are what amount to denial of service powers in the Bill, where essentially Ofcom can require internet service providers and others, such as payment providers, to disconnect the companies in the UK so that they cannot operate here. Again, that is a very substantial measure. I hope the hon. Member for Liverpool, Walton would agree that those measures, which are in the Bill already, are all extremely potent.
The question prompted by the amendment is whether we should go further. I have considered that issue as we have been thinking about updating the Bill—as hon. Members can imagine, it is a question that I have been debating internally. The question is whether we should go further and say there is personal criminal liability for breaches of the duties that go beyond information provision. There are arguments in favour, which we have heard, but there are arguments against as well. One is that if we introduce criminal liability for those other duties, that introduces a risk that the social media firms, fearing criminal prosecution, will become over-zealous and just take everything down because they are concerned about being personally liable. That could end up having a chilling effect on content available online and goes beyond what we in Parliament would intend.
For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.
I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.
A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.
Question put, That the amendment be made.
Clause 7 stand part.
Clauses 21 and 22 stand part.
My view is that the stand part debate on clause 6 has effectively already been had, but I will not be too heavy-handed about that at the moment.
As we have previously outlined, a statutory duty of care for social platforms online has been missing for far too long, but we made it clear on Second Reading that such a duty will only be effective if we consider the systems, business models and design choices behind how platforms operate. For too long, platforms have been abuse-enabling environments, but it does not have to be this way. The amendments that we will shortly consider are largely focused on transparency, as we all know that the duties of care will only be effective if platforms are compelled to proactively supply their assessments to Ofcom.
On clause 21, the duty of care approach is one that the Opposition support and it is fundamentally right that search services are subject to duties including illegal content risk assessments, illegal content assessments more widely, content reporting, complaints procedures, duties about freedom of expression and privacy, and duties around record keeping. Labour has long held the view that search services, while not direct hosts of potentially damaging content, should have responsibilities that see them put a duty of care towards users first, as we heard in our evidence sessions from HOPE not hate and the Antisemitism Policy Trust.
It is also welcome that the Government have committed to introducing specific measures for regulated search services that are likely to be accessed by children. However, those measures can and must go further, so we will be putting forward some important amendments as we proceed.
Labour does not oppose clause 22, either, but I would like to raise some important points with the Minister. We do not want to be in a position whereby those designing, operating and using a search engine in the United Kingdom are subject to a second-rate internet experience. We also do not want to be in a position where we are forcing search services to choose what is an appropriate design for people in the UK. It would be worrying indeed if our online experience vastly differed from that of, let us say, our friends in the European Union. How exactly will clause 22 ensure parity? I would be grateful if the Minister could confirm that before we proceed.
In response to the point about whether the duties on search will end up providing a second-rate service in the United Kingdom, I do not think that they will. The duties have been designed to be proportionate and reasonable. Throughout the Bill, Members will see that there are separate duties for search and for user-to-user services. That is reflected in the symmetry—which appears elsewhere, too—of clauses 6 and 7, and clauses 21 and 22. We have done that because we recognise that search is different. It indexes the internet; it does not provide a user-to-user service. We have tried to structure these duties in a way that is reasonable and proportionate, and that will not adversely impair the experience of people in the UK.
I believe that we are ahead of the European Union in bringing forward this legislation and debating it in detail, but the European Union is working on its Digital Services Act. I am confident that there will be no disadvantage to people conducting searches in United Kingdom territory.
Question put and agreed to.
Clause 6 accordingly ordered to stand part of the Bill.
Clause 7 ordered to stand part of the Bill.
Clause 8
Illegal content risk assessment duties
“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”
This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.
Amendment 14, in clause 8, page 6, line 33, at end insert—
“(4A) A duty for the illegal content risk assessment to be approved by either—
(a) the board of the entity; or, if the organisation does not have a board structure,
(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”
This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.
Amendment 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.
This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.
Amendment 19, in clause 8, page 7, line 14, at end insert—
“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—
(i) enable users to encounter illegal content on other regulated user-to-user services, and
(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”
This amendment would incorporate into the duties a requirement to consider cross-platform risk.
Clause stand part.
Amendment 20, in clause 9, page 7, line 30, at end insert
“, including by being directed while on the service towards priority illegal content hosted by a different service;”.
This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.
Amendment 26, in clause 9, page 7, line 30, at end insert—
“(aa) prevent the production of illegal content by means of the service;”.
This amendment incorporates a requirement to prevent the production of illegal content within the safety duties.
Amendment 18, in clause 9, page 7, line 35, at end insert—
“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”
This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.
Amendment 21, in clause 9, page 7, line 35, at end insert—
“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”.
This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.
Clause 9 stand part.
Amendment 30, in clause 23, page 23, line 24, after “facilitating” insert
“the production of illegal content and”.
This amendment requires the illegal content risk assessment to consider the production of illegal content.
Clause 23 stand part.
Amendment 31, in clause 24, page 24, line 2, after “individuals” insert “producing or”.
This amendment expands the safety duty to include the need to minimise the risk of individuals producing certain types of search content.
Clause 24 stand part.
Members will note that amendments 17 and 28 form part of a separate group. I hope that is clear.
Labour is extremely concerned by the lack of transparency around the all-important illegal content risk assessments, which is why we have tabled amendment 10. The effectiveness of the entire Bill is undermined unless the Government commit to a more transparent approach more widely. As we all know, in the Bill currently, the vital risk assessments will only be made available to the regulator, rather than for public scrutiny. There is a real risk—for want of a better word—in that approach, as companies could easily play down or undermine the risks. They could see the provision of the risk assessments to Ofcom as a simple, tick-box exercise to satisfy the requirements of them, rather than using the important assessments as an opportunity truly to assess the likelihood of current and emerging risks.
As my hon. Friend the Member for Worsley and Eccles South will touch on in her later remarks, the current approach runs the risk of allowing businesses to shield themselves from true transparency. The Minister knows that this is a major issue, and that until service providers and platforms are legally compelled to provide data, we will be shielded from the truth, because there is no statutory requirement for them to be transparent. That is fundamentally wrong and should not be allowed to continue. If the Government are serious about their commitment to transparency, and to the protection of adults and children online, they should make this small concession and see it as a positive step forward.
Amendment 14 would ensure that regulated companies, boards or senior staff have appropriate oversight of risk assessments related to adults. An obligation on boards or senior managers to approve risk assessments would hardwire the safety duties and create a culture of compliance in the regulated firms. The success of the regulatory framework relies on regulated companies carefully risk assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations.
To date, boards and top executives of the regulated companies have not taken the risks to children seriously enough. Platforms either have not considered producing risk assessments or, if they have done so, they have been of limited efficiency and have demonstrably failed to adequately identify and respond to harms to children. Need I remind the Minister that the Joint Committee on the draft Bill recommended that risk assessments should be approved at board level?
Introducing a requirement on regulated companies to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making, and create accountability and responsibility at the most senior level of the organisation. That will trickle down the organisation and help embed a culture of compliance across the company. We need to see safety online as a key focus for these platforms, and putting the onus on senior managers to take responsibility is a positive step forward in that battle.
We are encouraged by the prioritisation of tackling the dissemination of child sexual exploitation and abuse. However, there is room for the Bill to go even further in strengthening child protection online, particularly in relation to the use of online platforms to generate new child sexual exploitation and abuse content. While it is a welcome step forward that the Bill is essentially encouraging a safety-by-design approach, clause 8 does not go far enough to tackle newly produced content or livestreamed content.
The Minister will be aware of the huge problems with online sexual exploitation of children. I pay tribute to the hard work of my hon. Friend the Member for Rotherham (Sarah Champion), alongside the International Justice Mission, which has been a particularly vocal champion of vulnerable young children at home and abroad.
The Philippines is a source country for livestreamed sexual exploitation of children. In its recent white paper, the IJM found that traffickers often use cheap Android smartphones with prepaid cellular data services to communicate with customers to produce and distribute explicit material. In order to reach the largest possible customer base, they often connect with sexually motivated offenders through everyday technology—the same platforms that the rest of us use to communicate with friends, family and co-workers.
One key issue with assessing the extent of online sexual exploitation of children is that we are entirely dependent on detection of the crime. Sadly, most current technologies widely used to detect various forms of online sexual exploitation of children are not designed to recognise livestreaming. Clearly, the implications of that are huge for both child sexual exploitation and human trafficking more widely. The International Justice Mission reports that file hashtag and PhotoDNA, which are widely used to great effect in enabling the detection and reporting of millions of known child sexual exploitation files, do not and cannot detect newly produced child sexual exploitation material.
The livestreaming of CSEM involves an ephemeral video stream, not a stored still or a video file. It is also therefore not usually subject to screening or content review. We must consider how easy it is for platforms to host live content and how ready they are to screen that content. I need only point the Minister to the devastating mass shooting that took place in Buffalo last month. The perpetrator livestreamed the racist attack online, using a GoPro camera attached to a military-style helmet. The shooter streamed live on the site Twitch for around two minutes before the site took the livestream down, but since then the video has been posted elsewhere on the internet and on smaller platforms.
Other white supremacists have used social media to publicise gruesome attacks, including the mass shooter in Christchurch, New Zealand, in 2019. Since that shooting, social media companies have got better in some ways at combating videos of atrocities online, including stopping livestreams of attacks faster, but violent videos, such as those of mass shootings, are saved by users and then reappear across the internet on Facebook, Instagram, Twitter, TikTok and other high-harm, smaller platforms. These reuploaded videos are harder for companies to take down. Ultimately, more needs to be done at the back end in terms of design features if we are to truly make people safe.
When it comes to exploitation being livestreamed online—unlike publicised terror attacks—crimes that are not detected are not reported. Therefore, livestreaming of child sexual exploitation is a severely under-reported crime and reliable figures for its prevalence do not exist. Anecdotally, the problem in the Philippines is overwhelming, but it is not limited to the Philippines. The IJM is aware of similar child trafficking originating from other source countries in south-east Asia, south Asia, Africa and Europe. Therefore, it is essential that technology companies and online platforms are compelled to specifically consider the production of illegal content when drawing up their risk assessments.
I turn to amendment 19, which we tabled to probe the Minister on how well he believes the clause encapsulates the cross-platform risk that children may face online. Organisations such as the National Society for the Prevention of Cruelty to Children and 5Rights have raised concerns that, as the Bill is drafted, there is a gap where children are groomed on one platform, where no abuse takes place, but are then directed to another platform, where they are harmed.
Well-established grooming pathways see abusers exploit the design features of social networks to contact children before moving communication across to other platforms, including livestreaming sites and encrypted messaging services. Perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children whereby they can use direct messages to groom children and then coerce them into sending sexual images via WhatsApp.
Similarly, an abuser might groom a child through playing video games and simultaneously building that relationship further via a separate chat platform such as Discord. I want to point colleagues to Frida. Frida was groomed at the age of 13, and Frida’s story sadly highlights the subtle ways in which abusers can groom children on social networks before migrating them to other, more harmful apps and sites.
This is Frida’s experience in her own words:
Frida was 13 years old. How many other Fridas are there?
We recognise that no online service can assemble every piece of the jigsaw. However, the Bill does not place requirements on services to consider how abuse spreads from their platform to others or vice versa, to risk-assess accordingly or to co-operate with other platforms proactively to address harm. Amendment 19 would require companies to understand when discharging their risk assessment duties how abuse spreads from their platform to others or vice versa. For example, companies should understand how their platforms are situated on abuse pathways whereby the grooming and other online sexual abuse risks start on their site before migrating to other services, or whether they inherit risks from other sites.
Companies should also know whether they are dealing with abuse cross-platform risks, which happen sequentially, as tends to be the case for grooming initiated on social networks, or simultaneously, as tends to be the case on gaming services. Lastly, they should understand which functionalities and design features allowed child sexual exploitation offences to be committed and transferred across platforms.
The NSPCC research found that four UK adults in five think that social media companies should have a legal duty to work with each other to prevent online grooming from happening across multiple platforms, so that is an area in which the Minister has widespread support, both in the House and in the public realm.
This matter is not addressed explicitly. We are concerned that companies might be able to cite competition worries to avoid considering that aspect of online abuse. That is unacceptable. We are also concerned that forthcoming changes to the online environment such as the metaverse will create new risks such as more seamless moving of abuse between different platforms .
I grew up on the internet and spent a huge amount of time speaking to people, so I am well aware that people can be anyone they want to be on the internet, and people do pretend to be lots of different people. If someone tells us their age on the internet, we cannot assume that that is in any way accurate. I am doing what I can to imprint that knowledge on my children in relation to any actions they are taking online. In terms of media literacy, which we will come on to discuss in more depth later, I hope that one of the key things that is being told to both children and adults is that it does not matter if people have pictures on their profile—they can be anybody that they want to online and could have taken those pictures from wherever.
In relation to amendment 21 on collaboration, the only reasonable concern that I have heard is about an action that was taken by Facebook in employing an outside company in the US. It employed an outside company that placed stories in local newspapers on concerns about vile things that were happening on TikTok. Those stories were invented—they were made up—specifically to harm TikTok’s reputation. I am not saying for a second that collaboration is bad, but I think the argument that some companies may make that it is bad because it causes them problems and their opponents may use it against them proves the need to have a regulator. The point of having a regulator is to ensure that any information or collaboration that is required is done in a way that, should a company decide to use it with malicious intent, the regulator can come down on them. The regulator ensures that the collaboration that we need to happen in order for emergent issues to be dealt with as quickly as possible is done in a way that does not harm people. If it does harm people, the regulator is there to take action.
I want to talk about amendments 25 and 30 on the production of images and child sexual abuse content. Amendment 30 should potentially have an “or” at the end rather than an “and”. However, I am very keen to support both of those amendments, and all the amendments relating to the production of child sexual abuse content. On the issues raised by the Opposition about livestreaming, for example, we heard two weeks ago about the percentage of self-generated child sexual abuse content. The fact is that 75% of that content is self-generated. That is absolutely huge.
If the Bill does not adequately cover production of the content, whether it is by children and young people who have been coerced into producing the content and using their cameras in that way, or whether it is in some other way, then the Bill fails to adequately protect our children. Purely on the basis of that 75% stat, which is so incredibly stark, it is completely reasonable that production is included. I would be happy to support the amendments in that regard; I think they are eminently sensible. Potentially, when the Bill was first written, production was not nearly so much of an issue. However, as it has moved on, it has become a huge issue and something that needs tackling. Like Opposition Members, I do not feel like the Bill covers production in as much detail as it should, in order to provide protection for children.
“by means of the service”.
That phrase is quite important, and I will come to it later, on discussing some of the amendments, because it does not necessarily mean just on the service itself but, in a cross-platform point, other sites where users might find themselves via the service. That phrase is important in the context of some of the reasonable queries about cross-platform risks.
Moving on, companies will also need to consider how the design and operation of their service may reduce or increase the risks identified. Under schedule 3, which we will vote on, or at least consider, later on, companies will have three months to carry out risk assessments, which must be kept up to date so that fresh risks that may arise from time to time can be accommodated. Therefore, if changes are made to the service, the risks can be considered on an ongoing basis.
Amendment 10 relates to the broader question that the hon. Member for Liverpool, Walton posed about transparency. The Bill already contains obligations to publish summary risk assessments on legal but harmful content. That refers to some of the potentially contentious or ambiguous types of content for which public risk assessments would be helpful. The companies are also required to make available those risk assessments to Ofcom on request. That raises a couple of questions, as both the hon. Member for Liverpool, Walton mentioned and some of the amendments highlighted. Should companies be required to proactively serve up their risk assessments to Ofcom, rather than wait to be asked? Also, should those risk assessments all be published—probably online?
In considering those two questions, there are a couple of things to think about. The first is Ofcom’s capacity. As we have discussed, 25,000 services are in scope. If all those services proactively delivered a copy of their risk assessment, even if they are very low risk and of no concern to Ofcom or, indeed, any of us, they would be in danger of overwhelming Ofcom. The approach contemplated in the Bill is that, where Ofcom has a concern or the platform is risk assessed as being significant—to be clear, that would apply to all the big platforms—it will proactively make a request, which the platform will be duty bound to meet. If the platform does not do that, the senior manager liability and the two years in prison that we discussed earlier will apply.
Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.
Adjourned till this day at Two o’clock.
Contains Parliamentary information licensed under the Open Parliament Licence v3.0.