PARLIAMENTARY DEBATE
Online Safety Bill (Ninth sitting) - 14 June 2022 (Commons/Public Bill Committees)
Debate Detail
Chair(s) Sir Roger Gale, † Christina Rees
Members† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
ClerksKatya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill CommitteeTuesday 14 June 2022
(Morning)
[Christina Rees in the Chair]
Online Safety Bill
Clause 40
Secretary of State’s powers of direction
This amendment would remove the ability of the Secretary of State to modify Ofcom codes of practice ‘for reasons of public policy’.
Clause stand part.
Clause 41 stand part.
New clause 12—Secretary of State’s powers to suggest modifications to a code of practice—
“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.
(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.
(3) The Secretary of State may only write to OFCOM twice under this section for each code.
(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.
(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”
This new clause gives the Secretary of State powers to suggest modifications to a code of practice, as opposed to the powers of direction proposed in clause 40.
“for reasons of public policy”.
Of all the correspondence that I have had on the Bill—there has been quite a lot—this is the clause that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted excessive powers in the Bill, and that it threatens the independence of the independent regulator. Businesses are also wary of this power, in part due to the uncertainty that it causes.
The reduction of Ministers’ powers under the Bill was advised by the Joint Committee on the draft Bill and by the Digital, Culture, Media and Sport Committee. I am sure that the two hon. Members on the Government Benches who sat on those Committees and added their names to their reports—the hon. Members for Watford and for Wolverhampton North East—will vote for the amendment. How could they possibly have put their names to the Select Committee report and the Joint Committee report and then just a few weeks later decide that they no longer support the very proposals that they had advanced?
Could the Minister inform us which special interest groups specifically have backed the Secretary of State’s public policy powers under the Bill? I am fascinated to know. Surely, all of us believe in public policy that is informed by expert evidence. If the Secretary of State cannot produce any experts at all who believe that the powers that she enjoys are appropriate or an advantage, or improve legislation, then we should not be proceeding in the way that we are. Now that I know that our proceedings are being broadcast live, I also renew my call to anyone watching who is in favour of these powers as they are to say so, because so far we have found no one who holds that position.
We should be clear about exactly what these powers do. Under clause 40, the Secretary of State can modify the draft codes of practice, thus allowing the Government a huge amount of power over the independent communications regulator. The Government have attempted to play down these powers by stating that they would be used only in exceptional circumstances. However, the legislation does not define what “exceptional circumstances” means, and it is far too nebulous a term for us to proceed under the current circumstances. Rather, a direction can reflect public policy. Will the Minister also clarify the difference between “public policy” and “government policy”, which was the wording in the draft Bill?
The regulator must not be politicised in this way. Regardless of the political complexion of the Government, when they have too much influence over what people can say online, the implications for freedom of speech are grave, especially when the content that they are regulating is not illegal. I ask the Minister to consider how he would feel if, rather than being a Conservative, the Culture Secretary came from among my friends on the Labour Benches. I would argue that that would be a significant improvement, but I imagine that the Minister would not. I see from his facial expression that that is the case.
There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach of these powers. When we are allowing the Executive powers over the communications regulator, the protections must be absolute and iron-clad. As it stands, the Bill leaves leeway for abuse of these powers. No matter how slim a chance the Minister feels that there is of that, as parliamentarians we must not allow it. That is why I urge the Government to consider amendment 84.
As somebody who is new to these proceedings, I think it would be nice if, just for once, the Government listened to arguments and were prepared to accept them, rather than us going through this Gilbert and Sullivan pantomime where we advance arguments, we vote and we always lose. The Minister often says he agrees with us, but he still rejects whatever we say.
Amendment 84 would remove the Secretary of State’s ability to modify Ofcom codes of practice
“for reasons of public policy”.
Labour agrees with the Carnegie UK Trust assessment of this: the codes are the fulcrum of the regulatory regime and it is a significant interference in Ofcom’s independence. Ofcom itself has noted that the “reasons of public policy” power to direct might weaken the regime. If Ofcom has undertaken a logical process, rooted in evidence, to arrive at a draft code, it is hard to see how a direction based on “reasons of public policy” is not irrational. That then creates a vulnerability to legal challenge.
On clause 40 more widely, the Secretary of State should not be able to give Ofcom specific direction on non-strategic matters. Ofcom’s independence in day-to-day decision making is paramount to preserving freedom of expression. Independence of media regulators is the norm in developed democracies. The UK has signed up to many international statements in that vein, including as recently as April 2022 at the Council of Europe. That statement says that
“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”
The Bill introduces powers for the Secretary of State to direct Ofcom on internet safety codes. These provisions should immediately be removed. After all, in broadcasting regulation, Ofcom is trusted to make powerful programme codes with no interference from the Secretary of State. Labour further notes that although the draft Bill permitted this
“to ensure that the code of practice reflects government policy”,
clause 40 now specifies that any code may be required to be modified
“for reasons of public policy”.
Although that is more normal language, it is not clear what in practice the difference in meaning is between the two sets of wording. I would be grateful if the Minister could confirm what that is.
The same clause gives the Secretary of State powers to direct Ofcom, on national security or public safety grounds, in the case of terrorism or CSEA—child sexual exploitation and abuse—codes of practice. The Secretary of State might have some special knowledge of those, but the Government have not demonstrated why they need a power to direct. In the broadcasting regime, there are no equivalent powers, and the Secretary of State was able to resolve the case of Russia Today, on national security grounds, with public correspondence between the Secretary of State and Ofcom.
However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.
In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.
I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.
The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?
The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.
I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.
I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.
Question put, That the amendment be made.
Clauses 41 to 47 ordered to stand part of the Bill.
Question proposed, That the clause stand part of the Bill.
Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.
Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.
Question put and agreed to.
Clause 48 accordingly ordered to stand part of the Bill.
Clause 49
“Regulated user-generated content”, “user-generated content”, “news
publisher content”
This amendment would remove the exemption for comments below news articles posted online.
“(2A) Subsection (2)(e) does not apply in respect of a user-to-user service which is operated by an organisation which—
(a) is a relevant publisher (as defined in section 41 of the Crime and Courts Act 2013); and
(b) has an annual UK turnover in excess of £100 million.”
This amendment removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated content.
Below-the-line comments in newspaper articles are infamous. They are places that everybody fears to go. They are worse than Twitter. In a significant number of ways, below-the-line comments are an absolute sewer. I cannot see any reasonable excuse for them to be excluded from the Bill. We are including Twitter in the Bill; why are we not including below-the-line comments for newspapers? It does not make any sense to me; I do not see any logic.
We heard a lot of evidence relating to freedom of speech and a free press, and I absolutely, wholeheartedly agree with that. However, the amendment would not stop anyone writing a letter to the editor. It would not stop anyone engaging with newspapers in the way that they would have in the print medium. It would still allow that to happen; it would just ensure that below-the-line comments were subject to the same constraints as posts on Twitter. That is the entire point of amendment 89.
I do not think that I need to say much more, other than to add one more thing about the direction by comments to other, more radical and extreme pieces, or bits of information. It is sometimes the case that the comments on a newspaper article will direct people to even more extreme views. The newspaper article itself may be just slightly derogatory, while some of the comments may have links or references to other pieces, and other places on the internet where people can find a more radical point of view. That is exactly what happens on Twitter, and is exactly some of the stuff that we are trying to avoid—sending people down an extremist rabbit hole. I do not understand how the Minister thinks that the clause, which excludes below the line newspaper comments, is justifiable or acceptable.
Having been contacted by a number of newspapers, I understand and accept that some newspapers have moderation policies for their comments sections, but that is not strong enough. Twitter has a moderation policy, but that does not mean that there is actually any moderation, so I do not think that subjecting below-the-line comments to the provisions of the Bill is asking too much. It is completely reasonable for us to ask for this to happen, and I am honestly baffled as to why the Minister and the Government have chosen to make this exemption.
Labour has concerns about a number of subsections of the clause, including subsections (2), and (8) to (10)— commonly known as the news publisher content exemption, which I have spoken about previously. We understand that the intention of the exemption is to shield broadcasters and traditional newspaper publishers from the Bill’s regulatory effects, clause 50(2) defines a “recognised news publisher” as a regulated broadcaster or any other publisher that publishes news, has an office, and has a standards code and complaints process. There is no detail about the latter two requirements, thus enabling almost any news publishing enterprise to design its own code and complaints process, however irrational, and so benefit from the exemption. “News” is also defined broadly, and may include gossip. There remains a glaring omission, which amendment 43 addresses and which I will come to.
During an earlier sitting of the Committee, in response to comments made by my hon. Friend the Member for Liverpool, Walton as we discussed clause 2, the Minister claimed that
“The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic.”––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 204.]
Clause 49 exempts one-to-one live aural communications from the scope of regulation. Given that much interaction in virtual reality is live aural communication, including between two users, it is hard to understand how that would be covered by the Bill.
There is also an issue about what counts as content. Most standard understandings would define “content” as text, video, images and audio, but one of the worries about interactions in VR is that behaviour such as physical violence will be able to be replicated virtually, with psychologically harmful effects. It is very unclear how that would be within the scope of the current Bill, as it does not clearly involve content, so could the Minister please address that point? As he knows, Labour advocates for a systems-based approach, and for risk assessments and systems to take place in a more upstream and tech-agnostic way than under the current approach. At present, the Bill would struggle to be expanded effectively enough to cover those risks.
Amendment 43 removes comments sections operated by news websites where the publisher has a UK turnover of more than £100 million from the exemption for regulated user-generated comment. If the Bill is to be effective in protecting the public from harm, the least it must accomplish is a system of accountability that covers all the largest platforms used by British citizens. Yet as drafted, the Bill would exempt some of the most popular social media platforms online: those hosted on news publisher websites, which are otherwise known as comments sections. The amendment would close that loophole and ensure that the comments sections of the largest newspaper websites are subject to the regime of regulation set out in the Bill.
Newspaper comments sections are no different from the likes of Facebook and Twitter, in that they are social media platforms that allow users to interact with one another. This is done through comments under stories, comments in response to other comments, and other interactions—for example, likes and dislikes on posts. In some ways, their capacity to cause harm to the public is even greater: for example, their reach is in many cases larger than even the biggest of social media platforms. Whereas there are estimated to be around 18 million users of Twitter in the UK, more than twice that number of British citizens access newspaper websites every month, and the harm perpetuated on those platforms is severe.
In July 2020, the rapper Wiley posted a series of antisemitic tweets, which Twitter eventually removed after an unacceptable delay of 48 hours, but under coverage of the incident in The Sun newspaper, several explicitly antisemitic comments were posted. Those comments contained holocaust denial and alleged a global Jewish conspiracy to control the world. They remained up and accessible to The Sun’s 7 million daily readers for the best part of a week. If we exempt comments sections from the Bill’s proposed regime and the duties that the Bill sets for platforms, we will send the message that that kind of vicious, damaging and harmful racism is acceptable.
Similarly, after an antisemitic attack in the German city of Halle, racists comments followed in the comments section under the coverage in The Sun. There are more examples: Chinese people being described as locusts and attacked with other racial slurs; 5G and Bill Gates conspiracy theories under articles on the Telegraph website; and of course, the most popular targets for online abuse, women in public life. Comments that described the Vice-President of the United States as a “rat” and “ho” appeared on the MailOnline. A female union leader has faced dozens of aggressive and abusive comments about her appearance, and many of such comments remain accessible on newspaper comments sections to this day. Some of them have been up for months, others for years.
Last week, the Committee was sent a letter from a woman who was the victim of comments section abuse, Dr Corinne Fowler. Dr Fowler said of the comments that she received:
“These comments contained scores of suggestions about how to kill or injure me. Some were general ideas, such as hanging, but many were gender specific, saying that I should be burnt at the stake like a witch. Comments focused on physical violence, one man advising that I should slapped hard enough to make my teeth chatter”.
She added:
“I am a mother: without me knowing, my son (then 12 years old) read these reader comments. He became afraid for my safety.”
Without the amendment, the Bill cannot do anything to protect women such as Dr Fowler and their families from this vile online abuse, because comments sections will be entirely out of scope of the Bill’s new regime and the duties designed to protect users.
As I understand it, two arguments have been made to support the exemption. First, it is argued that the complaints handlers for the press already deal with such content, but the handler for most national newspapers, the Independent Press Standards Organisation, will not act until a complaint is made. It then takes an average of six months for a complaint to be processed, and it cannot do anything if the comments have not been moderated. The Opposition do not feel that that is a satisfactory response to the seriousness of harms that we know to occur, and which I have described. IPSO does not even have a code to deal with cases of antisemitic abuse that appeared on the comments section of The Sun. IPSO’s record speaks for itself from the examples that I have given, and the many more, and it has proven to be no solution to the severity of harms that appear in newspaper comments sections.
The second argument for an exemption is that publishers are legally responsible for what appears on comments sections, but that is only relevant for illegal harms. For everything else, from disinformation to racial prejudice and abuse, regulation is needed. That is why it is so important that the Bill does the job that we were promised. To keep the public safe from harm online, comments sections must be covered under the Bill.
The amendment is a proportionate solution to the problem of comments section abuse. It would protect user’s freedom of expression and, given that it is subject to a turnover threshold, ensure that duties and other requirements do not place a disproportionate burden on smaller publishers such as locals, independents and blogs.
I have reams and reams and reams of examples from comments sections that all fall under incredibly harmful abuse and should be covered by the Bill. I could be here for hours reading them all out, and while I do not think that anybody in Committee would like me to, I urge Committee members to take a look for themselves at the types of comments under newspaper articles and ask themselves whether those comments should be covered by the terms of the Bill. I think they know the answer.
My concern relates to subsection (5) of clause 49, which exempts one-to-one live aural communications in relation to user-to-user services. My concern relates to child sexual abuse and grooming. I am worried that exempting those one-to-one live aural communications allows bad actors, people who are out to attack children, a loophole to do that. We know that on games such as Fortnite, one-to-one aural communication happens.
I am not entirely sure how communication happens on Roblox and whether there is an opportunity for that there. However, we also know that a number of people who play online games have communication on Discord at the same time. Discord is incredibly popular, and we know that there is an opportunity for, and a prevalence of, grooming on there. I am concerned that exempting this creates a loophole for people to attack children in a way that the Minister is trying to prevent with the Bill. I understand why the clause is there but am concerned that the loophole is created.
As Opposition Members have suggested, the amendments would bring the comments that appear below the line on news websites such as The Guardian, MailOnline or the BBC into the scope of the Bill’s safety duties. They are right to point out that there are occasions when the comments posted on those sites are extremely offensive.
There are two reasons why comments below BBC, Guardian or Mail articles are excluded from the scope of the Bill. First, the news media publishers—newspapers, broadcasters and their representative industry bodies—have made the case to the Government, which we are persuaded by, that the comments section below news articles is an integral part of the process of publishing news and of what it means to have a free press. The news publishers—both newspapers and broadcasters that have websites—have made that case and have suggested, and the Government have accepted, that intruding into that space through legislation and regulation would represent an intrusion into the operation of the free press.
There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.
Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.
In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.
I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.
The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it
“is not accompanied by user-generated content of any other description”.
The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.
I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.
“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.
It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.
“anything communicated by means of an internet service”,
which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.
The remaining question—
What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.
Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.
Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.
Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.
Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.
We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.
Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.
This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.
Question put, That the amendment be made.
Question put, That the amendment be made.
Clause 49 ordered to stand part of the Bill.
“is a member of an approved regulator (as defined in section 42 of the Crime and Courts Act 2013).”
This amendment expands the definition of a recognised news publisher to incorporate any entity that is a member of an approved regulator.
The primary purpose of the Bill is to protect social media users from harm, and it will have failed if it does not achieve that. Alongside that objective, the Bill must protect freedom of expression and, in particular, the freedom of the press, which I know we are all committed to upholding and defending. However, in evaluating the balance between freedom of the press and the freedom to enjoy the digital world without encountering harm, the Bill as drafted has far too many loopholes and risks granting legal protection to those who wish to spread harmful content and disinformation in the name of journalism.
Amendment 107 will address that imbalance and protect the press and us all from harm. The media exemption in the Bill is a complete exemption, which would take content posted by news publishers entirely out of the scope of platforms’ legal duties to protect their users. Such a powerful exemption must be drafted with care to ensure it is not open to abuse. However, the criteria that organisations must meet to qualify for the exemption, which are set out in clause 50, are loose and, in some cases, almost meaningless. They are open to abuse, they are ambiguous and they confer responsibility on the platforms themselves to decide which publishers meet the Bill’s criteria and which do not.
In evidence that we heard recently, it was clear that the major platforms do not believe it is a responsibility they should be expected to bear, nor do they have the confidence or feel qualified to do so. Furthermore, David Wolfe, chairman of the Press Recognition Panel, has advised that the measure represents a threat to press freedom. I agree.
Opening the gates for any organisation to declare themselves a news publisher by obtaining a UK address, jotting down a standards code on the back of an envelope and inviting readers to send an email if they have any complaints is not defending the press; it is opening the profession up to abuse and, in the long term, risks weakening its rights and protections.
Let us discuss those who may wish to exploit that loophole and receive legal protection to publish harmful content. A number of far-right websites have made white supremacist claims and praised Holocaust deniers. Those websites already meet several of the criteria for exemption and could meet the remaining criteria overnight. The internet is full of groups that describes themselves as news publishers but distribute profoundly damaging and dangerous material designed to promote extremist ideologies and stir up hatred.
We can all think of high-profile individuals who use the internet to propagate disinformation, dangerous conspiracy theories and antisemitic, Islamophobic, homophobic or other forms of abuse. They might consider themselves journalists, but the genuine professionals whose rights we want to protect beg to differ. None of those individuals should be free to publish harmful material as a result of exemptions that are designed for quite a different purpose. Is it really the Government’s intention that any organisation that meets their loose criteria, as defined in the Bill, should be afforded the sacrosanct rights and freedoms of the press that we all seek to defend?
I turn to disinformation, and to hostile state actors who wish to sow the seeds of doubt and division in our politics and our civic life. The Committee has already heard that Russia Today is among those expected to benefit from the exemption. I have a legal opinion from Tamsin Allen, a senior media lawyer at Bindmans LLP, which notes that,
“were the bill to become law in its present form, Russia Today would benefit from the media exemption. The exemption for print and online news publications is so wide that it would encompass virtually all publishers with multiple contributors, an editor and some form of complaints procedure and standards code, no matter how inadequate. I understand that RT is subject to a standards code in Russia and operates a complaints procedure. Moreover, this exemption could also apply to a publisher promoting hate or violence, providing it met the (minimal) standards set out in the bill and constituted itself as a ‘news’ or ‘gossip’ publication. The only such publications which would not be exempt are those published by organisations proscribed under the Terrorism Act.”
If hostile foreign states can exploit this loophole in the Bill to spread disinformation to social media users in the UK, that is a matter of national security and a threat to our freedom and open democracy. The requirement to have a UK address offers little by way of protection. International publishers spreading hate, disinformation or other forms of online harm could easily set up offices in the UK to qualify for this exemption and instantly make the UK the harm capital of the world. For those reasons, the criteria must change.
We heard from several individuals in evidence that the exemption should be removed entirely from the Bill, but we are committed to freedom of the press as well as providing proper protections from harm. Instead of removing the exemption, I propose a change to the qualifying criteria to ensure that credible publishers can access it while extremist and harmful publishers cannot.
My amendment would replace the convoluted list of requirements with a single and simple requirement for the platforms to follow and adhere to: that all print and online media that seeks to benefit from the exemption should be independently regulated under the royal charter provisions that this House has already legislated for. If, as the Bill already says, broadcast media should be defined in this way, why not print media too? Unlike the Government’s criteria, the likes of Russia Today, white supremacist blogs and other deeply disturbing extremist publications simply could not satisfy this requirement. If they were ever to succeed in signing up to such a regulator, they would swiftly be expelled for repeated standards breaches.
There is no simple, agreed definition of what constitutes a recognised news publisher, and even those who have given evidence on behalf of the press have conceded that, but we must find a way to navigate this challenge. As drafted, the Bill does not do that. I am open to working with colleagues from all parties to tweak and improve this amendment, and to find an acceptable and agreed way to secure the balance we all wish to see. However, so far I have not seen or heard a better way to tighten the definitions in the Bill so as to achieve this balance, and I believe this amendment is an important step in the right direction.
The clause, as drafted, has been looked at in some detail over a number of years and debated with news publishers and others. It is the best attempt that we have so far collectively been able to come up with to provide a definition of a news publisher that does not infringe on press freedom. The Government are concerned that if the amendment were adopted, it would effectively require news publishers to register with a regulator in order to benefit from the exemption. That would constitute the imposition of a mandatory press regulator by the back door. I put on record that this Government do not support any kind of mandatory or statutory press regulation, in any form, for reasons of freedom of the press. Despite what has been said in previous debates, we think to do that would unreasonably restrict the freedom of the press in this country.
While I understand its intention, the amendment would drive news media organisations, both print and broadcast, into the arms of a regulator, because they would have to join one in order to get the exemption. We do not think it is right to create that obligation. We have reached the philosophical position that statutory or mandatory regulation of the press is incompatible with press freedom. We have been clear about that general principle and cannot accept the amendment, which would violate that principle.
In relation to hostile states, such as Russia, I do not think anyone in the UK press would have the slightest objection to us finding ways to tighten up on such matters. As I have flagged previously, thought is being given to that issue, but in terms of the freedom of the domestic press, we feel very strongly that pushing people towards a regulator is inappropriate in the context of a free press.
The characterisation of these provisions is a little unfair, because some of the requirements are not trivial. The requirement in 50(2)(f) is that there must be a person—I think it includes a legal person as well as an actual person—who has legal responsibility for the material published, which means that, unlike with pretty much everything that appears on the internet, there is an identified person who has legal responsibility. That is a very important requirement. Some of the other requirements, such as having a registered address and a standards code, are relatively easy to meet, but the point about legal responsibility is very important. For that reason, I respectfully resist the amendment.
Amendment, by leave, withdrawn.
“or special interest news material”.
Amendment 87, in clause 50, page 47, line 28, leave out the first “is” and insert—
“and special interest news material are”.
Amendment 88, in clause 50, page 47, line 42, at end insert—
““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”
Specialist publishers provide unparalleled insights into areas that broader news management organisations will likely not analyse, and it would surely be foolish to dismiss and damage specialist publications in a world where disinformation is becoming ever more prevalent. The former Secretary of State, the right hon. Member for Maldon (Mr Whittingdale), also raised this issue on Second Reading, where he stated that specialist publishers
“deserve the same level of protection.”—[Official Report, 19 April 2022; Vol. 712, c. 109.]
Part of the rationale for having the news publishers exemption in the Bill is that it means that the press will not be double-regulated. Special interest material is already regulated, so it should benefit from the same exemptions.
That brings me on to a second point. Only a few minutes ago, the hon. Member for Batley and Spen drew the Committee’s attention to the risks inherent in the clause that a bad actor could seek to exploit. It was reasonable of her to do so. Clearly, however, the more widely we draft the clause—if we include specialist publications such as Gardeners’ World, whose circulation will no doubt soar on the back of this debate—the greater the risk of bad actors exploiting the exemption.
My third point is about undue burdens being placed on publications. To the extent that such entities count as social media platforms—in-scope services—the most onerous duties under the Bill apply only to category 1 companies, or the very biggest firms such as Facebook and so on. The “legal but harmful” duties and many of the risk assessment duties would not apply to many organisations. In fact, I think I am right to say that if the only functionality on their websites is user comments, they would in any case be outside the scope of the Bill. I have to confess that I am not intimately familiar with the functionality of the Gardeners’ World website, but there is a good chance that if all it does is to provide the opportunity to post comments and similar things, it would be outside the scope of the Bill anyway, because it does not have the requisite functionality.
I understand the point made by the hon. Member for Ochil and South Perthshire, we will, respectfully, resist the amendment for the many reasons I have given.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
First, the exemption is open to abuse. Almost any organisation could develop a standards code and complaints process to define itself as a news publisher and benefit from the exemption. Under those rules, as outlined eloquently by my hon. Friend the Member for Batley and Spen, Russia Today already qualifies, and various extremist publishers could easily join it. Organisations will be able to spread seriously harmful content with impunity—I referred to many in my earlier contributions, and I have paid for that online.
Secondly, the exemption is unjustified, as we heard loud and clear during the oral evidence sessions. I recall that Kyle from FairVote made that point particularly clearly. There are already rigorous safeguards in the Bill to protect freedom of expression. The fact that content is posted by a news provider should not itself be sufficient reason to treat such content differently from that which is posted by private citizens.
Furthermore, quality publications with high standards stand to miss out on the exemption. The Minister must also see the lack of parity in the broadcast media space. In order for broadcast media to benefit from the exemption, they must be regulated by Ofcom, and yet there is no parallel stipulation for non-broadcast media to be regulated in order to benefit. How is that fair? For broadcast media, the requirement to be regulated by Ofcom is simple, but for non-broadcast media, the series of requirements are not rational, exclude many independent publishers and leave room for ambiguity.
On the hon. Member for Aberdeen North’s question about where the Bill states that sites with limited functionality—for example, functionality limited to comments alone—are out of scope, paragraph 4(1) of schedule 1 states that
“A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—
(a) posting comments or reviews relating to provider content;
(b) sharing such comments or reviews on a different internet service”.
Clearly, services where a user can share freely are in scope, but if they cannot share directly—if they can only share via another service, such as Facebook—that service is out of scope. This speaks to the point that I made to the hon. Member for Batley and Spen in a previous debate about the level of virality, because the ability of content to spread, proliferate, and be forced down people’s throats is one of the main risks that we are seeking to address through the Bill. I hope that paragraph 4(1) of schedule 1 is of assistance, but I am happy to discuss the matter further if that would be helpful.
Question put and agreed to.
Clause 50 accordingly ordered to stand part of the Bill.
Clause 51
“Search content”, “search results” etc
Question proposed, That the clause stand part of the Bill.
However, we have issues with the way that the Bill treats user-to-user services and search services differently when it comes to risk-assessing and addressing legal harm—an issue that we will come on to when we debate schedule 10. Although search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars are fundamentally their responsibility. We do, however, accept that over the past 20 years, Google, for example, has developed mechanisms to provide a safer search experience for users while not curtailing access to lawful information. We also agree that search engines are critical to the proper functioning of the world wide web; they play a uniquely important role in facilitating access to the internet, and enable people to access, impart, and disseminate information.
Question put and agreed to.
Clause 51 accordingly ordered to stand part of the Bill.
Clause 52
“Illegal content” etc
“(4A) An offence referred to in subsection (4) is deemed to have occurred if it would be an offence under the law of the United Kingdom regardless of whether or not it did take place in the United Kingdom.”
This amendment brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.
Clause stand part.
That schedules 5 and 6 be the Fifth and Sixth schedules to the Bill.
“content that amounts to a relevant offence.”
However, as the Minister will know from representations from Carnegie UK to his Department—we share its concerns—the illegal and priority illegal regimes may not be able to operate as intended. The Bill requires companies to decide whether content “amounts to” an offence, with limited room for movement. We share concerns that that points towards decisions on an item-by-item basis; it means detecting intent for each piece of content. However, such an approach does not work at the scale on which platforms operate; it is bad regulation and poor risk management.
There seem to be two different problems relating to the definition of “illegal content” in clause 52. The first is that it is unclear whether we are talking about individual items of content or categories of content—the word “content” is ambiguous because it can be singular or plural—which is a problem for an obligation to design and run a system. Secondly, determining when an offence has taken place will be complex, especially bearing in mind mens rea and defences, so the providers are not in a position to get it right.
The use of the phrase “amounts to” in clause 52(2) seems to suggest that platforms will be required to identify accurately, in individual cases, where an offence has been committed, without any wriggle room drafted in, unlike in the draft Bill. As the definition now contains no space for error either side of the line, it could be argued that there are more incentives to avoid false negatives than false positives—providers can set higher standards than the criminal law—and that leads to a greater risk of content removal. That becomes problematic, because it seems that the obligation under clause 9(3) is then to have a system that is accurate in all cases, whereas it would be more natural to deal with categories of content. This approach seems not to be intended; support for that perspective can be drawn from clause 9(6), which recognises that there is a distinction between categories of content and individual items, and that the application of terms of service might specifically have to deal with individual instances of content. Critically, the “amounts to” approach cannot work in conjunction with a systems-based approach to harm reduction. That leaves victims highly vulnerable.
This problem is easily fixed by a combination of reverting to the draft Bill’s language, which required reasonableness, and using concepts found elsewhere in the Bill that enable a harm mitigation system to operate for illegal content. We also remind the Minister that Ofcom raised this issue in the evidence sessions. I would be grateful if the Minister confirmed whether we can expect a Government amendment to rectify this issue shortly.
More broadly, as we know, priority illegal content, which falls within illegal content, includes,
“(a) terrorism content,
(b) CSEA content, and
(c) content that amounts to an offence specified in Schedule 7”,
as set out in clause 52(7). Such content attracts a greater level of scrutiny and regulation. Situations in which user-generated content will amount to “a relevant offence” are set out in clause 52(3). Labour supports the inclusion of a definition of illegal content as outlined in the grouping; it is vital that service providers and platforms have a clear indication of the types of content that they will have a statutory duty to consider when building, or making changes to the back end of, their business models.
We have also spoken about the importance of parity between the online and offline spaces—what is illegal offline must be illegal online—so the Minister knows we have more work to do here. He also knows that we have broad concerns around the omissions in the Bill. While we welcome the inclusion of terrorism and child sexual exploitation content as priority illegal content, there remain gaps in addressing violence against women and girls content, which we all know is hugely detrimental to many online.
The UK Government stated that their intention for the Online Safety Bill was to make the UK the safest place to be online in the world, yet the Bill does not mention online gender-based violence once. More than 60,000 people have signed the Glitch and End Violence Against Women Coalition’s petition calling for women and girls to be included in the Bill, so the time to act is now. We all have a right to not just survive but thrive, engage and play online, and not have our freedom of expression curtailed or our voices silenced by perpetrators of abuse. The online space is just as real as the offline space. The Online Safety Bill is our opportunity to create safe digital spaces.
The Bill must name the problem. Violence against women and girls, particularly those who have one or multiple protected characteristics, is creating harm and inequality online. We must actively and meaningfully name this issue and take an intersectional approach to ending online abuse to ensure that the Bill brings meaningful change for all women. We also must ensure that the Bill truly covers all illegal content, whether it originated in the UK or not.
Amendment 61 brings offences committed overseas within the scope of relevant offences for the purposes of defining illegal content. The aim of the amendment is to clarify whether the Bill covers content created overseas that would be illegal if what was shown in the content took place in the UK. For example, animal abuse and cruelty content is often filmed abroad. The same can be said for dreadful human trafficking content and child sexual exploitation. The optimal protection would be if the Bill’s definition of illegal content covered matter that would be illegal in either the UK or the country it took place in, regardless of whether it originated in the UK.
The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:
“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]
That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.
The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.
The Government are also putting in place much needed and important laws around cyber-flashing, as many of us hoped they would, and have been campaigning very hard on, because taking pictures of male genitalia, predominantly, and sending them to, predominantly, women is a form of abuse, harm and violence towards women through intimidation. The Government are trying to keep up with this fast-moving environment, but this legislation will only be as good as the criminal laws contained within it. The Government need to continue to future-proof the legislation, and to demonstrate that they see these sorts of offences as a priority.
The Government commissioned the Law Commission to undertake a significant piece of professional evaluation of how fit for purpose the laws are on the online posting of intimate images without consent. The Law Commission found the situation wanting to the greatest degree, and is consulting on producing legal recommendations. Those are not in the Bill, which is an enormous shame; those recommendations are perhaps even now with the Government for consideration, but unfortunately they have not yet been published.
I am concerned that we are missing an opportunity to tackle an issue that is an overwhelming problem for many women in this country, and I hope that when the Minister responds to this part of the debate, he can clearly set out the Government’s intention to tackle the issue. We all know that parliamentary time is in short supply: the Government have many Bills that they have to get through in this Session, before the next general election. I am concerned that this particular issue, which the Law Commission itself sees as so important, may not get the rapid legislation that we, as elected representatives, need to see happen. The foundation of the Bill is a duty of care, but that duty of care is only as good as the criminal law. If the criminal law is wanting when it comes to the publication online of intimate images, that is the taking, making and sharing of intimate images without consent—if that is not adequately covered in the criminal law—this legislation will not help the many people we want it to help. Will the Minister, in responding to the debate, outline in some detail, if possible, how he will handle the issue and when he hopes to make public the Law Commission recommendations, for which many people have been waiting for many years?
A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:
“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”
What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.
The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.
The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.
The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.
As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.
My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.
Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.
My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.
Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.
Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.
I hope that addresses all the questions that have been raised by the Committee. Although the shadow Minister is right to raise the question, I respectfully ask her to withdraw amendment 61 on the basis that those matters are clearly covered in clause 52(9). I commend the clause to the Committee.
“offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.”
Question put, That the amendment be made.
Schedules 5 and 6 agreed to.
Ordered, That further consideration be now adjourned.—(Steve Double.)
Contains Parliamentary information licensed under the Open Parliament Licence v3.0.