PARLIAMENTARY DEBATE
Online Safety Bill (Eleventh sitting) - 16 June 2022 (Commons/Public Bill Committees)
Debate Detail
Chair(s) † Sir Roger Gale, Christina Rees
MembersAnsell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
Mishra, Navendu (Stockport) (Lab)
† Moore, Damien (Southport) (Con)
Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
ClerksKatya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill CommitteeThursday 16 June 2022
(Morning)
[Sir Roger Gale in the Chair]
Online Safety Bill
Clause 69
OFCOM’s guidance about duties set out in section 68
“within six months of this Act being passed”.
As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.
Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.
The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.
Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.
Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.
Question put, That the amendment be made.
Question proposed, That the clause stand part of the Bill.
We welcome the “polluter pays” principle on which this and the following clauses are founded. Clause 70 establishes a duty for providers to notify Ofcom if their revenue is at or above the specified threshold designated by Ofcom and approved by the Secretary of State. It also creates duties on providers to provide timely notice and evidence of meeting the threshold. The Opposition do not oppose those duties. However, I would be grateful if the Minister could clarify what might lead to a provider or groups of providers being exempt from paying the fee. Subsection (6) establishes that
“OFCOM may provide that particular descriptions of providers of regulated services are exempt”,
subject to the Secretary of State’s approval. Our question is what kinds of services the Minister has in mind for that exemption.
Turning to clauses 71 to 76, as I mentioned, it is appropriate that the cost to Ofcom of exercising its online safety functions is paid through an annual industry fee, charged to the biggest companies with the highest revenues, and that smaller companies are exempt but still regulated. It is also welcome that under clause 71, Ofcom can make reference to factors beyond the provider’s qualifying worldwide revenue when determining the fee that a company must pay. Acknowledging the importance of other factors when computing that fee can allow for a greater burden of the fees to fall on companies whose activities may disproportionately increase Ofcom’s work on improving safety.
My hon. Friend the Member for Pontypridd has already raised our concerns about the level of funding needed for Ofcom to carry out its duties under the Bill. She asked about the creation of a new role: that of an adviser on funding for the online safety regulator. The impact assessment states that the industry fee will need to average around £35 million a year for the next 10 years to pay for operating expenditure. Last week, the Minister referred to a figure of around £88 million that has been announced to cover the first two years of the regime while the industry levy is implemented, and the same figure was used on Second Reading by the Secretary of State. Last October’s autumn Budget and spending review refers on page 115 to
“over £110 million over the SR21 period for the government’s new online safety regime through the passage and implementation of the Online Safety Bill, delivering on the government’s commitment to make the UK the safest place to be online.”
There is no reference to the £88 million figure or to Ofcom in the spending review document. Could the Minister tell us a bit more about that £88 million and the rest of the £110 million announced in the spending review, as it is relevant to how Ofcom is going to be resourced and the industry levy that is introduced by these clauses?
The Opposition feel it is critical that when the Bill comes into force, there is no gap in funding that would prevent Ofcom from carrying out its duties. The most obvious problem is that the level of funding set out in the spending review was determined when the Bill was in draft form, before more harms were brought into scope. The Department for Digital, Culture, Media and Sport has also confirmed that the figure of £34.9 million a year that is needed for Ofcom to carry out its online safety duties was based on the draft Bill.
We welcome many of the additional duties included in the Bill since its drafting, such as on fraudulent advertising, but does the Minister think the same level of funding will be adequate as when the calculation was made, when the Bill was in draft form? Will he reconsider the calculations his Department has made of the level of funding that Ofcom will need for this regime to be effective in the light of the increased workload that this latest version of the Bill introduces?
In March 2021, Ofcom put out a press release stating that 150 people would be employed in the new digital and technology hub in Manchester, but that that number would be reached in 2025. Therefore, as well as the level of resource being based on an old version of the Bill, the timeframe reveals a gap of three years until all the staff are in place. Does the Minister believe that Ofcom will have everything that is needed from the start, and in subsequent years as the levy gets up and going, in order to carry out its duties?
Of course, this will depend on how long the levy might need to be in place. My understanding of the timeframe is that first, the Secretary of State must issue guidance to Ofcom about the principles to be included in the statement of principles that Ofcom will use to determine the fees payable under clause 71. Ofcom must consult with those affected by the threshold amount to inform the final figure it recommends to the Secretary of State, and must produce a statement about what amounts comprise the provider’s qualifying world revenue and the qualifying period. That figure and Ofcom’s guidance must be agreed by the Secretary of State and laid before Parliament. Based on those checks and processes, how quickly does the Minister envisage the levy coming into force?
The Minister said last week that Ofcom is resourced for this work until 2023-24. Will the levy be in place by then to fund Ofcom’s safety work into 2024-25? If not, can the Minister confirm that the Government will cover any gaps in funding? I am sure he will agree, as we all do, that the duties in the Bill must be implemented as quickly as possible, but the necessary funding must also be in place so that Ofcom as a regulator can enforce the safety duty.
“sufficient to meet, but…not exceed the annual cost to OFCOM”.
That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.
It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.
The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where
“OFCOM consider that an exemption…is appropriate”
and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.
On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.
The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.
Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.
Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.
Question put and agreed to.
Clause 70 accordingly ordered to stand part of the Bill.
Clauses 71 to 76 ordered to stand part of the Bill.
Clause 77
General duties of OFCOM under section 3 of the Communications Act
Question proposed, That the clause stand part of the Bill.
As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to
“the size or capacity of the provider”,
and to
“the level of risk of harm presented by the service in question, and the severity of the potential harm”.
We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.
Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.
Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.
Question put and agreed to.
Clause 77 accordingly ordered to stand part of the Bill.
Clauses 78 and 79 ordered to stand part of the Bill.
Clause 80
Meaning of threshold conditions etc
Question proposed, That the clause stand part of the Bill.
Amendment 80, in schedule 10, page 192, line, at end insert—
“(c) the assessed risk of harm arising from that part of the service.”
This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.
Amendment 81, in schedule 10, page 192, line 39, after “functionality” insert—
“and at least one specified condition about the assessed risk of harm”
This amendment is linked to Amendment 80.
Amendment 82, in schedule 10, page 192, line 41, at end insert—
‘(4A) At least one specified condition about the assessed risk of harm must provide for a service assessed as posing a very high risk of harm to its users to meet the Category 1 threshold.”
This amendment is linked to Amendment 80, it widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.
That schedule 10 be the Tenth schedule to the Bill.
Clause 81 stand part.
Clause 82 stand part.
I want to talk about my amendment, and I start with a quote from the Minister on Second Reading:
“A number of Members…have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered.”—[Official Report, 19 April 2022; Vol. 712, c. 133.]
I appreciate that the Minister may still be thinking about that. He might accept all of our amendments; that is entirely possible, although I am not sure there is any precedent. The possibility is there that that might happen.
Given how strong I felt that the Minister was on the issue on Second Reading, I am deeply disappointed that there are no Government amendments to this section of the Bill. I am disappointed because of the massive risk of harm caused by some very small platforms—it is not a massive number—where extreme behaviour and radicalisation is allowed to thrive. It is not just about the harm to those individuals who spend time on those platforms and who are radicalised, presented with misinformation and encouraged to go down rabbit holes and become more and more extreme in their views. It is also about the risk of harm to other people as a result of the behaviour inspired in those individuals. We are talking about Jo Cox today; she is in our memories and thoughts. Those small platforms are the ones that are most likely to encourage individuals towards extremely violent acts.
If the Bill is to fulfil its stated aims and take the action we all want to see to prevent the creation of those most heinous, awful crimes, it needs to be much stronger on small, very high-risk platforms. I will make no apologies for that. I do not care if those platforms have small amounts of profits. They are encouraging and allowing the worst behaviours to thrive on their platforms. They should be held to a higher level of accountability. It is not too much to ask to class them as category 1 platforms. It is not too much to ask them to comply with a higher level of risk assessment requirements and a higher level of oversight from Ofcom. It is not too much to ask because of the massive risk of harm they pose and the massive actual harm that they create.
Those platforms should be punished for that. It is one thing to punish and criminalise the behaviour of users on those platforms—individual users create and propagate illegal content or radicalise other users—but the Bill does not go far enough in holding those platforms to account for allowing that to take place. They know that it is happening. Those platforms are set up as an alternative place—a place that people are allowed to be far more radical that they are on Twitter, YouTube, Twitch or Discord. None of those larger platforms have much moderation, but the smaller platforms encourage such behaviour. Links are put on other sites pointing to those platforms. For example, when people read vaccine misinformation, there are links posted to more radical, smaller platforms. I exclude Discord because, given its number of users, I think it would be included in one of the larger-platform categories anyway. It is not that there is not radical behaviour on Discord—there is—but I think the size of its membership excludes it, in my head certainly, from the category of the very smallest platforms that pose the highest risk.
We all know from our inboxes the number of people who contact us saying that 5G is the Government trying to take over their brains, or that the entire world is run by Jewish lizard people. We get those emails on a regular basis and those theories are propagated on the smallest platforms. Fair enough—some people may not take any action as a result of the radicalisation that they have experienced as a result of their very extreme views. But some people will take action and that action may be simply enough to harm their friends or family, it may be simply enough to exclude them and drag them away from the society or community that they were previously members of or it might, in really tragic cases, be far more extreme. It might lead people to cause physical or mental harm to others intentionally as a result of the beliefs that they have had created and fostered on those platforms.
That is why we have tabled the amendments. This is the one area that the Government have most significantly failed in writing this Bill, by not ensuring that the small, very high-risk platforms are held to the highest level of accountability and are punished for allowing these behaviours to thrive on their platforms. I give the Minister fair warning that unless he chooses to accept the amendments, I intend to push them to a vote. I would appreciate it if he gave assurances, but I do not believe that any reassurance that he could give would compare to having such a measure in the Bill. As I say, for me the lack of this provision is the biggest failing of the entire Bill.
The Minister knows my feelings on the Government’s approach to categorisation services; he has heard my concerns time and time again. However, it is not just me who believes that the Government have got their approach really wrong. It is also stakeholders far and wide. In our evidence sessions, we heard from HOPE not hate and the Antisemitism Policy Trust specifically on this issue. In its current form, the categorisation process is based on size versus harm, which is a fundamentally flawed approach.
The Government’s response to the Joint Committee that scrutinised the draft Bill makes it clear that they consider that reach is a key and proportional consideration when assigning categories and that they believe that the Secretary of State’s powers to amend those categories are sufficient to protect people. Unfortunately, that leaves many alternative platforms out of category 1, even if they host large volumes of harmful material.
The duty of care approach that essentially governs the Bill is predicated on risk assessment. If size allows platforms to dodge the entry criteria for managing high risk, there is a massive hole in the regime. Some platforms have already been mentioned, including BitChute, Gab and 4chan, which host extreme racist, misogynistic, homophobic and other extreme content that radicalises people and incites harm. And the Minister knows that.
I take this opportunity to pay tribute to my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), who has campaigned heavily on the issue since the horrendous and tragic shooting in Keyham in his constituency. One of my big concerns about the lack of focus on violence against women and girls in the Bill, which we have mentioned time and time again, is the potential for the rise of incel culture online, which is very heavily reported on these alternative platforms—these high-harm, high-risk platforms.
I will just give one example. A teacher contacted me about the Bill. She talked about the rise of misogyny and trying to educate her class on what was happening. At the end of the class, a 15-year-old boy—I appreciate that he is under 18 and is a child, so would come under a different category within the Bill, but I will still give the example. He came up to her and said: “Miss, I need to chat to you. This is something I’m really concerned about. All I did was google, ‘Why can’t I get a girlfriend?’” He had been led down a rabbit hole into a warren of alternative platforms that tried to radicalise him with the most extreme content of incel culture: women are evil; women are the ones who are wrong; it is women he should hate; it is his birth right to have a girlfriend, and he should have one; and he should hate women. That is the type of content that is on those platforms that young, impressionable minds are being pointed towards. They are being radicalised and it is sadly leading to incredibly tragic circumstances, so I really want to push the Minister on the subject.
We share the overarching view of many others that this crucial risk needs to be factored into the classification process that determines which companies are placed in category 1. Otherwise, the Bill risks failing to protect adults from substantial amounts of material that causes physical and psychological harm. Schedule 10 needs to be amended to reflect that.
There are therefore important public health reasons to minimise the discussion of dangerous and effective suicide methods and avoid discussion of them in the public domain. Addressing the most dangerous suicide-related content is an area where the Bill could really save lives. It is therefore inexplicable that a Bill intended to increase online safety does not seek to do that.
The Bill’s own pre-legislative scrutiny Committee recommended that the legislation should
“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model.”
The Government replied that they
“want the Bill to be targeted and proportionate for businesses and Ofcom and do not wish to impose disproportionate burdens on small companies.”
It is, though, entirely appropriate to place a major regulatory burden on small companies that facilitate the glorification of suicide and the sharing of dangerous methods through their forums. It is behaviour that is extraordinarily damaging to public health and makes no meaningful economic or social contribution.
Amendment 82 is vital to our overarching aim of having an assessed risk of harm at the heart of the Bill. The categorisation system is not fit for purpose and will fail to capture so many of the extremely harmful services that many of us have already spoken about.
“various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads…with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement.”
A week or so before the evidence sitting,
“he targeted and killed 10 people in Buffalo. One of the things that he posted was:
‘Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/’—
which is a thread on the small 4chan platform—
‘then my motivation returns’.”
Danny Stone told us that the kind of material we are seeing, which is legal but harmful, is inspiring people to go out and create real-world harm. When my hon. Friend the Member for Pontypridd asked him how to amend this approach, he said:
“You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 128, Q203-204.]
I do hope that, as my hon. Friend urges, the Minister will look at all these options, because this is a very serious matter.
There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.
Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.
We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?
We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.
Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established
“as soon as reasonably practicable”,
could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?
Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.
It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.
It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.
I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.
I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.
We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.
Question put and agreed to.
Clause 80 accordingly ordered to stand part of the Bill.
Schedule 10
Categories of regulated user-to-user services and regulated search services: regulations
Amendment proposed: 80,in schedule 10, page 192, line 19, at end insert—
“(c) the assessed risk of harm arising from that part of the service.”—(Kirsty Blackman.)
This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.
Schedule 10 agreed to.
Clauses 81 and 82 ordered to stand part of the Bill.
Clause 83
OFCOM’s register of risks, and risk profiles, of Part 3
“(d) the risk of harm posed by individuals in the United Kingdom in relation to adults and children in the UK or elsewhere through the production, publication and dissemination of illegal content.”
This amendment requires the Ofcom’s risk assessment to consider risks to adults and children through the production, publication and dissemination of illegal content.
Labour welcomes clause 83, which places a duty on Ofcom to carry out risk assessments to identify and assess a range of potential risks of harm presented by part 3 services. However we are concerned about subsection (9), which says:
“OFCOM must from time to time review and revise the risk assessments and risk profiles so as to keep them up to date”
That seems a fairly woolly concept even for the Minister to try to defend, so I would be grateful if he clarified exactly what demands will be placed on Ofcom to review those risk assessments and risk profiles. He will know that those are absolutely central to the Bill, so some clarification is required here. Despite that, Labour agrees that it will be a significant advantage for Ofcom to oversee the risk of harm presented by the regulated services.
However, harm should not be limited to those in the UK. Amendment 34 would therefore require Ofcom’s risk assessment to consider risks to adults and children throughout the production, publication and dissemination of illegal content. I have already spoken on this issue, in the debate on amendment 25 to clause 8, so I will keep my comments brief. As the Minister knows, online harms are global in nature, and amendment 34 seeks to ensure that the risk of harm presented by regulated services is not just limited to those in the UK. As we have mentioned previously, research shows us that there is some very damaging, often sexually violent, content being streamed abroad. Labour fears that the current provisions in the legislation will not be far-reaching enough to capture the true essence of the risk of harm that people may face when online.
Labour supports the intentions of clause 84, which outlines that Ofcom must produce guidance to assist providers in complying with their duties to carry out illegal content risk assessments
“As soon as reasonably practicable”.
Of course, the Minister will not be surprised that Labour has slight reservations about the timing around those important duties, so I would appreciate an update from the Minister on the conversations he has had with Ofcom about the practicalities of its duties.
First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning
“read, view, hear or otherwise experience content”.
As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.
Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.
As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.
Amendment, by leave, withdrawn.
Clause 83 ordered to stand part of the Bill.
Clause 84 ordered to stand part of the Bill.
Clause 85
Power to require information
Clauses 86 to 91 stand part.
Schedule 11 stand part.
Labour also supports clause 86, and we particularly welcome the clarification that Ofcom may require the provision of information in any form. If we are to truly give Ofcom the power to regulate and, where necessary, investigate service providers, we must ensure that it has sufficient legislative tools to rely on.
The Bill gives some strong powers to Ofcom. We support the requirement in clause 87 to name a senior manager, but again, we feel those provisions should go further. Both users and Ofcom must have access to the full range of tools they need to hold the tech giants to account. As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator, and even then, those measures might not come in until two years after the Bill is in place. Surely the top bosses at social media companies should be held criminally liable for systemic and repeated failures to ensure online safety as soon as the Bill comes into force, so can the Minister explain the reasons for the delay?
The Minister will be happy to hear that Labour supports clause 88. It is important to have an outline on the face of the Bill of the circumstances in which Ofcom can require a report from a skilled person. It is also important that Ofcom has the power to appoint, or give notice to a provider requiring them to appoint, a skilled person, as Labour fears that without those provisions in subsections (3) and (4), the ambiguity around defining a so-called skilled person could be detrimental. We therefore support the clause, and have not sought to amend it at this stage.
Again, Labour supports all the intentions of clause 89 in the interests of online safety more widely. Of course, Ofcom must have the power to force a company to co-operate with an investigation.
Again, we support the need for clause 90, which gives Ofcom the power to require an individual to attend an interview. That is particularly important in the instances outlined in subsection (1), whereby Ofcom is carrying out an investigation into the failure or possible failure of a provider of a regulated service to comply with a relevant requirement. Labour has repeatedly called for such personal responsibility, so we are pleased that the Government are ensuring that the Bill includes sufficient powers for Ofcom to allow proper scrutiny.
Labour supports clause 91 and schedule 11, which outlines in detail Ofcom’s powers of entry, inspection and audit. I did not think we would support this much, but clearly we do. We want to work with the Government to get this right, and we see ensuring Ofcom has those important authorisation powers as central to it establishing itself as a viable regulator of the online space, both now and for generations to come. We will support and have not sought to amend the clauses or schedule 11 for the reasons set out.
On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.
On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.
To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.
The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.
Question put and agreed to.
Clause 85 accordingly ordered to stand part of the Bill.
Clauses 86 to 91 ordered to stand part of the Bill.
Schedule 11
OFCOM’s powers of entry, inspection and audit
Amendment made: 4, in schedule 11, page 202, line 17, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”.—(Chris Philp.)
Schedule 11, as amended, agreed to.
Clause 92
Offences in connection with information notices
Question proposed, That the clause stand part of the Bill.
As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.
As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.
Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.
Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.
There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.
As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.
Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.
The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.
Question put and agreed to.
Clause 92 accordingly ordered to stand part of the Bill.
Clauses 93 to 95 ordered to stand part of the Bill.
Clause 96
Penalties for information offences
Amendment made: 2, in clause 96, page 83, line 15, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”—(Chris Philp.)
Clause 96, as amended, ordered to stand part of the Bill.
Clause 97
Co-operation and disclosure of information: overseas regulators
Question proposed, That the clause stand part of the Bill.
However, we do have concerns about subsection (2), which states:
“The power conferred by subsection (1) applies only in relation to an overseas regulator for the time being specified in regulations made by the Secretary of State.”
Can the Minister confirm exactly how that will work in practice? He knows that Labour Members have tabled important amendments to clause 123. Amendments 50 and 51, which we will consider later, aim to ensure that Ofcom has the power to co-operate and take action through the courts where necessary. The same issue applies here: Ofcom must be compelled and have the tools available at its disposal to work internationally where required.
Labour supports clause 98, which amends section 393 of the Communications Act 2003 to include new provisions. That is obviously a vital step, and we particularly welcome subsection (2), which outlined that, subject to the specific exceptions in section 393 of the 2003 Act, Ofcom cannot disclose information with respect to a business that it has obtained by exercising its powers under this Bill without the consent of the business in question. This is once again an important step in encouraging transparency across the board.
We support clause 99, which places a duty on Ofcom to consult the relevant intelligence service before Ofcom discloses or publishes any information that it has received from that intelligence service. For reasons of national security, it is vital that the relevant intelligence service is included in Ofcom’s reasoning and approach to the Bill more widely.
We broadly support the intentions of clause 100. It is vital that Ofcom is encouraged to provide information to the Secretary of State of the day, but I would be grateful if the Minister could confirm exactly how the power will function in reality. Provision of information to assist in the formulation of policy, as we know, is a very broad spectrum in the Communications Act. We want to make sure the powers are not abused—I know that is a concern shared on his own Back Benches—so I would be grateful for the Minister’s honest assessment of the situation.
We welcome clause 101, which amends section 26 of the Communications Act and provides for publication of information and advice for various persons, such as consumers. Labour supports the clause as it stands. We also welcome clause 102, which, importantly, sets out the circumstances in which a statement given to Ofcom can be used in evidence against that person. Again, this is an important clause in ensuring that Ofcom has the powers it needs to truly act as a world-leading regulator, which we all want it to be. Labour supports it and has chosen not to table any amendments.
The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.
It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.
The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.
Question put and agreed to.
Clause 97 accordingly ordered to stand part of the Bill.
Clauses 98 to 102 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Steve Double.)
Contains Parliamentary information licensed under the Open Parliament Licence v3.0.