PARLIAMENTARY DEBATE
Online Safety Bill (Eighth sitting) - 9 June 2022 (Commons/Public Bill Committees)
Debate Detail
Chair(s) Sir Roger Gale, † Christina Rees
Members† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
ClerksKatya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill CommitteeThursday 9 June 2022
(Afternoon)
[Christina Rees in the Chair]
Online Safety BillAmendment proposed (this day): 22, in clause 31, page 31, line 17, leave out subsection (3).—(Barbara Keeley.)
Question again proposed, That the amendment be made.
Clause stand part.
Clause 32 stand part.
That schedule 3 be the Third schedule to the Bill.
Clause 33 stand part.
Question put, That the amendment be made.
Clause 32 ordered to stand part of the Bill.
Schedule 3 agreed to.
Clause 33 ordered to stand part of the Bill.
Amendment 24, in clause 35, page 34, line 34, after “service” insert “that targets users”.
New clause 5—Duty to distinguish paid-for advertisements—
“(1) A provider of a Category 2A service must operate the service using systems and processes designed to clearly distinguish to users of that service paid-for advertisements from all other content appearing in or via search results of the service.
(2) The systems and processes described under subsection (1)—
(a) must include clearly displaying the words “paid-for advertisement” next to any paid-for advertisement appearing in or via search results of the service, and
(b) may include measures such as but not limited to the application of colour schemes to paid-for advertisements appearing in or via search results of the service.
(3) The reference to paid-for advertisements appearing “in or via search results of a search service” does not include a reference to any advertisements appearing as a result of any subsequent interaction by a user with an internet service other than the search service.
(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.
(5) The duties set out in this section extend to the design, operation and use of a Category 2A service that hosts paid-for advertisements targeted at users of that service in the United Kingdom.
(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).
(7) For the meaning of “paid-for advertisement”, see section 189 (interpretation: general).”
New clause 6—Duty to verify advertisements—
“(1) A provider of a Category 2A service must operate an advertisement verification process for any relevant advertisement appearing in or via search results of the service.
(2) In this section, “relevant advertisement” means any advertisement for a service or product to be designated in regulations made by the Secretary of State.
(3) The verification process under subsection (1) must include a requirement for advertisers to demonstrate that they are authorised by a UK regulatory body.
(4) In this section, “UK regulatory body” means a UK regulator responsible for the regulation of a particular service or product to be designated in regulations made by the Secretary of State.
(5) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.
(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).
(7) Regulations under this section shall be made by statutory instrument.
(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”
It is welcome that, after much flip-flopping, the Government have finally conceded to Labour’s calls and those of many campaign groups to include a broad duty to tackle fraudulent advertising on search engines through chapter 5 of part 3 of the Bill. We know that existing laws to protect consumers in the online world have failed to keep pace with the actors attempting to exploit them, and that is particularly true of scams and fraudulent advertisements.
Statistics show a steep increase in this type of crime in the online world, although those figures are likely to be a significant underestimate and do not capture the devastating emotional impact that scams have on their victims. The scale of the problem is large and it is growing.
The Financial Conduct Authority estimates that fraud costs the UK up to £190 billion a year, with 86% of that fraud committed online. We know those figures are increasing. The FCA more than doubled the number of scam warnings it issued between 2019 and 2020, while UK Finance data shows that there has been a significant rise in cases across all scam types as criminals adapt to targeting victims online. The pandemic, which led to a boom in internet shopping, created an environment ripe for exploitation. Reported incidents of scams and fraud have increased by 41% since before the pandemic, with one in 10 of us now victims of fraud.
Being scammed can cause serious psychological harm. Research by the Money and Mental Health Policy Institute suggests that three in 10 online scam victims felt depressed as a result of being scammed, while four in 10 said they felt stressed. Clearly, action to tackle the profound harms that result from fraudulent advertising is long overdue.
This Bill is an important opportunity but, as with other issues the Government are seeking to address, we need to see changes if it is to be successful. Amendments 23 and 24 are small and very simple, but would have a profound impact on the ability of the Bill to prevent online fraud from taking place and to protect UK users.
As currently drafted, the duties set out in clauses 34 and 35 for category 1 and 2A services extend only to the design, operation and use of a category 1 or 2A service in the United Kingdom. Our amendments would mean that the duties extended to the design, operation and use of a category 1 or 2A service that targets users in the United Kingdom. That change would make the Bill far more effective, because it would reduce the risk of a company based overseas being able to target UK consumers without any action being taken against them—being allowed to target the public fraudulently without fear of disruption.
That would be an important change, because paid-for advertisements function by the advertiser stating where in the world, by geographical location, they wish to target consumers. For instance, a company would be able to operate from Hong Kong and take out paid-for advertisements to target consumers just in one particular part of north London. The current wording of the Bill does not acknowledge the fact that internet services can operate from anywhere in the world and use international boundaries to circumvent UK legislation.
Other legislation has been successful in tackling scams across borders. I draw the Committee’s attention to the London Olympic Games and Paralympic Games Act 2006, which made it a crime to sell a ticket to the Olympics into the black market anywhere in the world, rather than simply in the UK where the games took place. I suggest that we should learn from the action taken to regulate the Olympics back in 2012 and implement the same approach through amendments 23 and 24.
New clause 5 was also tabled by my hon. Friend the Member for Washington and Sunderland West, who will be getting a lot of mentions this afternoon.
Paid search results occur when companies pay a charge to have their site appear at the top of search results. This is valuable to them because it is likely to direct consumers towards their site. The new clause would stop scam websites buying their way to the top of a search result.
Let me outline some of the consequences of not distinguishing between paid-for and not-paid-for advertisements, because they can be awful. Earlier this year, anti-abortion groups targeted women who were searching online for a suitable abortion clinic. The groups paid for the women to have misleading adverts at the top of their search that directed them towards an anti-abortion centre rather than a clinic. One woman who knew that she wanted to have an abortion went on researching where she could have the procedure. Her search for a clinic on Google led her to an anti-abortion centre that she went on to contact and visit. That was because she trusted the top search results on Google, which were paid for. The fact that it was an advertisement was indicated only by the two letters “AD” appearing in very small font underneath the search headline and description.
Another example was reported by The Times last year. Google had been taking advertising money from scam websites selling premier league football tickets, even though the matches were taking place behind closed doors during lockdown. Because these advertisements appeared at the top of search results, it is entirely understandable that people looking for football tickets were deceived into believing that they would be able to attend the games, which led to them being scammed.
There have been similar problems with passport renewals. As colleagues will be very aware, people have been desperately trying to renew their passports amid long delays because of the backlog of cases. This is a target for fraudsters, who take out paid advertisements to offer people assistance with accessing passport renewal services and then scam them.
New clause 5 would end this practice by ensuring that search engines provide clear messaging to show that the user is looking at a paid-for advertisement, by stating that clearly and through other measures, such as a separate colour scheme. A duty to distinguish paid-for advertising is present in many other areas of advertising. For example, when we watch TV, there is no confusion between what is a programme and what is an advert; the same is true of radio advertising; and when someone is reading a newspaper or magazine, the line between journalism and the advertisements that fund the paper is unmistakable.
We cannot continue to have these discrepancies and be content with the internet being a wild west. Therefore, it is clear that advertising on search engines needs to be brought into line with advertising in other areas, with a requirement on search engines to distinguish clearly between paid-for and organic results.
New clause 6 is another new clause tabled by my hon. Friend the Member for Washington and Sunderland West. It would protect consumers from bad actors trying to exploit them online by placing a duty on search engines to verify adverts before they accept them. That would mean that, before their adverts were allowed to appear in a paid-for search result, companies would have to demonstrate that they were authorised by a UK regulatory body designated by the Secretary of State.
This methodology for preventing fraud is already in process for financial crime. Google only accepts financial services advertisements from companies that are a member of the Financial Conduct Authority. This gives companies a further incentive to co-operate with regulators and it protects consumers by preventing companies that are well-known for their nefarious activities from dominating search results and then misleading consumers. By extending this best practice to all advertisements, search engines would no longer be able to promote content that is fake or fraudulent after being paid to do so.
Without amending the Bill in this way, we risk missing an opportunity to tackle the many forms of scamming that people experience online, one of which is the world of online ticketing. In my role as shadow Minister for the arts and civil society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West.
In the meeting of the all-party parliamentary group on ticket abuse in April, we heard about the awful consequences of secondary ticket reselling practices. Ticket reselling websites, such as Viagogo, are rife with fraud. Large-scale ticket touts dominate the resale site, and Viagogo has a well-documented history of breaching consumer protection laws. Those breaches include a number of counts of fraud for selling non-existent tickets. Nevertheless, Viagogo continues to take out paid-for advertisements with Google and is continually able to take advantage of consumers by dominating search results and commanding false trust.
If new clause 6 is passed, then secondary ticketing websites such as Viagogo would have to be members of a regulatory body responsible for secondary ticketing, such as the Society of Ticket Agents and Retailers, or STAR. Viagogo would then have to comply with STAR standards for its business model to be successful.
I have used ticket touting as an example, but the repercussions of this change would be wider than that. Websites that sell holidays and flights, such as Skyscanner, would have to be a member of the relevant regulatory group, for example the Association of British Travel Agents. People would be able to go to football matches, art galleries and music festivals without fearing that they are getting ripped off or have been issued with fake tickets.
I will describe just a few examples of the poor situation we are in at the moment, to illustrate the need for change. The most heartbreaking one is of an elderly couple who bought two tickets from a secondary ticketing website to see their favourite artist, the late Leonard Cohen, to celebrate their 70th wedding anniversary. When the day came around and they arrived at the venue, they were turned away and told they had been sold fake tickets. The disappointment they must have felt would have been very hard to bear. In another instance, a British soldier serving overseas decided to buy his daughter concert tickets because he could not be with her on her birthday. When his daughter went along to the show, she was turned away at the door and told she could not enter because the tickets had been bought through a scam site and were invalid.
For pensioners in particular, requiring adverts to be clearly different from other search results would make a positive difference. The other thing that we have to remember is that pensioners generally did not grow up online, and some of them struggle more to navigate the internet than some of us who are bit younger.
The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.
We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.
I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.
Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.
New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.
New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.
“where the UK is a target market”,
are already in the Bill, in clause 3(5)(b), on page 3, which set out the definitions at the start. I will allow the hon. Lady a moment to look at where it states:
“United Kingdom users form one of the target markets for the service”.
That applies to user-to-user and to search, so it is covered already.
“the service has a significant number of United Kingdom users”.
It does not matter if a person is one of 50, 100 or 1,000 people who get scammed by some organisation operating in another part of the country. The 2006 Bill dealing with the sale of Olympic tickets believed that was important, and we also believe it is important. We have to find a way of dealing with ticket touting and ticket abuse.
Turning to fraudulent advertising, I have given examples and been supported very well by the hon. Member for Aberdeen North. It is not right that vulnerable people are repeatedly taken in by search results, which is the case right now. The reason we have tabled all these amendments is that we are trying to protect vulnerable people, as with every other part of the Bill.
“a significant number of United Kingdom users”,
but paragraph (b) just says,
“United Kingdom users form one of the target markets”.
There is no significant number qualification in paragraph (b), and to put it beyond doubt, clause 166(1) makes it clear that service providers based outside the United Kingdom are within the scope of the Bill. To reiterate the point, where the UK is a target market, there is no size qualification: the service provider is in scope, even if it is only one user.
Question put, That the amendment be made.
Amendment 45, in clause 35, page 34, line 2, leave out subsection (1) and insert—
“(1) A provider of a Category 2A service must operate the service using proportionate systems and processes designed to—
(a) prevent individuals from encountering content consisting of fraudulent advertisements by means of the service;
(b) minimise the length of time for which any such content is present;
(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.”
This amendment brings the fraudulent advertising provisions for Category 2A services in line with those for Category 1 services.
Government amendments 91 to 94.
Clause 35 stand part.
Amendment 44, in clause 36, page 35, line 10, at end insert—
“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”
This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.
Clause 36 stand part.
All three of the duties on category 1 services introduced by clause 34 are necessary to address the harm caused by fraudulent and misleading online adverts. Service providers need to take proportionate but effective action to prevent those adverts from appearing or reappearing, and when they do appear, those service providers need to act quickly by swiftly taking them down. The duties on category 2A services were much weaker, only requiring them to minimise the risk of individuals encountering content consisting of fraudulent advertisements in or via search results of the service. There was no explicit reference to prevention, even though that is vital, or any explicit requirement to act quickly to take harmful adverts down.
That difference would have created an opportunity for fraudsters to exploit by focusing on platforms with lesser protections. It could have resulted in an increase in fraud enabled by paid-for advertising on search services, which would have undermined the aims of the Bill. I am glad that the Government have recognised this and will require the same proactive, preventative response to harmful ads from regulated search engines as is required from category 1 services.
Debt advice charities, including StepChange and the Money Advice Trust, have been working hard to tackle these impersonator ads. For instance, StepChange reported 72 adverts to the tech giants and regulators last year for misleading and harmful practices, only some of which the Advertising Standards Authority has issued rulings against. StepChange and the Money Advice Trust are keen to have the safeguards in place that are needed by the people who are most vulnerable to harm and exploitation, yet in the current drafting of the Bill harmful adverts on debt advice could slip through the net.
The conditions for an advert to be defined as fraudulent are set out in clause 34(3) for category 1 services and clause 35(3) for category 2A search services. Both clauses specify that an advert is fraudulent if it amounts to an offence set out in clause 36. Clause 36 lists a series of offences gathered from financial services legislation and the Fraud Act 2006.
Charities are concerned that fraudulent debt advice advertisements will not be captured by the offences set out in clause 36(2) contained in the Financial Services and Markets Act 2000, which relate to persons unauthorised by the Financial Conduct Authority carrying on an activity that is regulated under the Act. While providing debt counselling and debt adjusting are regulated activities, brokering debt solutions is not. Therefore the offences listed in the Bill would not seem to capture the unregulated advertisers behind misleading adverts, including those that impersonate debt advice charities.
Furthermore, the explanatory notes for the offences taken from the Financial Services Act 2012 show that these offences appear to be intended to address financial market abuse, and so seem somewhat at a distance from the harm consumers face from fraudulent online ads for debt help services.
Clause 36(3) lists offences under the Fraud Act 2006. This could capture harmful advertisements for debt help and debt solutions, but it is not completely clear that these provisions capture, or best capture, the nature of unfair practice caused by misleading online adverts for debt solutions. The Government’s announcement on 8 March outlined that fraudulent paid-for online adverts would be included in this Bill. However, they drew a distinction between “fraudulent adverts”, to be covered by the Bill, and “misleading adverts”, which will be considered in the online advertising consultation. In reality, this dividing line is not clear cut, even where the Bill seeks to define “fraudulent adverts” in terms of offences in other legislation.
Amendment 44 seeks to align clause 36 offences better with important existing consumer protection legislation. It would insert further offences into clause 36 to include offences that are contained in part 3 of the existing consumer protection from unfair trading regulations of 2008. Those regulations are key pieces of consumer protection legislation. Part 3 of those regulations creates offences relating to misleading or aggressive practices. Most relevant here would be the regulation 9 offence for contravening the prohibition on “misleading actions”, which states that something is a misleading practice if it fulfils one of two conditions. The first is that it both contains “false information” and is likely to cause “the average consumer” to take a decision they would not otherwise have done. The second is that it causes “confusion” with other products or trade names.
It has been pointed out that these regulations by themselves have not stopped vulnerable consumers being exposed to adverts of misleading debt solutions, despite the best efforts of regulators and charities to stop them. Adding offences under the consumer protection regulations to the Bill would finally close the net.
There should be no objection from the Government to this amendment. Through the consumer protection regulations, they have already recognised misleading commercial practices as an offence, including promotions that mislead consumers or create confusion over trade names. We therefore have a situation where harmful debt adverts meet the criteria of offence in consumer protection regulations, but might not meet the Fraud Act 2006 provisions in the Online Safety Bill. The amendment seeks to clarify and align the treatment of misleading debt adverts, which can be so harmful to people.
I admit that these amendments can get very technical, but it is important that I finish by talking about the impact of these scams on people’s lives. I want to talk about the experience of a woman who was recommended to StepChange’s debt advice services but clicked on a copycat debt ad from a firm masquerading as StepChange in the online search results. After entering her personal information into what she thought was a genuine website, the woman was pestered by phone calls into setting up an individual voluntary arrangement, or IVA, and made a series of payments worth £650 that were meant for her creditors. Sadly, it was only after contact from her bank, four months later, that the woman realised the debt firm she had clicked on was a scam.
The Bill offers a chance to establish an important principle. People should be able to have confidence that the links they click on are for reputable regulated advice services. People should not have to be constantly on their guard against scams and other misleading promotions found on social media websites and in top-of-the-page search results. Without this amendment and the others to this chapter, we cannot be sure that those outcomes will be achieved.
I listened carefully to what the shadow Minister said on amendment 44. The example she gave at the end of her speech—the poor lady who was induced into sending money, which she thought was being sent to pay off creditors but was, in fact, stolen—would, of course, be covered by the Bill as drafted, because it would count as an act of fraud.
The hon. Lady also talked about some other areas that were not fraud, such as unfair practices, misleading statements or statements that were confusing, which are clearly different from fraud. The purpose of clause 35 is to tackle fraud. Those other matters are, as she says, covered by the Consumer Protection from Unfair Trading Regulations 2008, which are overseen and administered by the Competition and Markets Authority. While matters to do with unfair, misleading or confusing content are serious—I do not seek to minimise their importance—they are overseen by a different regulator and, therefore, better handled by the CMA under its existing regulations.
If we introduce this extra offence to the list in clause 36, we would end up having a bit of regulatory overlap and confusion, because there would be two regulators involved. For that reason, and because those other matters—unfair, misleading and confusing advertisements —are different to fraud, I ask that the Opposition withdraw amendment 44 and, perhaps, take it up on another occasion when the CMA’s activities are in the scope of the debate.
The regulatory framework for financial compulsion is fragmented. FCA-regulated firms are clearly under much stronger obligations than those that fall outside FCA regulations. I believe that it would be better to accept the amendment, which would oblige search engines and social media giants to prevent harmful and deceptive ads from appearing in the first place. The Minister really needs to take on board the fact that in this patchwork, this fragmented world of different regulatory systems, some of the existing systems are clearly failing badly, and the strong view of expert organisations is that the amendment is necessary.
Question put and agreed to.
Clause 34 accordingly ordered to stand part of the Bill.
Clause 35
Duties about fraudulent advertising: Category 2A services
Amendments made: 91, in clause 35, page 34, line 3, leave out from “to” to end of line 5 and insert—
“(a) prevent individuals from encountering content consisting of fraudulent advertisements in or via search results of the service;
(b) if any such content may be encountered in or via search results of the service, minimise the length of time that that is the case;
(c) where the provider is alerted by a person to the fact that such content may be so encountered, or becomes aware of that fact in any other way, swiftly ensure that individuals are no longer able to encounter such content in or via search results of the service.”
This amendment alters the duty imposed on providers of Category 2A services relating to content consisting of fraudulent advertisements so that it is in line with the corresponding duty imposed on providers of Category 1 services by clause 34(1).
Amendment 92, in clause 35, page 34, line 16, leave out “reference” and insert “references”.
This amendment is consequential on Amendment 91.
Amendment 93, in clause 35, page 34, line 18, leave out “is a reference” and insert “are references”.
This amendment is consequential on Amendment 91.
Amendment 94, in clause 35, page 34, line 22, leave out
“does not include a reference”
and insert “do not include references”.—(Chris Philp.)
This amendment is consequential on Amendment 91.
Clause 35, as amended, ordered to stand part of the Bill.
Clause 36
Fraud etc offences
Amendment proposed: 44, in clause 36, page 35, line 10, at end insert—
“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”—(Barbara Keeley.)
This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.
Question put, That the amendment be made.
“(ia) organisations that campaign for the removal of animal abuse content, and”.
This amendment would add organisations campaigning for the removal of animal content to the list of bodies Ofcom must consult.
Amendment 63, in schedule 4, page 176, line 29, at end insert “and
(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.
This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.
Amendment 64, in schedule 4, page 177, line 4, at end insert “and
(vii) the systems and process are appropriate to detect cruelty towards humans and animals;”.
This amendment would ensure that ensuring systems and processes are appropriate to detect cruelty towards humans and animals is one of the online safety objectives for search services.
Amendment 60, in clause 52, page 49, line 5, at end insert—
“(e) an offence, not within paragraph (a), (b) or (c), of which the subject is an animal.”
This amendment brings offences to which animals are subject within the definition of illegal content.
Amendment 59, in schedule 7, page 185, line 39, at end insert—
“Animal Welfare
22A An offence under any of the following provisions of the Animal Welfare Act 2006—
(a) section 4 (unnecessary suffering);
(b) section 5 (mutilation);
(c) section 7 (administration of poisons);
(d) section 8 (fighting);
(e) section 9 (duty of person responsible for animal to ensure welfare).
22B An offence under any of the following provisions of the Animal Health and Welfare (Scotland) Act 2006—
(a) section 19 (unnecessary suffering);
(b) section 20 (mutilation);
(c) section 21 (cruel operations);
(d) section 22 (administration of poisons);
(e) section 23 (fighting);
(f) section 24 (ensuring welfare of animals).
22C An offence under any of the following provisions of the Welfare of Animals Act (Northern Ireland) 2011—
(a) section 4 (unnecessary suffering);
(b) section 5 (prohibited procedures);
(c) section 7 (administration of poisons);
(d) section 8 (fighting);
(e) section 9 (ensuring welfare of animals).
22D For the purpose of paragraphs 22A, 22B or 22C of this Schedule, the above offences are deemed to have taken place regardless of whether the offending conduct took place within the United Kingdom, if the offending conduct would have constituted an offence under the provisions contained within those paragraphs.”
This amendment adds certain animal welfare offences to the list of priority offences in Schedule 7.
Amendment 66, in clause 140, page 121, line 8, at end insert—
“(d) causing harm to any human or animal.”
This amendment ensures groups are able to make complaints regarding animal abuse videos.
Amendment 67, in clause 140, page 121, line 20, at end insert
“, or a particular group that campaigns for the removal of harmful online content towards humans and animals”.
This amendment makes groups campaigning against harmful content eligible to make supercomplaints.
The absence of protections relating to animal abuse content is a real omission from the Bill. Colleagues will have seen the written evidence from Action for Primates, which neatly summarised the key issues on which Labour is hoping to see agreement from the Government. Given this omission, it is clear that the current draft of the Bill is not fit for tackling animal abuse, cruelty and violence, which is all too common online.
There are no explicit references to content that can be disturbing and distressing to those who view it—both children and adults. We now know that most animal cruelty content is produced specifically for sharing on social media, often for profit through the monetisation schemes offered by platforms such as YouTube. Examples include animals being beaten, set on fire, crushed or partially drowned; the mutilation and live burial of infant monkeys; a kitten intentionally being set on by a dog and another being stepped on and crushed to death; live and conscious octopuses being eaten; and animals being pitted against each other in staged fights.
Animals being deliberately placed into frightening or dangerous situations from which they cannot escape or are harmed before being “rescued” on camera is becoming increasingly popular on social media, too. For example, kittens and puppies are “rescued” from the clutches of a python. Such fake rescues not only cause immense suffering to animals, but are fraudulent because viewers are asked to donate towards the rescue and care of the animals. This cannot be allowed to continue.
Indeed, as part of its Cancel Out Cruelty campaign, the Royal Society for the Prevention of Cruelty to Animals conducted research, which found that in 2020 there were nearly 500 reports of animal cruelty on social media. That was more than twice the figure reported for 2019. The majority of these incidents appeared on Facebook. David Allen, head of prevention and education at the RSPCA, has spoken publicly about the issue, saying:
“Sadly, we have seen an increase in recent years in the number of incidents of animal cruelty being posted and shared on social media such as Facebook, Instagram, TikTok and Snapchat.”
David Allen continued:
“We’re very concerned that the use of social media has changed the landscape of abuse with videos of animal cruelty being shared for likes and kudos with this sort of content normalising—and even making light of—animal cruelty. What’s even more worrying is the level of cruelty that can be seen in these videos, particularly as so many young people are being exposed to graphic footage of animals being beaten or killed which they otherwise would never have seen.”
Although the Bill has a clear focus on protecting children, we must remember that the prevalence of cruelty to animals online has the potential to have a hugely negative impact on children who may be inadvertently seeing that content through everyday social media channels.
To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.
I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:
“I swear I’ll kill it.”
In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:
“You must be aware that others look up to you and many young people aspire to emulate you.”
What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:
“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”
There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.
It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.
I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.
I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.
Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.
If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.
Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.
To some extent, the offences are in the Bill’s scope already. It covers, for example, extreme pornography. Given that the content described by the hon. Lady would inflict psychological harm to children, it is, to that extent, in scope.
The hon. Lady mentioned the Government’s wider activities to prevent animal cruelty. That work goes back a long time and includes the last Labour Government’s Animal Welfare Act 2006. She mentioned the more recent update to the criminal sentencing laws that increased by a factor of 10 the maximum sentence for cruelty to animals. It used to be six months and has now been increased to up to five years in prison.
In addition, just last year the Department for Environment, Food and Rural Affairs announced an action plan for animal welfare, which outlines a whole suite of activities that the Government are taking to protect animals in a number of different areas—sentience, international trade, farming, pets and wild animals. That action plan will be delivered through a broad programme of legislative and non-legislative work.
On the basis of the Government’s existing work on animal welfare, the effect that the Bill as drafted will have in this area, and the fact that we will give this issue some further thought, I hope that the shadow Minister will let the matter rest for now.
Question put, That the amendment be made.
Clause 38 stand part.
That schedule 4 be the Fourth schedule to the Bill.
New clause 20—Use of proactive technology in private messaging: report—
“(1) OFCOM must produce a report—
(a) examining the case for the use of proactive technology in private messaging where the aim is to identify CSEA content; and
(b) making recommendations to whether or not proactive technology should be used in such cases.
(2) The report must be produced in consultation with organisations that have expertise and experience in tackling CSEA.
(3) The report must be published and laid before both Houses of Parliament within six months of this Act being passed.”
On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.
Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:
“The online safety objectives for regulated user-to-user services are as follows”.
I will move straight to paragraph 4(a)(iv), which says
“there are adequate systems and processes to support United Kingdom users”.
Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.
I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.
Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.
As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.
Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.
If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.
Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:
“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”
The second is from a boy aged 17. He said:
“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.
Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.
We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.
Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.
Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.
Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.
To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.
Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.
How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.
Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.
I want to make a few comments about new clause 20 and some of the issues it raises. The new clause is incredibly important, and we need to take seriously the concerns that have been raised with us by the groups that advocate on behalf of children. They would not raise those concerns if they did not think the Bill was deficient in this area. They do not have spare people and cannot spend lots of time doing unnecessary things, so if they are raising concerns, those are very important things that will make a big difference.
I want to go a little further than what the new clause says and ask the Minister about future-proofing the Bill and ensuring that technologies can be used as they evolve. I am pretty sure that everybody agrees that there should be no space where it is safe to share child sexual exploitation and abuse, whether physical space or online space, private messaging or a more open forum. None of those places should be safe or legal. None should enable that to happen.
My particular thought about future-proofing is about the development of technologies that are able to recognise self-generated pictures, videos, livestreams and so on that have not already been categorised, do not have a hash number and are not easy for the current technologies to find. There are lots of people out there working hard to stamp out these images and videos online, and I have faith that they are developing new technologies that are able to recognise images, videos, messages and oral communications that cannot currently be recognised.
I agree wholeheartedly with the new clause: it is important that a report be produced within six months of the Bill being passed. It would be great if the Minister would commit to thinking about whether Ofcom will be able to require companies to implement new technologies that are developed, as well as the technologies that are currently available. I am not just talking about child sexual abuse images, material or videos; I am also talking about private messaging where grooming is happening. That is a separate thing that needs to be scanned for, but it is incredibly important.
Some of the stories relayed by the shadow Minister relate to conversations and grooming that happened in advance of the self-generated material being created. If there had been a proactive action to scan for grooming behaviour by those companies whose platforms the direct messaging was taking place on, then those young people would potentially have been in a safer place, because it could have been stopped in advance of that self-generated material being created. Surely, that should be the aim. It is good that we can tackle this after the event—it is good that we have something—but tackling it before it happens would be incredibly important.
I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.
My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.
I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.
I will talk about two or three of the issues that have arisen in the course of the debate. The first is new clause 20, a proposal requiring Ofcom to put together a report. I do not think that is strictly necessary, because the Bill already imposes a requirement to identify, assess and mitigate CSEA. There is no optionality here and no need to think about it; there is already a demand to prevent CSEA content, and Ofcom has to produce codes of practice explaining how it will do that. I think what is requested in new clause 20 is required already.
The hon. Member for Pontypridd mentioned the concern that Ofcom had to first of all prove that the CSEA risk existed. I think that might be a hangover from the previous draft of the Bill, where there was a requirement for the evidence to be “persistent and prevalent”—I think that might have been the phrase—which implied that Ofcom had to first prove that it existed before it could take action against it. So, for exactly the reason she mentioned, that it imposed a requirement to prove CSEA is there, we have changed the wording in the new version. Clause 103(1), at the top of page 87, instead of “persistent and prevalent”, now states “necessary and proportionate”. Therefore, if Ofcom simply considers something necessary, without needing to prove that it is persistent and prevalent—just if it thinks it is necessary—it can take the actions set out in that clause. For the reason that she mentioned, the change has been made already.
Secondly, on the question of the hon. Member for Aberdeen North about whether that can keep up to date with future technology moves—an important question, because this technology will change almost month to month, and certainly year to year—in that context it is worth referring to the definition of “accredited” technology. If my memory is correct, that is to be found in clause 105(9) and (10), on page 90. In essence, those two subsections state that Ofcom may update accreditation whenever it feels that to be necessary—that can be at any time; it is not one-off. Indeed, Ofcom may appoint some other person or body to do the accreditation if it feels that it does not have the expertise itself. The concept of accredited technology is live; it can be updated the whole time.
Given that we are on the topic, however, we are still thinking—this is so important, and the hon. Member for Aberdeen North has rightly raised it two or three times—about whether there are ways to strengthen clause 103 further, to provide even more clear and powerful powers to act in this area. If we can think of ways to do that, or if anyone else can suggest one, we are receptive to that thinking. The reason—as I gave in answer to the hon. Lady two or three times—is that, as far as I am concerned, there can be no compromise when scanning for CSEA content.
We then come to the question of the risk assessments and the codes of practice, to ensure that all the relevant groups get covered and that no one gets forgotten—this brings me back to clause 37, you will be pleased to hear, Ms Rees. Subsection (3), which appears towards the bottom of page 35, states on lines 31 to 33:
What are those relevant duties? The relevant duties are, mercifully, defined at the bottom of the following page, page 36, in subsection (10), which sets out what we mean, and the most important for protecting people are paragraphs (a), (b) and (c): anything that is illegal, anything that concerns the safety of children, and matters concerning the safety of adults, respectively. There is no risk that those very important topics can somehow get forgotten.
I hope that clarifies how the Bill operates. As I said, we are giving careful thought to finding ways—which I hope we can—to strengthen those powers in clause 103.
Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.
As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:
“OFCOM must prepare and issue one or more codes of practice”.
That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.
Question put and agreed to.
Clause 37 accordingly ordered to stand part of the Bill.
Clause 38 ordered to stand part of the Bill.
Schedule 4
Codes of practice under section 37: principles, objectives, content
Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and
(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)
This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.
Question put, That the amendment be made.
Question put, That the amendment be made.
Schedule 4 agreed to.
“(A1) OFCOM must prepare the draft codes of practice required under section 37 within the period of six months beginning with the day on which this Act is passed.”
This amendment requires Ofcom to prepare draft codes of practice within six months of the passing of the Act.
Clause stand part.
Clauses 42 to 47 stand part.
Amendment 48 would require Ofcom to prepare draft codes of practice within six months of the passing of the Act. This simple amendment would require Ofcom to bring forward these important codes of practice within an established time period—six months—after the Bill receives Royal Assent. Labour recognises the challenges ahead for Ofcom in both capacity and funding.
On this note, I must raise with the Minister something that I have raised previously. I find it most curious that his Department recently sought to hire an online safety regulator funding policy adviser. The job advert listed some of the key responsibilities:
“The post holder will support ministers during passage of the Online Safety Bill; secure the necessary funding for Ofcom and DCMS in order to set up the Online Safety regulator; and help implement and deliver a funding regime which is first of its kind in the UK.”
That raises worrying questions about how prepared Ofcom is for the huge task ahead. That being said, the Government have drafted the Bill in a way that brings codes of practice to its heart, so they cannot and should not be susceptible to delay.
The Government have drafted the Bill in a way that puts codes of practice at its heart, so they cannot and should not be susceptible to delay. We have heard from platforms and services that stress that the ambiguity of the requirements is causing concern. At least with a deadline for draft codes of practice, those that want to do the right thing will be able to get on with it in a timely manner.
The Age Verification Providers Association provided us with evidence in support of amendment 48 in advance of today’s sitting. The association agrees that early publication of the codes will set the pace for implementation, encouraging both the Secretary of State and Parliament to approve the codes swiftly. A case study it shared highlights delays in the system, which we fear will be replicated within the online space, too. Let me indulge Members with details of exactly how slow Ofcom’s recent record has been on delivering similar guidance required under the audio-visual media services directive.
The directive became UK law on 30 September 2020 and came into force on 1 November 2020. By 24 June 2021, Ofcom had issued a note as to which video sharing platforms were in scope. It took almost a year until, on 6 October 2021, Ofcom issued formal guidance on the measures.
In December 2021, Ofcom wrote to the verification service providers and
“signalled the beginning of a new phase of supervisory engagement”.
However, in March 2022 it announced that
“the information we collect will inform our Autumn 2022 VSP report, which intends to increase the public’s awareness of the measures platforms have in place to protect users from harm.”
There is still no indication that Ofcom intends to take enforcement action against the many VSPs that remain non-compliant with the directive. It is simply not good enough. I urge the Minister to carefully consider the aims of amendment 48 and to support it.
Labour supports the principles of clause 42. Ofcom must not drag out the process of publishing or amending the codes of practice. Labour also supports a level of transparency around the withdrawal of codes of practice, should that arise.
Labour also supports clause 43 and the principles of ensuring that Ofcom has a requirement to review its codes of practice. We do, however, have concerns over the Secretary of State’s powers in subsection (6). It is absolutely right that the Secretary of State of the day has the ability to make representations to Ofcom in order to prevent the disclosure of certain matters in the interests of national security, public safety or relations with the Government of a country outside the UK. However, I am keen to hear the Minister’s assurances about how well the Bill is drafted to prevent those powers from being used, shall we say, inappropriately. I hope he can address those concerns.
On clause 44, Ofcom should of course be able to propose minor amendments to its codes of practice. Labour does, however, have concerns about the assessment that Ofcom will have to make to ensure that the minor nature of changes will not require amendments to be laid before Parliament, as in subsection (1). As I have said previously, scrutiny must be at the heart of the Bill, so I am interested to hear from the Minister how exactly he will ensure that Ofcom is making appropriate decisions about what sorts of changes are allowed to circumvent parliamentary scrutiny. We cannot and must not get to a place where the Secretary of State, in agreeing to proposed amendments, actively prevents scrutiny from taking place. I am keen to hear assurances on that point from the Minister.
On clause 45, as I mentioned previously on amendment 65 to clause 37, as it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in subsection (1). However, providers could take alternative measures to comply, as outlined in subsection (5). Labour supports the clause in principle, but we are concerned that the definition of alternative measures is too broad. I would be grateful if the Minister could elaborate on his assessment of the instances in which a service provider may seek to comply via alternative measures. Surely the codes of practice should be, for want of a better phrase, best practice. None of us want to get into a position where service providers are circumnavigating their duties by taking the alternative measures route.
Again, Labour supports clause 46 in principle, but we feel that the provisions in subsection (1) could go further. We know that, historically, service providers have not always been transparent and forthcoming when compelled to be so by the courts. While we understand the reasoning behind subsection (3), we have broader concerns that service providers could, in theory, lean on their codes of practice as highlighting their best practice. I would be grateful if the Minister could address our concerns.
We support clause 47, which establishes that the duties in respect of which Ofcom must issue a code of practice under clause 37 will apply only once the first code of practice for that duty has come into force. However, we are concerned that this could mean that different duties will apply at different times, depending on when the relevant code for a particular duty comes into force. Will the Minister explain his assessment of how that will work in practice? We have concerns that drip feeding this information to service providers will cause further delay and confusion. In addition, will the Minister confirm how Ofcom will prioritise its codes of practice?
Lastly, we know that violence against women and girls has not a single mention in the Bill, which is an alarming and stark omission. Women and girls are disproportionately likely to be affected by online abuse and harassment. The Minister knows this—we all know this—and a number of us have spoken up on the issue on quite a few occasions. He also knows that online violence against women and girls is defined as including, but not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive sexting and the creation and sharing of deepfake pornography.
The Minister will also know that Carnegie UK is working with the End Violence Against Women coalition to draw up what a code of practice to tackle violence against women and girls could look like. Why has that been left out of the redraft of the Bill? What consideration has the Minister given to including a code of this nature in the Bill? If the Minister is truly committed to tackling violence against women and girls, why will he not put that on the face of the Bill?
Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.
On amendment 48, which seeks to get Ofcom to produce its codes of practice within six months, obviously we are unanimous in wanting that to be done as quickly as possible. However, Ofcom has to go through a number of steps in order to produce those codes of practice. For example, first we have to designate in secondary legislation the priority categories of content that is harmful to children and content that is harmful to adults, and then Ofcom has to go through a consultation exercise before it publishes the codes. It has in the past indicated that it expects that to be a 12-month, rather than a six-month, process. I am concerned that a hard, six-month deadline may be either impossible to meet or make Ofcom rush and do it in a bad way. I accept the need to get this done quickly, for all the obvious reasons, but we also want to make sure that it is done right. For those reasons, a hard, six-month deadline would not help us very much.
There are quite a few different codes of practice to publish, and the hon. Lady asked about that. The ones listed in clause 47 will not all come out at the same time; they will be staggered and prioritised. Obviously, the ones that are most germane to safety, such as those on illegal content and children’s safety, will be done first. We would expect them to be done as a matter of extreme urgency.
I hope I have partly answered some of the questions that the hon. Member for Aberdeen North asked. The document to be published before the summer, which she asked about, is a road map. I understand it to be a sort of timetable that will set out the plan for doing everything we have just been debating—when the consultations will happen and when the codes of practice will be published. I guess we will get the road map in the next few weeks, if “before the summer” means before the summer recess. We will have all that set out for us, and then the formal process follows Royal Assent. I hope that answers the hon. Lady’s question.
There were one or two other questions from the hon. Member for Pontypridd. She asked whether a Secretary of State might misuse the power in clause 43(2)—a shocking suggestion, obviously. The power is only to request a review; it is nothing more sinister or onerous than that.
On clause 44, the hon. Lady asked what would happen if Ofcom and the Secretary of State between them—it would require both—conspired to allow through a change claiming it is minor when in fact it is not minor. First, it would require both of them to do that. It requires Ofcom to propose it and the Secretary of State to agree it, so I hope the fact that it is not the Secretary of State acting alone gives her some assurance. She asked what the redress is if both the Secretary of State and Ofcom misbehave, as it were. Well, the redress is the same as with any mis-exercise of a public power—namely, judicial review, which, as a former Home Office Minister, I have experienced extremely frequently—so there is legal redress.
The hon. Lady then asked about the alternative measures. What if a service provider, rather than meeting its duties via the codes of practice, does one of the alternative measures instead? Is it somehow wriggling out of what it is supposed to do? The thing that is legally binding, which it must do and about which there is no choice because there is a legal duty, is the duties that we have been debating over the past few days. Those are the binding requirements that cannot be circumvented. The codes of practice propose a way of meeting those. If the service provider can meet the duties in a different way and can satisfy Ofcom that it has met those duties as effectively as it would under the codes of practices, it is open to doing that. We do not want to be unduly prescriptive. The test is: have the duties been delivered? That is non-negotiable and legally binding.
I hope I have answered all the questions, while gently resisting amendment 48 and encouraging the Committee to agree that the various other clauses stand part of the Bill.
Question put, That the amendment be made.
The Committee divided:.
Ordered, That further consideration be now adjourned. —(Steve Double.)
Adjourned till Tuesday 14 June at twenty-five minutes past Nine o’clock.
OSB61 Badger Trust
OSB62 Lego
OSB63 End Violence Against Women Coalition (EVAW)
OSB64 Hacked Off Campaign (further submission) (re: clause 50)
OSB65 Office of the City Remembrancer, on behalf of the City of London Corporation and City of London Police
OSB66 Juul Labs
OSB67 Big Brother Watch, ARTICLE 19, Open Rights Group, Index on Censorship, and Global Partners Digital
OSB68 News Media Association (supplementary submission)
Contains Parliamentary information licensed under the Open Parliament Licence v3.0.