PARLIAMENTARY DEBATE
Automated and Electric Vehicles Bill (Third sitting) - 2 November 2017 (Commons/Public Bill Committees)
Debate Detail
Chair(s) Mr Adrian Bailey, † Sir Edward Leigh
Members† Argar, Edward (Charnwood) (Con)
† Brown, Alan (Kilmarnock and Loudoun) (SNP)
† Duffield, Rosie (Canterbury) (Lab)
† Efford, Clive (Eltham) (Lab)
† Foxcroft, Vicky (Lewisham, Deptford) (Lab)
† Hayes, Mr John (Minister for Transport Legislation and Maritime)
Jones, Graham P. (Hyndburn) (Lab)
† Kerr, Stephen (Stirling) (Con)
† Knight, Sir Greg (East Yorkshire) (Con)
† Letwin, Sir Oliver (West Dorset) (Con)
† Mann, Scott (North Cornwall) (Con)
Rodda, Matt (Reading East) (Lab)
† Stephenson, Andrew (Pendle) (Con)
† Stewart, Iain (Milton Keynes South) (Con)
† Tracey, Craig (North Warwickshire) (Con)
† Turner, Karl (Kingston upon Hull East) (Lab)
† Western, Matt (Warwick and Leamington) (Lab)
ClerksFarrah Bhatti, Mike Everett, Committee Clerks
† attended the Committee
Public Bill CommitteeThursday 2 November 2017
[Sir Edward Leigh in the Chair]
Automated and Electric Vehicles Bill
At the end of the debate on a group of amendments, I shall call the Member who moved the lead amendment again. Before they sit down, they will need to indicate whether they wish to withdraw the amendment or seek a Division. If any Member wishes to press any other amendment or new clause in a group to a vote, they need to let me know.
I shall work on the assumption that the Minister wishes the Committee to reach a decision on all Government amendments if any are tabled. Please note that decisions on amendments take place not in the order that they are debated but in the order they appear on the amendment paper. In other words, debate occurs according to the selection list; decisions are taken when we come to the clause affected by the amendment. I shall use my discretion to decide whether to allow a separate stand part debate on individual clauses and schedules following the debates on the relevant amendments. I hope that this explanation is helpful.
Clause 1
Listing of automated vehicles by the Secretary of State
“(1A) The Secretary of State must consult on and publish the criteria that they will use to determine whether, in their opinion, a motor vehicle is designed or adapted to be capable, in at least some circumstances or situations, of safely driving itself without having to be monitored by an individual.
(1B) The Secretary of State may not change the criteria without consulting vehicle manufacturers, insurers and other such persons as the Secretary of State considers appropriate.”
This amendment requires the Government to consult on and publish criteria for the definition of “automated vehicles” that will be used by the Secretary of State.
“(1A) The Secretary of State may only add a vehicle to the list if the Secretary of State is satisfied that the vehicle’s software has been approved for safe use on roads or in other public places in Great Britain.”
This amendment would ensure that vehicles cannot be listed as automated vehicles by the Secretary of State unless he or she is satisfied that the vehicle’s software has been through an approval process (see NC11).
New clause 11—Approval of automated vehicle software—
“(1) The Secretary of State must set out in regulations a system for approving automated vehicle software.
(2) These regulations must, in particular, make provision for—
(a) the criteria to be used in the approval process to determine whether automated vehicle software is safe for use on roads or other public places in Great Britain, including, but not limited to the way in which the vehicle is programmed to—
(i) deal with moral judgements, and
(ii) transition between driving itself and being driven by a person.
(b) the process by which manufacturers of automated vehicles may apply for software approval, including, but not limited to, any inspection and testing that the vehicle may be required to undergo, and
(c) the process by which manufacturers of automated vehicles may appeal if their software is not approved.
(3) In this section, a “moral judgement” refers to any situation where an automated vehicle has, and makes, a choice of action during an accident while the vehicle is driving itself.
(4) In this section and section 2, the definition of transition of an automated vehicle “between driving itself and being driven by a person” may be set out by the Secretary of State in regulations.
(5) Where a statutory instrument contains the first regulations made under this section, the instrument may not be made unless a draft of it has been laid before Parliament and approved by a resolution of each House.
(6) A statutory instrument containing regulation under this section, that is not the first such regulation made under this section, is subject to annulment in pursuance of a resolution of either House of Parliament.”
This new clause would require the Government to establish a system for approving automated vehicle software. The approval process would include an opportunity for manufacturers to appeal against a failed approval process. Criteria for approval would include consideration of the way in which the vehicle was programmed to deal with moral judgements.
This is an exciting opportunity for the Committee to speak about the potential to liberate many people currently excluded from access to rural transport. The Bill also provides opportunities to improve personal transport arrangements, as well as air quality, which is crucial given the dire state of the environment and its impact on health. I begin by thanking the Minister personally for his collegiate approach to the Bill, and for his co-operation and assistance in the preparation for this sitting. He even allowed my staff access to his officials. It is genuinely appreciated.
Amendment 1 would improve the Bill, and I know that the Minister is intent on improving it. It would require the Government to consult on and publish criteria for the definition of “automated vehicles” that the Secretary of State will use. As the Committee can see, clause 1 as currently drafted puts the onus on the Secretary of State to define, in his or her opinion, what constitutes an automated vehicle, without having to consult the sector. In my view, the Bill would be vastly improved by a requirement to consult on and publish the criteria by which “automated vehicles” will be defined.
Secondly, the amendment would prevent the Secretary of State from changing the criteria without consulting vehicle manufacturers, insurers and other such persons as the Secretary of State considers appropriate. We ask for that consultation and publication of the criteria because it is crucial that manufacturers, vehicle owners and insurers know them, whether they are making, buying, warning about or insuring an automated vehicle, and whether the scope of the legislation applies to their vehicle. In the evidence session, the insurance industry welcomed the Government taking on the responsibility of saying what is an automated vehicle, but we are still concerned that the Bill as drafted leaves the Secretary of State with total discretion on what is an automated vehicle. We therefore tabled the amendment to provide greater clarity and to help the Government by ensuring that the relevant persons and organisations will be sufficiently involved, to inform the Secretary of State’s list of automated vehicles.
The Opposition believe that the additional clarity provided by the amendment would help to create a more reassuring environment and to encourage the development and uptake of automated vehicles. As I said, the amendment would also prevent the Secretary of State from changing the criteria without further consultation, and guarantee that the criteria used will be up to date and as practical as possible in a very fast-moving sector. We have rehearsed these matters previously so I do not want keep the Committee on this point for too long.
I have had the opportunity to look at Hansard; in the Committee for the Vehicle Technology and Aviation Bill, the Minister promised to go away, think about it and amend the Bill appropriately to tighten the definition, but that does not seem to have happened. I do not mean to criticise the Minister personally, but the Government have had six months to think about that. The only change that I can see is in clause 1(b) but that is just semantic. We intend to press the amendment to a Division.
When we boil it down, we are legislating for vehicles that are driven by computer software, as we heard in the evidence. We heard from the witnesses on Tuesday that we are legislating exclusively for tier 4 and tier 5 of the five tiers. The tiers start with driver-assisted systems such as braking, steering and parking, through to automated vehicles that can switch between being driven by a human and by software at tier 3, which overlaps into tier 4, and to tier 5, which is purely automated vehicles. The legislation really challenges us as legislators, because by simplifying the insurance system we are being asked to enable our roads to become laboratories to sharpen that technology. We heard clearly in the evidence that there were different attitudes to what is taking place. When asked about tier 5 technology, Mr Wong, from the Society of Motor Manufacturers and Traders, said:
“As to when those level 5 vehicles without steering wheels are capable of performing end-to-end journeys—from my house in the village to my office in the city—that is anybody’s guess. That will probably be some time in the 2030s. It is quite complex.”––[Official Report, Automated and Electric Vehicles Public Bill Committee, 31 October 2017; c. 43, Q98.]
However, we then heard from Mr Boland of Five AI, who told us that automated vehicles would be on our roads in 2019, albeit in an experimental fashion.
This is a big challenge for us. We need to consider the software in great detail, and the Secretary of State needs to be given the power to set and oversee certain standards. Mr Wong referred to the report written by the Ethics Commission on Automated Driving for the German Federal Ministry of Transport and Digital Infrastructure. I am a bit of an anorak, so I have started reading that report, although I have not got through all of it in the last 48 hours. It makes fascinating reading. The commission’s approach is that the technology is there to improve safety, whereas our attitude seems to be that it is a technological advance to help industry, and that improving safety and social inclusion will be a by-product a long way down the line.
The operation of the software raises some ethical issues. I asked the witnesses about how the software would perform and take decisions when an accident is imminent. For instance, imagine a four-year-old toddler walking in front of a vehicle that cannot stop to prevent a collision. To the left is oncoming traffic, with the risk of a head-on collision; to the right are perfectly innocent bystanders on the pavement or at the bus stop—those are the vehicle’s options. Mr Wong noted that this was the “classic trolley problem” referred to in the German ethics commission’s report. The commission’s conclusion was that it is simple to make a decision when the choice is between property damage and human injury, but when the choice is between different types of injury to different road users or innocent pedestrians who are not part of the scenario, we move into a completely new area of morals and ethics. We have to be prepared for that; these situations will take place on our streets, and we need to legislate for them. We should give ourselves the opportunity to oversee this software before it is allowed on the streets. Amendment 8 would give the Secretary of State power over the software’s approval, and new clause 11 would set out the approval criteria.
“designed or adapted to be capable, in at least some circumstances or situations, of safely driving themselves.”
In making that decision, surely the Secretary of State would take into account the nature of the software.
We seem to have started this discussion in terms of this being a mechanical problem about how to develop a piece of technology that can read all the different scenarios on our roads and react accordingly, but looking at the research—vehicles’ different speeds, any delay in the transition between a driver and an automated vehicle—an awful lot of the issue around the software is not referred to in the Bill. I am attempting to draw attention to that and to put in the Bill that it is the crucial area of the technology and we should pay attention to it.
At the experiment the Minister visited down in Greenwich, where automated vehicles are being tested, there was an incident in which someone pushed a plastic chair out in front of the vehicle. The vehicle did not stop and it hit the chair. That was not a scientific test, but it demonstrated that there are circumstances in which things will happen. The vehicles will have to make choices in such circumstances and we should be legislating for that. We should at least give ourselves the power to be able to react and respond in future as the technology develops.
I am not arguing against that technology—it is something that has arrived, and its time is here. As I was discussing with the Minister the other day, that capacity exists in air transport. We could fly passenger planes and they could take off and land perfectly safely without a pilot on board. In an emergency situation, they could be flown remotely by someone in air traffic control. If that capacity were tested in the market, however, all the evidence suggests that people would not buy a ticket, in spite of the fact that almost the entire flight of any flight that anyone undertakes today is done by a machine—by the technology—and some of that technology even shuts the pilot out now, because having the pilot interfere with it is not safe. We do not have that capacity in our air industry, however, because of public opinion.
The House of Commons Library tells me that the air industry would save £31 billion, so there is a big incentive for it to have that capacity, but it has not. We are legislating to have it on our roads, but we are not legislating to control the key bit of the technology, which is the software. That is why I tabled my amendments.
As the Minister knows, two specific issues in the Bill concern me and led me to seek to be part of the Committee. One relates to the question of the strict liability of insurers when the vehicle is operating automatically, which of course relates to the software and its safety—the subject of this group of amendments. I have suggested to the Minister two possible approaches to resolving that problem, which was exposed in our evidence sessions. One of those relates to clause 1(1) and would probably require a somewhat different amendment from those that have been tabled, albeit broadly of the same kind. Let me first explain the problem and then try to suggest the solution.
We established clearly from the insurance industry representatives we questioned that, as the Bill is currently drafted, strict liability will attach to the car rather than to an individual, which is an entirely new phenomenon in insurance law. Let us suppose that there is not a fundamental legal problem with strict liability attaching to the insurer of a car. I make that assumption, although I do not necessarily think that it is a safe one; that may be explored further in the other place by lawyers with much deeper acquaintance with insurance law than I claim to have.
Supposing that that is a feasible arrangement, we then face the question: at what point should that strict liability clock in? That would not be a material question if the machine was never driven by a human being but was driven only by the machine itself. As the hon. Member for Eltham pointed out, that was raised during the evidence session by the rather enterprising group that will create service operations on London’s streets out of what are, in effect, level 5 vehicles way ahead of the schedule that other witnesses suggested would apply. Such vehicles clearly will never have a human being driving them; they will be automated objects that human beings will get into. As it is currently drafted, the Bill will therefore create a strict liability for the insurers. On the happy assumption that that will work legally, insurers will insure those vehicles, they will discover whether that is a very expensive proposition and that will get built into the service price. I am not worried about that from a legislative point of view.
However, I think that the Minister would agree, as all our witnesses seemed to, that it is extremely likely that, in parallel with that rapid roll-out of highly automated level 5 items, for perhaps many millions of motorists there will be a gradual progression—not necessarily strictly demarcated as level 3, level 4 and so on—from vehicles that are largely driven by a driver but somewhat assisted by the machine, to vehicles that are driven by the machine under more and more circumstances but are sometimes driven by the driver.
I certainly do not think that we should legislate on the assumption that we know what the future will look like, but it is highly likely that there will be a stage at which there are vehicles that, for example, are well designed to operate on motorways on an automated basis. The nation may benefit hugely from them operating in that way, because it is safer and allows much shorter distances between vehicles and therefore much more intensive use of motorways, which diminishes capital investment in the motorway system, improves safety and prevents the environmental damage that building more motorways would occasion, so that may well in fact become compulsory at some point. However, those very same vehicles may be ill-designed to deal with country roads, city roads or other kinds of road, so they may well have a function that enables them to be switched back and forth between automated driving and being driven by the driver.
We heard rather different things from witnesses about that switchover. To tell the truth, I think that that is because nobody really knows how it is going to operate. The history of technology is littered with prophecies from experts about how future technologies will operate that have proved to be false, so the Committee would be wise to assume that we do not know, and will not know when legislating, how exactly the switchover between driver and automated vehicle will occur.
Mr Wong suggested in an evidence session that the vehicle itself will offer up to the driver the opportunity to switch over to automation in circumstances in which the vehicle is sufficiently intelligent to know that it is safe for it to take over the driving, and that it will never otherwise offer up that opportunity. It is perfectly sensible that if the vehicle offers itself to the driver to take over operation, and if the driver allows it to take over operation, the vehicle becomes the driver, and the strict liability of the insurer attaches to the vehicle and not any longer to the person. That would be fine.
However, if, as some other witnesses seemed to think was the case, it is the driver who will, at least in some circumstances, make the decision of whether to switch over to automated use, this becomes a highly material question: has the driver made that decision in a reasonable and sensible fashion? The reason is that if the driver has not made the decision in a sensible and reasonable fashion, and if the insurer of the vehicle is nevertheless bound to have strict liability for the vehicle taking over the action, insurers could be faced with enormous bills in circumstances in which what they were actually doing was facing a bad decision by a person whom they had never insured; they had insured the vehicle and not the person. That is the problem we need to address, which brings me to the question of clause 1(1).
I take the point that whether the vehicle should have been in autonomous mode may be material and I shall explore that more when I respond to the debate, but I think that it is what happens at the point of the accident that is of greatest concern. I just put that to my right hon. Friend the Member for West Dorset for further consideration.
Such a course of action is fine and would solve the problem that I have advanced, because the Minister or Secretary of State, or an expert acting on his or her behalf, would have verified in advance that the machine was capable of taking over and would take over only under safe circumstances. Before I give way to the Minister, I want to point out that that is using the law to limit the technology, and the history of the approach to that in our country’s legislation has been very bad. I will not go into all the history, but I am happy to write the Minister a memorandum about it if he wants. I once wrote an article about this. There is a very long history of Parliament trying to prejudge the technology, legislating on the assumption that it will be only that technology, mandating therefore only that technology, and discovering that there is not any of it and that people elsewhere are manufacturing things that we do not get because they do not fit our legal system. It is not the route I recommend, and I will come back to that when we get to clause 2. It is a possible route, however, and one that the Minister should at least consider.
On the specifics of his point about liability, I draw his attention to clause 3(2), which we will debate later. You will not let me debate it now for that reason, Sir Edward, but clause 3(2) specifically talks about the subject that my right hon. Friend describes, because it draws attention to the possibility of an accident being
“wholly due to the person’s negligence in allowing the vehicle to begin driving itself when it was not appropriate to do so.”
That is very much what my right hon. Friend speaks about, and it is why we put it in the Bill. He makes a separate point—a good one—about technology that kicks in of its own accord because the technology, the software, determines that it is better at that point for the vehicle to be driven autonomously. We will explore that in greater detail as we consider the legislation. I simply draw his attention at this stage to clause 3(2).
“wholly due to the person’s negligence”.
That is an almost impossible thing to establish. As currently drafted, it does almost no heavy lifting at all. I think I know why a parliamentary draftsman has nevertheless inserted the word “wholly”, because, like the Minister, I have had quite a long experience of dealing with parliamentary draftsmen on numerous Bills. I know that they think through carefully the question of what happens if we do not put in a word such as “wholly” under these circumstances.
Finally, if it were the intention of the Minister to add to clause 1(1), rather than to do something to clause 2 or clause 3, which we will come to later, it would be important to establish whether the view taken by Mr Wong—that these machines will always be designed in such a way that they decide on a safe basis whether to take over—is a consensual view across the industry in every country or a happenstance view of some particular technologist.
In clause 1(1)(b), the Secretary of State is asked to opine on whether the vehicle that is being approved and put on the list is capable of “safely driving”. An awful lot will hang on that word “safely” in what will probably be a rich jurisprudence over many decades. The hon. Member for Eltham is rightly drawing our attention to the fact that “safely” in this context could mean something technical—is the machine technically sophisticated enough to deal with circumstances—or it could mean something much deeper. It could mean the ethics and applied intelligence built into the machine so as to produce views or choices that accord with the social preferences of Parliament about, in trying to minimise the effect of an accident, who is to be sacrificed under circumstances where two different groups of persons could be sacrificed. Alternatively, it could mean any other set of very complicated ethical choices.
I of course bow to the Department’s legal advisers, parliamentary counsel and any external counsel, but my own hunch is that there is not enough jurisprudence available to guide us on whether “safely” will bear that amount of weight. I wonder whether the Minister should consider at least giving the Secretary of State the duty in due course to consider not just whether the machinery is capable of driving “safely”, but whether it is capable of driving—I do not know quite what words parliamentary counsel would want to choose—ethically or properly or in a socially desirable way. That is an odd kind of question to ask about a machine, I grant, but these are odd machines we are considering.
The hon. Member for Eltham is on to a good thing with amendment 8, even if he does not press it to a vote, because he raises an issue we will have to address. What we all do not want to get to—I think the Committee is united in this—is a sort of red flag situation where machines have been authorised because they have a large amount of technological wizardry in them that makes them highly sophisticated, but they make choices that any sane Parliament or Government, or indeed public, would regard as wholly morally objectionable, socially undesirable or both.
We need to think very hard about ensuring that the legislation at least lets our successors—whoever may be Secretary of State at the time—consider that range of issues when approving something. Otherwise, the Secretary of State will say, “Oh well, this is technically okay, but I don’t like the look of what it is going to do by way of the kinds of decisions it is going to make,” and some adviser will tell that Secretary of State, “Sorry, Secretary of State, it is ultra vires for you to refuse this vehicle on the list just because it is going to mow down young people in preference to old people”—or something—“because you are only allowed to determine safety, not ethics.” It is quite important that we get that precise wording right. I am grateful to you for your tolerance, Sir Edward.
Perhaps I am not appreciating the fine nuance of the debate, but I would have assumed that, ultimately, the liability has to be with the driver. In the event of an accident, the telematics would be able to provide data to the insurance industry to prove things one way or another.
That is our mission, and I am determined to do so in as convivial a spirit as possible; I am grateful to the hon. Member for Kingston upon Hull East for his generous remarks. Frankly, whoever was in government would face these challenges, and would be bringing a Bill of this kind to the House. It is perfectly appropriate that we should discuss it in as consensual a way as possible.
The job of the Opposition is to scrutinise such measures; indeed, it is the job of my right hon. and hon. colleagues to do so too. Those who have served on few Standing Committees and had little experience of legislation—there are some, who are newer Members of the House—will not have encountered a Minister quite like me. I am one of those rare creatures who are happy to listen to debate, hear suggestions and take them on board, be guided by them and concede where we have got it wrong. I am all the more so on issues such as this, because we are charting a difficult course, as I described.
The last time that we debated these matters—this is directly relevant to the amendments, Sir Edward, just in case you were thinking it might not be—we could not proceed with that Bill, because the inconvenience of a general election stopped us doing so. We considered the issues that we are beginning to debate now. They involve the creation of a list of automated vehicles to provide the public and the industry with the kind of clarity that I have described, and the relationship between those vehicles and new insurance provisions. Essentially, inasmuch as the Bill deals with autonomous vehicles, it does so in order to create a secure insurance market to allow the further developments that I mentioned.
The Bill suggests that the Secretary of State will create such a list to give clarity about insurance by applying the definition in clause 1(1)(a) and (b). We state that automated vehicles are those
The answer to my hon. Friend the Member for Milton Keynes South is that that includes other vehicles. He mentioned HGVs; he will know that some R and D is being done on those kinds of vehicle. Given what we already know about the work being done in this area, private cars might not be the first vehicles to become automated. I am not making a prediction, but it could be vehicles of the kind that he described. The best example that I can think of is the shuttles at airports that one uses to get to the terminal. We do not think of them as vehicles in the same way that we think of a car that we might drive from our home, but they are vehicles. They travel on a pre-ordained route, rather like the vehicle that I saw when I went to Greenwich and that I mentioned in earlier consideration. That was a fully autonomous vehicle driving on a single road from two set points. That might be the kind of first steps that are taken as the technology develops.
I emphasise that the technology is in its early stages—not quite in its genesis; more in its infancy. The standards by which these vehicles will be approved for safe sale and use are still being discussed internationally, so another challenge for the Government is to ensure that we—as a nation, as a polity and as a Parliament—do not jump ahead of those international standards. That is another ball that we are juggling, if I may use those terms. The international standards are developing because the research and development of the kind that I have described are happening across the world. Many countries are engaged in it; indeed, many of the businesses are pan-national, so they work in a number of different countries. This will be discussed and is being planned for by the United Nations Economic Commission for Europe, in which the UK plays a leading role.
The standards are still being developed and will form the basis of the type approval process, which is well established in the motor industry. We already talk routinely about type approval; it has been a long-standing part of how the industry works. The critical thing is that for a vehicle to meet to that type approval process, rather like a non-automated vehicle now, to be sold for safe use on roads, it must meet those standards. The core requirement of safety is implicit in the development of those standards, which will be international.
“May I point something out? I mentioned autonomous emergency braking. It has been demonstrated that the technology is improving all the time. Previously, autonomous emergency braking worked perfectly at 30 mph, which is urban speed, but it is becoming increasingly sophisticated. AEB can work well even at 50 mph. It would not surprise me if the technology improved in years to come”.––[Official Report, Automated and Electric Vehicles Public Bill Committee, 31 October 2017; c. 44, Q103.]
The technology is improving so rapidly and dramatically that in the scenario painted by the hon. Member for Eltham, an automated vehicle is likely to change lanes and—as in Mr Wong’s example—brake to ensure safety.
The representatives of the insurance industry stated in their evidence that the industry believes there will be fewer accidents, because the judgment of an autonomous vehicle will outpace that of a human being. I use the word “judgment” for technology with caution, as my right hon. Friend the Member for West Dorset used the word “ethics” with caution, but the judgment of the software driving the automated vehicle will be more acute and, in the end, safer. These machines are likely to be less prone to error than human beings, so there will be fewer accidents; the vehicles will be safer and therefore easier and cheaper to insure. We heard that point repeatedly in the evidence session. We can be confident that that is the direction of travel—I apologise for using that rather hackneyed phrase in this context—but we cannot be sure how quickly we will get there or exactly what it will look like. I would be a very bold man if I made such a prediction.
The Bill is drafted as if artificial intelligence were the same kind of thing as speed control. It is not, and that is a very important error underlying the Bill’s drafting. Speed control is a technical matter, and we could go much further with technical development and still be in the technical arena in which safety is the only question, because the ethical judgments are made exclusively by the human drivers. With artificial intelligence, as the hon. Member for Eltham rightly says, we are moving into a terrain in which the machine will make the kind of decisions that Parliaments and human beings make. These are questions not of safety, but of judgment about the right outcome under difficult circumstances.
I ask the Minister to go back to his Department and talk to its lawyers about whether jurisprudence will deliver to him or his successors the ability to refuse approval to a piece of artificial intelligence that, either directly or through its learning processes, will or could have the effect of producing totally dysfunctional anti-utilitarian results by making judgments that are technically perfectly safe but that just happen to take the view that, for example, wiping out a group of three-year-old schoolchildren is better than wiping out a 98-year-old crossing the road. That is a very difficult judgment for a human being to make, but it is the kind of judgment that Parliaments have to make, and I think that at the moment it is very clear in the Bill that it would not permit a Secretary of State to prevent type approval for a machine that was designed in such a way that there could be those very bizarre and undesirable results, and I am sure that that is not what the Department or the Minister wants to achieve.
There is a much bigger debate, which will clearly have to be dealt with in legislation, in regulations, in type approval—in a whole range of other things—about some of the other matters that the hon. Member for Eltham and my right hon. Friend the Member for West Dorset have raised. If they are both right that we will get to a point at which the machine makes what is in effect an ethical judgment—I am trying to use words very carefully; it is very obviously the machine making ethical judgments, but I do appreciate the strangeness of it—clearly that will have to be taken into account at a future point in the legislative process. I do not think this Bill is the place to do it; I just do not think it can do it, because we do not yet know enough.
We are back to my first point, about the line we are trying to tread between what we can do now with certainty and what we might do in the future in a world in which we can as yet only imagine what might occur. If my right hon. Friend will permit me to say so, perhaps the Hegelian synthesis, where we might meet between what appears to be my thesis and his antithesis, is that this Bill is a starting point—a first step along, as I have said, a long road.
There is no point in having the Secretary of State empowered to make a list unless Secretaries of State are actually going to make lists. There is no point in empowering them to make lists of automated vehicles unless those lists are going to relate to automated vehicles. Those automated vehicles will have artificial intelligence built into them; they cannot be automated otherwise. Therefore, the Secretary of State, who is making the list in the first place, which this Bill provides for—not some other Bill, but this Bill—will be constrained by the terms that the Bill sets for what basis they can use to make the list. That is why the shadow Minister has raised questions about the criteria, and why we are having this debate in the first place. Surely, therefore, we need to empower—I am not suggesting that we in any way oblige—later Secretaries of State to consider, inter alia, whether the machines that they are putting on the list are actually murderously safe or good and safe machines. At the moment, they can decide only whether it is a safe machine. If it happens to be safe in the sense in which Stalin could “safely” eliminate large sections of his population, the poor old Secretary of State would, as I construe it—the Minister has not given us any indication that he has had advice to the contrary—be prevented from—
The safe functioning criteria are more straightforward. This is about a marriage between software and the machine. The machinery certainly needs to be safe. We drive machines now with internal combustion engines that are not fundamentally different from their early ancestors. So we know that the machine needs to be safe. The existing provisions in the Bill are clear that the list can comprise at present only vehicles that can be legally used on the roads. Having reflected briefly, I will reflect more—I am in reflective mode, as the Committee can tell. Perhaps it is about what we do in regulations. There might be an opportunity to qualify or clarify through regulation how the list develops.
Let me try to get through some more of my pre-prepared notes rather than extemporising, as is necessary when we have proper dialogue and scrutiny.
As I said, I am not sure that it would be appropriate to be too precise about the criteria. The only scope that the Secretary of State will have to list a vehicle is by determining whether it meets the safety definition. If it does, it will be included on the list; if it does not, it will not. There is no discretion to make a decision outside those parameters; the power is merely administrative and is not a discretionary legislative power. That is so we can be clear about why vehicles need to be on the list.
The defined vehicles will not be covered by our current insurance framework and will therefore need new, specific insurance products. That is the point I was making about the limits to what we are trying to do now and the essence of why they matter. This is about allowing the further development of appropriate insurance products that are not out there now, because if they are not out there in the future that will inevitably limit how far we go with the further development of vehicles.
I promised to give way to the hon. Member for Warwick and Leamington and I have not done so. That was very discourteous of me, so I do so now.
My final point is that the character of the amendments and of our debate is about the Secretary of State’s interpretive powers. We have to be careful about extending the interpretive scope of this part of the Secretary of State’s responsibilities. This is yet another line to walk and not to cross. The criteria for inclusion on the list need to be sufficiently clear as not to allow any doubt in the insurance market about precisely what kind of vehicle might be on the list and therefore what kind of vehicle might or might not be insured. I am therefore doubtful about extending the interpretive scope.
We need to be clear which vehicles and which software can safely be operated in automated mode. The Secretary of State will therefore be able to transpose approved vehicles on to the list to ensure that our domestic insurance framework is based on and clear about which vehicles need which insurance products. It would not be appropriate to legislate at this early stage, as amendment 8 and new clause 11 suggest, to set an approval procedure or safety criteria until we know what the international standards are. The hon. Member for Warwick and Leamington is right; we will almost certainly need to do that further down the line as those international standards become clearer. Whether that is in other legislation or more likely in regulation—that is how I would like to go—is no doubt something we will debate over the course of the coming days.
In essence, I return to my core argument: the Bill is a starting point to creating greater clarity. It is not by any means the end of what I hope—I return to my very early words—will be a wonderful story.
Question put, That the amendment be made.
“or by an automated vehicle when transitioning between driving itself and being driven by a person,”
This amendment would ensure that the liability for accidents caused by an automated vehicle that is transitioning between driving itself and being driven by a person would be the same as the liability for accidents caused by an automated vehicle when driving itself.
“or by an automated vehicle when transitioning between driving itself and being driven by a person,”
This amendment would ensure that the liability for accidents caused by an automated vehicle that is transitioning between driving itself and being driven by a person would be the same as the liability for accidents caused by an automated vehicle when driving itself.
Various pieces of research into the issue have come to different conclusions. In the evidence sessions, we heard that Audi had carried out some research at different speeds and come to the conclusion that there should be a minimum of 10 seconds in that transition period. The Venturer research came to slightly different conclusions, but all the research points to the fact that this is a problematic area in automated vehicle technology. It can take a deal of time for a driver to become alert. Mr Wong described to us various alarms that alert the driver to a vehicle request for the driver to take back control of the car; if those various alarms do not alert the driver, the vehicle will then slowly come to a halt. I am sure that we can all imagine the sort of disruption that could be caused if that happened on a motorway. He even described how the car prepared for an accident by tightening the driver’s seat belt just before the vehicle came to a halt, in case the driver had passed out or was so fast asleep that the alarms did not wake them up. There are various scenarios involving the transition that cause alarm.
Mr Gooding of the RAC Foundation felt that we should not even entertain tier 3 because it is unsafe and does not make any sense, and because the legislation is about moving straight to tiers 4 and 5. Clearly, if people giving us evidence are saying that, I suggest to the Minister that it should cause the Government some alarm, and that perhaps we should be legislating to say that we do not want to allow this on our roads. There are issues being raised about the clear dangers of tier 3 transition.
We need to consider this issue. The evidence that I read said that the Venturer experiment at the Bristol testing centre discovered that drivers, when they first took over, tended to be over-cautious and drive at slower rates, which could increase congestion. There was also the potential for danger in vehicles suddenly slowing down, and Mr Gooding said in his answers to our questions that he felt that that issue was more important than congestion.
There are some important considerations raised by the issue of transition, particularly in tier 3. We asked witnesses, “When will the vehicle decide whether it is safe for the vehicle to drive or whether the vehicle should be handed back to the human driver?” They said that it depended on road conditions. That suggests that it will happen in the same locations on our roads: for instance, as vehicles leave motorways and enter more built-up areas, where there are more potential hazards and dangers for vehicles, it is likely that the vehicles will transition back to being driven by the driver. If that will happen regularly in the same location, it could create accident black spots. We could create a considerable new hazard on our roads.
“designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip.”
However,
“it’s important to note that this is limited to the ‘operational design domain’ of the vehicle—meaning it does not cover every driving scenario.”
I hope that the hon. Gentleman will agree that the transition question arises in relation to level 4 when vehicles move from one driving scenario to another.
I suggest to the Minister that we need to take that away and consider it. Safety must be the aspect most prevalent in our minds. There is also the moral or ethical issue of driver autonomy: will the driver be in charge of the vehicle, or will the technology be in charge of the driver? In the debate on previous amendments, he said that the technology is superior; he did not use that word, but he said that it is safer than a human in the event of an accident, even suggesting that a vehicle would make better or quicker choices than a human. That points us down a road, if Members will pardon the pun, of having roads operated in the way that our railways or underground service are controlled. Why not have fully automated vehicles of which drivers do not have control at all?
Ordered, That the debate be now adjourned.—Andrew Stephenson.
Contains Parliamentary information licensed under the Open Parliament Licence v3.0.