|Part of a series on|
A self-driving car, also known as an autonomous vehicle (AV), autonomous car, driver-less car, or robotic car (robo-car), is a car incorporating vehicular automation, that is, a ground vehicle that is capable of sensing its environment and moving safely with little or no human input.
Self-driving cars combine a variety of sensors to perceive their surroundings, such as thermographic cameras, radar, lidar, sonar, GPS, odometry and inertial measurement units. Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage.
Autonomy in vehicles is often categorized in six levels, according to a system developed by SAE International (SAE J3016, revised periodically). The SAE levels can be roughly understood as Level 0 - no automation; Level 1 - hands on/shared control; Level 2 - hands off; Level 3 - eyes off; Level 4 - mind off, and Level 5 - steering wheel optional.
As of March 2022[update], vehicles operating at Level 3 and above remain a marginal portion of the market. In December 2020, Waymo became the first service provider to offer driver-less taxi rides to the general public, in a part of Phoenix, Arizona. In March 2021, Honda became the first manufacturer to provide a legally approved Level 3 car, and Toyota operated a potentially Level 4 service around the Tokyo 2020 Olympic Village. Nuro has been allowed to start autonomous commercial delivery operations in California in 2021. In December 2021, Mercedes-Benz became the second manufacturer to receive legal approval for a Level 3 car complying with legal requirements. In February 2022, Cruise became the second service provider to offer driver-less taxi rides to the general public, in San Francisco in the US.
In China, two publicly accessible trials of robotaxis have been launched, in 2020 in Shenzhen's Pingshan District by Chinese firm AutoX and in 2021 at Shougang Park in Beijing by Baidu, a venue for the 2022 Winter Olympics.
Main article: History of self-driving cars
Experiments have been conducted on automated driving systems (ADS) since at least the 1920s; trials began in the 1950s. The first semi-automated car was developed in 1977, by Japan's Tsukuba Mechanical Engineering Laboratory, which required specially marked streets that were interpreted by two cameras on the vehicle and an Analog computer. The vehicle reached speeds up to 30 kilometres per hour (19 mph) with the support of an elevated rail.
A landmark autonomous car appeared in the 1980s, with Carnegie Mellon University's Navlab and ALV projects funded by the United States' Defense Advanced Research Projects Agency (DARPA) starting in 1984 and Mercedes-Benz and Bundeswehr University Munich's EUREKA Prometheus Project in 1987. By 1985, the ALV had demonstrated self-driving speeds on two-lane roads of 31 kilometres per hour (19 mph), with obstacle avoidance added in 1986, and off-road driving in day and night time conditions by 1987. A major milestone was achieved in 1995, with CMU's NavLab 5 completing the first autonomous coast-to-coast drive of the United States. Of the 2,849 mi (4,585 km) between Pittsburgh, Pennsylvania and San Diego, California, 2,797 mi (4,501 km) were autonomous (98.2%), completed with an average speed of 63.8 mph (102.7 km/h). From the 1960s through the second DARPA Grand Challenge in 2005, automated vehicle research in the United States was primarily funded by DARPA, the US Army, and the US Navy, yielding incremental advances in speeds, driving competence in more complex conditions, controls, and sensor systems. Companies and research organizations have developed prototypes.
The US allocated US$650 million in 1991 for research on the National Automated Highway System, which demonstrated automated driving through a combination of automation embedded in the highway with automated technology in vehicles, and cooperative networking between the vehicles and with the highway infrastructure. The programme concluded with a successful demonstration in 1997 but without clear direction or funding to implement the system on a larger scale. Partly funded by the National Automated Highway System and DARPA, the Carnegie Mellon University Navlab drove 4,584 kilometres (2,848 mi) across America in 1995, 4,501 kilometres (2,797 mi) or 98% of it autonomously. Navlab's record achievement stood unmatched for two decades until 2015, when Delphi improved it by piloting an Audi, augmented with Delphi technology, over 5,472 kilometres (3,400 mi) through 15 states while remaining in self-driving mode 99% of the time. In 2015, the US states of Nevada, Florida, California, Virginia, and Michigan, together with Washington, DC, allowed the testing of automated cars on public roads.
From 2016 to 2018, the European Commission funded an innovation strategy development for connected and automated driving through the Coordination Actions CARTRE and SCOUT. Moreover, the Strategic Transport Research and Innovation Agenda (STRIA) Roadmap for Connected and Automated Transport was published in 2019.
In November 2017, Waymo announced that it had begun testing driver-less cars without a safety driver in the driver position; however, there was still an employee in the car. An October 2017 report by the Brookings Institution found that the $80 billion had been reported as invested in all facets of self driving technology up to that point, but that it was "reasonable to presume that total global investment in autonomous vehicle technology is significantly more than this."
In October 2018, Waymo announced that its test vehicles had travelled in automated mode for over 10,000,000 miles (16,000,000 km), increasing by about 1,000,000 miles (1,600,000 kilometres) per month. In December 2018, Waymo was the first to commercialize a fully autonomous taxi service in the US, in Phoenix, Arizona. In October 2020, Waymo launched a geo-fenced driver-less ride hailing service in Phoenix. The cars are being monitored in real-time by a team of remote engineers, and there are cases where the remote engineers need to intervene.
In March 2019, ahead of the autonomous racing series Roborace, Robocar set the Guinness World Record for being the fastest autonomous car in the world. In pushing the limits of self-driving vehicles, Robocar reached 282.42 km/h (175.49 mph) – an average confirmed by the UK Timing Association at Elvington in Yorkshire, UK.
In 2020, a National Transportation Safety Board chairman stated that no self-driving cars (SAE level 3+) were available for consumers to purchase in the US in 2020:
There is not a vehicle currently available to US consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated. If you are selling a car with an advanced driver assistance system, you’re not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don’t own a self-driving car.
On 5 March 2021, Honda began leasing in Japan a limited edition of 100 Legend Hybrid EX sedans equipped with the newly approved Level 3 automated driving equipment which had been granted the safety certification by Japanese government to their autonomous "Traffic Jam Pilot" driving technology, and legally allow drivers to take their eyes off the road.
There is some inconsistency in the terminology used in the self-driving car industry. Various organizations have proposed to define an accurate and consistent vocabulary.
In 2014, such confusion has been documented in SAE J3016 which states that "some vernacular usages associate autonomous specifically with full driving automation (Level 5), while other usages apply it to all levels of driving automation, and some state legislation has defined it to correspond approximately to any ADS [automated driving system] at or above Level 3 (or to any vehicle equipped with such an ADS)."
Modern vehicles provide features such as keeping the car within its lane, speed controls, or emergency braking. Those features alone are just considered as driver assistance technologies because they still require a human driver control while fully automated vehicles drive themselves without human driver input.
According to Fortune, some newer vehicles' technology names—such as AutonoDrive, PilotAssist, Full-Self Driving or DrivePilot—might confuse the driver, who may believe no driver input is expected when in fact the driver needs to remain involved in the driving task. According to the BBC, confusion between those concepts leads to deaths.
For this reason, some organizations such as the AAA try to provide standardized naming conventions for features such as ALKS which aim to have capacity to manage the driving task, but which are not yet approved to be an automated vehicles in any countries. The Association of British Insurers considers the usage of the word autonomous in marketing for modern cars to be dangerous because car ads make motorists think 'autonomous' and 'autopilot' mean a vehicle can drive itself when they still rely on the driver to ensure safety. Technology able to drive a car is still in its beta stage.
Some car makers suggest or claim vehicles are self-driving when they are not able to manage some driving situations. Despite being called Full Self-Driving, Tesla stated that its offering should not be considered as a fully autonomous driving system. This makes drivers risk becoming excessively confident, taking distracted driving behaviour, leading to crashes. While in Great-Britain, a fully self-driving car is only a car registered in a specific list. There have also been proposals to adopt the aviation automation safety knowledge into the discussions of safe implementation of autonomous vehicles, due to the experience that has been gained over the decades by the aviation sector on safety topics.
According to the SMMT, "There are two clear states – a vehicle is either assisted with a driver being supported by technology or automated where the technology is effectively and safely replacing the driver."
Autonomous means self-governing. Many historical projects related to vehicle automation have been automated (made automatic) subject to a heavy reliance on artificial aids in their environment, such as magnetic strips. Autonomous control implies satisfactory performance under significant uncertainties in the environment, and the ability to compensate for system failures without external intervention.
One approach is to implement communication networks both in the immediate vicinity (for collision avoidance) and farther away (for congestion management). Such outside influences in the decision process reduce an individual vehicle's autonomy, while still not requiring human intervention.
As of 2017[update], most commercial projects focused on automated vehicles that did not communicate with other vehicles or with an enveloping management regime. Euro NCAP defines autonomous in "Autonomous Emergency Braking" as: "the system acts independently of the driver to avoid or mitigate the accident", which implies the autonomous system is not the driver.
In Europe, the words automated and autonomous might be used together. For instance, Regulation (EU) 2019/2144 of the European Parliament and of the Council of 27 November 2019 on type-approval requirements for motor vehicles (...) defines "automated vehicle" and "fully automated vehicle" based on their autonomous capacity:
In British English, the word automated alone might have several meaning, such in the sentence: "Thatcham also found that the automated lane keeping systems could only meet two out of the twelve principles required to guarantee safety, going on to say they cannot, therefore, be classed as ‘automated driving’, instead it claims the tech should be classed as ‘assisted driving’.": The first occurrence of the "automated" word refers to an Unece automated system, while the second occurrence refers to the British legal definition of an automated vehicle. The British law interprets the meaning of "automated vehicle" based on the interpretation section related to a vehicle "driving itself" and an insured vehicle.
To enable a car to travel without any driver embedded within the vehicle, some companies use a remote driver.
According to SAE J3016,
Some driving automation systems may indeed be autonomous if they perform all of their functions independently and self-sufficiently, but if they depend on communication and/or cooperation with outside entities, they should be considered cooperative rather than autonomous.
PC Magazine defines a self-driving car as "a computer-controlled car that drives itself." The Union of Concerned Scientists states that self-driving cars are "cars or trucks in which human drivers are never required to take control to safely operate the vehicle. Also known as autonomous or 'driver-less' cars, they combine sensors and software to control, navigate, and drive the vehicle."
The British Automated and Electric Vehicles Act 2018 law defines a vehicle as "driving itself" if the vehicle "is operating in a mode in which it is not being controlled, and does not need to be monitored, by an individual".
A classification system with six levels – ranging from fully manual to fully automated systems – was published in 2014 by standardization body SAE International as J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems; the details are revised periodically. This classification is based on the amount of driver intervention and attentiveness required, rather than the vehicle's capabilities, although these are loosely related. In the United States in 2013, the National Highway Traffic Safety Administration (NHTSA) had released its original formal classification system. After SAE updated its classification in 2016, called J3016_201609, NHTSA adopted the SAE standard, and SAE classification became widely accepted.
In SAE's automation level definitions, "driving mode" means "a type of driving scenario with characteristic dynamic driving task requirements (e.g., expressway merging, high speed cruising, low speed traffic jam, closed-campus operations, etc.)"
In the formal SAE definition below, an important transition is from SAE Level 2 to SAE Level 3 in which the human driver is no longer expected to monitor the environment continuously. At SAE 3, the human driver still has responsibility to intervene when asked to do so by the automated system. At SAE 4 the human driver is always relieved of that responsibility and at SAE 5 the automated system will never need to ask for an intervention.
|SAE Level||Name||Narrative definition||Execution of
|Monitoring of driving environment||Fallback performance of dynamic driving task||System capability (driving modes)|
|Human driver monitors the driving environment|
|0||No Automation||The full-time performance by the human driver of all aspects of the dynamic driving task, even when "enhanced by warning or intervention systems"||Human driver||Human driver||Human driver||n/a|
|1||Driver Assistance||The driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration||using information about the driving environment and with the expectation that the human driver performs all remaining aspects of the dynamic driving task||Human driver and system||Some driving modes|
|2||Partial Automation||The driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/deceleration||System|
|Automated driving system monitors the driving environment|
|3||Conditional Automation||The driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task||with the expectation that the human driver will respond appropriately to a request to intervene||System||System||Human driver||Some driving modes|
|4||High Automation||even if a human driver does not respond appropriately to a request to intervene the car can pull over safely by guiding system||System||Many driving modes|
|5||Full Automation||under all roadway and environmental conditions that can be managed by a human driver||All driving modes|
The SAE Automation Levels have been criticized for their technological focus. It has been argued that the structure of the levels suggests that automation increases linearly and that more automation is better, which may not always be the case. The SAE Levels also do not account for changes that may be required to infrastructure and road user behaviour.
To deal with board range of technology discussions regarding to self-driving car, there are few proposals for its classification. Among them, there is a proposal to have classification to have the following categories; car navigation, path planning, environment perception and car control. In 2020s, these technologies became recognized that they are far more complicated and involved than we thought it would be.
Main article: Hybrid navigation
Hybrid navigation is the simultaneous use of more than one navigation system for location data determination, needed for navigation.
To reliably and safely operate an autonomous vehicle, usually a mixture of sensors is utilized. Typical sensors include lidar (Light Detection and Ranging), stereo vision, GPS and IMU. Modern self-driving cars generally use Bayesian simultaneous localization and mapping (SLAM) algorithms, which fuse data from multiple sensors and an off-line map into current location estimates and map updates. Waymo has developed a variant of SLAM with detection and tracking of other moving objects (DATMO), which also handles obstacles such as cars and pedestrians. Simpler systems may use roadside real-time locating system (RTLS) technologies to aid localization.
Self-driving cars require a new class of high-definition maps (HD maps) that represent the world at up to two orders of magnitude more detail. In May 2018, researchers from the Massachusetts Institute of Technology (MIT) announced that they had built an automated car that can navigate unmapped roads. Researchers at their Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a new system, called MapLite, which allows self-driving cars to drive on roads that they have never been on before, without using 3D maps. The system combines the GPS position of the vehicle, a "sparse topological map" such as OpenStreetMap, (i.e. having 2D features of the roads only), and a series of sensors that observe the road conditions.
Control systems on automated cars may use sensor fusion, which is an approach that integrates information from a variety of sensors on the car to produce a more consistent, accurate, and useful view of the environment.
Path planning is a computational problem to find a sequence of valid configurations that moves the object from the source to destination. The large scale path of the vehicle can be determined by using a voronoi diagram, an occupancy grid mapping, or with a driving corridors algorithm. However, these traditional approaches are not sufficient for a vehicle that is interacting with other moving objects, and several advanced approaches applying machine learning are under developments.
Main article: Drive by wire
Drive by wire technology in the automotive industry is the use of electrical or electro-mechanical systems for performing vehicle functions traditionally achieved by mechanical linkages.
Main article: Driver monitoring system
Driver monitoring system is a vehicle safety system to assess the driver's alertness and warn the driver if needed. It is recognized in developer side that the role of the systems will increase as SAE Level 2 systems become more common-place, and becomes more challenging at Level 3 and above to predict the driver's readiness for handover.
Main article: Vehicular communication systems
Vehicular communications is a growing area of communications between vehicles and including roadside communication infrastructure. Vehicular communication systems use vehicles and roadside units as the communicating nodes in a peer-to-peer network, providing each other with information. This connectivity enables autonomous vehicles to interact with non-autonomous traffic and pedestrians to increase safety. And autonomous vehicles will need to connect to the cloud to update their software and maps, and feedback information to improve the used maps and software of their manufacturer.
See also: Over-the-air programming
Autonomous vehicles have software systems that drive the vehicle, meaning that updates through reprogramming or editing the software can enhance the benefits of the owner (e.g. update in better distinguishing blind person vs. non-blind person so that the vehicle will take extra caution when approaching a blind person). A characteristic of this re-programmable part of autonomous vehicles is that the updates need not only to come from the supplier, because through machine learning, smart autonomous vehicles can generate certain updates and install them accordingly (e.g. new navigation maps or new intersection computer systems). These reprogrammable characteristics of the digital technology and the possibility of smart machine learning give manufacturers of autonomous vehicles the opportunity to differentiate themselves on software.
In March 2021, UNECE regulation on software update and software update management system was published.
Autonomous vehicles are more modular since they are made up out of several modules which will be explained hereafter through a Layered Modular Architecture. The Layered Modular Architecture extends the architecture of purely physical vehicles by incorporating four loosely coupled layers of devices, networks, services and contents into Autonomous Vehicles. These loosely coupled layers can interact through certain standardized interfaces.
In order for autonomous vehicles to perceive their surroundings, they have to use different techniques each with their own accompanying digital information (e.g. radar, GPS, motion sensors and computer vision). Homogenization requires that the digital information from these different sources is transmitted and stored in the same form. This means their differences are decoupled, and digital information can be transmitted, stored, and computed in a way that the vehicles and their operating system can better understand and act upon it.
In international standardization field, ISO/TC 22 is in charge of in-vehicle transport information and control systems, and ISO/TC 204 is in charge of information, communication and control systems in the field of urban and rural surface transportation. International standards have been actively developed in the domains of AD/ADAS functions, connectivity, human interaction, in-vehicle systems, management/engineering, dynamic map and positioning, privacy and security.
The potential benefits from increased vehicle automation described may be limited by foreseeable challenges such as disputes over liability, the time needed to turn over the existing stock of vehicles from non-automated to automated, and thus a long period of humans and autonomous vehicles sharing the roads, resistance by individuals to forfeiting control of their cars, concerns about safety, and the implementation of a legal framework and consistent global government regulations for self-driving cars.
Other obstacles could include de-skilling and lower levels of driver experience for dealing with potentially dangerous situations and anomalies, ethical problems where an automated vehicle's software is forced during an unavoidable crash to choose between multiple harmful courses of action ('the trolley problem'), concerns about making large numbers of people currently employed as drivers unemployed, the potential for more intrusive mass surveillance of location, association and travel as a result of police and intelligence agency access to large data sets generated by sensors and pattern-recognition AI, and possibly insufficient understanding of verbal sounds, gestures and non-verbal cues by police, other drivers or pedestrians.
Possible technological obstacles for automated cars are:[needs update]
In 2010s, researchers openly worried about the potential of future regulation to delay deployment of automated cars on the road. However, as written in UNECE WP.29 GRVA, international regulation for Level 3 was smoothly established in 2020, and the uncertainty was resolved. As of 2022[update], in practice, it is actually very difficult to be approved as Level 3, with the Mercedes-Benz Drive Pilot being one of the few commercially available options to receive such approval.
As Tesla's "Full Self-Driving (FSD)" actually corresponds to Level 2, senators called for investigation to the Federal Trade Commission (FTC) about their marketing claims in August 2021. And in December 2021 in Japan, Mercedes-Benz Japan Co., Ltd. was punished by the Consumer Affairs Agency for the descriptions in their handouts that are different from the fact.
It was in July 2016, following a fatal crash by a Tesla car operating in "Autopilot" mode, that Mercedes-Benz was also slammed misleading over their commercial of E-Class models which had been available with "Drive Pilot". At that time, Mercedes-Benz stopped its "self-driving car" ad campaign which had been running in the United States, after they once rejected the claims.
Companies working on the technology have an increasing recruitment problem in that the available talent pool has not grown with demand. As such, education and training by third-party organizations such as providers of online courses and self-taught community-driven projects such as DIY Robocars and Formula Pi have quickly grown in popularity, while university level extra-curricular programmed such as Formula Student Driver-less have bolstered graduate experience. Industry is steadily increasing freely available information sources, such as code, datasets and glossaries to widen the recruitment pool.
In 2020s, from the importance of the automotive sector to the nation, self-driving car has become a topic of national security. The concerns regarding cybersecurity and data protection are not only important for user protection, but also in the context of national security. The trove of data collected by self-driving cars, paired with cybersecurity vulnerabilities, creates an appealing target for intelligence collection. Self-driving cars are required to be considered in a new way when it comes to espionage risk.
It was in July 2018 that a former Apple engineer was arrested by Federal Bureau of Investigation (FBI) at San Jose International Airport (SJC) while preparing to board a flight to China and charged with stealing proprietary information related to Apple's self-driving car project. And in January 2019, another Apple employee was charged with stealing self-driving car project secrets. In July 2021, United States Department of Justice (DOJ) accused Chinese security officials of a hacking attack seeking data on of coordinating a vast hacking campaign to steal sensitive and secret information from government entities including research related to autonomous vehicles. On the China side, they have already prepared "the Provisions on Management of Automotive Data Security (Trial)".
It is concerned that leapfrogging ability can be applied to autonomous car technology. Also, emerging Cellular V2X (Cellular Vehicle-to-Everything) technologies are based on 5G wireless networks.
See also: Human factors and ergonomics
Self-driving cars are already exploring the difficulties of determining the intentions of pedestrians, bicyclists, and animals, and models of behaviour must be programmed into driving algorithms. Human road users also have the challenge of determining the intentions of autonomous vehicles, where there is no driver with which to make eye contact or exchange hand signals. Drive.ai is testing a solution to this problem that involves LED signs mounted on the outside of the vehicle, announcing status such as "going now, don't cross" vs. "waiting for you to cross".
Handover and risk compensation
Two human-factor challenges are important for safety. One is the handover from automated driving to manual driving. Human factors research on automated systems has shown that people are slow to detect a problem with automation and slow to understand the problem after it is detected. When automation failures occur, unexpected transitions that require a driver to take over will occur suddenly and the driver may not be ready to take over.
The second challenge is known as risk compensation: as a system is perceived to be safer, instead of benefiting entirely from all of the increased safety, people engage in riskier behaviour and enjoy other benefits. Semi-automated cars have been shown to suffer from this problem, for example with users of Tesla Autopilot ignoring the road and using electronic devices or other activities against the advice of the company that the car is not capable of being completely autonomous. In the near future, pedestrians and bicyclists may travel in the street in a riskier fashion if they believe self-driving cars are capable of avoiding them.
In order for people to buy self-driving cars and vote for the government to allow them on roads, the technology must be trusted as safe. Self-driving elevators were invented in 1900, but the high number of people refusing to use them slowed adoption for several decades until operator strikes increased demand and trust was built with advertising and features like the emergency stop button. There are three types of trust between human and automation. There is dispositional trust, the trust between the driver and the company's product; there is situational trust, or the trust from different scenarios; and there is learned trust where the trust is built between similar events.
See also: Machine ethics
Rationale for liability
There are different opinions on who should be held liable in case of a crash, especially with people being hurt. One study suggests requesting the owners of self-driving cars to sign end-user license agreements (EULAs), assigning to them accountability for any accidents. Other studies suggest introducing a tax or insurances that would protect owners and users of automated vehicles of claims made by victims of an accident. Other possible parties that can be held responsible in case of a technical failure include software engineers that programmed the code for the automated operation of the vehicles, and suppliers of components of the AV.
Implications from the Trolley Problem
A moral dilemma that a software engineer or car manufacturer might face in programming the operating software of a self-driving vehicle is captured in a variation of the traditional ethical thought experiment, the trolley problem: An AV is driving with passengers when suddenly a person appears in its way and the car has to commit between one of two options, either to run the person over or to avoid hitting the person by swerving into a wall, killing the passengers. Researchers have suggested, in particular, two ethical theories to be applicable to the behaviour of automated vehicles in cases of emergency: deontology and utilitarianism. Deontological theory suggests that an automated car needs to follow strict written-out rules that it needs to follow in any situation. Utilitarianism, on the other hand, promotes maximizing the number of people surviving in a crash. Critics suggest that automated vehicles should adapt a mix of multiple theories to be able to respond morally right in the instance of a crash. Recently, some specific ethical frameworks i.e., utilitarianism, deontology, relativism, absolutism (monism), and pluralism, are investigated empirically with respect to the acceptance of self-driving cars in unavoidable accidents.
According to research, people overwhelmingly express a preference for autonomous vehicles to be programmed with utilitarian ideas, that is, in a manner that generates the least harm and minimizes driving casualties. While people want others to purchase utilitarian promoting vehicles, they themselves prefer to ride in vehicles that prioritize the lives of people inside the vehicle at all costs. This presents a paradox in which people prefer that others drive utilitarian vehicles designed to maximize the lives preserved in a fatal situation but want to ride in cars that prioritize the safety of passengers at all costs. People disapprove of regulations that promote utilitarian views and would be less willing to purchase a self-driving car that may opt to promote the greatest good at the expense of its passengers.
Bonnefon et al. concluded that the regulation of autonomous vehicle ethical prescriptions may be counterproductive to societal safety. This is because, if the government mandates utilitarian ethics and people prefer to ride in self-protective cars, it could prevent the large scale implementation of self-driving cars. Delaying the adoption of autonomous cars vitiates the safety of society as a whole because this technology is projected to save so many lives.
Privacy-related issues arise mainly from the interconnectivity of automated cars, making it just another mobile device that can gather any information about an individual (see data mining). This information gathering ranges from tracking of the routes taken, voice recording, video recording, preferences in media that is consumed in the car, behavioural patterns, to many more streams of information. The data and communications infrastructure needed to support these vehicles may also be capable of surveillance, especially if coupled to other data sets and advanced analytics.
The testing of vehicles with varying degrees of automation can be carried out either physically, in a closed environment or, where permitted, on public roads (typically requiring a license or permit, or adhering to a specific set of operating principles), or in a virtual environment, i.e. using computer simulations. When driven on public roads, automated vehicles require a person to monitor their proper operation and "take over" when needed. For example, New York state has strict requirements for the test driver, such that the vehicle can be corrected at all times by a licensed operator; highlighted by Cardian Cube Company's application and discussions with New York State officials and the NYS DMV.
In California, self-driving car manufacturers are required to submit annual reports to share how often their vehicles disengaged from autonomous mode during tests. It has been believed that we would learn how reliable the vehicles are becoming based on how often they needed "disengagements".
In 2017, Waymo reported 63 disengagements over 352,545 mi (567,366 km) of testing, an average distance of 5,596 mi (9,006 km) between disengagements, the highest among companies reporting such figures. Waymo also travelled a greater total distance than any of the other companies. Their 2017 rate of 0.18 disengagements per 1,000 mi (1,600 km) was an improvement over the 0.2 disengagements per 1,000 mi (1,600 km) in 2016, and 0.8 in 2015. In March 2017, Uber reported an average of just 0.67 mi (1.08 km) per disengagement. In the final three months of 2017, Cruise (now owned by GM) averaged 5,224 mi (8,407 km) per disengagement over a total distance of 62,689 mi (100,888 km). In July 2018, the first electric driver-less racing car, "Robocar", completed a 1.8-kilometer track, using its navigation system and artificial intelligence.
|Car maker||California, 2016||California, 2018||California, 2019|
|Total distance traveled||Distance between
|Total distance traveled||Distance between
|Total distance traveled|
|Waymo||5,128 mi (8,253 km)||635,868 mi (1,023,330 km)||11,154 mi (17,951 km)||1,271,587 mi (2,046,421 km)||11,017 mi (17,730 km)||1,450,000 mi (2,330,000 km)|
|BMW||638 mi (1,027 km)||638 mi (1,027 km)|
|Nissan||263 mi (423 km)||6,056 mi (9,746 km)||210 mi (340 km)||5,473 mi (8,808 km)|
|Ford||197 mi (317 km)||590 mi (950 km)|
|General Motors||55 mi (89 km)||8,156 mi (13,126 km)||5,205 mi (8,377 km)||447,621 mi (720,376 km)||12,221 mi (19,668 km)||831,040 mi (1,337,430 km)|
|Aptiv||15 mi (24 km)||2,658 mi (4,278 km)|
|Tesla||3 mi (4.8 km)||550 mi (890 km)|
|Mercedes-Benz||2 mi (3.2 km)||673 mi (1,083 km)||1.5 mi (2.4 km)||1,749 mi (2,815 km)|
|Bosch||7 mi (11 km)||983 mi (1,582 km)|
|Zoox||1,923 mi (3,095 km)||30,764 mi (49,510 km)||1,595 mi (2,567 km)||67,015 mi (107,850 km)|
|Nuro||1,028 mi (1,654 km)||24,680 mi (39,720 km)||2,022 mi (3,254 km)||68,762 mi (110,662 km)|
|Pony.ai||1,022 mi (1,645 km)||16,356 mi (26,322 km)||6,476 mi (10,422 km)||174,845 mi (281,386 km)|
|Baidu (Apolong)||206 mi (332 km)||18,093 mi (29,118 km)||18,050 mi (29,050 km)||108,300 mi (174,300 km)|
|Aurora||100 mi (160 km)||32,858 mi (52,880 km)||280 mi (450 km)||39,729 mi (63,938 km)|
|Apple||1.1 mi (1.8 km)||79,745 mi (128,337 km)||118 mi (190 km)||7,544 mi (12,141 km)|
|Uber||0.4 mi (0.64 km)||26,899 mi (43,290 km)||0 mi (0 km)|
In April 2021, WP.29 GRVA issued the master document on "Test Method for Automated Driving (NATM)".
In October 2021, the Europe's comprehensive pilot test of automated driving on public roads, L3Pilot, demonstrated automated systems for cars in Hamburg, Germany, in conjunction with ITS World Congress 2021. SAE Level 3 and 4 functions were tested on ordinary roads. At the end of February 2022, the final results of the L3Pilot project were published.
In November 2021, the California Department of Motor Vehicles (DMV) notified Pony.ai that it was suspending its driverless testing permit following a reported collision in Fremont on October 28. This incident stands out because the vehicle was in autonomous mode and didn't involve any other vehicle. In May 2022, DMV revoked Pony.ai's permit for failing to monitor the driving records of the safety drivers on its testing permit.
As of 2022[update], "disengagements" are at the center of the controversy. The problem is that reporting companies have varying definitions of what qualifies as a disengagement, and that definition can change over time.
In April 2022, it is reported that Cruise's testing vehicle blocked fire engine on emergency call, and sparked questions about an autonomous vehicle's ability to handle unexpected roadway issues.
Main article: Autonomous truck
Further information: Online food ordering
Companies such as Otto and Starsky Robotics have focused on autonomous trucks. Automation of trucks is important, not only due to the improved safety aspects of these very heavy vehicles, but also due to the ability of fuel savings through platooning. Autonomous vans are being developed for use by online grocers such as Ocado.
Research has also indicated that goods distribution on the macro (urban distribution) and micro level (last mile delivery) could be made more efficient with the use of autonomous vehicles  thanks to the possibility of smaller vehicle sizes.
China trailed the first automated public bus in Henan province in 2015, on a highway linking Zhengzhou and Kaifeng. Baidu and King Long produce automated minibus, a vehicle with 14 seats, but without driving seat. With 100 vehicles produced, 2018 will be the first year with commercial automated service in China.
In Europe, cities in Belgium, France, Italy and the UK are planning to operate transport systems for automated cars, and Germany, the Netherlands, and Spain have allowed public testing in traffic. In 2015, the UK launched public trials of the LUTZ Pathfinder automated pod in Milton Keynes. Beginning in summer 2015, the French government allowed PSA Peugeot-Citroen to make trials in real conditions in the Paris area. The experiments were planned to be extended to other cities such as Bordeaux and Strasbourg by 2016. The alliance between French companies THALES and Valeo (provider of the first self-parking car system that equips Audi and Mercedes premi) is testing its own system. New Zealand is planning to use automated vehicles for public transport in Tauranga and Christchurch.
See also: Tesla Autopilot § Fatal crashes
As of November 2021[update], Tesla's advanced driver-assistance system (ADAS) Autopilot is classified as a Level 2.
On 20 January 2016, the first of five known fatal crashes of a Tesla with Autopilot occurred in China's Hubei province. According to China's 163.com news channel, this marked "China's first accidental death due to Tesla's automatic driving (system)". Initially, Tesla pointed out that the vehicle was so badly damaged from the impact that their recorder was not able to conclusively prove that the car had been on autopilot at the time; however, 163.com pointed out that other factors, such as the car's absolute failure to take any evasive actions prior to the high speed crash, and the driver's otherwise good driving record, seemed to indicate a strong likelihood that the car was on autopilot at the time. A similar fatal crash occurred four months later in Florida. In 2018, in a subsequent civil suit between the father of the driver killed and Tesla, Tesla did not deny that the car had been on autopilot at the time of the accident, and sent evidence to the victim's father documenting that fact.
The second known fatal accident involving a vehicle being driven by itself took place in Williston, Florida on 7 May 2016 while a Tesla Model S electric car was engaged in Autopilot mode. The occupant was killed in a crash with an 18-wheel tractor-trailer. On 28 June 2016 the US National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident working with the Florida Highway Patrol. According to NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck's trailer. NHTSA's preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involved a population of an estimated 25,000 Model S cars. On 8 July 2016, NHTSA requested Tesla Motors provide the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla's planned updates schedule for the next four months.
According to Tesla, "neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." The car attempted to drive full speed under the trailer, "with the bottom of the trailer impacting the windshield of the Model S". Tesla also claimed that this was Tesla's first known autopilot death in over 130 million miles (210 million kilometers) driven by its customers with Autopilot engaged, however by this statement, Tesla was apparently refusing to acknowledge claims that the January 2016 fatality in Hubei China had also been the result of an autopilot system error. According to Tesla there is a fatality every 94 million miles (151 million kilometers) among all type of vehicles in the US However, this number also includes fatalities of the crashes, for instance, of motorcycle drivers with pedestrians.
In July 2016, the US National Transportation Safety Board (NTSB) opened a formal investigation into the fatal accident while the Autopilot was engaged. The NTSB is an investigative body that has the power to make only policy recommendations. An agency spokesman said "It's worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible." In January 2017, the NTSB released the report that concluded Tesla was not at fault; the investigation revealed that for Tesla cars, the crash rate dropped by 40 percent after Autopilot was installed.
In 2021, NTSB Chair called on Tesla to change the design of its Autopilot to ensure it cannot be misused by drivers, according to a letter sent to the company's CEO.
See also: Waymo § Crashes
Waymo originated as a self-driving car project within Google. In August 2012, Google announced that their vehicles had completed over 300,000 automated-driving miles (500,000 km) accident-free, typically involving about a dozen cars on the road at any given time, and that they were starting to test with single drivers instead of in pairs. In late-May 2014, Google revealed a new prototype that had no steering wheel, gas pedal, or brake pedal, and was fully automated . As of March 2016[update], Google had test-driven their fleet in automated mode a total of 1,500,000 mi (2,400,000 km). In December 2016, Google Corporation announced that its technology would be spun off to a new company called Waymo, with both Google and Waymo becoming subsidiaries of a new parent company called Alphabet.
According to Google's accident reports as of early 2016, their test cars had been involved in 14 collisions, of which other drivers were at fault 13 times, although in 2016 the car's software caused a crash.
In June 2015, Brin confirmed that 12 vehicles had suffered collisions as of that date. Eight involved rear-end collisions at a stop sign or traffic light, two in which the vehicle was side-swiped by another driver, one in which another driver rolled through a stop sign, and one where a Google employee was controlling the car manually. In July 2015, three Google employees suffered minor injuries when their vehicle was rear-ended by a car whose driver failed to brake at a traffic light. This was the first time that a collision resulted in injuries. On 14 February 2016 a Google vehicle attempted to avoid sandbags blocking its path. During the maneuver it struck a bus. Google stated, "In this case, we clearly bear some responsibility, because if our car hadn't moved, there wouldn't have been a collision." Google characterized the crash as a misunderstanding and a learning experience. No injuries were reported in the crash.
See also: Uber § Former operations
In March 2018, Elaine Herzberg became the first pedestrian to be killed by a self-driving car in the United States after being hit by an Uber vehicle, also in Tempe. Herzberg was crossing outside of a crosswalk, approximately 400 feet from an intersection. This marks the first time an individual is known to have been killed by an autonomous vehicle, and considered to raise questions about regulations surrounding the burgeoning self-driving car industry. Some experts say a human driver could have avoided the fatal crash. Arizona Governor Doug Ducey later suspended the company's ability to test and operate its automated cars on public roadways citing an "unquestionable failure" of the expectation that Uber make public safety its top priority. Uber has pulled out of all self-driving-car testing in California as a result of the accident. On 24 May 2018, the US National Transport Safety Board issued a preliminary report.
In September 2020, the backup driver has been charged of negligent homicide, because she did not look to the road for several seconds while her television was streaming The Voice broadcast by Hulu. Uber does not face any criminal charge because in the USA there is no basis for criminal liability for the corporation. The driver is assumed to be responsible of the accident, because she was in the driver seat in capacity to avoid an accident (like in a Level 3). Trial is planned for February 2021.[needs update]
On 9 November 2017, a Navya Arma automated self-driving bus with passengers was involved in a crash with a truck. The truck was found to be at fault of the crash, reversing into the stationary automated bus. The automated bus did not take evasive actions or apply defensive driving techniques such as flashing its headlights, or sounding the horn. As one passenger commented, "The shuttle didn't have the ability to move back. The shuttle just stayed still."
On 12 August 2021, a 31-year-old Chinese man was killed after his NIO ES8 collided with a construction vehicle. NIO's self-driving feature is still in beta and cannot yet deal with static obstacles. Though the vehicle's manual clearly states that the driver must take over when nearing construction sites, the issue is whether the feature was improperly marketed and unsafe. Lawyers of the deceased's family have also called into question NIO's private access to the vehicle, which they argue may lead to the data ending up forged.
On 26 August 2021, a Toyota e-Palette, a mobility vehicle used to support mobility within the Athletes' Village at the Olympic and Paralympic Games Tokyo 2020, collided with a visually impaired pedestrian about to cross a pedestrian crossing. The suspension was made after the accident, and restarted on 31 with improved safety measures.
In a 2011 online survey of 2,006 US and UK consumers by Accenture, 49% said they would be comfortable using a "driverless car".
A 2012 survey of 17,400 vehicle owners by J.D. Power and Associates found 37% initially said they would be interested in purchasing a "fully autonomous car". However, that figure dropped to 20% if told the technology would cost US$3,000 more.
In a 2012 survey of about 1,000 German drivers by automotive researcher Puls, 22% of the respondents had a positive attitude towards these cars, 10% were undecided, 44% were sceptical and 24% were hostile.
A 2013 survey of 1,500 consumers across 10 countries by Cisco Systems found 57% "stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver", with Brazil, India and China the most willing to trust automated technology.
In a 2014 US telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an automated car was available instead.
In a February 2015 survey of top auto journalists, 46% predicted that either Tesla or Daimler would be the first to the market with a fully autonomous vehicle, while (at 38%) Daimler was predicted to be the most functional, safe, and in-demand autonomous vehicle.
In 2015 a questionnaire survey by Delft University of Technology explored the opinion of 5,000 people from 109 countries on automated driving. Results showed that respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the respondents did not want to spend any money for a fully automated driving system. Respondents were found to be most concerned about software hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries (in terms of lower accident statistics, higher education, and higher income) were less comfortable with their vehicle transmitting data. The survey also gave results on potential consumer opinion on interest of purchasing an automated car, stating that 37% of surveyed current owners were either "definitely" or "probably" interested in purchasing an automated car.
In 2016, a survey in Germany examined the opinion of 1,603 people, who were representative in terms of age, gender, and education for the German population, towards partially, highly, and fully automated cars. Results showed that men and women differ in their willingness to use them. Men felt less anxiety and more joy towards automated cars, whereas women showed the exact opposite. The gender difference towards anxiety was especially pronounced between young men and women but decreased with participants' age.
In 2016, a PwC survey, in the United States, showing the opinion of 1,584 people, highlights that "66 percent of respondents said they think autonomous cars are probably smarter than the average human driver". People are still worried about safety and mostly the fact of having the car hacked. Nevertheless, only 13% of the interviewees see no advantages in this new kind of cars.
In 2017, Pew Research Center surveyed 4,135 US adults from 1–15 May and found that many Americans anticipate significant impacts from various automation technologies in the course of their lifetimes—from the widespread adoption of automated vehicles to the replacement of entire job categories with robot workers.
In 2019, results from two opinion surveys of 54 and 187 US adults respectively were published. A new standardised questionnaire, the autonomous vehicle acceptance model (AVAM) was developed, including additional description to help respondents better understand the implications of different automation levels. Results showed that users were less accepting of high autonomy levels and displayed significantly lower intention to use highly autonomous vehicles. Additionally, partial autonomy (regardless of level) was perceived as requiring uniformly higher driver engagement (usage of hands, feet and eyes) than full autonomy.
See also: Regulation of algorithms
The Geneva Convention on Road Traffic subscribed to by over 101 countries worldwide, requires the driver to be 18 years old.
The 1968 Vienna Convention on Road Traffic, subscribed to by 83 countries worldwide, establishes principles to govern traffic laws. One of the fundamental principles of the convention had been the concept that a driver is always fully in control and responsible for the behaviour of a vehicle in traffic. In 2016, a reform of the convention has opened possibilities for automated features for ratified countries.
In January 2021, proposal of amendment to Article 1 and new Article 34 bis to the 1968 Vienna Convention on Road Traffic was transmitted to one-year period for acceptance. By 14 January 2022, the amendment to the convention was accepted, and it should enter into force on 14 July 2022.
In February 2018, UNECE's Inland Transport Committee (ITC) acknowledged the importance of WP.29 activities related to automated, autonomous and connected vehicles and requested WP.29 to consider establishing a dedicated Working Party. Following the request, WP.29, at its June 2018 session, decided to convert the Working Party on Brakes and Running Gear (GRRF) into a new Working Party on Automated/Autonomous and Connected Vehicles (GRVA).
In June 2020, WP.29 virtual meeting approved reports from GRVA about its fifth session on "automated/autonomous and connected vehicles" and sixth session on "cyber security and software updates". The new Regulation on cyber security has been allocated as Regulation 155 and the new Regulation on software updates has been allocated as Regulation 156. In this way, UN regulation on Level 3 was established.
In March 2021, the following UNECE regulations were published:
On 22 January 2022, these regulations come into effect in some countries.
In June 2022, the 187th session of WP.29 was held, and several amendments on GRVA regulations were agreed based on the continuous leadership of Japan in chairmanship and as proposing national body. In case of Regulation 157 of nutomated Lane Keeping System (ALKS), the speed limit of 60km/h is to be changed to 130km/h, and lane changing function is to be added for passenger car.
Japan is a non-signatory country to the Vienna Convention. In 2019, Japan amended two laws, "Road Traffic Act" and "Road Transport Vehicle Act", and they came into effect in April 2020. In the former act, Level 3 self driving cars became allowed on public roads. In the latter act, process to designate types for safety certification on Level 3 self driving function of Autonomous Driving System (ADS) and the certification process for the asserted type were legally defined. Through the amendment process, the achievements from the national project "SIP-adus" led by Cabinet Office since 2014 were fully considered and accepted.
In May 2020, "The Road Act" was also amended to include definition of automatic operation control equipment in infrastructure, and came into effect. In July 2020, the next stage national level roadmap plan was officially issued which had considered social deployment and acceptability of Level 4. At the end of 2020, Ministry of Land, Infrastructure, Transport and Tourism (MLIT) amended its "Safety Regulation for Road Transport Vehicle" to reflect the finalized UNECE WP.29 GRVA's regulations consistently without delay.
In April 2021, National Police Agency (NPA) published its expert committee's report of FY 2020 on summary of issues in research to realize Level 4 mobility services, including required legal amendment issues. During the summer of 2021, Ministry of Economy, Trade and Industry (METI) prepared with MLIT to launch a project "RoAD to the L4" to cover R&D with social deployment to realize acceptable Level 4 mobility service, and updated its public information in September. As a part of this project, civil law liability problem reflecting changed roles will be clarified.
About misleading representation in marketing, Article 5 of "Act against Unjustifiable Premiums and Misleading Representations" is applied.
At the end of 2021, NPA proposed an amendment bill on "Road Traffic Act" to include approving scheme for Level 4 services. In March 2022, the Japanese government adopted the bill to amend the act. In April 2022, the bill was deliberated at the ordinary National Diet session and passed. Under the amended act, a license system will be introduced for operators of transport services using unmanned Level 4 vehicles, which requires no driver in the remotely monitored vehicle within a limited area. Such vehicles are expected to be used for residents in depopulated areas.
In the United States, a non-signatory country to the Vienna Convention, state vehicle codes generally do not envisage—but do not necessarily prohibit—highly automated vehicles as of 2012[update]. To clarify the legal status of and otherwise regulate such vehicles, several states have enacted or are considering specific laws. By 2016, seven states (Nevada, California, Florida, Michigan, Hawaii, Washington, and Tennessee), along with the District of Columbia, have enacted laws for automated vehicles. Incidents such as the first fatal accident by Tesla's Autopilot system have led to discussion about revising laws and standards for automated cars.
In 2017, the Republican-controlled House of Representatives unanimously passed "SELF DRIVE Act" which would speed the adoption of self-driving cars and bar states from setting performance standards. However, a complementary bill in the Senate, "AV START", failed to pass after Democrats raised objections that it didn't do enough to address safety and liability concerns. A comprehensive regulatory structure has not yet emerged at either the federal or state level in the United States.
In September 2016, the US National Economic Council and US Department of Transportation (USDOT) released the Federal Automated Vehicles Policy, which are standards that describe how automated vehicles should react if their technology fails, how to protect passenger privacy, and how riders should be protected in the event of an accident. The new federal guidelines are meant to avoid a patchwork of state laws, while avoiding being so overbearing as to stifle innovation. Since then, USDOT has released multiple updates:
The National Highway Traffic Safety Administration (NHTSA) released for public comment the Occupant Protection for Automated Driving System on 30 March 2020, followed by the Framework for Automated Driving System Safety on 3 December 2020. Occupant Protection is intended to modernize the Federal Motor Vehicle Safety Standards considering the removal of manual controls with automated driving systems, while the Framework document is intended to provide an objective way to define and assess automated driving system competence to ensure motor vehicle safety while also remaining flexible to accommodate the development of features to improve safety.
Historically, a vehicle without driving controls such as a steering wheel, accelerator pedal, and brake pedal would not be in compliance with the Federal Motor Vehicle Safety Standards (FMVSS), the minimum safety equipment needed to legally sell a vehicle to the public. On 10 March 2022, NHTSA updated and finalized the rule on safety requirements for the Occupant Protection which now allows a vehicle without driving controls to comply with US regulations.
As of April 2022[update], 38 states have laws or executive orders related to autonomous vehicles.
In June 2011, the Nevada Legislature passed a law to authorize the use of automated cars. Nevada thus became the first jurisdiction in the world where automated vehicles might be legally operated on public roads. According to the law, the Nevada Department of Motor Vehicles is responsible for setting safety and performance standards and the agency is responsible for designating areas where automated cars may be tested. This legislation was supported by Google in an effort to legally conduct further testing of its Google driver-less car. The Nevada law defines an automated vehicle to be "a motor vehicle that uses artificial intelligence, sensors and global positioning system coordinates to drive itself without the active intervention of a human operator". The law also acknowledges that the operator will not need to pay attention while the car is operating itself. Google had further lobbied for an exemption from a ban on distracted driving to permit occupants to send text messages while sitting behind the wheel, but this did not become law. Furthermore, Nevada's regulations require a person behind the wheel and one in the passenger's seat during tests.
In April 2012, Florida became the second state to allow the testing of automated cars on public roads.
California became the third state to allow automated car testing when Governor Jerry Brown signed SB 1298 into law in September 2012 at Google Headquarters in Mountain View.
On 19 February 2016, California Assembly Bill 2866 was introduced in California that would allow automated vehicles to operate on public roads, including those without a driver, steering wheel, accelerator pedal, or brake pedal. The bill states that the California Department of Motor Vehicles would need to comply with these regulations by 1 July 2018 for these rules to take effect. As of November 2016[update], this bill has yet to pass the house of origin. California published discussions on the proposed federal automated vehicles policy in October 2016.
In December 2016, the California Department of Motor Vehicles ordered Uber to remove its self-driving vehicles from the road in response to two red-light violations. Uber immediately blamed the violations on human-error, and has suspended the drivers.
In Washington, DC's district code:
"Autonomous vehicle" means a vehicle capable of navigating District roadways and interpreting traffic-control devices without a driver actively operating any of the vehicle's control systems. The term "autonomous vehicle" excludes a motor vehicle enabled with active safety systems or driver- assistance systems, including systems to provide electronic blind-spot assistance, crash avoidance, emergency braking, parking assistance, adaptive cruise control, lane-keep assistance, lane-departure warning, or traffic-jam and queuing assistance, unless the system alone or in combination with other systems enables the vehicle on which the technology is installed to drive without active control or monitoring by a human operator.
In the same district code, it is considered that:
An autonomous vehicle may operate on a public roadway; provided, that the vehicle:
- (1) Has a manual override feature that allows a driver to assume control of the autonomous vehicle at any time;
- (2) Has a driver seated in the control seat of the vehicle while in operation who is prepared to take control of the autonomous vehicle at any moment; and
- (3) Is capable of operating in compliance with the District's applicable traffic laws and motor vehicle laws and traffic control devices.
Michigan and others
In December 2013, Michigan became the fourth state to allow testing of driver-less cars on public roads. In July 2014, the city of Coeur d'Alene, Idaho adopted a robotics ordinance that includes provisions to allow for self-driving cars.
In 2013, the government of the United Kingdom permitted the testing of automated cars on public roads. Before this, all testing of robotic vehicles in the UK had been conducted on private property.
In July 2018, "The Automated and Electric Vehicles Act 2018" received royal assent.
In March 2019, the UK became a signatory country to the Vienna Convention.
In 2021, the UK worked on a bill to allow self-driving automated lane keeping systems (ALKS) up to 37 mph (or 60 km/h) after a mixed reaction of experts during the consultation launched in summer 2020. This system would be allowed to give back control to the driver when "unplanned events" such as road construction or inclement weather occurs. The Centre for Connected and Autonomous Vehicles (CCAV) has asked the Law Commission of England and Wales and the Scottish Law Commission to undertake a far-reaching review of the legal framework for "automated" vehicles, and their use as part of public transport networks and on-demand passenger services. The teams developed policy and the full analysis report was published in January 2022.
About misleading representation in marketing, the Society of Motor Manufacturers and Traders (SMMT) published guiding principles as followings:
In April 2022, UK government confirmed planned changes to "The Highway Code", responding to a public consultation. The changes will clarify drivers' responsibilities in self-driving vehicles, including when a driver must be ready to take back control.
In November 2019, Regulation (EU) 2019/2144 of the European Parliament and of the Council on motor vehicle type approval requirements defined specific requirements relating to automated vehicles and fully automated vehicles. This law is applicable from 2022 and is based on uniform procedures and technical specifications for the systems and other items.
In April 2022, EU released a draft version of its legislation for vehicles with automated driving systems (ADS).
France is a signatory country to the Vienna Convention. In 2014, the government of France announced that testing of automated cars on public roads would be allowed in 2015. 2000 km of road would be opened through the national territory, especially in Bordeaux, in Isère, Île-de-France and Strasbourg. At the 2015 ITS World Congress, a conference dedicated to intelligent transport systems, the very first demonstration of automated vehicles on open road in France was carried out in Bordeaux in early October 2015.
In May 2018, the government published the first version of the French strategy for the development of automated road mobility to set up the legislative framework, and it brought a result as "The Mobility Orientation Law" in December 2019.
In December 2020, the government updated the strategy to make France the preferred location in Europe for the deployment of automated road mobility services.
The legislative and regulatory framework for the deployment of automated vehicles and transport systems was established through an ordinance in April 2021 and a following decree in June 2021. The legislative and regulatory framework for the operation of automated vehicles resulting from article 31 of "the Mobility Orientation Law" is scheduled to be finalized in Q1 of 2022.
Germany is a signatory country to the Vienna Convention. In July 2021, the "Federal Act Amending the Road Traffic Act and the Compulsory Insurance Act" came into effect. The Act allows motor vehicles with autonomous driving capabilities, meaning vehicles that can perform driving tasks independently without a person driving, in specified operating areas on public roads. Provisions about autonomous driving in appropriate operating areas correspond to Level 4. Moreover, the new German legislation has major implications for dilemmatic situations. This includes for example the non-discrimination principle that applies to unavoidable crash situations. Moreover, the act elaborates on the technical requirements of autonomous vehicles, including a software system that can operate without permanent supervision of the technical oversight or driver, contains an accident mitigation and reduction system and can initiate a “minimal-risk state.”
In February 2022, the Federal Ministry for Digital and Transport (BMDV) submitted the "Ordinance on the Approval and Operation of Motor Vehicles with Autonomous Driving Functions in Specified Operating Areas - Autonomous Vehicles Approval and Operation Ordinance (AFGBV)" to the German Bundesrat for approval.
Canada is a non-signatory country to the Vienna Convention. At the federal level, "The Motor Vehicle Safety Act" regulates about motor vehicles which was last amended in February 2020.
In August 2021, Transport Canada released the "Guidelines for Testing Automated Driving Systems in Canada" Version 2.0.
For historical reason, China is not a signatory country to 1949 Geneva Convention, although it is a signatory country to 1968 Vienna Convention. Legislations are conducted at National People's Congress and its Standing Committee which are under control of Chinese Communist Party.
In 2018, China introduced testing regulations to regulate autonomous cars, for conditional automation, high-level automation and full automation (roughly corresponding to Level 3, Level 4 and Level 5). The rules lay out requirements that vehicles must first be tested in non-public zones, that road tests can only be on designated streets and that a qualified person must always sit in the driver's position, ready to take over control.
In February 2020, eleven constituent departments, represented by National Development and Reform Commission (NDRC), jointly issued the "Strategy for Innovation and Development of Intelligent Vehicles" which describes about roadmap plan until 2025. This plan states about the need to revise the "Road Traffic Safety Law", and surveying and mapping law for intelligent vehicles.
In March 2020, Ministry of Industry and Information Technology (MIIT) published draft GB/T on 6-levels classification framework for driving automation which is basically corresponding to SEA levels. And in April 2020, MIIT released about the goal of the year which is set to complete the formulation of framework for driving-assist functions and low-level autonomous driving (Level 3). In January 2021, MIIT planed to add highways to the list of roads were provincial and city-level authorities can authorize automated cars.
In March 2021, Ministry of Public Security (MPS) published draft proposed amendments on the "Road Traffic Safety Law". In August 2021, The Cyberspace Administration of China (CAC) and MIIT issued "the Provisions on Management of Automotive Data Security (Trial)". In September 2021, "Data Security Law" came into effect, which broadly expands the extraterritorial reach of China's existing data rules.
In February 2022, MIIT issued the second draft of the "Administrative Measures for Data Security in the Industry and Information Technology Fields". And in March 2022, MIIT issued "the Guidelines for the Construction of the Internet of Vehicles Cybersecurity and Data Security Standard System".
Australia is a non-signatory country to the Vienna Convention. National Transport Commission (NTC) is in charge of reforming current laws with still achieving national level consistency. In February 2022, NTC published a policy paper to present proposals on the end-to-end regulatory framework for the commercial deployment of automated vehicles.
Department of Infrastructure, Transport, Regional Development and Communications (DITRDC) is in charge of developing their policies and bills on first supply and in-service automated vehicle law.
New Zealand is a non-signatory country to the Vienna Convention. New Zealand legislation does not specifically require a driver to be present for a vehicle to be legally operated on a public road. However, most regulations and relevant international frameworks strongly imply the presence of a driver in the vehicle given that ‘automation’ was not a consideration at the time of drafting the legislation.
As of April 2022[update], Ministry of Transport is working on "Autonomous Vehicles Work Programme" with "Long-term Insights Briefing (LTIB)" which will include legislation issues.
Israelis a signatory country to the Vienna Convention. As of April 2022[update], Israel Innovation Authority (OCS) is working on forming regulatory framework for trials and use of autonomous vehicles with Ministry of Transport and Road Safety (MOT) and Ministry of Justice.
In March 2022, the Knesset passed legislation that will allow companies to pilot autonomous shared transportation with passengers in the vehicle but without a safety driver on Israeli roads. The legislation allows companies and vehicle operators to obtain special licenses from MOT and to conduct trials with autonomous cars including for the purpose of transporting paying passengers and where an independent driving system replaces the driver.
Main article: Self-driving car liability
Self-driving car liability is a developing area of law and policy that will determine who is liable when an automated car causes physical damage to persons, or breaks road rules. When automated cars shift the control of driving from humans to automated car technology the driver will need to consent to share operational responsibility which will require a legal framework. There may be a need for existing liability laws to evolve in order to fairly identify the parties responsible for damage and injury, and to address the potential for conflicts of interest between human occupants, system operator, insurers, and the public purse.
Between manually driven vehicles (SAE Level 0) and fully autonomous vehicles (SAE Level 5), there are a variety of vehicle types that can be described to have some degree of automation. These are collectively known as semi-automated vehicles. As it could be a while before the technology and infrastructure are developed for full automation, it is likely that vehicles will have increasing levels of automation. These semi-automated vehicles could potentially harness many of the advantages of fully automated vehicles, while still keeping the driver in charge of the vehicle.
Tesla vehicles are equipped with hardware that Tesla claims will allow full self driving in the future. In October 2020 Tesla released a "beta" version of its "Full Self-Driving" software to a small group of testers in the United States; however, this "Full Self-Driving" corresponds to level 2 autonomy.
In 2017, BMW had been trying to make 7 Series as an automated car in public urban motorways of the United States, Germany and Israel before commercializing them in 2021. Although it was not realized, BMW is still preparing 7 Series to become the next manufacturer to reach Level 3 in the second half of 2022.
In September 2021, Stellantis has presented its findings from a pilot programme testing Level 3 autonomous vehicles on public Italian highways. Stellantis's Highway Chauffeur claims Level 3 capabilities, which was tested on the Maserati Ghibli and Fiat 500X prototypes. Stellantis is going to roll out Level 3 capability within its cars in 2024.
In January 2022, Polestar, a Volvo Cars' brand, indicated its plan to offer Level 3 autonomous driving system in the Polestar 3 SUV, Volvo XC90 successor, with technologies from Luminar Technologies, Nvidia, and Zenseact.
As of February 2022[update], Hyundai Motor Company is in the stage of enhancing cybersecurity of connected cars to put Level 3 self-driving Genesis G90 on Korean roads.
In July 2020, Toyota started testing with public demonstration rides on Lexus LS (fifth generation) based TRI-P4 with Level 4 capability. In August 2021, Toyota operated potentially Level 4 service using e-Palette around the Tokyo 2020 Olympic Village.
In September 2020, Mercedes-Benz introduced world's first commercial Level 4 Automated Valet Parking (AVP) system named Intelligent Park Pilot for its new S-Class. The system can be pre-installed but is conditional on future national legal approval.
In September 2021, Honda started testing programme toward launch of Level 4 mobility service business in Japan under collaboration with Cruise and General Motors, using Cruise AV. In October 2021 at World Congress on Intelligent Transport Systems, Honda presented that they are already testing Level 4 technology on modified Legend Hybrid EX. At the end of the month, Honda explained that they are conducting verification project on Level 4 technology on a test course in Tochigi prefecture. Honda plans to test on public roads in early 2022.
In February 2022, General Motors and Cruise have petitioned NHTSA for permission to build and deploy a self-driving vehicle, the Cruise Origin, which is without human controls like steering wheels or brake pedals. The car was developed with GM and Cruise investor Honda, and its production is expected to begin in late 2022 in Detroit at GM's Factory Zero. As of April 2022[update], the petition is pending.
In April 2022, Honda unveiled its Level 4 mobility service partners to roll out in central Tokyo in the mid-2020s using the Cruse Origin.
Also in April 2022, Volkswagen started testing of its autonomous ID. Buzz AD prototype with Argo AI on public roads. And in May 2022, Argo AI started testing on public roads in Austin and Miami using modified Ford Escape Hybrid.
((cite web)): CS1 maint: numeric names: authors list (link)
This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
((cite web)): CS1 maint: numeric names: authors list (link)