Part of a series on |
Self-driving cars |
---|
Topics |
Related topics |
A self-driving car, also known as an autonomous car, driver-less car, or robotic car (robo-car),[1][2][3] is a car that is capable of traveling without human input.[4][5] Self-driving cars use sensors to perceive their surroundings, such as optical and thermographic cameras, radar, lidar, ultrasound/sonar, GPS, odometry and inertial measurement units.[6] Control systems interpret sensory information to create a three-dimensional model of the vehicle's surroundings. Based on the model, the car then identifies an appropriate navigation path and strategies for managing traffic controls (stop signs, etc.) and obstacles.[7][8][9][10][11]
Once the technology matures, autonomous vehicles are predicted to impact the automotive industry, health, welfare, urban planning, traffic, insurance, labor market, and other fields.
Autonomy in vehicles is often divided into six levels,[12] according to a system developed by SAE International (SAE J3016).[13] The SAE levels can be roughly understood as Level 0 – no automation; Level 1 – hands on/shared control; Level 2 – hands off; Level 3 – eyes off; Level 4 – mind off, and Level 5 – steering wheel optional.
As of December 2022[update], vehicles operating at Level 3 and above were an insignificant market factor. In December 2020, Waymo became the first service provider to offer driver-less taxi rides to the general public, in a part of Phoenix, Arizona. In March 2021, Honda was the first manufacturer to sell a legally approved Level 3 car.[14][15][16] Nuro began autonomous commercial delivery operations in California in 2021.[17] In December 2021, Mercedes-Benz received approval for a Level 3 car.[18] In February 2022, Cruise became the second service provider to offer driver-less taxi rides to the general public, in San Francisco.[19]
As of December 2022[update], several manufacturers had scaled back plans for self-driving technology, including Ford and Volkswagen.[20]
Waymo undergoing testing in the San Francisco Bay Area
|
Roborace autonomous racing car on display at the 2017 New York City ePrix
|
Main article: History of self-driving cars |
Experiments have been conducted on automated driving systems (ADS) since at least the 1920s;[21] trials began in the 1950s. The first semi-automated car was developed in 1977, by Japan's Tsukuba Mechanical Engineering Laboratory, which required specially marked streets that were interpreted by two cameras on the vehicle and an analog computer. The vehicle reached speeds up to 30 kilometres per hour (19 mph) with the support of an elevated rail.[22][23]
A landmark autonomous car appeared in the 1980s, with Carnegie Mellon University's Navlab[24] and ALV[25][26] projects funded by the United States' Defense Advanced Research Projects Agency (DARPA) starting in 1984 and Mercedes-Benz and Bundeswehr University Munich's EUREKA Prometheus Project in 1987.[27] By 1985, the ALV had demonstrated self-driving speeds on two-lane roads of 31 kilometres per hour (19 mph), with obstacle avoidance added in 1986, and off-road driving in day and night time conditions by 1987.[28] A major milestone was achieved in 1995, with Carnegie Mellon University's Navlab 5 completing the first autonomous coast-to-coast drive of the United States. Of the 2,849 mi (4,585 km) between Pittsburgh, Pennsylvania and San Diego, California, 2,797 mi (4,501 km) were autonomous (98.2%), completed with an average speed of 63.8 mph (102.7 km/h).[29][30][31][32] From the 1960s through the second DARPA Grand Challenge in 2005, automated vehicle research in the United States was primarily funded by DARPA, the US Army, and the US Navy, yielding incremental advances in speeds, driving competence in more complex conditions, controls, and sensor systems.[33] Companies and research organizations have developed prototypes.[27][34][35][36][37][38][39][40][41]
The US allocated US$650 million in 1991 for research on the National Automated Highway System, which demonstrated automated driving through a combination of automation embedded in the highway with automated technology in vehicles, and cooperative networking between the vehicles and with the highway infrastructure. The programme concluded with a successful demonstration in 1997 but without clear direction or funding to implement the system on a larger scale.[42] Partly funded by the National Automated Highway System and DARPA, the Carnegie Mellon University Navlab drove 4,584 kilometres (2,848 mi) across America in 1995, 4,501 kilometres (2,797 mi) or 98% of it autonomously.[43] Navlab's record achievement stood unmatched for two decades until 2015, when Delphi improved it by piloting an Audi, augmented with Delphi technology, over 5,472 kilometres (3,400 mi) through 15 states while remaining in self-driving mode 99% of the time.[44] In 2015, the US states of Nevada, Florida, California, Virginia, and Michigan, together with Washington, DC, allowed the testing of automated cars on public roads.[45]
From 2016 to 2018, the European Commission funded an innovation strategy development for connected and automated driving through the Coordination Actions CARTRE and SCOUT.[46] Moreover, the Strategic Transport Research and Innovation Agenda (STRIA) Roadmap for Connected and Automated Transport was published in 2019.[47]
In November 2017, Waymo announced that it had begun testing driver-less cars without a safety driver in the driver position;[48] however, there was still an employee in the car.[49] An October 2017 report by the Brookings Institution found that $80 billion had been reported as invested in all facets of self driving technology up to that point, but that it was "reasonable to presume that total global investment in autonomous vehicle technology is significantly more than this".[50]
In October 2018, Waymo announced that its test vehicles had traveled in automated mode for over 10,000,000 miles (16,000,000 km), increasing by about 1,000,000 miles (1,600,000 kilometres) per month.[51] In December 2018, Waymo was the first to commercialize a fully autonomous taxi service in the US, in Phoenix, Arizona.[52] In October 2020, Waymo launched a geo-fenced driver-less ride hailing service in Phoenix.[53][54] The cars are being monitored in real-time by a team of remote engineers, and there are cases where the remote engineers need to intervene.[55][54]
In March 2019, ahead of the autonomous racing series Roborace, Robocar set the Guinness World Record for being the fastest autonomous car in the world. In pushing the limits of self-driving vehicles, Robocar reached 282.42 km/h (175.49 mph) – an average confirmed by the UK Timing Association at Elvington in Yorkshire, UK.[56]
In 2020, a National Transportation Safety Board chairman stated that no self-driving cars (SAE level 3+) were available for consumers to purchase in the US in 2020:
There is not a vehicle currently available to US consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated. If you are selling a car with an advanced driver assistance system, you're not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don't own a self-driving car.[57]
On 5 March 2021, Honda began leasing in Japan a limited edition of 100 Legend Hybrid EX sedans equipped with the newly approved Level 3 automated driving equipment which had been granted the safety certification by Japanese government to their autonomous "Traffic Jam Pilot" driving technology, and legally allow drivers to take their eyes off the road.[14][15][58][16]
There is some inconsistency in the terminology used in the self-driving car industry. Various organizations have proposed to define an accurate and consistent vocabulary.
In 2014, such confusion was documented in SAE J3016 which states that "some vernacular usages associate autonomous specifically with full driving automation (Level 5), while other usages apply it to all levels of driving automation, and some state legislation has defined it to correspond approximately to any ADS [automated driving system] at or above Level 3 (or to any vehicle equipped with such an ADS)."
Modern vehicles provide features such as keeping the car within its lane, speed controls, or emergency braking. Those features alone are just considered as driver assistance technologies because they still require a human driver control while fully automated vehicles drive themselves without human driver input.
According to Fortune, some newer vehicles' technology names—such as AutonoDrive, PilotAssist, Full-Self Driving or DrivePilot—might confuse the driver, who may believe no driver input is expected when in fact the driver needs to remain involved in the driving task.[59] According to the BBC, confusion between those concepts leads to deaths.[60]
For this reason, some organizations such as the AAA try to provide standardized naming conventions for features such as ALKS which aim to have capacity to manage the driving task, but which are not yet approved to be an automated vehicles in any countries. The Association of British Insurers considers the usage of the word autonomous in marketing for modern cars to be dangerous because car ads make motorists think "autonomous" and "autopilot" mean a vehicle can drive itself when they still rely on the driver to ensure safety. Technology able to drive a car is still in its beta stage.
Some car makers suggest or claim vehicles are self-driving when they are not able to manage some driving situations. Despite being called Full Self-Driving, Tesla stated that its offering should not be considered as a fully autonomous driving system.[61] This makes drivers risk becoming excessively confident, taking distracted driving behavior, leading to crashes. While in Great-Britain, a fully self-driving car is only a car registered in a specific list.[62] There have also been proposals to adopt the aviation automation safety knowledge into the discussions of safe implementation of autonomous vehicles, due to the experience that has been gained over the decades by the aviation sector on safety topics.[63]
According to the SMMT, "There are two clear states – a vehicle is either assisted with a driver being supported by technology or automated where the technology is effectively and safely replacing the driver."[64]
Autonomous means self-governing.[65] Many historical projects related to vehicle automation have been automated (made automatic) subject to a heavy reliance on artificial aids in their environment, such as magnetic strips. Autonomous control implies satisfactory performance under significant uncertainties in the environment, and the ability to compensate for system failures without external intervention.[65]
One approach is to implement communication networks both in the immediate vicinity (for collision avoidance) and farther away (for congestion management). Such outside influences in the decision process reduce an individual vehicle's autonomy, while still not requiring human intervention.
As of 2017[update], most commercial projects focused on automated vehicles that did not communicate with other vehicles or with an enveloping management regime. Euro NCAP defines autonomous in "Autonomous Emergency Braking" as: "the system acts independently of the driver to avoid or mitigate the accident", which implies the autonomous system is not the driver.[66]
In Europe, the words automated and autonomous might be used together. For instance, Regulation (EU) 2019/2144 of the European Parliament and of the Council of 27 November 2019 on type-approval requirements for motor vehicles (...) defines "automated vehicle" and "fully automated vehicle" based on their autonomous capacity:[67]
In British English, the word automated alone might have several meaning, such in the sentence: "Thatcham also found that the automated lane keeping systems could only meet two out of the twelve principles required to guarantee safety, going on to say they cannot, therefore, be classed as 'automated driving', instead it claims the tech should be classed as "assisted driving".":[68] The first occurrence of the "automated" word refers to an Unece automated system, while the second occurrence refers to the British legal definition of an automated vehicle. The British law interprets the meaning of "automated vehicle" based on the interpretation section related to a vehicle "driving itself" and an insured vehicle.[69]
To enable a car to travel without any driver embedded within the vehicle, some companies use a remote driver.[70]
According to SAE J3016,
Some driving automation systems may indeed be autonomous if they perform all of their functions independently and self-sufficiently, but if they depend on communication and/or cooperation with outside entities, they should be considered cooperative rather than autonomous.
PC Magazine defines a self-driving car as "a computer-controlled car that drives itself".[71] The Union of Concerned Scientists states that self-driving cars are "cars or trucks in which human drivers are never required to take control to safely operate the vehicle. Also known as autonomous or "driver-less" cars, they combine sensors and software to control, navigate, and drive the vehicle."[72]
The British Automated and Electric Vehicles Act 2018 law defines a vehicle as "driving itself" if the vehicle "is operating in a mode in which it is not being controlled, and does not need to be monitored, by an individual".[73]
Another British definition assumes: "Self-driving vehicles are vehicles that can safely and lawfully drive themselves."[74]
A classification system with six levels – ranging from fully manual to fully automated systems – was published in 2014 by standardization body SAE International as J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems; the details are revised periodically.[13] This classification is based on the amount of driver intervention and attentiveness required, rather than the vehicle's capabilities, although these are loosely related. In the United States in 2013, the National Highway Traffic Safety Administration (NHTSA) had released its original formal classification system. After SAE updated its classification in 2016, called J3016_201609,[77] NHTSA adopted the SAE standard,[78] and SAE classification became widely accepted.[79] The SAE standard plays a major role but it has certain limitations.[80][81]
In SAE's automation level definitions, "driving mode" means "a type of driving scenario with characteristic dynamic driving task requirements (e.g., expressway merging, high speed cruising, low speed traffic jam, closed-campus operations, etc.)"[1][82]
In the formal SAE definition below, an important transition is from SAE Level 2 to SAE Level 3 in which the human driver is no longer expected to monitor the environment continuously. At SAE 3, the human driver still has responsibility to intervene when asked to do so by the automated system. At SAE 4 the human driver is always relieved of that responsibility and at SAE 5 the automated system will never need to ask for an intervention.
SAE Level | Name | Narrative definition | Execution of steering and acceleration/ deceleration |
Monitoring of driving environment | Fallback performance of dynamic driving task | System capability (driving modes) | |
---|---|---|---|---|---|---|---|
Human driver monitors the driving environment | |||||||
0 | No Automation | The full-time performance by the human driver of all aspects of the dynamic driving task, even when "enhanced by warning or intervention systems" | Human driver | Human driver | Human driver | N/a | |
1 | Driver Assistance | The driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration | Using information about the driving environment and with the expectation that the human driver performs all remaining aspects of the dynamic driving task | Human driver and system | Some driving modes | ||
2 | Partial Automation | The driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/deceleration | System | ||||
Automated driving system monitors the driving environment | |||||||
3 | Conditional Automation | The driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task | With the expectation that the human driver will respond appropriately to a request to intervene | System | System | Human driver | Some driving modes |
4 | High Automation | Even if a human driver does not respond appropriately to a request to intervene the car can pull over safely by guiding system | System | Many driving modes | |||
5 | Full Automation | Under all roadway and environmental conditions that can be managed by a human driver | All driving modes |
The SAE Automation Levels have been criticized for their technological focus. It has been argued that the structure of the levels suggests that automation increases linearly and that more automation is better, which may not always be the case.[87] The SAE Levels also do not account for changes that may be required to infrastructure[88] and road user behavior.[89][90]
Several classifications have been proposed to deal with the broad range of technological discussions pertaining to self-driving cars. One such proposal is to classify based on the following categories; car navigation, path planning, environment perception and car control.[91] In the 2020s, it became apparent that these technologies are far more complex than initially thought.[92][93] Even video games have been used as a platform to test autonomous vehicles.[94]
Main article: Hybrid navigation |
Hybrid navigation is the simultaneous use of more than one navigation system for location data determination, needed for navigation.
Sensing
To reliably and safely operate an autonomous vehicle, usually a mixture of sensors is utilized.[93]
Typical sensors include lidar (Light Detection and Ranging), stereo vision, GPS and IMU.[95][96]
Modern self-driving cars generally use Bayesian simultaneous localization and mapping (SLAM) algorithms,
which fuse data from multiple sensors and an off-line map into current location estimates and map updates.[97]
Waymo has developed a variant of SLAM with detection and tracking of other moving objects (DATMO), which also handles obstacles such as cars and pedestrians. Simpler systems may use roadside real-time locating system (RTLS) technologies to aid localization.
Maps
Self-driving cars require a new class of high-definition maps (HD maps) that represent the world at up to two orders of magnitude more detail.[93] In May 2018, researchers from the Massachusetts Institute of Technology (MIT) announced that they had built an automated car that can navigate unmapped roads.[98] Researchers at their Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a new system, called MapLite, which allows self-driving cars to drive on roads that they have never been on before, without using 3D maps. The system combines the GPS position of the vehicle, a "sparse topological map" such as OpenStreetMap (i.e. having 2D features of the roads only), and a series of sensors that observe the road conditions.[99]
Sensor fusion
Control systems on automated cars may use sensor fusion, which is an approach that integrates information from a variety of sensors on the car to produce a more consistent, accurate, and useful view of the environment.[100] Self-driving cars tend to use a combination of cameras, LiDAR sensors, and radar sensors in order to enhance performance and ensure the safety of the passenger and other drivers on the road. An increased consistency in self-driving performance prevents accidents that may occur because of one faulty sensor.[101]
Path planning
Path planning is a computational problem to find a sequence of valid configurations that moves the object from the source to destination. Self-driving cars rely on path planning technology in order to follow the rules of traffic and prevent accidents from occurring. The large scale path of the vehicle can be determined by using a voronoi diagram, an occupancy grid mapping, or with a driving corridors algorithm.[102] A driving corridors algorithm allows the vehicle to locate and drive within open free space that is bounded by lanes or barriers. While these algorithms work in a simple situation, path planning has not been proven to be effective in a complex scenario. Two techniques used for path planning are graph-based search and variational-based optimization techniques. Graph-based techniques can make harder decisions such as how to pass another vehicle/obstacle. Variational-based optimization techniques require a higher level of planning in setting restrictions on the vehicle's driving corridor to prevent collisions.[103]
Main article: Drive by wire |
Drive by wire technology in the automotive industry is the use of electrical or electro-mechanical systems for performing vehicle functions traditionally achieved by mechanical linkages.
Main article: Driver monitoring system |
Driver monitoring system is a vehicle safety system to assess the driver's alertness and warn the driver if needed. It is recognized in developer side that the role of the systems will increase as SAE Level 2 systems become more common-place, and becomes more challenging at Level 3 and above to predict the driver's readiness for handover.[104]
Main article: Vehicular communication systems |
Vehicular communications is a growing area of communications between vehicles and including roadside communication infrastructure. Vehicular communication systems use vehicles and roadside units as the communicating nodes in a peer-to-peer network, providing each other with information. This connectivity enables autonomous vehicles to interact with non-autonomous traffic and pedestrians to increase safety.[105][106] And autonomous vehicles will need to connect to the cloud to update their software and maps, and feedback information to improve the used maps and software of their manufacturer.[93]
See also: Over-the-air programming |
Autonomous vehicles have software systems that drive the vehicle, meaning that updates through reprogramming or editing the software can enhance the benefits of the owner (e.g. update in better distinguishing blind person vs. non-blind person so that the vehicle will take extra caution when approaching a blind person). A characteristic of this re-programmable part of autonomous vehicles is that the updates need not only to come from the supplier, because through machine learning, smart autonomous vehicles can generate certain updates and install them accordingly (e.g. new navigation maps or new intersection computer systems). These reprogrammable characteristics of the digital technology and the possibility of smart machine learning give manufacturers of autonomous vehicles the opportunity to differentiate themselves on software.
In March 2021, UNECE regulation on software update and software update management system was published.[107]
Autonomous vehicles are more modular since they are made up out of several modules which will be explained hereafter through a Layered Modular Architecture. The Layered Modular Architecture extends the architecture of purely physical vehicles by incorporating four loosely coupled layers of devices, networks, services and contents into Autonomous Vehicles. These loosely coupled layers can interact through certain standardized interfaces.
In order for autonomous vehicles to perceive their surroundings, they have to use different techniques each with their own accompanying digital information (e.g. radar, GPS, motion sensors and computer vision). Homogenization requires that the digital information from these different sources is transmitted and stored in the same form. This means their differences are decoupled, and digital information can be transmitted, stored, and computed in a way that the vehicles and their operating system can better understand and act upon it.
In international standardization field, ISO/TC 22 is in charge of in-vehicle transport information and control systems,[108] and ISO/TC 204 is in charge of information, communication and control systems in the field of urban and rural surface transportation.[109] International standards have been actively developed in the domains of AD/ADAS functions, connectivity, human interaction, in-vehicle systems, management/engineering, dynamic map and positioning, privacy and security.[110]
In 2017, Mobileye published a mathematical model for automated vehicle safety which is called "Responsibility-Sensitive Safety (RSS)".[111] It is under standardization at IEEE Standards Association as "IEEE P2846: A Formal Model for Safety Considerations in Automated Vehicle Decision Making".[112]
In 2022, a research group of National Institute of Informatics (NII, Japan) expanded RSS and developed "Goal-Aware RSS" to make RSS rules possible to deal with complex scenarios via program logic.[113]
The potential benefits from increased vehicle automation described may be limited by foreseeable challenges such as disputes over liability,[114][115] the time needed to turn over the existing stock of vehicles from non-automated to automated,[116] and thus a long period of humans and autonomous vehicles sharing the roads, resistance by individuals to forfeiting control of their cars,[117] concerns about safety,[118] and the implementation of a legal framework and consistent global government regulations for self-driving cars.[119] In addition, cyberattacks could be a potential threat to autonomous driving in the future.[120]
Other obstacles could include de-skilling and lower levels of driver experience for dealing with potentially dangerous situations and anomalies,[121] ethical problems where an automated vehicle's software is forced during an unavoidable crash to choose between multiple harmful courses of action (the trolley problem),[122][123] concerns about making large numbers of people currently employed as drivers unemployed, the potential for more intrusive mass surveillance of location, association and travel as a result of police and intelligence agency access to large data sets generated by sensors and pattern-recognition AI, and possibly insufficient understanding of verbal sounds, gestures and non-verbal cues by police, other drivers or pedestrians.[124]
Possible technological obstacles for automated cars are:[needs update]
Deceptive marketing
As Tesla's "Full Self-Driving (FSD)" actually corresponds to Level 2,[135]
senators called for investigation to the Federal Trade Commission (FTC) about their marketing claims in August 2021.[136]
And in December 2021 in Japan, Mercedes-Benz Japan Co., Ltd. was punished by the Consumer Affairs Agency for the descriptions in their handouts that are different from the fact.[137]
In July 2016, following a fatal crash by a Tesla car operating in "Autopilot" mode, Mercedes-Benz was also slammed for a misleading commercial advertising E-Class models which had been available with "Drive Pilot".[138] At that time, Mercedes-Benz rejected the claims and stopped its "self-driving car" ad campaign which had been running in the United States.[139][140] In August 2022, the California Department of Motor Vehicles (DMV) accused Tesla of deceptive marketing practices.[141]
Employment
Companies working on the technology have an increasing recruitment problem in that the available talent pool has not grown with demand.[142] As such, education and training by third-party organizations such as providers of online courses and self-taught community-driven projects such as DIY Robocars[143] and Formula Pi have quickly grown in popularity, while university level extra-curricular programmed such as Formula Student Driver-less[144] have bolstered graduate experience. Industry is steadily increasing freely available information sources, such as code,[145] datasets[146] and glossaries[147] to widen the recruitment pool.
National security
In the 2020s, from the importance of the automotive sector to the nation, self-driving car has become a topic of national security. The concerns regarding cybersecurity and data protection are not only important for user protection, but also in the context of national security. The trove of data collected by self-driving cars, paired with cybersecurity vulnerabilities, creates an appealing target for intelligence collection. Self-driving cars are required to be considered in a new way when it comes to espionage risk.[148]
It was in July 2018 that a former Apple engineer was arrested by Federal Bureau of Investigation (FBI) at San Jose International Airport (SJC) while preparing to board a flight to China and charged with stealing proprietary information related to Apple's self-driving car project.[149][150] And in January 2019, another Apple employee was charged with stealing self-driving car project secrets.[151] In July 2021, United States Department of Justice (DOJ) accused Chinese security officials of a hacking attack seeking data on of coordinating a vast hacking campaign to steal sensitive and secret information from government entities including research related to autonomous vehicles.[152][153] On the China side, they have already prepared "the Provisions on Management of Automotive Data Security (Trial)".[154][155]
It is concerned that leapfrogging ability can be applied to autonomous car technology.[156] Also, emerging Cellular V2X (Cellular Vehicle-to-Everything) technologies are based on 5G wireless networks.[157] As of November 2022[update], US Congress is applying fresh scrutiny to the possibility that imported Chinese technology could be a Trojan horse.[158]
See also: Human factors and ergonomics |
Moving obstacles
Self-driving cars are already exploring the difficulties of determining the intentions of pedestrians, bicyclists, and animals, and models of behavior must be programmed into driving algorithms.[10] Human road users also have the challenge of determining the intentions of autonomous vehicles, where there is no driver with which to make eye contact or exchange hand signals. Drive.ai is testing a solution to this problem that involves LED signs mounted on the outside of the vehicle, announcing status such as "going now, don't cross" vs. "waiting for you to cross".[159]
Handover and risk compensation
Two human-factor challenges are important for safety. One is the handover from automated driving to manual driving.
Human factors research on automated systems has shown that people are slow to detect a problem with automation and slow to understand the problem after it is detected. When automation failures occur, unexpected transitions that require a driver to take over will occur suddenly and the driver may not be ready to take over.[160]
The second challenge is known as risk compensation: as a system is perceived to be safer, instead of benefiting entirely from all of the increased safety, people engage in riskier behavior and enjoy other benefits. Semi-automated cars have been shown to suffer from this problem, for example with users of Tesla Autopilot ignoring the road and using electronic devices or other activities against the advice of the company that the car is not capable of being completely autonomous. In the near future, pedestrians and bicyclists may travel in the street in a riskier fashion if they believe self-driving cars are capable of avoiding them.
Trust
In order for people to buy self-driving cars and vote for the government to allow them on roads, the technology must be trusted as safe.[161][162] Self-driving elevators were invented in 1900, but the high number of people refusing to use them slowed adoption for several decades until operator strikes increased demand and trust was built with advertising and features like the emergency stop button.[163][164] There are three types of trust between human and automation.[165] There is dispositional trust, the trust between the driver and the company's product;[165] there is situational trust, or the trust from different scenarios;[165] and there is learned trust where the trust is built between similar events.[165]
See also: Machine ethics |
Rationale for liability
There are different opinions on who should be held liable in case of a crash, especially with people being hurt.[166] One study suggests requesting the owners of self-driving cars to sign end-user license agreements (EULAs), assigning to them accountability for any accidents.[167] Other studies suggest introducing a tax or insurances that would protect owners and users of automated vehicles of claims made by victims of an accident.[166] Other possible parties that can be held responsible in case of a technical failure include software engineers that programmed the code for the automated operation of the vehicles, and suppliers of components of the AV.[168]
Implications from the Trolley Problem
A moral dilemma that a software engineer or car manufacturer might face in programming the operating software of a self-driving vehicle is captured in a variation of the traditional ethical thought experiment, the trolley problem: An AV is driving with passengers when suddenly a person appears in its way and the car has to commit between one of two options, either to run the person over or to avoid hitting the person by swerving into a wall, killing the passengers.[169] Researchers have suggested, in particular, two ethical theories to be applicable to the behavior of automated vehicles in cases of emergency: deontology and utilitarianism.[10][170] Deontological theory suggests that an automated car needs to follow strict written-out rules that it needs to follow in any situation. Utilitarianism, on the other hand, promotes maximizing the number of people surviving in a crash. Critics suggest that automated vehicles should adapt a mix of multiple theories to be able to respond morally right in the instance of a crash.[10][170] Recently, some specific ethical frameworks i.e., utilitarianism, deontology, relativism, absolutism (monism), and pluralism, are investigated empirically with respect to the acceptance of self-driving cars in unavoidable accidents.[171]
According to research, people overwhelmingly express a preference for autonomous vehicles to be programmed with utilitarian ideas, that is, in a manner that generates the least harm and minimizes driving casualties.[172] While people want others to purchase utilitarian promoting vehicles, they themselves prefer to ride in vehicles that prioritize the lives of people inside the vehicle at all costs.[172] This presents a paradox in which people prefer that others drive utilitarian vehicles designed to maximize the lives preserved in a fatal situation but want to ride in cars that prioritize the safety of passengers at all costs.[172] People disapprove of regulations that promote utilitarian views and would be less willing to purchase a self-driving car that may opt to promote the greatest good at the expense of its passengers.[172]
Bonnefon et al. concluded that the regulation of autonomous vehicle ethical prescriptions may be counterproductive to societal safety.[172] This is because, if the government mandates utilitarian ethics and people prefer to ride in self-protective cars, it could prevent the large scale implementation of self-driving cars.[172] Delaying the adoption of autonomous cars vitiates the safety of society as a whole because this technology is projected to save so many lives.[172]
Privacy
Privacy-related issues arise mainly from the interconnectivity of automated cars, making it just another mobile device that can gather any information about an individual (see data mining). This information gathering ranges from tracking of the routes taken, voice recording, video recording, preferences in media that is consumed in the car, behavioral patterns, to many more streams of information.[173][174][175] The data and communications infrastructure needed to support these vehicles may also be capable of surveillance, especially if coupled to other data sets and advanced analytics.[173]
Robotaxi
Main article: Robotaxi |
Robotaxi is an application of self-driving car which is supposed to be operated by taxi company or ridesharing company. Through the massive investments by Big Tech companies in the mid-2010s, research and development of robotaxi became active in the U.S.[176]
Self-driving shuttle and bus
Further information: Vehicular automation § Shuttle |
Self-driving shuttle is an application of self-driving car with considerations of multiple passengers supposing the use cases mainly in cities. Through the European Union funded "CityMobil2" project in the mid-2010s, research and development of self-driving shuttle became active in Europe.[177] Continuously, under the funding programme Horizon 2020, "Avenue" project was conducted from 2018 to 2022 in four cities (Geneva, Lyon, Copenhagen and Luxembourg).[178]
Self-driving truck and van
Main article: Self-driving truck |
Companies such as Otto and Starsky Robotics have focused on autonomous trucks. Automation of trucks is important, not only due to the improved safety aspects of these very heavy vehicles, but also due to the ability of fuel savings through platooning. Autonomous vans are being developed for use by online grocers such as Ocado.[179]
Autonomous micro-mobility
Research has indicated that goods distribution on the macro (urban distribution) and micro level (last mile delivery) could be made more efficient with the use of autonomous vehicles[180] thanks to the possibility of smaller vehicle sizes.
Also, simulation studies in MIT Media Lab indicate that ultra-lightweight systems can become more helping to remove cars from our cities by applying autonomous driving technologies.[181]
In November 2022, Honda unveiled the "Honda CI Micro-mobility" machines and their core technologies. Honda starts demonstration testing using "Honda CI Micro-mobility" machines, "CiKoMa" and "WaPOCH", at two locations in Jōsō City of Ibaraki Prefecture.[182]
Autonomous work vehicle
In 2021, Honda and Black & Veatch have successfully tested their second generation prototype Autonomous Work Vehicle (AWV) at a Black & Veatch construction site in New Mexico.[183]
In December 2022, eve autonomy in Japan, a company backed by Yamaha Motor and TIER IV, launched the all-in-one autonomous transportation commercial service "eve auto" with EV work vehicle as the first SAE Level 4 service in Japan at nine sites, including Yamaha Motor's three factories, Prime Polymer's Anesaki Works, Panasonic's cold chain factory in the Oizumi area, Fuji Electric's Suzuka factory, Japan Logistic Systems Corp.'s Ageo Center, and ENEOS Corp.'s Negishi refinery.[184]
The testing of vehicles with varying degrees of automation can be carried out either physically, in a closed environment[185] or, where permitted, on public roads (typically requiring a license or permit,[186] or adhering to a specific set of operating principles),[187] or in a virtual environment, i.e. using computer simulations.[188][189] When driven on public roads, automated vehicles require a person to monitor their proper operation and "take over" when needed. For example, New York has strict requirements for the test driver, such that the vehicle can be corrected at all times by a licensed operator; highlighted by Cardian Cube Company's application and discussions with New York State officials and the NYS DMV.[190]
In California, self-driving car manufacturers are required to submit annual reports to share how often their vehicles disengaged from autonomous mode during tests.[191] It has been believed that we would learn how reliable the vehicles are becoming based on how often they needed "disengagements".[192]
In 2017, Waymo reported 63 disengagements over 352,545 mi (567,366 km) of testing, an average distance of 5,596 mi (9,006 km) between disengagements, the highest among companies reporting such figures. Waymo also traveled a greater total distance than any of the other companies. Their 2017 rate of 0.18 disengagements per 1,000 mi (1,600 km) was an improvement over the 0.2 disengagements per 1,000 mi (1,600 km) in 2016, and 0.8 in 2015. In March 2017, Uber reported an average of just 0.67 mi (1.08 km) per disengagement. In the final three months of 2017, Cruise (now owned by GM) averaged 5,224 mi (8,407 km) per disengagement over a total distance of 62,689 mi (100,888 km).[193] In July 2018, the first electric driver-less racing car, "Robocar", completed a 1.8-kilometer track, using its navigation system and artificial intelligence.[194]
Car maker | California, 2016[193] | California, 2018[195] | California, 2019[196] | |||
---|---|---|---|---|---|---|
Distance between disengagements |
Total distance traveled | Distance between disengagements |
Total distance traveled | Distance between disengagements |
Total distance traveled | |
Waymo | 5,128 mi (8,253 km) | 635,868 mi (1,023,330 km) | 11,154 mi (17,951 km) | 1,271,587 mi (2,046,421 km) | 11,017 mi (17,730 km) | 1,450,000 mi (2,330,000 km) |
BMW | 638 mi (1,027 km) | 638 mi (1,027 km) | ||||
Nissan | 263 mi (423 km) | 6,056 mi (9,746 km) | 210 mi (340 km) | 5,473 mi (8,808 km) | ||
Ford | 197 mi (317 km) | 590 mi (950 km) | ||||
General Motors | 55 mi (89 km) | 8,156 mi (13,126 km) | 5,205 mi (8,377 km) | 447,621 mi (720,376 km) | 12,221 mi (19,668 km) | 831,040 mi (1,337,430 km) |
Aptiv | 15 mi (24 km) | 2,658 mi (4,278 km) | ||||
Tesla | 3 mi (4.8 km) | 550 mi (890 km) | ||||
Mercedes-Benz | 2 mi (3.2 km) | 673 mi (1,083 km) | 1.5 mi (2.4 km) | 1,749 mi (2,815 km) | ||
Bosch | 7 mi (11 km) | 983 mi (1,582 km) | ||||
Zoox | 1,923 mi (3,095 km) | 30,764 mi (49,510 km) | 1,595 mi (2,567 km) | 67,015 mi (107,850 km) | ||
Nuro | 1,028 mi (1,654 km) | 24,680 mi (39,720 km) | 2,022 mi (3,254 km) | 68,762 mi (110,662 km) | ||
Pony.ai | 1,022 mi (1,645 km) | 16,356 mi (26,322 km) | 6,476 mi (10,422 km) | 174,845 mi (281,386 km) | ||
Baidu (Apolong) | 206 mi (332 km) | 18,093 mi (29,118 km) | 18,050 mi (29,050 km) | 108,300 mi (174,300 km) | ||
Aurora | 100 mi (160 km) | 32,858 mi (52,880 km) | 280 mi (450 km) | 39,729 mi (63,938 km) | ||
Apple | 1.1 mi (1.8 km) | 79,745 mi (128,337 km) | 118 mi (190 km) | 7,544 mi (12,141 km) | ||
Uber | 0.4 mi (0.64 km) | 26,899 mi (43,290 km) | 0 mi (0 km) |
Disengagements
As of 2022[update], "disengagements" are at the center of the controversy. The problem is that reporting companies have varying definitions of what qualifies as a disengagement, and that definition can change over time.[197][192]
Executives of self-driving car companies have criticized disengagements as a deceptive metric, because it does not take into account the higher degree of difficulty navigating urban streets compared with interstates highway.[198]
Compliance
In April 2021, WP.29 GRVA issued the master document on "Test Method for Automated Driving (NATM)".[199]
In October 2021, the Europe's comprehensive pilot test of automated driving on public roads, L3Pilot, demonstrated automated systems for cars in Hamburg, Germany, in conjunction with ITS World Congress 2021. SAE Level 3 and 4 functions were tested on ordinary roads.[200][201] At the end of February 2022, the final results of the L3Pilot project were published.[202]
In November 2022, an International Standard ISO 34502 on "Scenario based safety evaluation framework" was published.[203][204]
Collision avoidance
In April 2022, collision avoidance testing was demonstrated by Nissan.[205][206]
Also, Waymo published a document about collision avoidance testing in December 2022.[207]
Simulation and validation
In September 2022, Biprogy released a software system of "Driving Intelligence Validation Platform (DIVP)" as the achievement of Japanese national project "SIP-adus" led by Cabinet Office with the same name of its subproject which is interoperable with Open Simulation Interface (OSI) of ASAM.[208][209][210]
Topics
In November 2021, the California Department of Motor Vehicles (DMV) notified Pony.ai that it was suspending its driverless testing permit following a reported collision in Fremont on 28 October. This incident stands out because the vehicle was in autonomous mode and didn't involve any other vehicle.[211]
In May 2022, DMV revoked Pony.ai's permit for failing to monitor the driving records of the safety drivers on its testing permit.[212]
In April 2022, it is reported that Cruise's testing vehicle blocked fire engine on emergency call, and sparked questions about an autonomous vehicle's ability to handle unexpected roadway issues.[213][214]
In November 2022, Toyota gave a demonstration of one of its GR Yaris test car equipped with AI, which had been trained on the skills and knowledge of professional rally drivers to enhance the safety of self-driving cars.[215] Toyota has been using the learnings from the collaborative activities with Microsoft in FIA World Rally Championship since 2017 season.[216]
See also: Tesla Autopilot § Notable crashes |
As of November 2021[update], Tesla's advanced driver-assistance system (ADAS) Autopilot is classified as a Level 2.[217]
On 20 January 2016, the first of five known fatal crashes of a Tesla with Autopilot occurred in China's Hubei province.[218] According to China's 163.com news channel, this marked "China's first accidental death due to Tesla's automatic driving (system)". Initially, Tesla pointed out that the vehicle was so badly damaged from the impact that their recorder was not able to conclusively prove that the car had been on autopilot at the time; however, 163.com pointed out that other factors, such as the car's absolute failure to take any evasive actions prior to the high speed crash, and the driver's otherwise good driving record, seemed to indicate a strong likelihood that the car was on autopilot at the time. A similar fatal crash occurred four months later in Florida.[219][220] In 2018, in a subsequent civil suit between the father of the driver killed and Tesla, Tesla did not deny that the car had been on autopilot at the time of the accident, and sent evidence to the victim's father documenting that fact.[221]
The second known fatal accident involving a vehicle being driven by itself took place in Williston, Florida on 7 May 2016 while a Tesla Model S electric car was engaged in Autopilot mode. The occupant was killed in a crash with an 18-wheel tractor-trailer. On 28 June 2016 the US National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident working with the Florida Highway Patrol. According to NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck's trailer.[222][223] NHTSA's preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involved a population of an estimated 25,000 Model S cars.[224] On 8 July 2016, NHTSA requested Tesla Motors provide the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla's planned updates schedule for the next four months.[225]
According to Tesla, "neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." The car attempted to drive full speed under the trailer, "with the bottom of the trailer impacting the windshield of the Model S". Tesla also claimed that this was Tesla's first known autopilot death in over 130 million miles (210 million kilometers) driven by its customers with Autopilot engaged, however by this statement, Tesla was apparently refusing to acknowledge claims that the January 2016 fatality in Hubei China had also been the result of an autopilot system error. According to Tesla there is a fatality every 94 million miles (151 million kilometers) among all type of vehicles in the US.[222][223][226] However, this number also includes fatalities of the crashes, for instance, of motorcycle drivers with pedestrians.[227][228]
In July 2016, the US National Transportation Safety Board (NTSB) opened a formal investigation into the fatal accident while the Autopilot was engaged. The NTSB is an investigative body that has the power to make only policy recommendations. An agency spokesman said "It's worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible."[229] In January 2017, the NTSB released the report that concluded Tesla was not at fault; the investigation revealed that for Tesla cars, the crash rate dropped by 40 percent after Autopilot was installed.[230]
In 2021, NTSB Chair called on Tesla to change the design of its Autopilot to ensure it cannot be misused by drivers, according to a letter sent to the company's CEO.[217]
See also: Waymo § Crashes |
Waymo originated as a self-driving car project within Google. In August 2012, Google announced that their vehicles had completed over 300,000 automated-driving miles (500,000 km) accident-free, typically involving about a dozen cars on the road at any given time, and that they were starting to test with single drivers instead of in pairs.[231] In late-May 2014, Google revealed a new prototype that had no steering wheel, gas pedal, or brake pedal, and was fully automated.[232] As of March 2016[update], Google had test-driven their fleet in automated mode a total of 1,500,000 mi (2,400,000 km).[233] In December 2016, Google Corporation announced that its technology would be spun off to a new company called Waymo, with both Google and Waymo becoming subsidiaries of a new parent company called Alphabet.[234][235]
According to Google's accident reports as of early 2016, their test cars had been involved in 14 collisions, of which other drivers were at fault 13 times, although in 2016 the car's software caused a crash.[236]
In June 2015, Brin confirmed that 12 vehicles had suffered collisions as of that date. Eight involved rear-end collisions at a stop sign or traffic light, two in which the vehicle was side-swiped by another driver, one in which another driver rolled through a stop sign, and one where a Google employee was controlling the car manually.[237] In July 2015, three Google employees suffered minor injuries when their vehicle was rear-ended by a car whose driver failed to brake at a traffic light. This was the first time that a collision resulted in injuries.[238] On 14 February 2016 a Google vehicle attempted to avoid sandbags blocking its path. During the maneuver it struck a bus. Google stated, "In this case, we clearly bear some responsibility, because if our car hadn't moved, there wouldn't have been a collision."[239][240] Google characterized the crash as a misunderstanding and a learning experience. No injuries were reported in the crash.[236]
In March 2018, Elaine Herzberg died after being hit by a self-driving car being tested by Uber's Advanced Technologies Group (ATG) in the US state of Arizona. There was a safety driver in the car. Herzberg was crossing the road about 400 feet from an intersection.[241] This marks the first time an individual is known to have been killed by an autonomous vehicle, and the incident raised questions about regulation of the self-driving car industry.[242] Some experts said a human driver could have avoided the fatal crash.[243] Arizona governor Doug Ducey suspended the company's ability to test and operate its automated cars on public roadways citing an "unquestionable failure" of the expectation that Uber make public safety its top priority.[244] Uber then stopped self-driving tests in California until it was issued a new permit in 2020.[245][246]
In May 2018, the US National Transportation Safety Board (NTSB) issued a preliminary report.[247] The final report 18 months later determined that the immediate cause of the accident was the safety driver's failure to monitor the road because she was distracted by her phone. However, Uber ATG's "inadequate safety culture" contributed to the crash. The report noted from the post-mortem that the victim had "a very high level" of methamphetamine in her body.[248] The board also called on federal regulators to carry out a review before allowing automated test vehicles to operate on public roads.[249][250]
In September 2020, the backup driver, Rafael Vasquez, was charged with negligent homicide, because she did not look at the road for several seconds while her phone was streaming The Voice broadcast by Hulu. She pleaded not guilty and was released to await trial. Uber does not face any criminal charge because in the USA there is no basis for criminal liability for the corporation. The safety driver is assumed to be responsible of the accident, because she was in the driving seat in a capacity to avoid an accident (like in a Level 3). The trial was planned for February 2021.[251][needs update]
On 9 November 2017, a Navya Arma automated self-driving bus with passengers was involved in a crash with a truck. The truck was found to be at fault of the crash, reversing into the stationary automated bus. The automated bus did not take evasive actions or apply defensive driving techniques such as flashing its headlights, or sounding the horn. As one passenger commented, "The shuttle didn't have the ability to move back. The shuttle just stayed still."[252]
On 12 August 2021, a 31-year-old Chinese man was killed after his NIO ES8 collided with a construction vehicle.[253] NIO's self-driving feature is still in beta and cannot yet deal with static obstacles.[254] Though the vehicle's manual clearly states that the driver must take over when nearing construction sites, the issue is whether the feature was improperly marketed and unsafe. Lawyers of the deceased's family have also called into question NIO's private access to the vehicle, which they argue may lead to the data ending up forged.[255]
On 26 August 2021, a Toyota e-Palette, a mobility vehicle used to support mobility within the Athletes' Village at the Olympic and Paralympic Games Tokyo 2020, collided with a visually impaired pedestrian about to cross a pedestrian crossing.[256] The suspension was made after the accident, and restarted on 31 with improved safety measures.[257]
In a 2011 online survey of 2,006 US and UK consumers by Accenture, 49% said they would be comfortable using a "driverless car".[258]
A 2012 survey of 17,400 vehicle owners by J.D. Power and Associates found 37% initially said they would be interested in purchasing a "fully autonomous car". However, that figure dropped to 20% if told the technology would cost US$3,000 more.[259]
In a 2012 survey of about 1,000 German drivers by automotive researcher Puls, 22% of the respondents had a positive attitude towards these cars, 10% were undecided, 44% were skeptical and 24% were hostile.[260]
A 2013 survey of 1,500 consumers across 10 countries by Cisco Systems found 57% "stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver", with Brazil, India and China the most willing to trust automated technology.[261]
In a 2014 US telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an automated car was available instead.[262]
In a February 2015 survey of top auto journalists, 46% predicted that either Tesla or Daimler would be the first to the market with a fully autonomous vehicle, while (at 38%) Daimler was predicted to be the most functional, safe, and in-demand autonomous vehicle.[263] In 2015, a questionnaire survey by Delft University of Technology explored the opinion of 5,000 people from 109 countries on automated driving. Results showed that respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the respondents did not want to spend any money for a fully automated driving system. Respondents were found to be most concerned about software hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries (in terms of lower accident statistics, higher education, and higher income) were less comfortable with their vehicle transmitting data.[264] The survey also gave results on potential consumer opinion on interest of purchasing an automated car, stating that 37% of surveyed current owners were either "definitely" or "probably" interested in purchasing an automated car.[264]
In 2016, a survey in Germany examined the opinion of 1,603 people, who were representative in terms of age, gender, and education for the German population, towards partially, highly, and fully automated cars. Results showed that men and women differ in their willingness to use them. Men felt less anxiety and more joy towards automated cars, whereas women showed the exact opposite. The gender difference towards anxiety was especially pronounced between young men and women but decreased with participants' age.[265]
In 2016, a PwC survey, in the United States, showing the opinion of 1,584 people, highlights that "66 percent of respondents said they think autonomous cars are probably smarter than the average human driver". People are still worried about safety and mostly the fact of having the car hacked. Nevertheless, only 13% of the interviewees see no advantages in this new kind of cars.[266]
In 2017, Pew Research Center surveyed 4,135 US adults from 1–15 May and found that many Americans anticipate significant impacts from various automation technologies in the course of their lifetimes—from the widespread adoption of automated vehicles to the replacement of entire job categories with robot workers.[267]
In 2019, results from two opinion surveys of 54 and 187 US adults respectively were published. A new standardized questionnaire, the autonomous vehicle acceptance model (AVAM) was developed, including additional description to help respondents better understand the implications of different automation levels. Results showed that users were less accepting of high autonomy levels and displayed significantly lower intention to use highly autonomous vehicles. Additionally, partial autonomy (regardless of level) was perceived as requiring uniformly higher driver engagement (usage of hands, feet and eyes) than full autonomy.[268]
In 2022, research by safety charity Lloyd's Register Foundation uncovered that only a quarter (27%) of the world's population would feel safe in self-driving cars.[269]
Main article: Regulation of self-driving cars |
See also: Regulation of algorithms |
Regulation of self-driving cars is an increasingly important issue which includes multiple subtopics. Among them are self-driving car liability, regulations regarding approval and international conventions.
In the 2010s, researchers openly worried about the potential of future regulation to delay deployment of automated cars on the road.[270] In 2020, international regulation in the form of UNECE WP.29 GRVA was established, regulating Level 3 automated driving. As of 2022[update], in practice, it is actually very difficult to be approved as Level 3.
See also: Lane centering § Sample of level 2 automated cars, and List of self-driving system suppliers § Date of first public road driverless operation |
Between manually driven vehicles (SAE Level 0) and fully autonomous vehicles (SAE Level 5), there are a variety of vehicle types that can be described to have some degree of automation. These are collectively known as semi-automated vehicles. As it could be a while before the technology and infrastructure are developed for full automation, it is likely that vehicles will have increasing levels of automation. These semi-automated vehicles could potentially harness many of the advantages of fully automated vehicles, while still keeping the driver in charge of the vehicle.[271]
Tesla vehicles are equipped with hardware that Tesla claims will allow full self driving in the future. In October 2020 Tesla released a "beta" version of its "Full Self-Driving" software to a small group of testers in the United States;[272] however, this "Full Self-Driving" corresponds to level 2 autonomy.[273]
In 2017, BMW had been trying to make 7 Series as an automated car in public urban motorways of the United States, Germany and Israel before commercializing them in 2021.[274] Although it was not realized, BMW is still preparing 7 Series to become the next manufacturer to reach Level 3 in the second half of 2022.[275][276]
In September 2021, Stellantis has presented its findings from a pilot programme testing Level 3 autonomous vehicles on public Italian highways. Stellantis's Highway Chauffeur claims Level 3 capabilities, which was tested on the Maserati Ghibli and Fiat 500X prototypes.[277] Stellantis is going to roll out Level 3 capability within its cars in 2024.[278]
In January 2022, Polestar, a Volvo Cars' brand, indicated its plan to offer Level 3 autonomous driving system in the Polestar 3 SUV, Volvo XC90 successor, with technologies from Luminar Technologies, Nvidia, and Zenseact.[279]
As of February 2022[update], Hyundai Motor Company is in the stage of enhancing cybersecurity of connected cars to put Level 3 self-driving Genesis G90 on Korean roads.[280]
In December 2022, Honda announced that it will enhance its Level 3 technology to function at any speed below legal limits on highways by 2029.[281][282]
In early 2023, Mercedes-Benz got authorized its Level 3 Drive Pilot in Nevada [283] and plans to submit in California for approval by mid-2023.[284] Drive Pilot is planned to will be available in the US market as an option for some models in the second half of 2023.
In July 2020, Toyota started testing with public demonstration rides on Lexus LS (fifth generation) based TRI-P4 with Level 4 capability.[285] In August 2021, Toyota operated potentially Level 4 service using e-Palette around the Tokyo 2020 Olympic Village.[286]
In September 2020, Mercedes-Benz introduced world's first commercial Level 4 Automated Valet Parking (AVP) system named Intelligent Park Pilot for its new S-Class. The system can be pre-installed but is conditional on future national legal approval.[287][288]
In September 2021, Honda started testing programme toward launch of Level 4 mobility service business in Japan under collaboration with Cruise and General Motors, using Cruise AV.[289] In October 2021 at World Congress on Intelligent Transport Systems, Honda presented that they are already testing Level 4 technology on modified Legend Hybrid EX.[290] At the end of the month, Honda explained that they are conducting verification project on Level 4 technology on a test course in Tochigi prefecture. Honda plans to test on public roads in early 2022.[291]
In February 2022, General Motors and Cruise have petitioned NHTSA for permission to build and deploy a self-driving vehicle, the Cruise Origin, which is without human controls like steering wheels or brake pedals. The car was developed with GM and Cruise investor Honda, and its production is expected to begin in late 2022 in Detroit at GM's Factory Zero.[292][293] As of April 2022[update], the petition is pending.[294]
In April 2022, Honda unveiled its Level 4 mobility service partners to roll out in central Tokyo in the mid-2020s using the Cruse Origin.[295] By September 2022, Japan version prototype of Cruise Origin for Tokyo was completed and started testing.[296]
In January 2023, Holon, the new brand from the Benteler Group, unvield its self-driving shuttle autonomous during the Consumer Electronics Show (CES) 2023 in Las Vegas. The company claims the vehicle is the world's first Level 4 shuttle built to automotive standard. Production of the Holon mover is scheduled to start in the US at the end of 2025.[297]