Next Article in Journal
Experimental Study on the Heat Exchange Mechanism in a Simulated Self-Circulation Wellbore
Previous Article in Journal
Replacement Reserve for the Italian Power System and Electricity Market
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Autonomous Shuttle Bus for Public Transportation: A Review

Department of Automotive Engineering and Transports, Technical University of Cluj-Napoca, 400001 Cluj-Napoca, Romania
*
Author to whom correspondence should be addressed.
Energies 2020, 13(11), 2917; https://doi.org/10.3390/en13112917
Submission received: 23 April 2020 / Revised: 31 May 2020 / Accepted: 4 June 2020 / Published: 6 June 2020
(This article belongs to the Section E: Electric Vehicles)

Abstract

:
The rapid evolution of autonomous technology in the field of automotive and information technology (IT) has made it possible to implement autonomous vehicles (AVs) for public passenger transport. Although the shuttle bus transport capacities currently in use are low (maximum 15 people), the use of these transport units in large urban agglomerations is beneficial for society. The current paper is written to review the current AV implementation with respect to shuttle buses with its direct implications in their scientific evolution, with direct links to the legal and social aspects of public transportation all over the world. A critical aspect that is presented in the paper is the legal framework of autonomous driving, which is extremely uneven around the globe, with the direct impact of autonomous shuttle bus exploitation. As the legislation on AVs presents some shortcomings in the approval, registration, and public road implementation of these vehicles, many of the world’s major cities have found ways to integrate them into testing programs, establishing the basis for future comprehensive legislative measures in this highly dynamic scientific domain. The current technological solutions adopted by several autonomous shuttle bus producers will be presented with an exhaustive overview of each major component. The aspects of the control algorithm, with its complicated layers of security and perturbance factors, will be explained in detail. Thus, in some countries/cities, autonomous shuttle buses have been implemented on less-traveled routes where they can travel at speeds up to 25 km/h without hindering the public’s circulation, such as university campuses, industrial areas, airports, and sports bases. Some countries/cities use autonomous shuttle buses for pilot programs related to passenger transport, while others use them in postal transport and others for scientific purposes. In all of these situations, the first step in autonomous driving has been taken. The paper also makes an evaluation of the social factors that are a consequence of the mass introduction of autonomous driving as a means of public transportation. Autonomous shuttle buses are becoming a part of everyday life in big cities. Their acceptance as a strategic means of transport depends on their efficiency in daily services; through its efficiency, this means of transport will become a game-changer once its benefits become not only known but experienced by a large number of users.

Graphical Abstract

1. Introduction

Public transport is one of the key issues in designing a large metropolitan area, one of the criteria that defines the quality of life in a city, and also strictly related to the social life of each inhabitant. A city without public transport would entail a random mass of people traveling in a random way without reaching their target. There are plenty of direct, indirect, and random factors that influence public transport, but by far the most disturbing one is human interaction in urban traffic. Drivers’ decisions are influenced by many factors and even though they are driving, driving is not always their only focus; sometimes drivers do not maintain a constant distance between their vehicles, do not always focus on signs and traffic lights, do not follow mandatory traffic rules and in their decisions, and always trust their emotions. If we eliminate the human factor from this equation, we will dramatically reduce traffic jams and increase road security. Usually, a driver spends 45 min to 1 h in traffic to reach work from home; if the same person uses public transport (the conventional mode), he or she will spend approximately the same amount of time (including lane changing and waiting time). In the case of a dedicated line for public transportation, the spent time according to several studies [1] will decrease by 40%–60%.
The configuration of these means of transport is highly similar to the general platform of the AVs, yet there are several differences that differentiate them and these aspects have been pointed out in the following sections of the paper: the section related to the powertrain, the section related to the steering, the section related to the braking, section explaining the sensors (the positioning, the type, their coverage) all differ that the “classical” AV, autonomous driving algorithm, security.
An overview of this paper, which contains the main objective of this paper and highlights all the chapters addressed during the review is presented in Figure 1.
The main advantage of autonomous driving is not only the replacement of the human driver with emotional and physical limitations, but the possibility of the vehicle to make predictions and to communicate with infrastructure and other vehicles. Due to of this, AVs will become a key component of the foundations of smart cities. Cities without traffic jams related to public transportation will rapidly increase their quality of life, decrease their energy consumption (in all components related to traffic), and also become more nature friendly in this manner, adding quality to their urban air.
Before talking about autonomous driving history, we have to talk about autonomous driving as a means of transport. History has shown that long before automobiles started being developed in this direction, the aeronautic industry and railroad industry successfully replaced (partially or fully) the human intervention in the control of airplanes, trains, and subways.
The birth of autonomous driving was a complicated process, but it represented something exotic until only recently. This technology emerged behind the scenes without concrete applications to the economy of transport or to industry in general.
In the mid-1920s, Francis P. Houldina, an electrical engineer with a military background, equipped a Chandler vehicle with an antenna that received signals from a second vehicle in front. Control was accomplished through a small electric motor that performed tasks according to the movement of the first vehicle. The communication channel used radio waves [2].
Another visionary was Norman Bel Geddes, an industrial designer who, with support from General Motors, presented a vehicle propelled by magnetic fields generated via circuits embedded in the road in 1939 during the “Futurama” exhibit at the World Fair [3].
In 1957, RCA LABS-USA presented to the public, two vehicles provided by General Motors that have been equipped with receivers able to interpret the signals coming from the road. The vehicles have been able to manage automatic steering, acceleration and braking [4].
Dr. Robert L. Cosgriff in 1960 working at Ohio State University, presented the idea that driver automation based on information from the road would become a reality in the next 15 years. He was working in the laboratory of communication and control systems [5].
On the European continent, the pioneers in the field were a team form the Transport Road Research Laboratory that, in 1960, successfully tested a Citroen DS with speeds over 130 km/h without a driver in all kinds of weather conditions. This vehicle functioned by following the magnetic field of a series of electrical wires mounted on the road [6].
The year 1980 introduced a new approach developed by Mercedes-Benz, one that considers “vision” as the decisive factor for a non-driver van, which was operated successfully on public roads. This project was conducted by Prof. Ernst Dickman, a pioneer in computer vision working at Bundeswehr University Munich.
Legislation then began to be promoted for this type of vehicle, and, in 1997, the United States Department of Transportation created the first laws related to the “demonstration of an automated vehicle and highway system” [7].
Schiphol Airport in December 1997 was the first public entity to use the “Park Shuttle”, called an “automated people mover”. This technology was the foundation for future accomplishments in the field, representing the first time that autonomous driving capabilities were tested in public common usage [8].
Since the year 2000, a revolution of autonomous driving has taken place. Independent researchers, automotive companies, software companies, and electronics companies all seek to claim ownership of this complex means of transport that will definitely change humans’ relationships with vehicles once and for all.
From a standardization point of view, the Society of Automotive Engineers (SAE) established a standard and issued the “SAE J3016™: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems”, in part, to speed up the delivery of an initial regulatory framework and best practices to guide manufacturers and other entities in the safe design, development, testing, and deployment of highly automated vehicles.
An AV is a vehicle equipped with hardware and software that ensures driving capacity without the need for human intervention with the control mechanisms of the vehicle, with or without remote monitoring of the vehicle.
The clear advantages of autonomous driving in the short and medium term will have an obvious impact on public transportation once this solution is used on a large-scale. Current shuttle buses serving public roads reached Level 3 and 4 of the autonomy characteristics presented in Figure 2, according to SAE J3016™ [9].
Currently, there are several companies worldwide that have already produced autonomous driving shuttle buses for the market. These companies first developed pilot programs, validated their products, and can now replicate their products on a large scale. These companies include Apollo Baidu (Baidu, China) [10], EasyMile EZ10 (EasyMile, Toulouse, France) [11], Navya Arma (Navya, Lyon, France) [12], and Olli (Local Motors, National Harbor, MD, USA) [13].
Ordinary autonomous vehicles do not carry rear axle steering for better maneuverability in the city, the sensors positioning, and their range and versatility is especially designed for public transportation, the security aspects pointed in the paper are in direct link with the public transport. The concept of autonomous driving is the same, its benefits are somehow similar and some characteristics of the powertrain might be common, but at the end these two types of vehicle models belong to the same family.
The vast majority of these companies emerged from startups, and their backgrounds are strongly related to research and development. In terms of research, several scientific reviews cover this subject.
Ahmed et al. [14] present autonomous driving from the perspective of pedestrian and bicycles recognition. Ainsalu et al. [15], on the other hand, present the current implications of autonomous buses based on several projects and their outcomes. Dominguez et al. [16] report research on the integration of autonomous driving in a smart city infrastructure. Rosique et al. [17] focus on sensors and object perception as a source of information for autonomous driving control algorithms. Soteropoulos et al. [18] present research on the integration of autonomous driving in modern traffic and its influence on current transport strategies. Taeihagh et al. [19], on the other hand, focused their research on the cyber security of AVs and its influence on the privacy of the users of these technologies. Zheng et al. [20] present a series of methodologies related to maps and roads as driving paths for AVs.
Such research must have a legal framework to accurately present its results and promote relevant products to the public. In the particular case of autonomous driving, there are several regulations by the United Nations (UN).
The regulatory framework for AVs at the international level (within the UN) was established on the basis of the Vienna International Convention on Road Traffic in 1968, which states in Article 8 that “every driver must constantly have control of his or her vehicle or to guide him” [21]. The amendment UNECE ECE/TRANS/WP.1/145, pertaining to Article 8 of the Vienna International Convention, ensures the possibility that a driver can be assisted in his or her driving performance tasks and in the control of his or her motor vehicle through a driving assistance system. Thus, the annex to the amendment, paragraph 5b, specifies that “from today (23 March 2016), the automated control systems will be explicitly allowed on public roads, provided they comply with the provisions of the UN Regulations on motor vehicles or so that they can be controlled or deactivated by the driver” [22].
Autonomous shuttle buses for public transport are present in many of the major metropolises of the world, where they have been implemented on urban routes, in public traffic, and under test regimes, where they are permanently monitored by a human operator (Level 3 according to SAE J3016™ (Figure 3)) [23,24].
The next level of autonomy (Level 4) is being tested in various locations around the world (e.g., Waymo, Google’s self-driving car program, Phoenix, AZ, USA) [25] and involves the use of AVs in urban traffic without being accompanied by a human operator. These AVs travel based only on the perceptions and decisions of an autonomous driving system (ADS) using machine learning or artificial intelligence (AI) technologies. For example, Amini et al. developed and implemented a machine learning methodology based on a minimum set of data collected from the real world [26].
Table 1 presents the current fleet of autonomous shuttle buses in service all over the world, including the vehicle producer, number of vehicles per route, cost of transportation, and the road type where the vehicle is in service. The scientific world has a strong focus on this topic, and its diversity in terms of research topics brings together scientists from all engineering domains. Understanding of the environment and its dynamic changes relative to AVs; the integration of AVs in current road infrastructure; and their security risks, propulsion systems, sensors, computing processing power, machine learning elements, and AI, as well as the social aspects emerging from these technologies, are all topics that must be evaluated and presented as keystones in the future development of autonomous shuttle buses for urban transport.

2. General Technical Characteristics

In recent years, autonomous driving has become not only a hypothetical driving solution but a reality. The type of physical vehicle used in the implementation of these concepts is similar to pre-existing solutions, with a steering wheel, acceleration, and braking paddles, but also including a complex network of sensors. Depending on the degree of automation, these classic control elements start to disappear and will disappear completely once the degree of automation reaches the 5th level. In parallel, new decision management solutions have started to emerge (such as tables and joysticks) based on the control of AVs in cases of emergency or impermissible situations. The complexity of direct mechanical control from the driver side will decrease, and the degree of automated control over the dynamic behavior of the vehicle will start to increase, until the driver has no control over the behavior of the vehicle, except by means of indicating the destination. This physical elimination process works in parallel with upgrading the level of autonomy. However, completely replacing a driver requires a series of upgrades in the sensor areas and processing power of autonomous driving control units. The technical aspects of each component’s role are presented in the following sections.

2.1. Powertrain

2.1.1. Driveline

The AV propulsion system is equipped with asynchronous/synchronous traction engine/engines (the producer’s choice) powered by a DC/AC converter via a complex electronic power system.
The traction motors generate torque to the propulsion wheels according to the control algorithm of the electrical machine via the electronics and, subsequently, to the voltage limitations generated by the battery management system (BMS). The electrical torque and power delivered to the wheels are directly controlled by the desired speed of the AV. The electric motors also function as electric generators in the recovery braking mode, where they can recover a maximum amount of braking energy (energy that will be stored in a battery via an electronic AC/DC converter and/or supercapacitor solutions). Another solution is to convert the kinetic recovered energy (when braking or downhill cruising) via the power source (AC/DC and DC/DC) to low voltage electrical consumers of the AV (Figure 4, where (1) standard charge plug-in; (2) main inverter; (3) power drive unit; (4) e-motor/generator; (5) power electronics).
Electric batteries are charged by charging stations connected to the plug-in terminal. The voltage/current flow into the battery from the charging stations is controlled first by the power electronics with transformed energy (AC/DC) and then by the BMS [15,38].
The command and the control of the propulsion system operation is realized by the electronic control unit (ECU), which is integrated into the ADS of the vehicles and intervenes in the permanent modification of the parameter values that influence AV performance in order to optimize electricity consumption.

2.1.2. High-Voltage Battery

A high-voltage battery consists of a series of low voltage cells (2.2 V up to 3.7 V) fixed in modules that are connected in series and in parallel in order to increase the power and voltage of the complete battery pack. The complete battery pack, depending on its size, can reach 280–390 V with power that can vary from 20 kWh to 35 kWh. This battery pack is continuously monitored by the BMS for equal cell voltage charging and discharging (state of charge, SOC), its optimal functional temperature, the optimum C-rate (rate at which the battery is being charged or discharged) of the complete pack, and the complete battery state of health (SOH). The battery pack is indirectly connected to the power electronics, the unit that controls the energy flow through all electrically powered systems.
Electric batteries have a capacity that will ensure the operation of the AV on the selected routes within a predetermined period of time (generally the duration of a working day or 8–10 h) at maximum load with passengers with the conditioning systems for the passenger compartment temperature in operation and in all possible situations under all environmental conditions.
The batteries use lithium-type technology, with a high stored energy density, the minimum volume and mass needed to achieve the required autonomy, and maximum operating safety under the climatic conditions in which the AV operates. Electric batteries allow a charging regime from 0% to 99% in a well-defined time interval depending on their capacity, regardless of the ambient thermal regime.
There are models of AVs that are equipped with solutions for extending their autonomy through an additional battery with a maximum of 10% from the main battery’s capacity (or a supercapacitor with the same specifications), which is separated from the main system of the electrical batteries and used only in situations where which the capacity of the main batteries falls below a level of 20%, which ensures additional autonomy. This battery/supercapacitor is separated from the main power source and is permanently maintained at a maximum charge level.
The main characteristics of the high voltage batteries equipped to each of the autonomous shuttle bus models presented in this paper are shown in Table 2 [11,12,13,39].

2.1.3. Steering

When discussing the steering of AVs, one is first impacted by the lack of a steering wheel, which is a key component of classical vehicles. The steering of AVs still represents a key factor, but direct control is missing. Here, steering decisions more subtly use a decision algorithm supervised by the ADS. The steering of an AV is a reactive action generated by a series of external factors rather than by the decision of a “virtual drive” that reflects itself in a proactive mode based on the road, traffic, and surroundings. Steering in this case is the means, not the method, by which the vehicle reaches the desired target. This is why an AV operates based on a virtual/optimal trajectory.
Thus, there is no freedom to cut lines and no possibility to encounter dangerous situations, but there is always a safe method of driving and a controlled balance between the position of the gravitational center of the vehicle and the three axis forces that act on each steering wheel.
Autonomous shuttle buses can be equipped with two-wheel steering systems or four-wheel steering systems. Figure 5 presents an overview of the steering system (a) and the other two steering systems that can be found on autonomous shuttle buses: a steering system with two steering wheels and (b) a steering system with four steering wheels (c).
In the case of autonomous shuttle buses, the rear axle wheels turn in the opposite direction of the front axle wheels. Four-wheel steering systems have major advantages. One of these advantages is a reduction of the steering angle, which increases safety by reducing the required response time. This system also improves the maneuverability of a vehicle during tight cornering or parking [38].
Autonomous shuttle buses with four independent steering wheels are equipped with independent steering controllers on each wheel. This allows greater flexibility in controlling the vehicle’s movement via the independent control of the driving force and the rotation angle (δA, δB, δC, δD) of each wheel [40,41]. The steering system of the autonomous shuttle buses is built around a rack-type pinion mechanism and a steering linkage mechanism controlled by a 24 V DC motor. Figure 6 shows a kinematic diagram of the steering mechanism: (0) chassis; (1) steering gear driven by gear motor; (2) rack; (3) left link; (4) right link; (5) left wheel; (6) right wheel [23].
The steering system of the autonomous shuttle buses is powered by an electric gear motor that drives the steering gear (1), which turns and activates the rack (2). The electric gear motor is controlled by the Motion Control block by ADS. The rack-pinion gear transforms the rotational motion of the electric gear motor into translational movement that is transmitted to the wheel hubs by means of a link (3, 4), which causes the wheels (5, 6) to turn. The technical specifications of the gear motor type are presented in Table 3 [42]. These specifications are presented as general values, taken from the technical data sheets of the manufacturers of the most used elements from the categories described in the paper and used in equipping the autonomous shuttle bus models presented.
The complete chain of steering control includes the module for vehicle behavior, which generates a planned trajectory in the environment in which it circulates. The vehicle’s behavior is based on the use of a file to define the road network the AV must navigate, which includes the streets and a file that defines the points that the AV will cross during its travel to a certain destination. The generated trajectory is then entered in the direction controller [43].
The task of the steering controller (Figure 7) is to compare the current position of the AV with the alignment of the planned trajectory. Depending on other aspects, such as comfort and the current speed of the AV, the controller generates control over the steering system based on the vehicle’s speed. The centralized control structure is based on the characteristics of the vehicle’s dynamics, and the subsystems are controlled directly by a centralized controller. The controller’s design is generally based on linear or nonlinear models, and the method for designing a multi-output multi-input system (MIMO) is adopted to solve problems related to longitudinal and lateral vehicle dynamics [44].
In Figure 7, the notations represent the geographical coordinates of the track on which the AV runs, the steering angles of the steering wheel, the forces and moments acting on the vehicle from a dynamic point of view, and the moments at the wheel axle, with respect to the longitudinal and transverse speeds of the vehicle, where Xdef, Ydef are the geographical coordinates of X respective to Y on the track, Vdef is the vehicle’s speed on the track, ωro is the angular speed of the wheel necessary to follow the track, ∑Fx is the sum of the forces in the longitudinal direction, ∑Fy is the sum of the forces in the transverse direction, ∑Fz is the sum of the forces in the vertical direction, Fxi is the longitudinal forces on the wheel in the longitudinal direction, Fyi is the transverse forces on the wheel in the longitudinal direction, Mi is the torque of the driving wheels, δi is the steering angle of the wheels, vx is the longitudinal speed of the vehicle, vy is the transverse speed of the vehicle, and ωr is the angular speed of the wheel. All these quantities are analyzed and processed by the blocks in Figure 7 to ensure that steering decisions are performed optimally. The steering controller generates the desired steering angle, resulting in a torque to the steering system that depends on the voltage entered into the controller.

2.1.4. Braking

For classical vehicles, the driver uses his or her senses, experience, and abilities to make predictions. Usually, when using the braking pedal, the vast majority of drivers are optimists; they do not brake intensively until the last portion of the road, at which point they use the engine brake. Truck and bus drivers frequently use a retarder before using the brake pedal. In the end, braking is performed partially instinctively and partially by awareness. Taking into account all these variables, vehicle manufacturers do not ensure a certain braking distance according to different vehicle speeds but instead equip their vehicles with braking systems capable of ensuring a minimum deceleration measured in m/s2.
For AVs, the same legislation applies, and such vehicles will always reach the desired level of deceleration. The problem lies in how this deceleration is distributed over distance and how the occupants of the vehicle experience this deceleration, particularly those who are not seated and those who are not facing the road, in order to acknowledge the braking process. The algorithm that controls deceleration must be adaptable and able to distinguish between normal braking, smooth deceleration and braking, and emergency braking [38].
The analyzed AVs are powered by an electric vehicle platform that permits regenerative braking and countercurrent braking. Thus, there are three ways to induce deceleration in an AV. The physical systems are already well known, but the keys to braking in autonomous driving are establishing the moment of braking, the force of braking, and the moment when braking is no longer necessary, taking into account that in the vast majority of cases, the total deceleration force of the braking system is not needed.
According to the desired deceleration and the distance between the vehicle and the desired maneuver, the braking control algorithm can decide to use the hydraulic system, the regenerative braking system, the countercurrent braking system, or the first and second in parallel.
A possible control solution for the braking control of an AV is presented in Figure 8 [45].

2.1.5. Charging System

The charging strategy of an industrial vehicle designated for public transportation has to take into consideration that there is a direct connection between the battery’s SOH and the charging solution used. The continuous usage of fast charging stations (high-power DC) will generate, over time, a rapid aging effect on the cell level of the battery by creating a chemical film on the negative electrode (the solid electrolyte interface), which will exert a direct influence on the reliability and usability of the complete battery pack. In order to increase the reliability and life-time of the battery so that a SOH of 80% can be maintained after only several thousand charging and discharging cycles, the ratio between fast charging (high power DC) and slow charging (low-power AC) should be 1/3–5 charging cycles.
Charging stations are equipment that charge autonomous shuttle bus batteries within a well-defined time interval depending on their capacity using special dedicated connectors/coupling devices [46,47]. The charging stations used for powering electric vehicles in general, and autonomous shuttle buses in particular, are characterized by the following specific terms [48]:
  • Charging levels 1–3 is used to classify the power, voltage, and rated current of the charging stations according to the specifications defined by SAE J1772 (Table 4) [49];
  • Charging modes 1–4 is used to classify the mode of the power supply, protection, and communication/control of the charging system according to the specifications defined by the international standard IEC 61851-1 [50]. Thus, autonomous shuttle buses are connected by an in-cable control and protection device (IC-CPD) cable to electric vehicle supply equipment (EVSE) compatible charging stations; and
  • Charger types are used to classify the different types of sockets used to supply power to the autonomous shuttle bus charging system according to the specifications defined by the international standard IEC 62196-2 [51]. For example, Apollo Baidu uses a CHAdeMO 50 kW DC Type 2 charging connector [52], EasyMile EZ10 uses a Type 2/Mennekes AC charging connector [53], Navya Arma uses a Type 2/Mennekes AC [12], and Olli uses a Type 2/Mennekes AC charging connector [54].

2.2. Sensor Systems

The key question in the case of autonomous shuttle buses is whether sensors are capable of replacing a human driver, in urban traffic, especially for a passenger transport vehicle. Vehicle manufacturers, particularly the divisions responsible for autonomous driving, answer that they are. On what facts do such companies rely, and why are they so confident?
Replacing the decisive factor of the human brain with a series of control algorithms based on information from sensors presents series of unknowns. The element of decision-making is the brain of the driver, which receives driving information based on the senses (vision and hearing), which together give the driver awareness of his or her soundings, but this is not enough. There are persons who can hear and see very well and have no ability to drive because they lack the knowledge and trained reflexes to control a vehicle. Thus, perception based on sight and hearing together with trained knowledge provides the ability to drive, but an element remains missing—the experience of driving. This means that the reactions of the driver are learned by the driver’s brain in order to improve their driving skills in terms of their reflexes and appropriate reactions (driving style under various meteorological conditions).
This means that once a driver’s brain is trained to drive, the vast majority of functions will be made almost automatically (similar to typing without looking at the computer keys), which means that the mental effort becomes negligible. For example, for a driver that drives along the same road for a long period of time, the driver’s brain becomes so familiar with the route that that the driver is no longer able to recall the events that happened on the road unless they were exceptional.
In the development of autonomous driving, specialists have decided to train artificial brains using a similar approach to the training that a person engages in involuntarily when driving “automatically” based on prior information. Problems emerge once the routine becomes obsolete and an unpredictable event occurs, at which point the human brain’s processing power grows exponentially, and decisions can be made.
Based on its lack of large-scale processing power, the AV has to upgrade itself with sensorial strength. Humans do not have good night vision, do not possess superior abilities of motion detection by hearing, cannot see through fog or heavy rain, etc. However, they can compensate for these deficiencies with brain processing power and interact and react to the perturbating factors of the environment. On the other hand, the AV algorithm has no ability to attain the same level of brain processing power and compensates for this lack of independence by increasing its extra sensory abilities with a series of capabilities that humans are missing and even communicating with the environment (vehicle to vehicle communication and vehicle to infrastructure communication). In this way, vehicles can compensate for their lack of processing power in order to achieve a task independent of the driving decision factors. The sensing capabilities of actual autonomous shuttle buses designated for public transport are being presented step by step in each particular case so that an exhaustive understanding of the matter can be achieved.
Sensors are technical devices of specific sizes that react to certain environmental properties. An AV, which is devoid of the perception of a human operator, must be able to perceive its surroundings in order to function safely without the intervention of a human operator. For this process, AVs are equipped with a large number of sensors (Figure 9) [55,56,57] that scan the entire environment, identifying everything that materializes around the vehicle from road markings and traffic signs to static and dynamic objects.
The signals transmitted by the sensors are managed by intelligent platforms such as the self-driving computation platform, which is integrated into the ADS. Based on these, the perception capabilities of AVs are becoming increasingly more sophisticated, possessing the ability to identify and classify any static or dynamic objects in the proximity of the AV and follow these objects from frame to frame [58,59].
Sensors are equipped on the autonomous shuttle buses models presented in this work. The types, numbers of sensors, and their positions are determined by the manufacturers of autonomous shuttle buses. For example, the sensors equipped on Apollo Baidu autonomous shuttle bus are shown in Table 5 (according to the public data on github.com/ApolloAuto).
The characteristics of the sensors on AVs (the maximum values for field of view, range, accuracy, frame-rate, resolution, color perception, and the minimum values for weather affections, maintenance, visibility, and price) are shown in Figure 10 (where 0 is very poor, 1 is poor, 2 is very fair, 3 is fair, 4 is good, and 5 is very good) [17,20,68,69,70].

2.2.1. LIDAR

LIDAR sensors (Figure 11) based on laser beam distance measurement technology allow an AV to generate a virtual image of the environment in which it circulates to establish its precise position and detect the static or dynamic objects on the road for a 2D map (LIDAR 2D) or for a 3D map (LIDAR 3D) [71].
The main role of LIDAR sensors is to detect existing objects (pedestrians, vehicles, etc.) and to determine the ADS to control emergency braking to avoid collisions. LIDAR 3D sensors use a set of laser diodes (between 4 and 128 laser channels) mounted on a rotary device that scans the environment in a 360° horizontal and 20–45° vertical field of view. The accuracy of the images that form the 3D map is given by the number of used laser beams. LIDAR 3D sensors are mounted in a way that minimizes the areas without coverage and allows them to scan the displacement area over a circular radius to create a 3D map of the virtual environment in which the existing objects (pedestrians, vehicles, etc.) are present [17].
LIDAR 2D sensors capture information from the environment by applying a single circular laser beam to a flat surface perpendicular to the axis of rotation. LIDAR 2D sensors are generally mounted in two pieces to minimize the areas without coverage and to ensure continuous visibility on the entire circular surface of the AV [70].
The main technical characteristics of the LIDAR sensors used in automotive applications are presented in Table 6 (general values, taken from the technical sheets of the most used LIDAR sensor manufacturers) [61,72,73].
Sualeh et al. presented a multiple object detection and tracking (MODT) algorithm that classifies the objects detected by LIDAR sensors and then applies Bayesian filters to estimate the kinematic evolution of the objects classified over time [74].

2.2.2. Radar

Radar sensors equipped on AVs are placed in the front and rear of a vehicle to measure distance to the detected objects and to calculate their speed and estimate their direction of travel (Figure 12) [75,76].
Mid-range radars (MRRs) are placed so that they cover the front, rear, and side corners of an AV to detect objects in the immediate vicinity of the vehicle [77]. Long range radar (LRR) sensors are used to provide information on the vehicles traveling in front of the AV and determine adaptive cruise control (ACC) [78].
The main technical characteristics of the radar sensors used in automotive applications are presented in Table 7 (general values, taken from the technical sheets of the most used radar sensors manufacturers) [77,79].

2.2.3. Camera

An AV is equipped with a high-resolution stereo video camera system (two identical cameras that simulate human binocular vision), located in the front and back of the vehicle to provide a circular image of the environment.
These high-resolution video cameras have the role of permanently monitoring/recording the movement of the AV, the route followed, the traffic signals, etc., and are equipped with complementary metal–oxide–semiconductor (CMOS) sensors. The monocular video cameras capture 2D images without generating precise details about the distances to the objects sensed without estimating information that will help establish scenarios to avoid these objects. In contrast, stereo camcorders have the ability to estimate the distance to sensed objects by measuring the difference between two images from different angles [15].
The video cameras (Figure 13) have been applied to multiple tasks for AVs, such as lane detection, distance detection of other objects or vehicles from traffic and traffic sign detection.
The operation of a CMOS image sensor is based on the light passing through the lens, which is filtered by a matrix of color filter array (Bayer CFA) that provide information on the light intensity in the wavelength regions. The raw image data captured by the CMOS sensor are then converted into a color image by an algorithm that converts photons into photoelectrons [80,81].
After the image processing stage, visual perception is used to recognize and understand the detected objects in the environment using the classification algorithm (Figure 14) [14].

2.2.4. GPS/GNSS

The global navigation satellite system (GNSS) includes all satellite navigation systems, global, and regional systems and operates on the principle of receiver position detection (AV) against a fixed reference formed by a group of geostationary satellites orbiting over 20,000 km from the surface of the Earth. These satellites emit signals that contain information about their position, orbital parameters, etc. Based on these signals, the reception systems extract information about their position, travel speed, and the current time [15,76,82].
The operating principle of the GNSS is based on the exact transmission time of the extrapolated signal emitted by at least four visible satellites and received by the receiver, from which the receiver extracts information about the geographical coordinates and time (x, y, z, t) [17].
The most commonly used GNSS is the global positioning system (GPS), which uses a real-time kinematic (RTK) correction tool to position the AV with maximum accuracy (Figure 15). GNSS provides precision that can deviate by a few meters, although the ADS requires a precision of several centimeters. This accuracy is achieved with the aid of a base station. The GNSS measures the distance to the satellites and the base station, thus achieving positioning with maximum accuracy for the AV. The transfer of GNSS/RTK differential corrections from the reference base stations to the AV is carried out via mobile communications or the Internet (3G/4G/5G) [83].
The GNSS with RTK (used, for example, by Navya) has a low refresh rate (1–20 Hz), which is why some manufacturers of autonomous shuttle buses (EasyMile) use the visual capabilities of LIDAR sensors and video cameras to reconstruct the displacement space and locate the vehicle in this space. This technique is called simultaneous localization and mapping (SLAM) and consists of generating an environmental map based on virtual reality (VR) and simultaneous localization within this map [15]. The disadvantage of this technique lies in the fact that there is no delimitation of the mapped space, which requires extra hardware and software resources.

2.2.5. Inertial Measurement Unit

The inertial measurement unit (IMU) sensor includes a three-axis accelerometer and a three-axis gyroscope and is intended to measure the acceleration of the AV and specify its location and orientation (Figure 16).
When a vehicle is in motion, it can perform linear and rotational movements around each of the three axes: lateral, longitudinal, and vertical (pitch, roll, and yaw). Inertial measurements, which can be made without reference to an external point, include linear acceleration, angular velocity, and angular acceleration. This information can be used to increase or improve external measurements, such as GPS coordinates [84].
An IMU is a device containing an accelerometer that measures a body-specific force vector, a gyroscope that measures the inertial angular velocity vector, and a magnetometer that measures the vector of the magnetic field around the device. These three sensors are mounted so that their measuring axes each constitute distinct orthogonal systems, resulting in six degrees of freedom. An IMU offers basic information about the vehicle on which it is mounted (acceleration, rotation, and orientation).

3. Autonomous Driving Systems

This section describes the core technological developments over the last few years related to processing power, flexibility, and—one of the most important aspects—mobility. For a computer that processes pre-deterministic calculus, the factors of processing power, memory, size, and grid power present no problem when in a lab connected to other computers, where it can borrow memory or processing power. However, once the computer is independent with limited processing power, every bit of processed information counts. AVs are simply large computers on wheels whose abilities depend on their software’s quality, bugs, flexibility, and processing power.
We no longer evaluate an AV in terms of its speed, dynamic performance, or braking distance. We instead evaluate AVs by their ability to make speed related decisions instead of asking for a decision, establish alternative routes before reaching a traffic restricted area, reduce their speed before reaching a bumpy road, reserve a charging station before reaching the desired destination, etc. These are the factors that differentiate AVs in general. The particularities of classic vehicles are immaterial (such as color selection), yet they help buyers choose their future vehicles.
Are the ADS capable of crossing the double line when a vehicle is parked in the road, or will it ask for control? Is the system capable of recognizing police signs in case of a crash or road block? Is the system able to increase its speed in case of a medical emergency, or is the code simply too “classic” to perform such tasks that a human driver can perform?
An ADS must not only perform its driving tasks using various sensors for input and actuators for control, but such systems must be resilient, redundant, flexible, reliable, versatile, and ensure cyber security. All these tasks have to be performed in a network with high-speed information exchange and multi-processing layers doubled by different types of physical data that require continuous processing power. These large volumes of information require tremendous processing speed. An AV with all its cameras can generate 2 gigapixels per second of data, which takes 250 trillion operations per second to process [85]. This is translated into large amounts of computing power, which requires tremendous amounts of electricity. Some AVs use around 2500 watts [86]. These are extreme cases, but they highlight some lesser-studied evaluation criteria for vehicles: Does the processing power influence the choice of an electric vehicle? Does the vehicle’s electronic equipment support future upgrades? For example, is the ADS capable of running software for the following 3–5 years? Is the cost of an update worthwhile in terms of benefits? Will there be “bugs” in the first update, or should one wait for the second update? These topics might become similar to current maintenance activities, such as changing the injector for better engine performance (better fuel consumption and fewer emissions).
All the presented aspects have a large impact on the debates surrounding autonomous shuttle buses designed for public transport. Ultimately, who is paying for the development of these platforms after the product is sold? What about software maintenance, electronics maintenance, upgrades, and compatibility? These are themes of interest to be analyzed in the future.

3.1. Autonomous Driving Algorithm

An ADS destined for public transportation is a hardware and software solution that has the ability to drive without the need for humans to use the vehicle control mechanisms (depending on the level of autonomy according to SAE J3016™ specifications), with remote vehicle monitoring accomplished via monitoring personnel through a software application called a management platform [1,9].
The degree of autonomy of an autonomous shuttle buses destined for public transportation that fits within Level 3/Level 4 allows the human operator to not intervene in the travel direction, vehicle speed, or braking (for all dynamic functions) while the vehicle is in operation. After stopping in the proximity of an object and after the object in question is removed from the proximity of the autonomous shuttle, the vehicle will resume its function of independent, autonomous driving without the intervention of the human operator. This is one of the existing solutions on the market in terms of human asset/inactive intervention.
In the event of a problem (a system error, an incident, a deviation from the route, etc.) for Level 4, the ADS will have sufficient control to overcome this situation, and the human operator will only intervene if he or she determines that the vehicle must deviate from the perception and planning of its autonomous management system’s actions. When a problem occurs for a Level 3 system, the human operator will be the one to intervene to reactivate the ADS after the problem is solved.
The ADS is a substitute for a real driver who, by delegating total driving control to the vehicle and describing the behavioral conditions, performs the following independent functions [87]:
  • Avoidance of obstacles: discovering existing objects in the movement direction (pedestrians, vehicles, etc.); safely stopping or changing lanes as appropriate;
  • Centered travel: ensuring the vehicle moves in the center of the travel direction at the maximum travel speed;
  • Changing travel direction: for reasons other than avoiding or overtaking existing objects (pedestrians, vehicles, etc.); this is done by changing the travel direction when possible;
  • Passing over several lanes: insurance for existing objects (pedestrians, cars, etc.), changing travel direction, overtaking, and returning to the lane;
  • Passing over a single lane: insurance existing objects (pedestrians, vehicles, etc.), changing travel direction, overcoming when the maneuvering space allows, and returning to the lane;
  • Abandonment of overtaking: insurance against existing objects (pedestrians, vehicles, etc.), changing the travel direction, avoiding overtaking in the event of an unforeseen situation, and returning to the lane;
  • Complete overtaking: achieving and completing overtaking when maneuvering allows it to return to the lane;
  • Overcoming being overtaken: when another vehicle commits overtaking, that vehicle is avoided to increase travel speed and maintain a safe distance from the vehicle that overtakes;
  • Maintaining distance: ensuring and maintaining a maximum safe distance from existing objects (pedestrians, vehicles, etc.) in proximity to avoid any collisions;
  • Low speed operations: ensuring and maintaining a minimum safe distance from existing objects (pedestrians, vehicles, etc.) in proximity to avoid any collisions;
  • Interactions with other objects: avoiding interactions with other existing objects (pedestrians, vehicles, etc.) by keeping distance, adjusting movement speed, and passing or stopping as appropriate; and
  • Constant movement: in the travel direction when no object (pedestrians, vehicles, etc.) intervenes in the proximity of the vehicle.
The operation of the ADS uses a decision algorithm based on the following principles (Figure 17) [1,12,88]:
  • Perception of the AV position of the object’s position in the environment;
  • Prediction/planning of the dynamic behaviors of the vehicle resulting from its perception correlated with movement/operation rules; and
  • Motion control: command and control of the propulsion systems, steering, and braking based on the decisions established in the planning stage.
The main command and control parameters are collected in internal non-volatile memory as data from all sensors on the AV, namely:
  • The permanent position of the AV in real time, specifically a history of its movement;
  • Data on the main functional parameters of the propulsion, steering, and braking systems and a history of these systems’ operations (speed of movement, speed of the electric motor, consumption/electricity recovery, etc.);
  • Data on the main functional parameters of the comfort systems in the passenger compartment of the AV (lighting, air conditioning, etc.);
  • Data on the main events that took place during the entire period of the AVs operation (abnormal operations, emergency stops, etc.); and
  • Data on the data flow of the video cameras.
The main functions of the command and control unit are the following [1]:
  • Logic and control—monitoring and diagnosing the autonomous operations of the main systems: propulsion, steering, and braking;
  • Logic and control—monitoring and diagnosing the autonomous operations of the auxiliary systems of lighting, door/ramp, comfort compartments for passengers, and multimedia;
  • Logic and command of the external access interlocks to the main/auxiliary systems for the safe operation of the AV; and
  • Interconnection with the management platform—recording and transmitting data, events, and operating errors to the platform.
All the information on object detection is used to manage system priorities and integrate the AV into road traffic through a programmed control area, in order to continue its movement in an autonomous driving regime. The vehicle’s movement speed can be adjusted to safely manage detected objects on the route via prediction. The AV accelerates/decelerates depending on the trajectory being followed and the movement speed of the dynamic objects. The planning of the dynamic behaviors allows the AV to interact with other traffic participants, preventing accidents and automatically adjusting travel speed. The planning features of this dynamic behavior are synchronized with the action control characteristics, activating the dynamic functions of the propulsion, steering, or braking systems and determining the best behavior of the AV according to any static or dynamic objects on the route.
The autonomous driving algorithm used on the NVidia drive platform (Figure 18) allows for the perception of existing objects (pedestrians, vehicles, etc.), travel routes, and traffic signals based on the data received from several types of sensors (LIDAR, radar, camera, GPS/GNSS, IMU, etc.). This platform employs a neural network that, based on the deep learning method, responds to the following objectives [58,89]:
  • Permanent development of the perception algorithms for existing objects (pedestrians, vehicles, etc.), commuting routes, and traffic signs;
  • Detection of static and dynamic objects to avoid collisions, traffic lanes, and other traffic conditions;
  • Tracking detected objects (pedestrians, vehicles, etc.) from one frame to the next; and
  • Estimating the distances to detected objects.
The ADS, regardless of the technical solution adopted to implement the functional algorithm, provides decision support to achieve perception and vehicle movement and to stop under maximum safety conditions, based on the data provided by the sensors. According to these data, the system determines the characteristics that define the path of the AVs movement and regulates its speed so that it can stop under completely safe conditions depending on the route’s characteristics, the objects (pedestrians, vehicles, etc.) encountered on the route, and the points defined as stopping stations for boarding/disembarking the passengers. All these functions are characteristics of an autonomous driving solution destined for public transportation.
The functions of the ADS react to all the events and objects on the route, managing each of the scenarios to avoid an accident. These scenarios include the following:
  • The objects that appear on the route;
  • Respecting the traffic lanes;
  • The behavior of other vehicles in traffic;
  • The management of crossing priorities (interpretation of traffic light colors or when they are flashing yellow) at intersections or when changing the direction of travel; and
  • Rules of road traffic and traffic signs on the route.
The current analyzed autonomous shuttle buses (Level 3/Level 4) destined for public transportation are permanently monitored through the management platform by a human operator, who, in potentially dangerous situations, will command the autonomous shuttle bus to stop. In this situation, the ADS is automatically deactivated, at which point the autonomous shuttle bus can only be driven manually by the human operator on board through a mobile control device. Autonomous shuttle buses are also equipped with an “operator” button, which, when actuated, disables the autonomous driving mode and activates the manual driving mode. This solution is a particular design option in vehicles designed for public transportation.

3.2. The Electronic Control Unit of the Autonomous Driving Algorithm

The electronic control unit (ECU) is the central component of the ADS that receives information from sensors, actuators, etc. Based on the specifications of the autonomous system’s control algorithm, the ECU generates the command and control signals of the main drive systems, the propulsion system, the steering system, and the braking system. The sensors equipped to the AV perform a complete scan of the environment to distinguish all types of objects (pedestrians, vehicles, environment, particular roads, etc.), both static and dynamic and during both day and night under low visibility conditions.
The information from the sensors is collected by the ADS of the vehicle, which analyzes these data and controls the vehicle’s movement to ensure the safest conditions for its passengers using a series of actuators belonging to the propulsion, steering, and braking systems. This information is provided by:
  • Users/passengers, who provide data on the selected (defined) destination through the control panel in the on-demand mode;
  • Location sensors (LIDAR, radar, cameras, etc.) that allow the vehicle to be framed on a defined map using the data from these sensors; and
  • Position sensors (GPS/GNSS, IMU, odometers) that provide data on the AVs position.
The software architecture of the ECU of the ADS can be organized using an open platform and standardized AUTOSAR, which is organized into layers and can integrate and manage, in the application layer, the software components that perform different functions from different manufacturers so that they can use microcontroller resources and accept regular firmware updates (Figure 19) [60,90].
The vehicle platform is the basic model (open vehicle platform) for the ECU of the ADS. The platform contains internal drivers, which include software modules with direct access to the microcontroller and internal modules.
The hardware platform includes sensors, regardless of their location (internal/external) (human machine interface (HMI) devices, LIDAR, radar, cameras, GPS/GNSS, etc.), with the appropriate drivers. This layer provides an application programming interface (API) for access and connection to the open vehicle platform.
The software platform includes the real-time operating system (RTOS) and runtime framework, which comprise the layer that provides the specific functions for the application software (between the software components and/or sensor components). The software platform components include local maps and localization (data generation and localization), a data integrator (collection and multiplexing of the information provided from the sensors), perception of object position, the prediction of object behavior, planning (planning the optimal trajectory), motion control (AV movement control), HMI control (control of the device’s interaction with the human operator), and a V2X adapter (connection with the vehicle-to-everything units, V2X).
The cloud platform provides a suite of cloud computing services running on the same infrastructure with a series of modular services that require a large storage space with high data traffic. These services feature a local HD map (local high definition maps), big data (reference data), security (secure protocols), AI (AI algorithms), simulation (virtual model simulations for prediction and planning validation), and V2X (transmission of the information from a vehicle to any entity that could influence the vehicle and vice versa).
The V2X technology involves the exchange of large volumes of data between AVs and all intelligent “entities” around the vehicle, and these data transfers can be done in real time only through 5G technology. Increasing of the communication capabilities by expanding the volumes of the transferred data and of the response time will improve the traffic flow in the road traffic, allowing increased speeds of AVs, and reduce speed in a timely manner when the situation requires it. As a result, some of the developers/manufacturers of AVs have now integrated technical solutions for the 5G connected on some vehicle models (series production, or prototype), anticipating the technological evolution of the coming years [60,91]. Compared to the cloud platform and the suite of services running on this platform (suite of cloud computing services running), 5G technology allows the use of edge computing solution [92], which offers low network latency (5G latency at 1 ms vs. 4G latency at 50 ms [93]), allowing local data processing without the need for cloud uploading.
Edge computing is a computing and data storage solution for cloud computing systems, which allows the data processing at the edge of a network (using network resources), so outside the cloud computing environment, thus reducing the time required to process this data. In order for this solution to be achieved, a high speed of network communication and a low latency in data processing are required, conditions that can be ensured by implementing of the 5G technology. This approach allows the autonomous vehicles interconnected in a network (V2V) to share their data on traffic, route, etc., in real-time.
The operation of the command and control unit as a central component of the autonomous steering system is presented in detail below (Figure 20) [94,95,96,97].
Autonomous driving is based on the real-time position of the AV, which is generated by the localization module based on data from the sensor block, data that are taken over and managed by the data integrator block. These data are interpreted and compared with the reference data for the geographical coordinates (latitude, longitude, and altitude) of the selected and implemented route (local map), which are permanently updated with the data from the sensor block and stored in a memory block local map generator [17,98].
The perception module receives the location data of the AV from the localization block and the data from the sensor block for the objects in proximity to the AV (pedestrians, vehicles, etc.). This module detects (object detector) and classifies (object classifier) static and dynamic objects in proximity to the AV. This module estimates whether the dynamic objects in proximity to the AV can intersect their path with the route selected for travel. Based on the movement speed of these objects (the speed determined by the radar sensor) relative to the movement speed of the AV, the perception module will determine if there is a risk of collision between the vehicle and the monitored object.
The prediction/planning module receives the perception data from the perception module. These data contain a map of the static and dynamic objects in proximity to the AV superimposed over the selected and implemented path (local map). Based on these data, the prediction/planning module will select the optimal trajectory (optimal trajectory) based on certain predetermined criteria (the trajectory evaluator) to steer the AV in a way that allows safe travel without collisions.
The motion control transmits commands to the propulsion systems (velocity) for steering (angle) and braking (brake) based on the data received from the localization, perception, and prediction/planning modules. The braking system (brake) can be operated in the case of an emergency by the monitoring personnel on the management platform through a secure virtual private network (VPN).
On the data bus, the ECU of the ADS records information on electricity consumption, indicating the recovered energy, the electric battery’s SOC, and other relevant data on variations in the functional parameters of the main and auxiliary systems.
The Baidu Apollo autonomous shuttle bus is equipped with an ECU of the ADS called an industrial PC (IPC), which receives data from the sensors through a control unit, called the Apollo sensor unit (ASU). The ASU collects data from the main sensors (LIDAR, radar, camera, IMU, GPS, and GNSS) via four controller area network (CAN) buses, processes the signals, and then sends them via a PCI Express interface to the IPC that manages the autonomous shuttle bus control (Figure 21) [60].
The IPC is an industrial computer Nuvo-6108GC in the following configuration: Xeon E3 V5 CPU, Intel C236 chipset, 32 GB DDR4 RAM, PCIe ×8 ports, PCIe ×16, Gigabit Ethernet, and a PCIex CAN card. The resistance piece is represented by an ASUS NVidia GTX1080-A8G-Gaming graphics card, which supports the following AI technologies: VR, autonomous driving, and NVidia compute unified device architecture (CUDA) [60,99,100].

3.3. Safety and Cybersecurity

The safety protocols implemented in the command and control of the ADS comply with the following criteria [12]:
  • The AV will stop if a malfunction of any of the ADS sensors is detected;
  • The AV will stop if a communication error is detected between the components of the ADS;
  • The AV will stop if the power supply of the ADS is stopped; and
  • The AV will stop if an anomaly is detected by the ECU.
The protocols implemented for the verification, evaluation, and validation of the functional safety of the ADS are based on a set of steps corresponding to the safety criteria that correspond to the performance criteria (Table 8) [101]. These protocols aim at achieving a balance between the functional safety criteria and the performance criteria. The functional safety criteria will always have priority between the two. Thus, according to the defined protocols, the ADS will promptly order the AV to stop due to any deviations from these criteria.
The steps for the safety criteria, corresponding to the performance criteria, are as follows [101,102]:
  • Safety level: The ADS must recognize the limits for which the safe transfer from manual driving to autonomous driving is allowed and vice versa under minimum risk conditions;
  • Cybersecurity ensures the protection of the electronic systems, communication networks, control algorithms, and software products against computer attacks, especially unauthorized access;
  • Safety assessments—evaluating and validating the proper functioning of all vehicle systems in order to ensure that all the general safety objectives are met;
  • Safe operation: Ensuring that the operating state of the ADS corresponds to a level of autonomy that allows the safe operation of the AV;
  • Operational domain: The operating conditions under which the management system is designed to operate, and which reflect the technological capacity of the ADS;
  • The traffic behavior of the ADS must be predictable for all road traffic participants, and the traffic rules must be understood and respected by the system;
  • Manually assisted: The ADS controls the longitudinal or lateral movement of the vehicle (not both systems simultaneously) but is automatically deactivated via the driver’s intervention;
  • Partial automation: The ADS controls the longitudinal or lateral movement of the vehicle (both systems simultaneously) but is automatically deactivated when the driver intervenes;
  • Conditional automation: The ADS is activated by the driver and independently controls the movement of the vehicle. The ADS automatically deactivates in the event of an emergency situation or at the driver’s request;
  • High automation: The ADS is activated by the driver and independently controls the movement of the vehicle. The ADS handle emergency situations by itself and is automatically deactivated at the driver’s request; and
  • Fully autonomous: The ADS activates automatically according to the implemented autonomous driving algorithm and independently controls the movement of the vehicle. The ADS handles emergency situations by itself and is automatically deactivated at the driver’s request.
AVs are equipped with hundreds of sensors and dozens of ECUs, which are controlled by software code sequences and interconnected through the local interconnect network (LIN), CAN, FlexRay, and media-oriented system transport (MOST). The relevant Ethernet communication networks must be protected using cybersecurity solutions. The AV is commanded and controlled by a virtual driver using an ECU reliant on the algorithm of the ADS.
The future implementation of 5G technology in the management and control of the AVs will lead to an increase in the speed response of ADS to both safety protocols and safety criteria.
This aspect requires the presence of a strict protocol for computer security, particularly as a protective measure against unauthorized access of the AVs communication network. As a protective measure against unauthorized access to the AVs information and communication network, such access is possible only from the management platform through a secure network using a VPN channel via mobile data transmission technology (3G/4G/5G), and the information transmitted on this network is encrypted.
When several AVs are registered on the same management platform, communications with each autonomous shuttle bus are carried out on different VPN channels in order to ensure the security of the entire fleet in case of unauthorized access. A high level of protection against unauthorized access to critical safety systems requires that control of the actuation systems within the autonomous steering system (namely the propulsion system, the steering system, and the braking system) be completely separate from the infotainment system, regardless of any open Internet connection (Table 9) [101,103,104].

3.4. Route Specifications

For the operation of AVs in safe conditions, the ECU of the ADS analyzes and processes data from the sensors. Then, using the decision algorithms, the ECU creates a virtual model of the AVs moving environment. This virtual moving environment reproduces the following:
  • The conditions and nature of the road, including road signs/signals, speed limits, the widths of the roads, and the traffic conditions;
  • The geographical area, the longitudinal/transverse inclination of the road, the altitude profile of the route, etc.;
  • Climate, atmospheric conditions, ambient brightness (day/night), air temperature, and precipitation (fog, rain, snow, etc.); and
  • Operational safety measures.
The reference data on the geographical coordinates (latitude, longitude, and altitude) of the selected and implemented routes (local map) are stored in a memory block of the command and control unit and are permanently updated with data from the sensors or via updates made periodically. Programming the selected and implemented route will allow the AV to travel only on this route. The vehicle is designed to determine the method for crossing the selected route and engaging in transitions with the least amount of risk, thereby facilitating safe stopping.
5G technology, by transmitting real-time weather data to the ADS, offers solutions for autonomous driving in the event of difficult climatic phenomena (reduced visibility due to fog, rain, snow, etc.), and improper road conditions (wet, covered with ice or snow, etc.).
Two navigation techniques are available for AV orientation: 1) SLAM, which, with the help of sensors, monitors the environment and realizes perception and localization in real-time, and 2) HD maps, which memorize a detailed representation of the environment, which overlaps existing objects (pedestrians, vehicles, etc.) [105]. The most advantageous orientation solution combines real-time perception and localization (SLAM) with HD maps.
The methodology by which an extended map of the environment is generated (a full 3D map) is presented in Figure 22. Data from the LIDAR sensors (2D and 3D) generate a virtual image of the environment in which the AV moves, thereby providing an image formed by static and dynamic objects. The real-time position of the AV is determined with maximum precision using the geographical coordinates provided by the GNSS/GPS sensor.
An important aspect in digitizing and implementing the selected route is the quality of the GPS/GNSS signal, which must be at its maximum for the entire route to ensure safe movement of the AV [71,106]. This aspect can be improved through a 5G mobile data connection for the transfer of GNSS/RTK differential corrections from base stations to AVs. The two superimposed maps, both 2D and 3D, generate a local map containing all the existing objects (pedestrians, vehicles, etc.). This superimposed map of the environment with the map of the selected route that is implemented in the internal memory of the command and control unit in the ADS generates an extended map of the environment (a full 3D map) [107].

4. Legal Framework

Ever since there have been means of transport that use external power sources, there has been a need for relevant controls and rules, even though such regulations are sometimes inappropriate and disproportionate to the harmful potential of the means of transport. For instance, the “Locomotive acts” passed in the British Parliament in 1865 stipulated that one could drive a steam powered carriage or motor car at no more than 2 mph inside town and 4 mph outside town only if there was a person walking in front of the vehicle blowing a whistle and carrying a red flag.
If we analyze the content of this legislation in the present day, it might seem to be a joke, but it was, nevertheless, outlined by the British Parliament. The need to change and update laws and their foundations based on current technology is being felt more and more strongly, as technology is dramatically changing every 3–5 years. Such legislation is sometimes made post-factum as reality fills the empty spaces with factors that have never been considered by any legislative system, sometimes leading to overregulation.
For autonomous driving, the technology is mature enough to require dedicated legislation, but in order to generate such a movement, there is a need for lobbying and persuasion so that the relevant producers can change the status of their research and development (R and D) AV products to accepted products with free access to consumers. Here, the three major players in this market (United States, Europe, and China) have not yet come to a common decision. For the moment, these products, although active on public roads, are neglected due to the small number of units.
Retrospectively, the British Parliament in 1865 was trying to protect horses and carriages against the “iron beasts”. However, in less than 100 years, there were no more horses on public roads in the UK (there is now a policy against the access of horses and carriages on certain public roads). This will also be the case for AVs. Presently, there are some fixed regulations preventing vehicles from being “driven” without a driver, but 100 years from now, vehicles driven by human drivers will not be allowed on certain roads due to a possible perturbance of the rhythmicity and traffic continuity afforded by AVs.
For now, the legal aspects related to public transportation focus on safety, public accessibility, rhythmicity, and emissions. These aspects can be naturally achieved using an autonomous shuttle bus. However, since there is no driver that is responsible for all these factors, the local/regional/national legislation has no ability to offer derogations from existing laws that connect drivers to their vehicles. Therefore, new legislation must be developed to provide a clear framework for vehicle manufacturers to obtain conformity approval for their products. A new topic based on the lack of drivers in AVs is being debated for the case of unfortunate crashes/accidents. This debate involves the responsibility and legal coverage of an autonomous driving license and coding and also the related insurance aspects. This uncharted legal territory must soon become populated with rules that are flexible enough that their validity can endure for at least several decades.
The legislative framework for regulating AV circulation on public roads is based on two essential components: the rules of behavior in traffic and the regulation of the technical equipment attached to these vehicles. According to [108], in order to harmonize the specifications of Regulation 79/2008 [109] with AV traffic conditions, it is necessary that the functions of automation, the monitoring of the driver’s attention, and the emergency maneuvers needed to ensure road safety are proportional to the degree of automation. According to such regulations, the ADS can allow a vehicle to follow a defined trajectory or modify its trajectory in response to the signals initiated and transmitted from outside the system. Reporting to the ADS, the driver will not necessarily exercise primary control of the vehicle.

4.1. United States

In the United States (U.S.)., the regulations regarding the safety of AVs are subject to state governments, while their registration and application to public roads are subject to the laws of the federal government. In 2017, the U.S. Department of Transportation (DOT) and the National Highway Transportation Safety Association (NHTSA) launched the “Automated Driving Systems 2.0: A Vision for Safety” program, which was designed to promote and improve the safety, mobility, and efficiency of AVs [104]. The criteria imposed by the NHTSA regarding the establishment of the safety criteria for AVs refer in particular to the following issues [110]:
  • System safety: compliance with and the implementation of International Standards Organization (ISO) and SAE International safety standards;
  • Operational design domain (ODD): documentation on at least the following information: types of roads, geographical area, speed of travel, environmental conditions, and other constraints;
  • Object and event detection and response: the detection and reaction of the AV in the event of the appearance of a static or dynamic object, regardless of the events;
  • Fallback (minimal risk condition): ensuring the minimum risk conditions of the ADS;
  • Validation methods: validation methods of the ADS control algorithm;
  • HMI: the interaction between the AV and the driver;
  • Cybersecurity: ensuring the security protocols implemented in the control of the ADS;
  • Crashworthiness: ensuring the protection of passengers in the event of an accident (active and passive safety systems);
  • Post-crash ADS behavior: ensuring the protection of post-accident passengers (disconnection of electricity and main systems);
  • Data Recording: recording the data of the monitoring events and operating errors;
  • Consumer education and training: programs for all the personnel involved in the servicing (operating) of AVs; and
  • Federal, state, and local Laws: compliance with federal, state, and local laws/regulations applicable to the operation of AVs.
The number of the states that have adopted legislation on AVs in recent years has gradually increased. Figure 23 shows the current situation in the states that have adopted such legislation (blue), issued executive orders (green), adopted legislation and issued executive orders (orange), or have not taken any standalone action on motor vehicles (gray) [111]. For state abbreviations, we used the traditional US abbreviations [112].
According to the Constitution of the United States of America, each state has an independent group of laws that aim to promote their own interests. Consequently, the legislation concerning AVs differs significantly from one state to another. Thus, while some states (California, Connecticut, D.C., Florida, Massachusetts, New Mexico, New York, South Dakota, and Washington) clearly specify their requirements for a human operator to be on board an AV throughout its test run, other states (Arizona, Missouri, North Carolina, Tennessee, and Texas) allow the movement of AVs on public roads without being accompanied by a human operator [113].
The concepts of AVs themselves are perceived and defined differently for each of the U.S. states. Thus, for D.C., in the Code of the District of Columbia, section 23A., AVs are defined as vehicles capable of navigating district roads and interpreting traffic control devices without an active driver operating any of the vehicle control systems [114]. The State of Illinois, in Senate Bill 791/2017, defines an AV as a vehicle equipped with an ADS (hardware and software) that is capable of performing the entire dynamic driving task on a sustained basis without being limited to a particular operational area [115]. Louisiana, in House Bill 1143/2016, defines autonomous technology as technology that is installed on a vehicle and ensures one’s ability to drive a vehicle under high or complete automation without oversight by a human operator, including the ability to automatically bring the vehicle to a minimum state of risk in the event of a critical failure of the vehicle [116]. Georgia, in Senate Bill 219/2017, defines the minimum risk condition as a low-risk mode of operation in which the AV that operates without a driver provides reasonably safe conditions, namely the complete stopping of the vehicle in the event of a malfunction of the ADS—a failure that causes the vehicle to become unable to fulfill the full dynamic driving task [117].
From the perspective of traffic safety on public roads, some states have adopted more stringent criteria regarding the testing conditions for AVs. Thus, in Senate Bill AB-669/2017, the California Department of Transportation requires that a minimum distance of 100 feet be maintained between any vehicles or combinations of vehicles that are in circulation on public roads throughout the course of any tests [118], and, in Senate Bill 2005/2017, the state of New York established that tests with AVs on public roads can only be performed under the direct supervision of the state police [119].
To test AVs, most U.S. states require the operator of the tests to demonstrate his or her financial stability by proving the existence of insurance to cover any incidents that may occur during the entire period of the AVs operation.

4.2. Europe

There are no clear regulations in Europe regarding the approval, registration, and service entry of AVs onto public roads intended for traffic. Manufacturers of AVs (particularly those classified as Level 3/Level 4) use the directive 2007/46/EC of the European Parliament and of the Council of 5 September 2007, which establishes a framework for the approval of vehicles and trailers, as well as the systems, components, and separate technical units intended for such vehicles [120,121].
The approval process is different depending on the country due to the fact that AVs are special vehicles for which some states have specific regulations, while others grant exemptions or provisional authorization for experimental purposes. The states of the European Union that have adopted legislation on AVs and allow these vehicles to be used for testing are the following: Austria, Belgium, Finland, France, Germany, Greece, Italy, Luxembourg, Netherlands, Spain, and Sweden (see Table 1).

4.2.1. Austria

In Austria, the vehicle traffic is regulated at the federal level by the Austrian Motor Vehicles Act of 1967, which regulates the procedures for the approval and registration of the vehicles, respectively their movement on public roads [122]. This law was amended in 2016 by an amendment called Automatic Driving Regulation (Automatisiertes Fahren Verordnung, AutomatFahrV), which allows the testing of AVs without drivers and related technologies on public roads, according to the specifications published in the Code of Practice by the Bundesministerium für Klimaschutz, Umwelt, Energy, Mobility, Innovation, and Technology [123]. According to the Code of Practice specifications, the human operator must monitor the AV throughout its travel on public roads and must be able to deactivate the ADS at any time during the test if the situation so requires. At the same time, the AVs to be put into circulation on public roads in Austria under test conditions must comply with all safety requirements specified by the Austrian Motor Vehicles Act 1967 and present no risk to the traffic participants [124].

4.2.2. Belgium

In Belgium, the circulation of vehicles on public roads, specifically the technical conditions that they must meet in order to be admitted into traffic, is regulated at the federal level by the Royal Decree of 1st December 1975 (Road Traffic Code) [125].
AVs are admitted for testing based on the regulations specified in the Code of practice for testing AVs in Belgium published by the Federal Department of Mobility (SPF Mobilité et Transports) in 2016 [126]. This code regulates the testing of AVs in accordance with legislation on road traffic and requires that such tests are supervised by a human operator who can take manual control of the driving and assume responsibility for the vehicle’s operational safety.
The technical requirements that the AV must meet to be admitted for testing on public roads in Belgium are specified in directive 2007/46/EC of the European Parliament and the Council of 5 (September 2007), which establishes a framework for the approval of vehicles and their trailers, as well their systems, components, and the separate technical units intended for their respective vehicles, specifically in the Royal Decree of 15 March 1968 [120,127]. Another requirement is that the speed of the AV in test mode on public roads must not exceed 30 km/h.

4.2.3. France

France is the home country of two companies that develop autonomous shuttle buses for public transport (Navya and EasyMile). Currently, France is the country with the greatest number of autonomous public transport vehicle projects operating on public roads under test conditions (see Table 1). In accordance with European law, the adoption of a legislative framework for AV approval in public transport in one EU member country will automatically result in the recognition of that type of approval in all other member states [120].
In the period of 2015–2018, France issued a series of laws, regulations, and orders that allow AVs to be put into circulation on public roads for testing purposes. The most important law, ratified by the French Parliament in May 2018, is the “Véhicule à délégation total ou partielle de conductivité”, which establishes the authorization conditions for an experimental vehicle that engages in autonomous driving on public roads [128]. The main requirement is that the AV be permanently accompanied by a human operator who is present onboard.
Another important requirement that applies to all EU member states is the protection of individuals with regards to the processing of personal data and the free movement of such data (EU regulation 679/2016 [129]).

4.2.4. Germany

In Germany, due to the high costs of development and technology in the automotive industry, there is great interest from both car manufacturers and the federal government to develop the autonomous driving sector. Many of the big players in the German automotive sector have developed autonomous car models (Audi, BMW, Daimler, and VW). Thus, German car manufacturers have adopted a step-by-step approach, focusing primarily on the introduction of automated driving functions into classic car series, reserving for AVs (Level 3/Level 4 by human operator) the secondary role of testing on public roads, university campuses, pedestrian areas, etc. (see Table 1).
Accordingly, the German Federal Legislature (Bundesministerium der Justiz und für Verbraucherschutz) has revised and amended its laws on the road traffic (Straßenverkehrsgesetz), introducing (section 1a) specific regulations for introduction of vehicles with autonomous driving functions to public roads [130]. These regulations include the following [104]:
  • The ADS must recognize and comply with all traffic rules just as they would be recognized and respected by a driver;
  • The ADS must be able to be deactivated at any time by the human operator, who can at any time assume control of the vehicle; and
  • The ADS must notify the driver of any critical situation where the driver has to assume vehicle control or where the ADS is used contrary to established conditions.
German law requires that the AVs subject to registration be approved in accordance with the specifications of the European law, namely directive 2007/46/EC, which establishes a framework for the approval of vehicles and trailers, as well as their systems, components, and separate technical units intended for those vehicles [120,121]. As AVs do not comply with all the requirements of directive 2007/46/EC in its current form (especially with respect to steering and braking systems), the German vehicle registration regime provides an exemption for vehicles that are put into circulation under a test regime [104,124].

4.2.5. Italy

In Italy, the movement of vehicles on public roads is regulated by the legislative decree 285/1992 (Code della Strada), which outlines the specific rules regarding the movement of such vehicles on public roads in Italian regions [131]. In February 2018, the Minister of Infrastructure and Transport (Ministro delle Infrastrutture e dei Trasporti) issued Decree 18A02619, which regulates the methods for implementing the operational tools needed to test intelligent, connected, and automated vehicles on public roads [132].
This decree establishes functional standards based on new technologies being introduced to road infrastructure that can communicate with the users of the vehicles to provide real-time information on traffic, accidents, weather conditions, etc. At the same time, the decree establishes the conditions for testing AVs on public roads under safe conditions for the other participants in road traffic. The main condition imposed by this regulation is that the AV must be permanently accompanied by a human operator who is present onboard.

4.2.6. The Netherlands

In accordance with Dutch law, the movement of vehicles on public roads is mainly regulated by the Road Traffic Act 1994 (Wegenverkeerswet 1994) [133]. In 2015, the Dutch Parliament adopted the Decree on the Exemption of Exceptional Transport, which allows the testing of AVs on public roads [134].
This decree specifies that AVs can be put into circulation without being approved for testing their autonomous driving functions in road traffic according to the autonomy level of the SAE J3016™ specifications.

4.2.7. Spain

In Spain, vehicles on public roads are regulated by the Code on Traffic and Road Safety (Code de Tráfico y Seguridad Vial). This code provides safety standards for the circulation of vehicles on public roads, registration procedures, regulations on public passenger transport, etc. [135]. In Spanish legislation, there is no law regulating the introduction of AVs into circulation on public roads.
In November 2015, the Ministry of the Interior, through the Directorate General of Traffic (Dirección General de Tráfico), issued the instruction Notice 15/V—113 (Authorization of tests or research tests carried out with automated driving vehicles on roads, open to traffic in general) which establishes a legal testing framework for AVs in road traffic [136]. In accordance with the provisions of the normative act, the AV must meet the following minimum requirements:
  • The AV must be equipped with a system to ensure it can stop in an emergency;
  • The ADS can at any time be deactivated by the human operator, who can take control of the vehicle; and
  • The AV must feature cybersecurity that provides protection against cyber-attacks.

4.2.8. Sweden

In Sweden, the “regulation on the tests of vehicles without a driver” was adopted in 2017 (Förordning om försöksverksamhet med självkörande fordon) [137]. This regulation provides the possibility to conduct tests with AVs on public roads only if the applicant demonstrates that road safety will be ensured throughout the tests and that they will not cause significant disturbances or any inconvenience to the environment.

4.3. China

In China, the government authorities have, since 2017, provided local regulations that permit the testing of AVs. China has promulgated their strategic plan “National Road-Testing Guide”, which aims to modernize the automotive industry by 2025 [104]. The first autonomous shuttle bus (Baidu) that was put into circulation for testing in Beijing in March 2018 had public road sector access of a total length of 105 km in the outskirts, far from high traffic areas.
Currently, the circulation of AVs on Chinese public roads is regulated by the “Administrative Rules for Road Testing of Intelligent and Connected Vehicles (for Trial Implementation)”, which includes all administrative rules for the testing of autonomous and connected vehicles on public roads [138]. This normative act includes a series of safety requirements that AVs must meet in order to be admitted for testing on public roads, including the following [139]:
  • Autonomous, connected, and intelligent vehicles must be equipped with sensors, controllers and actuators, and network communication and connection devices, which can replace the driver and ensure safe and efficient driving;
  • AVs must be permanently accompanied by a human operator who must be present on board the vehicles;
  • Autonomous driving functions must be able to be deactivated at any time by the human operator, who can at any time take over control of the vehicle;
  • The operation of AVs must be permanently monitored in real-time from a distance by the monitoring personnel through a management platform;
  • The legal entities conducting the tests must be registered in China, have competence in the research, development, and manufacturing of AVs, and take full responsibility for any events that may occur during the tests;
  • AVs, regardless of their operational mode (autonomous or manual), must comply with all traffic rules and regulations in road traffic.
Currently, autonomous shuttle buses for public passenger transport (Level 3/Level 4) are present in urban traffic in many locations in China, including on public roads or pedestrian areas in major cities, in software and technology parks, and in industrial areas (see Table 1). They circulate freely or on-demand, with passengers on predetermined routes, and are monitored by a human operator who can intervene at any time to take control of the ADS.

5. Social Implications

The social impact of any new technology must be evaluated in terms of its efficiency and advantages to society, as well as its direct impact on the working environment. Society has to prepare itself for these new technologies, first to accept them and then to embrace them. Those who are directly impacted by these new technologies should first be presented with a reconversion plan or an alternative to their current job or profession. Only once those directly involved are covered in terms of their job and social security should the large-scale implementation of these new technologies start [140].
Autonomous driving will have a massively positive or negative impact on society. The advantages of this technology are obvious, but since there are human beings directly involved, those human beings must learn to accept that a computer will make decisions in their place. Thus, vehicle manufacturers must present valid and transparent decision and risk scenarios to all those who are interested in understanding how the relevant decisions are made. The key element in this process of acceptance is transparency. By using this approach, fear will start to disappear. Millions of kilometers of testing will eventually become millions of kilometers of daily usage. Only then can the product be considered accepted. A general chart presenting the advantages and disadvantages of the social implications is presented below [141,142]. The advantages of replacing the classical means of transport with a fleet of autonomous shuttle buses will bring benefits in terms of the following:
  • Traffic jam reductions;
  • Operating times (24 h a day);
  • Road safety;
  • Pollution reduction by means of energy efficient operation;
  • Schedule reliability;
  • Automatic and interactive fleet management;
  • New working places; and
  • New uses/purposes for autonomous public transport during a pandemic (e.g., COVID-19). Such vehicles can be used to transport infected patients between hospitals while avoiding human interaction or to transport samples or tests between different investigation centers, among other possibilities.
The disadvantages of replacing classical means of transport with a fleet of autonomous shuttle buses include the following:
  • Driver vacancy (usually a large city fleet has more than 1000 drivers);
  • Increased cost of maintaining and upgrading the fleet;
  • Security risks; and
  • Lack of response in case of a general malfunction.
There is no good or bad approach. There is no right or wrong. There is only a forward approach. Forward might sometimes entail staying in the same spot or moving to the right or left of that spot; as long as motion exists, technology developers will have the catalyst they need.
New technology is not always accepted by society. Sometimes the ancestral human need for safety makes an individual reluctant to change, but there are times when curiosity pushes people out of their proverbial caves and into the unknown.

6. Conclusions

Autonomous shuttle buses for public passenger transport use ever-evolving technology, technology that certainly needs years of testing to replace large-scale public transport using traditional buses. The use of autonomous shuttle buses for urban transport with transport capacities of a maximum 15 persons allows for the implementation of fixed routes, which can be established according to traffic conditions, the geometry of the route, intersections with other routes, the travel schedule, etc.
The need for the presence of a human operator who must permanently accompany autonomous shuttle buses, the reduced number of transport passengers (maximum 15 persons), the reduced speed of travel (maximum 20 km/h), and the high price of autonomous shuttle buses (approximately 300,000 Euros) are major barriers to the implementation of these vehicles in public passenger transport. Overcoming these barriers will require years of testing and significant technological advances in autonomous driving. On the other hand, at present, autonomous shuttle buses can be used successfully in niche areas of public transport, such as night routes (when urban traffic is reduced as much as possible), and routes through airports, industrial parks, and commercial areas.
What the future of transportation will look like in the following 50 years, we can only predict. However, everything related to public service will certainly be autonomous, independent, and self-sustained. Public transport represents a critical form of public service that, in the following years, will face a dramatic change for the best. It will become improved, modernized, and more ecological and efficient.
The autonomy of public transportation fleets is one of the challenges in the years to come. The public authorities’ policies will dictate the quality of life of inhabitants by providing them with the means and the reasons to relinquish private transport. Autonomous public transport has vast advantages in its efficiency (fixed schedule), 24-h operational time with zero stopping hours, lower amounts of maintenance (due to poor/bad driving), energy efficiency for individually optimized routes, greater safety (independent routes mean fewer accidents), and continuous monitoring (personal security improvements). These are only a few advantages that autonomous public transport will provide for the future. Clearly, present-day mobility solutions for public transport will be obsolete 10 years from now, but their ideas will not be obsolete. Instead, they will be improved upon. Ten to twenty years from now, a significant portion of people will work from home; perhaps they will not need public transport, but they will need public services supported by autonomous means of transport (e.g., for food, medicine, and grocery deliveries, garbage pick-up, clothing delivery (from shops or from the cleaners), streets maintenance and cleaning, etc.).
The current paper provides a comprehensive evaluation of the current autonomous driving solutions designed for public transport. It gives the reader, independent of his/her field of specialization, the means to understand the purpose, technology, legal implications, social implications, usage, and advantages of this public transport solution in direct connection with its primary users—city inhabitants.
The powertrain solution is presented with respect to its current electrical platform, including its current advantages and limitations. The variety of solutions is large, and the development of future autonomous platforms is vast and long-term, as only the future can establish proper technological directions.
From the perspective sensors, the current paper will be valid for perhaps 3–5 years, as the technology in this field is so dynamic that the future implementation of 5G technology will allow vehicles to externalize their sensors and use some information from an external source without the need for the continuous transport of these sensors alongside the vehicle. 5G technology will have a major influence in the development of autonomous drive technology, especially for level 5 autonomy, making AVs more connected, smarter, more secure and faster, and turning them into true mobile data centers with information in real-time, not only for them but also for the other vehicles with which they are connected (V2V).
From an IT perspective, the dynamics of updating will increase exponentially once machine learning and AI become an omnipresent part of the code that controls autonomous driving strategies. Presently, this type of code is too expensive for a market product. However, once this technology advances, such software solutions will become increasingly more accessible and will guarantee a competitive price for the product.
The legal framework will change along with each active generation, featuring continuous transformation through legal upgrades and corrections. The social implications will become obvious once the market share of these vehicles reaches 10% of public transportation vehicles.
Summa summarum, the current paper provides a clear scientific perspective into present autonomous driving in the public transport field, as well as a valid introspection into the benefits of this technical solution with direct links to current technology, focusing specifically on this technology’s social benefits. The future of public transport will be autonomous, or it will not be at all.

Author Contributions

B.O.V., C.I., and N.C. were responsible in the “Introduction” for the design, concepts, and drawings; in the “General Technical Characteristics” section, the technical framework, design, and graphics were supported by C.I., B.O.V., and N.C.; the content design and graphics in the “Autonomous driving systems” section were supported by C.I., N.C., and B.O.V.; the “Legal Framework” section was supported by C.I., B.O.V., and N.C.; the “Social implications” were supported by B.O.V., C.I., and N.C.; and the final conclusions were supported by B.O.V., C.I., and N.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

ACAlternating current
ACCAdaptive cruise control
ADSAutonomous driving system
AIArtificial intelligence
APIApplication programming interface
ASUApollo sensor unit
AVAutonomous vehicle
BMSBattery management system
CANController area network
CFAColor filter array
CMOSComplementary metal–oxide–semiconductor
CUDACompute unified device architecture
DCDirect current
ECUElectronic control unit
EVSEElectric vehicle supply equipment
GNSSGlobal navigation satellite system
GPSGlobal positioning system
HDHigh definition
HMIHuman machine interface
IC-CPDIn-cable control and protection device
IPCIndustrial PC (Baidu Apollo)
ITInformation technology
LINLocal interconnect network
LRRLong range radar
MODTMultiple object detection and tracking
MOSTMedia-oriented system transport
MRRMid-range radar
ODDOperational design domain
RTKReal time kinematic
RTOSReal-time operating system
SAESociety of Automotive Engineers
SLAMSimultaneous localization and mapping
SOCState of charge
SOHState of health
UNUnited nation
V2XVehicle-to-everything
VPNVirtual private network
VRVirtual reality

References

  1. Wolf, I. The Interaction Between Humans and Autonomous Agents. In Autonomous Driving Technical, Legal and Social Aspects; Maurer, M., Gerdes, J.C., Lenz, B., Eds.; Springer: Berlin/Heidelberg, Germany, 2016; pp. 103–124. [Google Scholar]
  2. TIME Magazine. Available online: http://content.time.com/time/magazine/0,9263,7601250810,00.html (accessed on 11 April 2020).
  3. Magic. Available online: https://archive.org/details/magicmotorways00geddrich/page/n9/mode/2up (accessed on 11 May 2020).
  4. Wired. Available online: https://www.wired.com/2009/11/autonomous-cars/ (accessed on 11 April 2020).
  5. Transportation Research Board. Available online: https://trid.trb.org/view/1289421 (accessed on 11 April 2020).
  6. Yahoo! News. Available online: http://uk.news.yahoo.com/how-the-first--driverless-car--was-invented-in-britain-in-1960-093127757.html (accessed on 11 April 2020).
  7. Bishop, R. Intelligent Vehicle Technology and Trends; Artech House: Norwood, MA, USA, 2005; pp. 7–24. [Google Scholar]
  8. ParkShuttle. Available online: http://faculty.washington.edu/jbs/itrans/parkshut.htm (accessed on 11 April 2020).
  9. SAE International. Available online: https://www.sae.org/standards/content/j3016_201806/ (accessed on 11 April 2020).
  10. Apollo Minibus. Available online: http://apollo.auto/minibus/index.html (accessed on 11 April 2020).
  11. EasyMile. Available online: https://easymile.com/solutions-easymile/ez10-autonomous-shuttle-easymile/ (accessed on 11 April 2020).
  12. Navya Autonom Shuttle. Available online: https://navya.tech/shuttle/ (accessed on 11 April 2020).
  13. Meet Olli Local Motors. Available online: https://localmotors.com/meet-olli/ (accessed on 11 April 2020).
  14. Ahmed, S.; Huda, M.N.; Rajbhandari, S.; Saha, C.; Elshaw, M.; Kanarachos, S. Pedestrian and Cyclist Detection and Intent Estimation for Autonomous Vehicles: A Survey. Appl. Sci. 2019, 9, 2335. [Google Scholar] [CrossRef] [Green Version]
  15. Ainsalu, J.; Arffman, V.; Bellone, M.; Ellner, M.; Haapamäki, T.; Haavisto, N.; Josefson, E.; Ismailogullari, A.; Lee, B.; Madland, O.; et al. State of the Art of Automated Buses. Sustainability 2018, 10, 3118. [Google Scholar]
  16. Domínguez, J.M.L.; Sanguino, T.J.M. Review on V2X, I2X, and P2X Communications and Their Applications: A Comprehensive Analysis over Time. Sensors 2019, 19, 2756. [Google Scholar] [CrossRef] [Green Version]
  17. Rosique, F.; Navarro, P.J.; Fernández, C.; Padilla, A. A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research. Sensors 2019, 19, 648. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Soteropoulos, A.; Berger, M.; Ciari, F. Impacts of automated vehicles on travel behaviour and land use: An international review of modelling studies. Transp. Rev. 2019, 39, 29. [Google Scholar] [CrossRef]
  19. Taeihagh, A.; Lim, H.S.M. Governing autonomous vehicles: Emerging responses for safety, liability, privacy, cybersecurity, and industry risks. Transp. Rev. 2019, 39, 103. [Google Scholar] [CrossRef] [Green Version]
  20. Zheng, L.; Li, B.; Yang, B.; Song, H.; Lu, Z. Lane-Level Road Network Generation Techniques for Lane-Level Maps of Autonomous Vehicles: A Survey. Sustainability 2019, 11, 4511. [Google Scholar] [CrossRef] [Green Version]
  21. E/CONF.56/16/R.1. Available online: https://www.unece.org/fileadmin/DAM/trans/conventn/crt1968e.pdf (accessed on 11 April 2020).
  22. ECE: Report of the Sixty-Eighth Session of the Working Party on Road Traffic Safety. Available online: https://www.unece.org/fileadmin/DAM/trans/doc/2014/wp1/ECE-TRANS-WP1-145e.pdf (accessed on 11 April 2020).
  23. Navya. Providing Fluid Mobility with Autonomous Shuttles. Available online: https://navya.tech/wp-content/uploads/2017/09/NAVYA_DP_SHUTTLE_2017_GB.pdf (accessed on 11 April 2020).
  24. ERTRAC Roadmap. Available online: https://www.ertrac.org/uploads/documentsearch/id57/ERTRAC-CAD-Roadmap-2019.pdf (accessed on 11 April 2020).
  25. Waymo. Available online: https://www.thedrive.com/tech/15848/waymo-is-already-running-cars-with-no-one-behind-the-wheel (accessed on 11 April 2020).
  26. Amini, A.; Phillips, J.; Moseyko, J.; Banerjee, R.; Karaman, S. Learning Robust Control Policies for End-to-End Autonomous Driving from Data-Driven Simulation. IEEE Robot. Autom. Lett. 2020, 5, 1143. [Google Scholar] [CrossRef]
  27. Autonom Shuttle. Available online: https://navya.tech/en/autonom-shuttle/#storeLocator__bottomHalf (accessed on 12 April 2020).
  28. TCL. Available online: https://www.tcl.fr/n1-la-premiere-ligne-tcl-desservie-par-des-navettes-autonomes-et-electriques (accessed on 11 April 2020).
  29. Mobilité Intelligente Autonom. Available online: http://www.brainstorming.fr/CLIENTS/Pollutec/201811-2/docs/MIA-DossierdePresse-27-11-18.pdf (accessed on 11 April 2020).
  30. Geneva Avenue. Available online: https://h2020-avenue.eu/?portfolio=geneva (accessed on 11 April 2020).
  31. SmartShuttle. Available online: https://www.postauto.ch/en/project-smartshuttle (accessed on 11 April 2020).
  32. Wiener Linien Seestadt. Network Expansion. Passenger Information. Available online: https://www.wienerlinien.at/eportal3/ep/channelView.do/pageTypeId/66533/channelId/-4400687#4400868 (accessed on 11 April 2020).
  33. Luxembourg Avenue. Available online: https://h2020-avenue.eu/portfolio-item/luxembourg/ (accessed on 11 April 2020).
  34. Ministère des Armées. Available online: https://www.defense.gouv.fr/air/actus-air/une-navette-autonome-sur-la-base-aerienne-de-villacoublay (accessed on 11 April 2020).
  35. EasyMile Example of Use Cases. Available online: https://easymile.com/application-map-easymile/ (accessed on 12 April 2020).
  36. Apollo. Available online: http://apollo.auto/about/development_route.html (accessed on 12 April 2020).
  37. Local Motors Press. Available online: https://localmotors.com/press/ (accessed on 12 April 2020).
  38. Eskandarian, A. Handbook of Intelligent Vehicles; Springer: Berlin/Heidelberg, Germany, 2012; pp. 33–58. [Google Scholar]
  39. NTSB. Available online: https://www.ntsb.gov/investigations/AccidentReports/Reports/HAB1906.pdf (accessed on 12 April 2020).
  40. Éduscol STI. Available online: https://eduscol.education.fr/sti/concours_examens/epreuve-de-sciences-de-lingenieur-septembre-2018-nouvelle-caledonie#fichiers-liens (accessed on 16 April 2020).
  41. Chen, Y.; Yu, H.; Graaf, R.; Wang, X.; Wan, J. Robust vehicle longitudinal motion control subject to in-wheel-motor driving torque variations. In Proceedings of the 2017 American Control Conference (ACC), Seattle, WA, USA, 24–26 May 2017; pp. 4316–4321. [Google Scholar]
  42. Zhang, Z.; Pan, H.; Salman, W.; Rasim, Y.; Liu, X.; Wang, C.; Yang, Y.; Li, X. A novel steering system for a space-saving 4ws4wd electric vehicle: Design, modeling, and road tests. IEEE Trans. Intell. Transp. Syst. 2017, 18, 114–127. [Google Scholar] [CrossRef]
  43. Isa, K.B.; Jantan, A.B. An Autonomous Vehicle Driving Control System. Int. J. Eng. Educ. 2005, 21, 855–866. [Google Scholar]
  44. Zheng, H.; Yang, S. A trajectory tracking control strategy of 4WIS/4WID electric vehicle with adaptation of driving conditions. Appl. Sci. 2019, 9, 168. [Google Scholar] [CrossRef] [Green Version]
  45. Dagstuhl. Available online: https://drops.dagstuhl.de/opus/volltexte/2019/10338/ (accessed on 12 April 2020).
  46. Sun, X.; Li, Z.; Wang, X.; Li, C. Technology Development of Electric Vehicles: A Review. Energies 2020, 13, 90. [Google Scholar] [CrossRef] [Green Version]
  47. Ti.com/hev. Available online: https://www.thierry-lequeu.fr/data/SZZA058C.pdf (accessed on 12 April 2020).
  48. Electric Vehicle Charging Levels. Available online: https://www.emobilitysimplified.com/2019/10/ev-charging-levels-modes-types-explained.html (accessed on 12 April 2020).
  49. SAE International. Available online: https://www.sae.org/standards/content/j1772_201001/ (accessed on 12 April 2020).
  50. IEC Standards. Available online: https://www.iecee.org/dyn/www/f?p=106:49:0::::FSP_STD_ID:6032 (accessed on 12 April 2020).
  51. IEC Standards. Available online: https://www.iecee.org/dyn/www/f?p=106:49:0::::FSP_STD_ID:20348 (accessed on 12 April 2020).
  52. Apolong. Available online: https://www.pngkey.com/maxpic/u2e6e6i1a9o0q8a9/ (accessed on 18 April 2020).
  53. EasyMile EZ10. Available online: https://landtransportguru.net/easymile-ez10/ (accessed on 18 April 2020).
  54. Local Motors Lab. Available online: https://www.theverge.com/2016/6/17/11962776/local-motors-olli-3d-printed-autonomous-bus-photos (accessed on 18 April 2020).
  55. TI E2E Forums. Available online: https://e2e.ti.com/blogs_/b/behind_the_wheel/archive/2014/09/25/cars-are-becoming-rolling-sensor-platforms (accessed on 12 April 2020).
  56. Advanced Driver Assistance (ADAS) Solutions Guide Ti.com/adas 2015. Available online: https://uk.farnell.com/wcsstore/ExtendedSitesCatalogAssetStore/cms/asset/images/europe/common/applications/automotive/pdf/ti-adas-solution-guide.pdf (accessed on 12 April 2020).
  57. Wevolver. A Review of Autonomous Vehicle Safety and Regulations. Available online: https://www.wevolver.com/article/a.review.of.autonomous.vehicle.safety.and.regulations (accessed on 12 April 2020).
  58. Nvidia AGX. Available online: https://www.nvidia.com/es-la/deep-learning-ai/products/agx-systems/ (accessed on 12 April 2020).
  59. FTA Research. Available online: https://www.transit.dot.gov/sites/fta.dot.gov/files/docs/research-innovation/114661/strategic-transit-automation-research-report-no-0116_0.pdf (accessed on 12 April 2020).
  60. Apollo Auto GitHub. Available online: https://github.com/ApolloAuto/ (accessed on 12 April 2020).
  61. Velodyne Lidar. Available online: https://velodynelidar.com/downloads/ (accessed on 12 April 2020).
  62. Hesai Cloud. Available online: https://www.hesaitech.com/en/download?product=Pandora (accessed on 12 April 2020).
  63. Continental. Available online: https://www.conti-engineering.com/en-US/Industrial-Sensors/Sensors-Overview (accessed on 12 April 2020).
  64. Leopard Imaging. Available online: https://leopardimaging.com/product-category/usb30-cameras/ (accessed on 12 April 2020).
  65. Novatel IMU. Available online: https://www.novatel.com/assets/Documents/Papers/IMU-IGM-A1-PS.pdf (accessed on 12 April 2020).
  66. Novatel GNSS. Available online: https://www.novatel.com/assets/Documents/Papers/ProPak6-PS-D18297.pdf (accessed on 12 April 2020).
  67. Novatel Antenna. Available online: https://www.novatel.com/assets/Documents/Papers/PwrPak7D-PS.pdf (accessed on 12 April 2020).
  68. Safer Roads Automated Vehicles. Available online: https://www.itf-oecd.org/sites/default/files/docs/safer-roads-automated-vehicles.pdf (accessed on 12 April 2020).
  69. Sensor Fusion. Available online: http://umich.edu/~umtriswt/PDF/SWT-2017-12.pdf (accessed on 12 April 2020).
  70. CleanTechnica. Available online: https://cleantechnica.com/2016/07/29/tesla-google-disagree-lidar-right/ (accessed on 12 April 2020).
  71. Im, J.-H.; Im, S.-H.; Jee, G.-I. Extended Line Map-Based Precise Vehicle Localization Using 3D LIDAR. Sensors 2018, 18, 3179. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  72. Sick Sensor 2D Lidar Sensors. Available online: https://www.sick.com/kr/en/detection-and-ranging-solutions/2d-lidar-sensors/c/g91900 (accessed on 12 April 2020).
  73. Sick Sensor 3D Lidar Sensors. Available online: https://www.sick.com/kr/en/detection-and-ranging-solutions/3d-lidar-sensors/c/g282752 (accessed on 12 April 2020).
  74. Sualeh, M.; Kim, G.-W. Dynamic Multi-LiDAR Based Multiple Object Detection and Tracking. Sensors 2019, 19, 1474. [Google Scholar] [CrossRef] [Green Version]
  75. TI Lidar Sensor. Available online: http://www.ti.com/lit/wp/slyy150/slyy150.pdf (accessed on 12 April 2020).
  76. Steinhardt, N.; Leinen, S. Data Fusion for Precise Localization. In Handbook of Driver Assistance Systems; Winner, H., Hakuli, S., Lotz, F., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 154–196. [Google Scholar]
  77. Bosch Radar Sensor (MRR). Available online: : https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-assistance-systems/automatic-emergency-braking/mid-range-radar-sensor-(mrr)/ (accessed on 12 April 2020).
  78. OSRAM. Available online: https://www.osram-group.com/en/media/press-releases/pr-2017/27-07-2017b (accessed on 12 April 2020).
  79. Bosch Radar Sensor (LRR). Available online: https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-assistance-systems/left-turn-assist/long-range-radar-sensor/ (accessed on 12 April 2020).
  80. Yu, H.; Shi, W.; Alawieh, H.B.; Yan, C.; Zeng, X.; Li, X.; Yu, H. Efficient Statistical Validation of Autonomous Driving Systems. In Safe, Autonomous and Intelligent Vehicles; Yu, H., Li, X., Murray, R.M., Ramesh, S., Tomlin, C., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 5–32. [Google Scholar]
  81. Yu, F.-M.; Jwo, K.-W.; Chang, R.-S.; Tsai, C.-T. Dispensing Technology of 3D Printing Optical Lens with Its Applications. Energies 2019, 12, 3118. [Google Scholar] [CrossRef] [Green Version]
  82. Min, H.; Wu, X.; Cheng, C.; Zhao, X. Kinematic and Dynamic Vehicle Model-Assisted Global Positioning Method for Autonomous Vehicles with Low-Cost GPS/Camera/In-Vehicle Sensors. Sensors 2019, 19, 5430. [Google Scholar] [CrossRef] [Green Version]
  83. Odijk, D. Positioning Model. In Springer Handbook of Global Navigation Satellite Systems; Teunissen, P.J.G., Montenbruck, O., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 606–635. [Google Scholar]
  84. Özgüner, Ü.; Acarman, T.; Redmill, K. Autonomous Ground Vehicles; Artech House: Norwood, MA, USA, 2011; pp. 193–216. [Google Scholar]
  85. Nvidia TU Automotive. Available online: https://www.tu-auto.com/nvidia-more-computing-power-makes-for-safer-avs/ (accessed on 13 April 2020).
  86. Self-Driving Cars. Available online: https://www.wired.com/story/self-driving-cars-power-consumption-nvidia-chip/ (accessed on 13 April 2020).
  87. Kala, R. On-Road Intelligent Vehicles Motion Planning for Intelligent Transportation Systems; Butterworth-Heinemann: Cambridge, MA, USA, 2016; pp. 63–86. [Google Scholar]
  88. Jahromi, B.S.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar] [CrossRef] [Green Version]
  89. Poetter, T. Autonomous Vehicle (AV): Vehicle, Edge, and Data Center Processes and Components, Governance. 2020. Available online: https://www.researchgate.net/publication/339944384_Autonomous_Vehicles_AV_Vehicle_Edge_and_Data_Center_Processes_and_Components_Governance (accessed on 13 April 2020).
  90. Herrmann, A.; Brenner, W.; Stadler, R. Autonomous Driving: How the Driverless Revolution will Change the World; Emerald Publishing Limited: Bingley, UK, 2018; pp. 141–150. [Google Scholar]
  91. Gacha Autonomous Shuttle Bus. Available online: https://sensible4.fi/gacha/ (accessed on 18 May 2020).
  92. UrbanSense 5G Edge Computing Challenge. Available online: https://forumvirium.fi/en/ (accessed on 18 May 2020).
  93. How 5G Will Influence Autonomous Driving Systems, White Paper. Available online: https://www.keysight.com/zz/en/assets/7018-06173/white-papers/5992-2998.pdf (accessed on 18 May 2020).
  94. U.S. Patent 7979173. Available online: https://patents.google.com/patent/US7979173B2/en (accessed on 13 April 2020).
  95. U.S. Patent 10334050. Available online: https://patents.google.com/patent/US10334050B2/en (accessed on 20 April 2020).
  96. U.S. Patent 10401852. Available online: https://patents.google.com/patent/US10401852/en (accessed on 20 April 2020).
  97. U.S. Patent 10446037. Available online: https://patents.google.com/patent/US10446037/en (accessed on 20 April 2020).
  98. Betz, J.; Heilmeier, A.; Wischnewski, A.; Stahl, T.; Lienkamp, M. Autonomous Driving—A Crash Explained in Detail. Appl. Sci. 2019, 9, 5126. [Google Scholar] [CrossRef] [Green Version]
  99. Nuvo-6108GC. Available online: https://www.neousys-tech.com/Resource/Product_Document/Nuvo-6108GC/Nuvo-6108GC_Datasheet.pdf (accessed on 13 April 2020).
  100. ASUS. Available online: https://www.asus.com/Graphics-Cards/ROG-STRIX-GTX1080-A8G-GAMING/ (accessed on 13 April 2020).
  101. Safety First for Automated Driving. Available online: https://newsroom.intel.com/wp-content/uploads/sites/11/2019/07/Intel-Safety-First-for-Automated-Driving.pdf (accessed on 13 April 2020).
  102. NHTSA Website. Available online: https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/documents/13882-automateddrivingsystems_092618_v1a_tag.pdf (accessed on 13 April 2020).
  103. II Consortium. Available online: https://www.iiconsortium.org/news/joi-march-2019.htm (accessed on 20 April 2020).
  104. Norton Rose Fulbright Automotive Vehicle. Available online: https://www.automotive-iq.com/content-auto-download/5bcecf6ed1d92e3b5f3ec797 (accessed on 13 April 2020).
  105. Ilci, V.; Toth, C. High Definition 3D Map Creation Using GNSS/IMU/LiDAR Sensor Integration to Support Autonomous Vehicle Navigation. Sensors 2020, 20, 899. [Google Scholar] [CrossRef] [Green Version]
  106. U.S. Patent 9612123. Available online: https://patents.google.com/patent/US9612123/en (accessed on 13 April 2020).
  107. Dimitrievski, M.; Veelaert, P.; Philips, W. Behavioral Pedestrian Tracking Using a Camera and LiDAR Sensors on a Moving Vehicle. Sensors 2019, 19, 391. [Google Scholar] [CrossRef] [Green Version]
  108. DICOM. Available online: https://www.ecologique-solidaire.gouv.fr/sites/default/files/90p%20VDEF.pdf (accessed on 20 April 2020).
  109. Regulation 79/2018 (UN/ECE). Available online: https://eur-lex.europa.eu/eli/reg/2018/1947/oj (accessed on 20 April 2020).
  110. NHTSA ADS 2.0. Available online: https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/documents/13069a-ads2.0_090617_v9a_tag.pdf (accessed on 20 April 2020).
  111. Autonomous Vehicles. Available online: https://www.ncsl.org/research/transportation/autonomous-vehicles-self-driving-vehicles-enacted-legislation.aspx#enacted (accessed on 20 April 2020).
  112. US State Abbreviations. Available online: https://www.bu.edu/brand/guidelines/editorial-style/us-state-abbreviations/ (accessed on 20 April 2020).
  113. Eckert Seamans The Autonomous Vehicle Legislative Survey. Available online: https://www.eckertseamans.com/app/uploads/PLAC_Eckert_Seamans_AV_Survey-1.pdf (accessed on 20 April 2020).
  114. Autonomous Vehicles. Available online: https://code.dccouncil.us/dc/council/code/titles/50/chapters/23A/ (accessed on 20 April 2020).
  115. The Illinois Vehicle Code. Available online: http://www.ilga.gov/legislation/publicacts/100/PDF/100-0352.pdf (accessed on 20 April 2020).
  116. Autonomous Technology. Available online: https://legiscan.com/LA/text/HB1143/id/1415397/Louisiana-2016-HB1143-Chaptered.pdf (accessed on 20 April 2020).
  117. Automated Driving System. Available online: https://legiscan.com/GA/text/SB219/id/1587891/Georgia-2017-SB219-Enrolled.pdf (accessed on 20 April 2020).
  118. Assembly Bill 669/2017. Available online: https://legiscan.com/CA/text/AB669/2017 (accessed on 20 April 2020).
  119. Budget Bill 3005/2017. Available online: https://nyassembly.gov/2017budget/budget_bills/A3005C.pdf (accessed on 20 April 2020).
  120. Directive 2007/46/EC. Available online: https://eur-lex.europa.eu/eli/dir/2007/46/oj (accessed on 20 April 2020).
  121. EURLex. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52018DC0283 (accessed on 20 April 2020).
  122. Baker McKenzie Global Driverless Vehicle Survey. Available online: https://www.bakermckenzie.com/-/media/files/insight/publications/2018/03/global-driverless-vehicle-survey-2018/mm_global_driverlessvehiclesurvey2018_mar2018.pdf (accessed on 20 April 2020).
  123. Austria Code of Practice. Available online: https://www.bmk.gv.at/dam/jcr:e54371b0-8f79-47a0-9342-c2accdcc36be/code_of_practice_ua.pdf (accessed on 20 April 2020).
  124. KPMG 2019. Available online: https://assets.kpmg/content/dam/kpmg/xx/pdf/2019/02/2019-autonomous-vehicles-readiness-index.pdf (accessed on 20 April 2020).
  125. Code de la Route. Available online: https://www.code-de-la-route.be/textes-legaux/sections/ar/code-de-la-route (accessed on 20 April 2020).
  126. Autonomous Vehicle. Code of Practice for Testing in Belgium. Available online: https://mobilit.belgium.be/sites/default/files/resources/files/code_of_practice_en_2016_09.pdf (accessed on 20 April 2020).
  127. Koninklijk Besluit van 1968. Available online: https://www.wegcode.be/wetteksten/secties/kb/tech (accessed on 20 April 2020).
  128. Legifrance. Available online: https://www.legifrance.gouv.fr/eli/arrete/2018/4/17/TRER1717820A/jo/texte (accessed on 20 April 2020).
  129. Regulation (EU) 2016/679. Available online: https://eur-lex.europa.eu/eli/reg/2016/679/oj (accessed on 20 April 2020).
  130. Straßenverkehrsgesetz. Available online: https://www.gesetze-im-internet.de/stvg/BJNR004370909.html (accessed on 20 April 2020).
  131. Codice Della Strada. Available online: https://www.edscuola.it/archivio/norme/decreti/dlvo285_92.htm (accessed on 20 April 2020).
  132. Smart Road. Available online: https://www.gazzettaufficiale.it/eli/gu/2018/04/18/90/sg/pdf (accessed on 20 April 2020).
  133. Wegenverkeerswet 1994. Available online: https://wetten.overheid.nl/BWBR0006622/2020-01-01 (accessed on 20 April 2020).
  134. Exceptioneel Vervoer. Available online: https://wetten.overheid.nl/BWBR0018680/2015-07-01 (accessed on 20 April 2020).
  135. Código Tráfico. Available online: https://boe.es/legislacion/codigos/codigo.php?id=20&modo=2&nota=0 (accessed on 20 April 2020).
  136. DGT Instrucción 15/V-113. Available online: http://www.dgt.es/Galerias/seguridad-vial/normativa-legislacion/otras-normas/modificaciones/15.V-113-Vehiculos-Conduccion-automatizada.pdf (accessed on 20 April 2020).
  137. Förordning 309/2017. Available online: https://www.riksdagen.se/sv/dokument-lagar/dokument/svensk-forfattningssamling/forordning-2017309-om-forsoksverksamhet-med_sfs-2017-309 (accessed on 20 April 2020).
  138. China’s Drive to Dominate Autonomous Cars. Available online: https://innovator.news/chinas-drive-to-dominate-autonomous-cars-736f4a4d66bf (accessed on 20 April 2020).
  139. China National Rules of Road. Available online: http://www.conventuslaw.com/report/china-national-administrative-rules-of-road/ (accessed on 20 April 2020).
  140. Autonomous Vehicles. Available online: https://www.sae.org/publications/books/content/pt-158/ (accessed on 20 April 2020).
  141. Chehria, A.; Mouftah, H.T. Autonomous vehicles in the sustainable cities, the beginning of a green adventure. Sustain. Cities Soc. 2019, 51, 101751. [Google Scholar] [CrossRef]
  142. Pype, P.; Daalderop, G.; Schulz-Kamm, E.; Walters, E.; Westermann, S. Intelligent Transport Systems: The Trials Making Smart Mobility a Reality. In Automated Driving Safer and More Efficient Future Driving; Watzenig, D., Horn, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 595–600. [Google Scholar]
Figure 1. An overview of the paper.
Figure 1. An overview of the paper.
Energies 13 02917 g001
Figure 2. Levels of driving automation (SAE J3016™).
Figure 2. Levels of driving automation (SAE J3016™).
Energies 13 02917 g002
Figure 3. Automated urban road transport evolution (scale-based approach).
Figure 3. Automated urban road transport evolution (scale-based approach).
Energies 13 02917 g003
Figure 4. Powertrain architecture: (a) steering system with two steering wheels; (b) four-wheel steering system; (c) four-wheel steering system with wheel hub motors.
Figure 4. Powertrain architecture: (a) steering system with two steering wheels; (b) four-wheel steering system; (c) four-wheel steering system with wheel hub motors.
Energies 13 02917 g004aEnergies 13 02917 g004b
Figure 5. Steering system: (a) overview of the steering system; (b) the steering system with two steering wheels; (c) the steering system with four steering wheels.
Figure 5. Steering system: (a) overview of the steering system; (b) the steering system with two steering wheels; (c) the steering system with four steering wheels.
Energies 13 02917 g005
Figure 6. Kinematic diagram of the front wheel steering system (top view).
Figure 6. Kinematic diagram of the front wheel steering system (top view).
Energies 13 02917 g006
Figure 7. The steering controller.
Figure 7. The steering controller.
Energies 13 02917 g007
Figure 8. Control loop structure.
Figure 8. Control loop structure.
Energies 13 02917 g008
Figure 9. Sensor systems (specific equipment for autonomous shuttle bus).
Figure 9. Sensor systems (specific equipment for autonomous shuttle bus).
Energies 13 02917 g009
Figure 10. Comparison of sensor capabilities.
Figure 10. Comparison of sensor capabilities.
Energies 13 02917 g010
Figure 11. LIDAR sensors (photo by the author).
Figure 11. LIDAR sensors (photo by the author).
Energies 13 02917 g011
Figure 12. Radar sensors.
Figure 12. Radar sensors.
Energies 13 02917 g012
Figure 13. Video camera system tasks.
Figure 13. Video camera system tasks.
Energies 13 02917 g013
Figure 14. The object classification mechanism.
Figure 14. The object classification mechanism.
Energies 13 02917 g014
Figure 15. GNSS/RTK sensor.
Figure 15. GNSS/RTK sensor.
Energies 13 02917 g015
Figure 16. IMU sensor.
Figure 16. IMU sensor.
Energies 13 02917 g016
Figure 17. Autonomous driving algorithm.
Figure 17. Autonomous driving algorithm.
Energies 13 02917 g017
Figure 18. NVidia drive (deep learning) (NVidia).
Figure 18. NVidia drive (deep learning) (NVidia).
Energies 13 02917 g018
Figure 19. Software architecture for ADS.
Figure 19. Software architecture for ADS.
Energies 13 02917 g019
Figure 20. ECU of the ADS.
Figure 20. ECU of the ADS.
Energies 13 02917 g020
Figure 21. Apollo Baidu ECU.
Figure 21. Apollo Baidu ECU.
Energies 13 02917 g021
Figure 22. Extended local map generator.
Figure 22. Extended local map generator.
Energies 13 02917 g022
Figure 23. U.S. states and autonomous driving legislation.
Figure 23. U.S. states and autonomous driving legislation.
Energies 13 02917 g023
Table 1. Autonomous shuttle buses for public transportation.
Table 1. Autonomous shuttle buses for public transportation.
Shuttle BusLocationOperatorVehiclesLaunchTrackFeeEnvironmentLink
Navya ArmaConfluence Lyon, FranceKeolis209/20161.350 kmFreePedestrian area 1[27]
Navya ArmaParc Olympique Lyonnais, FranceTCL Lyon211/20191.400 kmFreePublic road 2[28]
Navya ArmaZAC des Gaulnes Lyon, FranceBerthelet103/20191.200 kmFreePublic road[29]
Navya Armal’Abbaye Fontevraud, FranceKeolis105/20180.800 kmFreePedestrian area[27]
Navya ArmaRue Paul Duez, Universite de Lille, FranceKeolis212/20181.400 kmFreePublic road[27]
Navya ArmaVillejean, Universite de Rennes, FranceKeolis201/20181.300 kmFreePublic road[27]
Navya ArmaVirginio-Malnati Meyrin, Geneve, SwitzerlandTPG Geneve209/20182.100 kmFreePublic road[30]
Navya ArmaPlace de la Planta Sion, SwitzerlandCarPostal206/20163.540 kmFreePublic road[31]
Navya ArmaNeuhausen am Rheinfall, SwitzerlandVB/SH103/20171.500 kmFreePublic road[27]
Navya Armal’Ancienne Marly, Fribourg, SwitzerlandTPF208/20171.300 kmOn-demandPublic road[27]
Navya ArmaSylt Schleswig-Holstein, GermanySVG105/20192.700 kmFreePublic road[27]
Navya ArmaIlse-Arlt-Straße Wien, AustriaWiener Linien206/20194.000 kmFreePublic road[32]
Navya ArmaOmmelander Hospital, Groningen, NetherlandsArriva (DB)108/20181.000 kmFreePublic road[27]
Navya ArmaContern, LuxembourgSales-Lentz109/20181.000 kmFreePublic road[33]
Navya ArmaOslo Waterfront Oslo, NorwayHolo405/20192.200 kmFreePublic road[27]
Navya ArmaLindholmen Science Park Gothenburg, SwedenAutonomous204/20191.400 kmFreePublic road[27]
Navya ArmaUniversity of Metropolia Helsinki, FinlandMetropolia206/20192.000 kmFreePublic road[27]
Navya ArmaSt Perth Esplanade Perth, AustraliaRAC207/20163.500 kmPaid servicePublic road[27]
Navya ArmaFlinders University Adelaide, AustraliaFlinders106/20181.200 kmFreePublic road[27]
Navya ArmaUniversity of Michigan, Ann Arbor, Michigan, USMCITY212/20161.600 kmFreePublic road[27]
Navya ArmaLake Nona Orlando, Florida, USBeep209/20193.540 kmFreePublic road[27]
Navya ArmaLas Vegas Blvd Las Vegas, Nevada, USKeolis111/20171.000 kmFreePublic road[27]
Navya ArmaMontcalm Candiac, Montreal, CanadaKeolis108/20182.000 kmFreePublic road[27]
Navya ArmaMasdar City, Abu Dhabi, United Arab EmiratesNavya309/20180.900 kmFreePedestrian area[27]
Navya ArmaNursery Park, West Kowloon, Hong KongwestKowloon107/20170.300 kmFreePedestrian area[27]
EasyMile EZ10Airport Velizy-Villacoublay, Paris, FranceRATP/SCA106/2018-Air ForceGovernment property 3[34]
EasyMile EZ10First Fully Driverless Service Sorigny, FranceTLD111/20181.500 kmFreePublic road[35]
EasyMile EZ10Plateau de Satory Versailles, FranceTransdev112/20181.000 kmFreePublic road[35]
EasyMile EZ10InnoZ EUREF Campus, Berlin, GermanyBVG112/20170.600 kmFreePublic road[35]
EasyMile EZ10Bad Birnbach, GermanyDB210/20171.400 kmFreePublic road[35]
EasyMile EZ10Project See-Meile, Berlin, GermanyBVG108/20191.200 kmFreePublic road[35]
EasyMile EZ10GreenTec Campus, Enge-Sande, GermanyBMVI106/20182.700 kmFreePublic road[35]
EasyMile EZ10Bernmobil Demo Bern, SwitzerlandAVOC106/20192.000 kmFreePublic road[35]
EasyMile EZ10Koppl Salzburg Research, Koppl, AustriaDigibus104/20181.400 kmFreePublic road[35]
EasyMile EZ10Austin Airport, Austin, Texas, USAUS108/20190.700 kmFreePedestrian area[35]
EasyMile EZ10Virginia Tech Blacksburg, Virginia, USNRV105/20190.800 kmFreePublic road[35]
EasyMile EZ10Calgary Zoo Calgary, CanadaPWT109/20180.557 kmFreePublic road[35]
EasyMile EZ10Renmark Aged Care, Renmark, AustraliaTAG108/20194.500 kmFreePublic road[35]
EasyMile EZ10BusBot, Toormina, New South Wales, AustraliaBusways106/20191.000 kmFreePedestrian area[35]
EasyMile EZ10Southeast University, Nanjing, Jiangsu, ChinaNJNDTIG110/20181.400 kmFreePublic road[35]
EasyMile EZ10National University of Singapore, SingaporeComfortDelGro107/20191.600 kmFreePublic road[35]
Baidu ApolloSoftware Park Xiamen, ChinaBaidu104/2018-FreePublic road[36]
Baidu ApolloKink Long Xiamen, ChinaBaidu103/2018-FreePedestrian area[36]
Baidu ApolloHaidian Park Xiamen, ChinaBaidu105/2019-RegistrationPedestrian area[36]
Baidu ApolloXiongan New Area, ChinaBaidu512/20174.000 kmFreePublic road[36]
Baidu ApolloShenzhen, ChinaShenzhen Bus412/20171.200 kmFreePublic road[36]
Baidu ApolloYangquan, Shanxi, ChinaBaidu2201/201926.000 kmFreePublic road[36]
Baidu ApolloWuhan, ChinaBaidu1010/20185.000 kmFreePedestrian road[36]
Local Motors OlliITCILO Campus, Turin, ItalyITC-ILO101/2020-Registration 4Public road[37]
Local Motors OlliGoodyear, Colmar-Berg, LuxembourgSales-Lentz303/2019-FreePrivate property[37]
Local Motors OlliGoMentum Station, Concord, California, USCCTA110/2019-On-demandPrivate property[37]
Local Motors OlliNational Harbor, Maryland, USMDOT110/2019-RegistrationPublic road[37]
Local Motors OlliThe Goodyear Tire, Akron, Ohio, USGoodyear102/2019-On-demandPrivate property[37]
Local Motors OlliUniversity of Buffalo, Buffalo New York, USUB109/2019-FreePrivate property[37]
Local Motors OlliKing Abdullah University, Thuwal, Saudi ArabiaSAPTCO312/2019-FreePublic road[37]
1 Pedestrian area: type of traffic mixed with pedestrians and bikes. 2 Public road: type of traffic mixed with pedestrians, bikes and motorized vehicles. 3 Government property: legal designation for the ownership of property by governmental legal entities. 4 Free for registered members on https://rideolli.com/.
Table 2. Shuttle bus high voltage battery type and power.
Table 2. Shuttle bus high voltage battery type and power.
Powertrain ElementsApollo BaiduEasyMile EZ10Navya ArmaOlli
Battery Capacityno data20.0 kWh33.0 kWh18.5 kWh
Battery Typeno dataLiFePO4LiFePO4Lithium
Table 3. The technical specifications of the gear motor.
Table 3. The technical specifications of the gear motor.
Technical SpecificationsValueUnit
Nominal voltage24V
Nominal power320W
Nominal current13.5A
Maximal current105A
Nominal speed301/min
Nominal torque45Nm
Maximal torque350Nm
Reduction ratio50-
Table 4. The charging levels (SAE J1772).
Table 4. The charging levels (SAE J1772).
LevelVoltage (V)Phases (~)Current (A)Power (kW)Time
Level 11201 × AC12–161.4–1.9≤17 h
200–450DC≤80≤36≤1.2 h
Level 22401 × AC≤80≤20≤7 h
200–450DC≤200≤90≤20 min
Level 3208–6003 × ACno data≤20no data
200–600DC≤400≤240≤10 min
Table 5. Baidu Apollo platform-supported sensors.
Table 5. Baidu Apollo platform-supported sensors.
TypeProducerModelUnitLink
UnitBaiduApollo Sensor Unit (ASU)1[60]
LidarVelodyneAlpha Prime VLS-128 (3D, 128 channels)1[61]
LidarVelodyneHDL-64E S3 (3D, 64 channels)1[61]
LidarHesaiPandora (Lidar 3D 40 channels, camera: 1 color 4 mono)1[62]
RadarContinentalARS-408-21 (LRR 77GHz)1[63]
CameraLeopardLI-USB30-AR023ZWDR (1920x1080@30 fps)2[64]
IMUNovAtelIMU-IGM-A1 (IMU sensor 200 Hz)1[65]
RTKNovAtelProPak6 (GNSS Receiver 240 channels)1[66]
GNSSNavtechPwrPak 7D (GNSS Receiver with dual antenna)1[67]
Table 6. Technical data of LIDAR sensors.
Table 6. Technical data of LIDAR sensors.
Lidar SensorLidar 2DLidar 3D
Aperture angle (horizontal)270°360°
Aperture angle (vertical)-30°
Horizontal resolution0.25–0.5°0.1–0.4°
Vertical resolution-
Working range0.50–50 m1–100 m
# of lines116
Scanning range (10% remission)18–30 m-
Scanning range (90% remission)20–50 m-
Scanning frequency25 Hz/50 Hz20 Hz
Table 7. Technical data of radar sensors.
Table 7. Technical data of radar sensors.
Radar SensorLRRMRRMRR Rear
Frequency range76–77 GHz76–77 GHz76–77 GHz
Detection range0.36–250 m0.36–160 m0.36–80 m
Measuring range6° (200 m)6° (160 m)5° (70 m)
10° (100 m)9° (100 m)75° (close range)
15° (30 m)10° (60 m)-
20° (5 m)25° (36 m)-
-42° (12 m)-
Measuring accuracy: distance±0.12 m±0.12 m±0.12 m
Measuring accuracy: speed0.11 m/s0.11 m/s0.14 m/s
Measuring accuracy: angle±0.3°±0.3°±0.8°
Object separation: distance0.72 m0.72 m0.72 m
Object separation: speed0.4 m/s0.66 m/s1.4 m/s
Object separation: angle
Maxim of detected objects243232
Cycle time60 ms60 ms60 ms
Table 8. Matrix of the safety criteria/performance criteria.
Table 8. Matrix of the safety criteria/performance criteria.
Criteria/StagesSafety LevelCybersecuritySafety AssessmentSafe OperationOperational DomainTraffic BehaviorManually AssistedPartial AutomationConditional AutomationHigh AutomationFull Automation
Safety criteriaLocating the vehicle
Object detection/classification
Predicting the behavior of objects
Path planning
Movement control
Interactions with other vehicles
Performance criteriaMaximum performance (full automation)
High performance (high automation)
Low performance (conditional automation)
Minimum performance (partial automation)
Rated performance (assisted manual)
Minimum performance (manual)
Blue color represents the active safety criteria; Yellow color represents the active performance criteria.
Table 9. Cybersecurity control mechanisms.
Table 9. Cybersecurity control mechanisms.
Security ObjectivesOnline SecuritySecurity at the Vehicle LevelSecurity at Component Level
IntegrityManagement of the integrity of access rightsSecure communications, functional separation of critical systemsData integrity control
LoginControlling access to the platform, securing communicationsCodes for authenticationSecure initialization sequences with public keys
ProtectionDetection and reaction mechanisms in case of attacks and intrusionsGateway/router data flow controlLimit network access points
ConfidentialityControl of access to databasesData stream encryption, TSL, IP securityData encryption, secure media

Share and Cite

MDPI and ACS Style

Iclodean, C.; Cordos, N.; Varga, B.O. Autonomous Shuttle Bus for Public Transportation: A Review. Energies 2020, 13, 2917. https://doi.org/10.3390/en13112917

AMA Style

Iclodean C, Cordos N, Varga BO. Autonomous Shuttle Bus for Public Transportation: A Review. Energies. 2020; 13(11):2917. https://doi.org/10.3390/en13112917

Chicago/Turabian Style

Iclodean, Calin, Nicolae Cordos, and Bogdan Ovidiu Varga. 2020. "Autonomous Shuttle Bus for Public Transportation: A Review" Energies 13, no. 11: 2917. https://doi.org/10.3390/en13112917

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop