Difference between revisions of "Autonomous System for Ground Transport"

From MIT Technology Roadmapping
Jump to navigation Jump to search
(Created page with "''Note: This sample technology roadmap is taken from the example provided in Chapter 8 of the book. For the full discussion, see the Chapter. This page is meant only as an exa...")
 
 
(168 intermediate revisions by 3 users not shown)
Line 1: Line 1:
''Note: This sample technology roadmap is taken from the example provided in Chapter 8 of the book. For the full discussion, see the Chapter. This page is meant only as an example of how the sections of your team's technology roadmap should be organized and formatted on this XLP. ''


=Technology Roadmap Sections and Deliverables=
=Technology Roadmap Sections and Deliverables=


The first point is that each technology roadmap should have a clear and unique identifier:  
The first point is that each technology roadmap should have a clear and unique identifier:  
* '''2SEA - Solar Electric Aircraft'''
* '''2ASGT - Autonomous System for Ground Transport'''
This indicates that we are dealing with a “level 2” roadmap at the product level (see Fig. 8-5), where “level 1” would indicate a market level roadmap and “level 3” or “level 4” would indicate an individual technology roadmap.
This indicates that we are dealing with a “level 2” roadmap at the product or system level, where “level 1” would indicate a social level roadmap and “level 3” or “level 4” would indicate an individual technology roadmap.


==Roadmap Overview==
==Roadmap Overview==


The working principle and architecture of solar-electric aircraft is depicted in the below.
The overview and working principle of autonomous ground transport is depicted below:


[[File:Section 1.JPG]]
[[File:Roadmap Overview.png]]


Solar-electric aircraft are built from light-weight materials such as wood or carbon-fiber reinforced polymers (CFRP) and harvest solar energy through the photoelectric effect by bonding thin film solar cells to the surface of the main wings, and potentially the fuselage and empennage as well. The electrical energy harvested during the day is then stored in on-board chemical batteries (e.g. Lithium-Ion, Lithium-Sulfur etc…) and used for propelling the aircraft at all times, including at night. For the system to work there needs to be an overproduction of energy during the day, so that the aircraft can use the stored energy to stay aloft at night. The flight altitude of about 60,000-70,000 feet is critical to stay above the clouds and not to interfere with commercial air traffic. Depending on the length of day, i.e. the diurnal cycle which determines the number of sunshine hours per day, which itself depends on the latitude and time-of-year (seasonality) the problem is easier or harder. The reference case in the technology roadmap is an equatorial mission (latitude = zero) with 12 hours of day and 12 hours of night.
Autonomous transport has four main components, namely (1)Perception, (2)Localization, (3)Planning, and (4)Control.
The four components work together as depicted in the overview to enable the autonomous capabilities in the transport system.
 
(1) Perception
The data from sensors (radars, lidars, cameras etc) are integrated to build a comprehensive and detailed understanding of the vehicle’s environment
 
(2) Localization
GPS and algorithms are employed to determine the location of the vehicle relative to its surrounding. It is critical for the accuracy to be within the order of centimeters to ensure that the vehicle stay on the road.
 
(3) Planning
With the understanding of the environment and the vehicle's relative location within, a transport route can be planned to get to the desired destination. This involves predicting the behavior of other entities (other vehicles, pedestrians etc) in the immediate proximity followed by deciding the appropriate actions to be taken in response to them. Lastly, a route is developed to reach the destination within required conditions (safety, comfort etc)
 
(4) Control
The planned route is passed onto the vehicle. In this execution phase, the route is translated into control instructions for the vehicle to turn the steering wheel, hit the accelerator or the brake etc.  
 
The autonomous transport technology is intended to bring about improvements in mainly safety and mobility.
The number of fatalities in motor incidents are significant each year and autonomous vehicles could potentially reduce that number with the use of software that are less error-prone or less susceptible to distractions than humans. At the same time, autonomous technology can offer mobility to disable or elderly individuals where it is needed the most.
 
==History of Autonomous System for Ground Transport==
The first conceptual design was proposed by Leonardo da Vinci in 1478, almost five centuries before pervasive adoption of automobiles. It was technically, however, not the type of autonomous vehicle we refer to at present because the concept focused more on self propelled vehicle with mechanical spring system moving on fixed and predefined routes. We argue that the first ground transport vehicle was the Houdina radio control car built and tested in 1925 as no driver was required to be present in the vehicle itself, despite the fact that the car was indeed remotely controlled by a driver. The major milestones together with their relevant information are listed below.
 
[[File:HistoryASGT.png|800 px]]
 
We charted the technology readiness level of the major milestones over time below. It is worthwhile to point out that the emergence of internet in 1970s and internet of things in 2010s have dramatically improved the development of autonomous technologies and enabled the inclusion of additional technologies into the autonomous system for ground transport.
 
[[File:TRLASGT.jpg|800 px]]


==Design Structure Matrix (DSM) Allocation==
==Design Structure Matrix (DSM) Allocation==
The autonomous system for ground transport technology sits at level 2 abstraction and is a specification of ground transport at level 1. It can be classified into 5 different level of autonomy.


[[File:Section 2.JPG]]
[[File:Autonomy Levels.png|600 px]]


The 2-SEA tree that we can extract from the DSM above shows us that the Solar-Electric Aircraft (2SEA) is part of a larger company-wide initiative on electrification of flight (1ELE), and that it requires the following key enabling technologies at the subsystem level: 3CFP Carbon Fiber Polymers, 3HEP Hybrid Electric Propulsion and 3EPS Non-Propulsive Energy Management (e.g. this includes the management of the charge-discharge cycle of the batteries during the day-night cycle). In turn these require enabling technologies at level 4, the technology component level: 4CMP components made from CFRP (spars, wing box, fairings …), 4EMT electric machines (motors and generators), 4ENS energy sources (such as thin film photovoltaics bonded to flight surfaces) and 4STO (energy storage in the form of lithium-type batteries).
Each level of autonomous system requires a certain number of enabling technologies such as sensors and cameras, which are the level 4 abstraction in the system architecture hierarchy. Going further, it could be insightful for roadmap construction to further decompose the level 4 systems into more specific types of technologies.  
 
 
[[File:DSMTree.png|600 px]][[File:DSM Allocation.png|600 px]]
 
The Object-Process-Language (OPL) description complements the second level autonomous system for ground transport (2ASGT) tree.
 
[[File:DSMOPL.png|1200 px]]


==Roadmap Model using OPM==
==Roadmap Model using OPM==
We provide an Object-Process-Diagram (OPD) of the 2SEA roadmap in the figure below. This diagram captures the main object of the roadmap (Solar-Electric Aircraft), its various instances including main competitors, its decomposition into subsystems (wing, battery, e-motor …), its characterization by Figures of Merit (FOMs) as well as the main processes (Flying, Recharging).
We provide an Object-Process-Diagram (OPD) of the 2ASGT roadmap in the figure below. This diagram captures the main object of the roadmap (Autonomous System for Ground Transport), its 2-level decomposition into enabling systems (Positioning and Sensing Technologies, Data Storage and Transmission Technologies, Computation and Control Technologies), its characterization by Figures of Merit (FOMs) as well as the main processes and other objects it interacts with.
[[File:ASGTOPM.jpeg|1000 px]]


[[File:Section 3.JPG]]
An Object-Process-Language (OPL) description of the roadmap scope is auto-generated and given below. It reflects the same content as the previous figure, but in a formal natural language.
[[File:ASGTOPL.jpg|1000 px]]


An Object-Process-Language (OPL) description of the roadmap scope is auto-generated and given below. It reflects the same content as the previous figure, but in a formal natural language.  
==Figures of Merit==
Four main figures of merit, in terms of reliability, safety, compatibility, and cost are used to quantify the performance and utility level of the autonomous system for ground transport.  
[[File:FOM_Autonomous_Ground_Transport.png|800 px]]


[[File:Section 3_2.JPG]]
For this roadmap, we will narrow down our focus on Reliability, measured by miles per intervention. the reliability FOM can be decomposed into component technologies which include perception technology, enabled by sensors. We will over the governing equations for radar sensors in later section of this roadmap.  


==Figures of Merit==
The table below show a list of FOMs by which solar electric aircraft can be assessed. The first four (shown in bold) are used to assess the aircraft itself. They are very similar to the FOMs that are used to compare traditional aircraft which are propelled by fossil fuels, the big difference being that 2SEA is essentially emissions free during flight operations. The other rows represent subordinated FOMs which impact the performance and cost of solar electric aircraft but are provided as outputs (primary FOMs) from lower level roadmaps at level 3 or level 4, see the DSM above.


[[File:Section 4_.JPG]]
[[File:FOM Trends.png|800 px]]


Besides defining what the FOMs are, this section of the roadmap should also contain the FOM trends over time dFOM/dt as well as some of the key governing equations that underpin the technology. These governing equations can be derived from physics (or chemistry, biology ..) or they can be empirically derived from a multivariate regression model. The table below shows an example of a key governing equation governing (solar-) electric aircraft.
The performance data were extracted from the disengagement report from Californa's Department of Motor Vehicles. The data here were reported by the indicated companies from their testing of autonomous vehicles in a bounded test environment of public roads in California. [ref https://thelastdriverlicenseholder.com/2019/02/13/update-disengagement-reports-2018-final-results/]


[[File:Section 4_2.JPG]]
Considering the average logarithmic performance values of the FOM, the average rate of improvement from 2015 to 2016 is about 8% (dFOM/dt).


==Alignment with Company Strategic Drivers==
==Alignment with Company Strategic Drivers==
The table below shows an example of potential strategic drivers and alignment of the 2SEA technology roadmap with it.
The table below shows the potential strategic drivers and their respective alignment with the perception system for autonomous system for ground transport technology roadmap.


[[File:Section 5.JPG]]
[[File:StrategicDriver.png|800 px]]


The list of drivers shows that the company views HAPS as a potential new business and wants to develop it as a commercially viable (for profit) business (1). In order to do so, the technology roadmap performs some analysis - using the governing equations in the previous section - and formulates a set of FOM targets that state that such a UAV needs to achieve an endurance of 500 days (as opposed to the world record 26 days that was demonstrated in 2018) and should be able to carry a payload of 10 kg. The roadmap confirms that it is aligned with this driver. This means that the analysis, technology targets, and R&D projects contained in the roadmap (and hopefully funded by the R&D budget) support the strategic ambition stated by driver 1. The second driver, however, which is to use the HAPS program as a platform for developing an autonomy stack for both UAVs and satellites, is not currently aligned with the roadmap.
As shown the table above, this technology roadmap should target to achieve the strategic driver 1 and 2. Driver 1 yields a perception system that is capable of helping vehicles to accomplish full autonomy under urban and suburban operational environments. This requires the inclusion of more advanced perception algorithms such as sensor fusion and neuromorphic sensing, which would also improve the reliability and accuracy of the autonomous system for ground transport. Driver 2 asks for a perception system that is customized for and compatible with the transportation mode of trucking.


==Positioning of Company vs. Competition==
==Positioning of Company vs. Competition==
The figure below shows a summary of other electric and solar-electric aircraft from public data.
The Autonomous system for ground transport is a growing industry still in its youth, and like many other technologies, it is an ecosystem that involves various clusters and sectors of enabling technologies. Different companies take up their own market share by providing the industry with certain number of forms and/or functions. It is not at all an exaggeration to say that any company in the technology sector has something to do with the autonomous driving ecosystem.
 
[[File:AVCompanies.png|1000 px]]
 
(Source: https://acceleratingbiz.com/proof-point/autonomous-connected-vehicle-ecosystem/)
 
Within the automobile industry, functions pertaining to autonomous driving have been gradually implemented onto the high end vehicles, as shown in the exhibit below on the left, whose customers are able and willing to pay a premium for these advanced yet developing features. It is a trend that autonomous system would be permeating into a wider range of automobiles especially in trucks, buses, as well as the middle to low end passenger vehicles.


[[File:Section 6.JPG]]
Here we focus on the perception system configurations of products and projects related to autonomous driving development, as presented in the figure below on the right. One observation is that choices regarding types of sensor have been mainly converging into vision, LIDAR, RADAR, and Sonar. This pattern initiated and grounded our technical model construction of the perception system within the autonomous system for ground transport as we concentrate our research and analysis on these four types of sensors.  


The aerobatic aircraft Extra 330LE by Siemens currently has the world record for the most powerful flight certified electric motor (260kW). The Pipistrel Alpha Electro is a small electric training aircraft which is not solar powered, but is in serial production. The Zephyr 7 is the previous version of Zephyr which established the prior endurance world record for solar-electric aircraft (14 days) in 2010. The Solar Impulse 2 was a single-piloted solar-powered aircraft that circumnavigated the globe in 2015-2016 in 17 stages, the longest being the one from Japan to Hawaii (118 hours).  
Our product aims to equip trucks with platooning and other autonomous driving capabilities via advanced perception and cross vehicle coordination systems.


SolarEagle  and Solara 50 were both very ambitious projects that aimed to launch solar-electric aircraft with very aggressive targets (endurace up to 5 years) and payloads up to 450 kg. Both of these projects were canceled prematurely. Why is that?
[[File:AVFuntions.jpg|600 px]][[File:AVSensors.png|600 px]]


[[File:Section 6_2.JPG]]
(Source: https://reader.elsevier.com/reader/sd/pii/S0968090X18302134?token=010ACE9CE9AEE89B64AFE4376F797FC66676FA3CF5199A0AEA0116E84EAD8203EA5138F8B80C9602C0913F07F95964B2)


The Pareto Front (see Chapter 5, Figure 5-20 for a definition) shown in black in the lower left corner of the graph shows the best tradeoff between endurance and payload for actually achieved electric flights by 2017. The Airbus Zephyr, Solar Impulse 2 and Pipistrel Alpha Electro all have flight records that anchor their position on this FOM chart. It is interesting to note that Solar Impulse 2 overheated its battery pack during its longest leg in 2015-2016 and therefore pushed the limits of battery technology available at that time.  We can now see that both Solar Eagle in the upper right and Solara 50 were chasing FOM targets that were unachievable with the technology available at that time. The progression of the Pareto front shown in red corresponds to what might be a realistic Pareto Front progression by 2020. Airbus Zephyr Next Generation (NG) has already shown with its world record (624 hours endurance) that the upper left target (low payload mass - about 5-10 kg and high endurance of 600+ hours) is feasible. There are currently no plans for a Solar Impulse 3, which could be a non-stop solar-electric circumnavigation with one pilot (and an autonomous co-pilot) which would require a non-stop flight of about 450 hours. A next generation E-Fan aircraft with an endurance of about 2.5 hours (all electric) also seems within reach for 2020. Then in green we set a potentially more ambitious target Pareto Front for 2030. This is the ambition of the 2SEA technology roadmap as expressed by strategic driver 1. We see that in the upper left the Solara 50 project which was started by Titan Aerospace, then acquired by Google, then cancelled, and which ran from about 2013-2017 had the right targets for about a 2030 Entry-into-Service (EIS), not for 2020 or sooner. The target set by Solar Eagle was even more utopian and may not be achievable before 2050 according to the 2SEA roadmap.
One drawback of our study on company position and competition, unfortunately, is that most of the autonomous driving technology companies did not publish the parameters and specifications of the sensors they were employing, which makes it difficult to quantify the perception systems available. From a more academic perspective, the majority of publications and research on design and building perception systems were based on either experimental or empirical approaches, making it hard to plot a Pareto front of state of art of the perception systems and their embedded sensor technologies.


==Technical Model==
==Technical Model==
In order to assess the feasibility of technical (and financial) targets at the level of the 2SEA roadmap it is necessary to develop a technical model. The purpose of such a model is to explore the design tradespace and establish what are the active constraints in the system. The first step can be to establish a morphological matrix that shows the main technology selection alternatives that exist at the first level of decomposition, see the figure below.
In order to accomplish autonomous driving, the perception system is crucial and functions like the eyes and ears of the vehicle. It is the perception system that acquires relevant information and enables the driver assistance functions such as cruise control, automatic braking, lane change assist, traffic signage recognition, and object detection. Together, these automated functions compose an autonomous driving system.
 
This technology roadmap concentrates on different types of sensors, which are the most significant enabling components, employed in the perception system. Four most common types of sensors and their respective properties are listed in the table below.
 
[[File:Sensor Tables.png]]
 
One of the metrics mattered for different types of sensor is range, which represents detection coverage.  
 
The analytical expression for radar detection range and normalized technological sensitivity is as shown
 
[[File:Radar_Range_2.png]]
 
From the sensitivity analysis plot, we see that the various parameters in the governing equation has the same sensitivity for detection range. However, to improve detection range performance in the context of autonomous vehicle perception system, we may choose to focus on reducing Smin, to reduce the detectable signal threshold and/or improving the effective aperture of the radar antenna. These improvements could be potentially achieved through waveform design or signal processing methods and would have minimal impact on the rest of the autonomous system components (as compared to increasing power output).  
 
The following table presents the morphological matrices of range or various types of sensors
 
[[File:Morphological matrix for detection range.png]]
 
Another metric outstanding is the range resolution, which represents the measurable smallest difference in distance and determines whether or not two objects can be separated or perceived as one.
 
The analytical expression for radar resolution and normalized technological sensitivity is as shown
 
[[File:RadarResolution.jpg]]
 
To improve radar resolution, signal processing techniques such as pulse compression could be employed with minimal impact to the autonomous system.
 
The following table presents the morphological matrices of range resolution for various types of sensors
 
[[File:Morphological matrix for range resolution.png]]
 
The architectural performance index approach was adopted to quantify the perception system performance of the autonomous system for ground transport as given below
 
[[File:API.jpg]]


[[File:Section 7_.JPG]]
APIs are computed for range and resolution respectively for the performance analysis.


It is interesting to note that the architecture and technology selections for the three aircraft (Zephyr, Solar Impulse 2 and E-Fan 2.0) are quite different. While Zephyr uses lithium-sulfur batteries, the other two use the more conventional lithium-ion batteries. Solar Impulse uses the less efficient (but more affordable) single cell silicon-based PV, while Zephyr uses specially manufactured thin film multi-junction cells and so forth.
Making a tradeoff between range and range resolution is oftentimes a challenge when selecting sensors to form a viable perception system for the autonomous system for ground transport. The performance of 60 sensor configurations from the morphological matrices and the Pareto Front is as shown below.


The technical model centers on the E-range and E-endurance equations and compares different aircraft sizing (e.g. wing span, engine power, battery capacity) taking into account aerodynamics, weights and balance, the performance of the aircraft and also its manufacturing cost. It is important to use Multidisciplinary Design Optimization (MDO) when selecting and sizing technologies in order to get the most out of them and compare them fairly (see below).
[[File:Pareto Front.png|1000 px]]


[[File:Section 7_2.JPG]]
The (0,0) point represents the least favorable combination of sensors while the two indicated sensor combinations consist of the state of the art technologies in perception system.


==Financial Model==
==Financial Model==
The figure below contains a sample NPV analysis underlying the 2SEA roadmap. It shows the non-recurring cost (NRC) of the product development project (PDP), which includes the R&D expenditures as negative numbers. A ramp up-period of 4 years is planned with a flat revenue plateau (of 400 million per year) and a total program duration of 24 years.
Based on our technology assessment and market review, we've decided as a near term strategy milestone, to focus on the truck transportation industry. There has been increasing demand for truck drivers (about 60000 driver shortage in 2018 and an estimated 1.1 million new drivers over the next decade) due to the significant increase in freight volume.  
It is therefore imperative to enable new drivers to get up to speed within a short period of time. Driver assistance technology is a probable solution to bring the learning curve from 20 years of experience down to 2 years, hence lowering the threshold for bringing in new drivers. At the same time, technology requirements for highway use cases are considerably less stringent than urban use cases, with longer distance sensing and higher sensing resolution being the key technology requirements.
 
Given the considerations, we developed a financial model for the near term strategy with the following assumptions


[[File:Section 8.JPG]]
# Existing development and manufacturing infrastructure is sufficient, no additional capital investment is required.<br>
# R&D investment: '''$30M per year for first 3 years''' (Mainly on sensors range and resolution performance, cross-platform perception communication and integration across range of truck platforms)
# Ramping up duration: '''5 years'''
# Total program duration: '''22 years''' (in line with estimated time frame of next evolution in autonomous technology)
# Total Initial Market Demand: '''3500000 trucks''' (To be conservative, only class 8 trucks were considered. In service '''class 8''' truck is estimated to be at 3.5 million in 2018 with a annual growth of 5% due from retail sales)
# Market Demand Growth Rate: ''' 2%''' (conservative assumption)
# Target Annual Output Capacity: '''200000''' systems (initial design capacity, with flexibility to increase or decrease based on actual sales)
# Revenue per system: '''$10000''' for sensor hardware and manpower (sensor prices, including LIDARS etc, are expected to be lowered to the range of $5-7000 or even lower by 2025 due to technology development and high volume of demand) [ref : https://velodynelidar.com/newsroom/velodyne-lidar-announces-new-velarray-lidar-sensor/]
# Cost per system: '''$8000''' (assumed 25% margin)
# Recurring revenue: '''$1000''' per system (maintenance and software upgrades)
# Discount rate: '''15%''' (based on industry norm)
 
[[File:DCF.png|600 px]]
[[File:Future Cashflow Forecast.png|600 px]]


==List of R&T Projects and Prototypes==
==List of R&T Projects and Prototypes==
In order to select and prioritize R&D (R&T) projects we recommend using the technical and financial models developed as part of the roadmap to rank-order projects based on an objective set of criteria and analysis. The figure below illustrates how technical models are used to make technology project selections, e.g based on the previously stated 2030 target performance and Figure 8-17 (see the Chapter 8 of the text) shows the outcome if none of the three potential projects are selected.


[[File:Section 9.JPG]]
According to our technical and financial models, several goals and milestones emerged for the future development trend regarding the autonomous system for ground transport. These development opportunities go along with the figures of merit discussed above and thus can be further materialized into various projects. Below is a list of the potential projects and their respective categorization, along with their individual risk level and priority. The level of priority is evaluated based on how soon the technology is required along the development timeline as well as the relative maturity level. That is to say, a relatively matured technology (less development work required to realize it) which is needed the soonest, will be of high priority. The level of risk is evaluated based on the level of disruptiveness the project would bring to the current technology.
 
[[File:ListofProjectsTeam11.png|800 px]]
 
Based on our near-term milestone of bringing the autonomous driver assistance product to the trucking industry, we suggested that project number 5 and 6, which would eventually enable cross vehicle communication and increase transport safety, have the the highest priority among the listed projects.
 
Project number 2, which is to develop sensors with higher resolution in order to increase the reliability of the perception system within the autonomous system for ground transport, should also be implemented. If carried out properly with milestones achieved on schedule, this project would result in a shift of the Pareto front as shown below, which leads to an improvement in the reliability related FOM.
 
[[File:ParetoShiftTeam11.png|1000 px]]
 
The relevant R&D and R&T projects are to be completed in three phases, development, production, and operation. The sensor development would be conducted both internally and externally in order to pursue a higher component performance. The vehicle perception coordination system, as the core technical competency of the company, would be researched and produced mostly internally. Control system redundancy feature would be kicked off after the two aforementioned system are practically verified and validated.
 
[[File:ProjectGanttTeam11.png|1000 px]]


==Key Publications, Presentations and Patents==
==Key Publications, Presentations and Patents==
A good technology roadmap should contain a comprehensive list of publications, presentations and key patents as shown in Figure 8-19. This includes literature trends, papers published at key conferences and in the trade literature and trade press.
The latest three decades witnessed the emergence of autonomous vehicle development. Despite the fact that bias might exist that more recent publication got archived digitally, the number of publications pertaining to autonomous vehicle has been increasing according to the Web of Science database as shown in the exhibit below.
[[File:AVPublications.png|1000 px]]
 
(Source: https://reader.elsevier.com/reader/sd/pii/S0968090X18302134?token=010ACE9CE9AEE89B64AFE4376F797FC66676FA3CF5199A0AEA0116E84EAD8203EA5138F8B80C9602C0913F07F95964B2)
 
We performed a search for publications and patents relevant to the perception system on the autonomous system for ground transport. These research serve as technological guidance and approach reference for our technology roadmap.
 
'''Publications:'''
------
Title: Autonomous driving in urban environments: Boss and the Urban Challenge
 
URL: https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.20255
 
Key Words: Motion Planning, Perception, Mission Planning, Behavioral Reasoning
 
Description: This journal paper introduces a three-layer autonomous driving planning system using an autonomous vehicle called Boss, which won the 2007
DARPA Urban Challenge, as an example. Its autonomous system on board comprises of mission, behavior, and motion planning systems. The paper presents the mathematical foundation as well as the development process for the autonomous system that could be generally applied to other similar systems.
------
Title: Are we ready for autonomous driving? The KITTI vision benchmark suite
 
URL: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6248074
 
Key Words: Visual Sensors Benchmark
 
Description: This publication proposes a benchmark for visual sensing and object recognition on the autonomous driving system via an experimental approach. Stereo camera and LIDAR are the two main types of sensors incorporated into the tested system. Data acquisition, sensor calibration, and related evaluation metrics for the system benchmark are discussed to form a suggested framework for visual sensors benchmark.
------
Title: A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application
 
URL: https://www.mdpi.com/1424-8220/17/10/2359


[[File:Section 10 1.JPG]]
Key Words: Low Cost Sensors for Localization and Autonomous Driving
 
Description: This paper presents a low cost sensors approach for autonomous driving system which yields accurate vehicle localization and viable performance. The proposed design is mainly based on a camera and a computer vision algorithm, which contributes to its low cost feature. This is an autonomous system for ground transport that excludes LIDAR and RADAR which could be an alternative to commonly used systems.
------
Title: A Review of Current Neuromorphic Approaches for Vision, Auditory, and Olfactory Sensors
 
URL: https://www.frontiersin.org/articles/10.3389/fnins.2016.00115/full
 
Key Words: Neuromorphic Approaches for Vision Sensors
 
Description: Conventional vision, auditory, and olfactory sensors generate large volumes of redundant data and as a result tend to consume excessive power. To address these shortcomings, neuromorphic sensors have been developed. This paper reviews the current state-of-the-art in neuromorphic implementation of vision, auditory, and olfactory sensors and identify key contributions across these fields. Bringing together these key contributions, the paper further suggests a future research direction for further development of the neuromorphic sensing field.
 
'''Patents:'''
------
Title: Autonomous driving sensing system and method
 
URL: https://patents.google.com/patent/US9720411B2/en
 
Key Words: Autonomous Vehicle Sensing and Control
 
Description: This patent illustrates the decision process of the perception system on an autonomous vehicle from an architectural perspective. It presents the process flow of action taken by the autonomous system based on environmental conditions detected by the sensors.
------
Title: Cross-validating sensors of an autonomous vehicle
 
URL: https://patents.google.com/patent/US9555740
 
Key Words: Cross-validating sensors
 
Description: This patent describes a cross validation system for sensors on autonomous vehicles created by Google. Data and information gathered by two sensors could be validated with each other in order to improve the functionality and reliability of the perception system. The two sensors do not have to be the same type of sensor. This is an important consideration for type and quantity of sensor adoption when designing a perception system.
------
Title: Modifying Behavior of Autonomous Vehicle Based on Advanced Predicted Behavior Analysis of Nearby Drivers
 
URL: https://patents.google.com/patent/US20180061237A1/en
 
Key Words: Modifying Behavior of Autonomous Vehicle
 
Description: This patent describes a system that assesses one or more features of drivers within a threshold distance of a self-driving vehicle using sensors. Based on the assessment, the system predicts the corresponding behavior of the respective vehicles to serve as feedback to the self-driving vehicle. Subsequent changes in the assessment can alert the self-driving vehicle to change course and the way it monitors data.


==Technology Strategy Statement==
==Technology Strategy Statement==
A technology roadmap should conclude and be summarized by both a written statement that summarizes the technology strategy coming out of the roadmap as well as a graphic that shows the key R&D investments, targets and a vision for this technology (and associated product or service) over time. For the 2SEA roadmap the statement could read as follows:
'''Our long term target is to develop a fully autonomous system for all ground transport systems by 2050 with a reliability performance of 1 intervention per 10<sup>15</sup> miles . This final target requires significant technological breakthroughs along all broad key FOMs (Safety, Compatibility, Reliability and Cost). As such, we will focus on near term development milestones as part of our long term technology strategy.
 
In this key near term milestone, we target to develop advance driver-assistance system (ADAS) for the truck transportation industry, targeting both single and platoon truck transportation. This milestone has a realistic timeline with a viable business model. Through this milestone, we will be able to establish an operational test bed towards advanced autonomy while generating income for subsequent R&D. For this milestone, we will invest in two R&D projects. The first project will look at developing perception system across vehicles to enable platooning movements for the trucks. The second project will look at incorporating redundancies in autonomy control units. These R&D projects involve relatively mature technology components and will enable us to reach our goal by 2023 to begin integrating the technology onto truck fleets.
'''
 


'''Our target is to develop a new solar-powered and electrically-driven UAV as a HAPS service platform with an Entry-into-Service date of 2030. To achieve the target of an endurance of 500 days and useful payload of 10 kg we will invest in two R&D projects. The first is a flight demonstrator with a first flight by 2027 to demonstrate a full-year aloft (365 days) at an equatorial latitude with a payload of 10 kg. The second project is an accelerated development of Li-S batteries with our partner XYZ with a target lifetime performance of 500 charge-discharge cycles by 2027. This is an enabling technology to reach our 2030 technical and business targets.'''
[[File:Swoosh chart.png|800 px]]

Latest revision as of 04:27, 11 December 2019

Technology Roadmap Sections and Deliverables

The first point is that each technology roadmap should have a clear and unique identifier:

  • 2ASGT - Autonomous System for Ground Transport

This indicates that we are dealing with a “level 2” roadmap at the product or system level, where “level 1” would indicate a social level roadmap and “level 3” or “level 4” would indicate an individual technology roadmap.

Roadmap Overview

The overview and working principle of autonomous ground transport is depicted below:

Roadmap Overview.png

Autonomous transport has four main components, namely (1)Perception, (2)Localization, (3)Planning, and (4)Control. The four components work together as depicted in the overview to enable the autonomous capabilities in the transport system.

(1) Perception The data from sensors (radars, lidars, cameras etc) are integrated to build a comprehensive and detailed understanding of the vehicle’s environment

(2) Localization GPS and algorithms are employed to determine the location of the vehicle relative to its surrounding. It is critical for the accuracy to be within the order of centimeters to ensure that the vehicle stay on the road.

(3) Planning With the understanding of the environment and the vehicle's relative location within, a transport route can be planned to get to the desired destination. This involves predicting the behavior of other entities (other vehicles, pedestrians etc) in the immediate proximity followed by deciding the appropriate actions to be taken in response to them. Lastly, a route is developed to reach the destination within required conditions (safety, comfort etc)

(4) Control The planned route is passed onto the vehicle. In this execution phase, the route is translated into control instructions for the vehicle to turn the steering wheel, hit the accelerator or the brake etc.

The autonomous transport technology is intended to bring about improvements in mainly safety and mobility. The number of fatalities in motor incidents are significant each year and autonomous vehicles could potentially reduce that number with the use of software that are less error-prone or less susceptible to distractions than humans. At the same time, autonomous technology can offer mobility to disable or elderly individuals where it is needed the most.

History of Autonomous System for Ground Transport

The first conceptual design was proposed by Leonardo da Vinci in 1478, almost five centuries before pervasive adoption of automobiles. It was technically, however, not the type of autonomous vehicle we refer to at present because the concept focused more on self propelled vehicle with mechanical spring system moving on fixed and predefined routes. We argue that the first ground transport vehicle was the Houdina radio control car built and tested in 1925 as no driver was required to be present in the vehicle itself, despite the fact that the car was indeed remotely controlled by a driver. The major milestones together with their relevant information are listed below.

HistoryASGT.png

We charted the technology readiness level of the major milestones over time below. It is worthwhile to point out that the emergence of internet in 1970s and internet of things in 2010s have dramatically improved the development of autonomous technologies and enabled the inclusion of additional technologies into the autonomous system for ground transport.

TRLASGT.jpg

Design Structure Matrix (DSM) Allocation

The autonomous system for ground transport technology sits at level 2 abstraction and is a specification of ground transport at level 1. It can be classified into 5 different level of autonomy.

Autonomy Levels.png

Each level of autonomous system requires a certain number of enabling technologies such as sensors and cameras, which are the level 4 abstraction in the system architecture hierarchy. Going further, it could be insightful for roadmap construction to further decompose the level 4 systems into more specific types of technologies.


DSMTree.pngDSM Allocation.png

The Object-Process-Language (OPL) description complements the second level autonomous system for ground transport (2ASGT) tree.

DSMOPL.png

Roadmap Model using OPM

We provide an Object-Process-Diagram (OPD) of the 2ASGT roadmap in the figure below. This diagram captures the main object of the roadmap (Autonomous System for Ground Transport), its 2-level decomposition into enabling systems (Positioning and Sensing Technologies, Data Storage and Transmission Technologies, Computation and Control Technologies), its characterization by Figures of Merit (FOMs) as well as the main processes and other objects it interacts with. ASGTOPM.jpeg

An Object-Process-Language (OPL) description of the roadmap scope is auto-generated and given below. It reflects the same content as the previous figure, but in a formal natural language. ASGTOPL.jpg

Figures of Merit

Four main figures of merit, in terms of reliability, safety, compatibility, and cost are used to quantify the performance and utility level of the autonomous system for ground transport. FOM Autonomous Ground Transport.png

For this roadmap, we will narrow down our focus on Reliability, measured by miles per intervention. the reliability FOM can be decomposed into component technologies which include perception technology, enabled by sensors. We will over the governing equations for radar sensors in later section of this roadmap.


FOM Trends.png

The performance data were extracted from the disengagement report from Californa's Department of Motor Vehicles. The data here were reported by the indicated companies from their testing of autonomous vehicles in a bounded test environment of public roads in California. [ref https://thelastdriverlicenseholder.com/2019/02/13/update-disengagement-reports-2018-final-results/]

Considering the average logarithmic performance values of the FOM, the average rate of improvement from 2015 to 2016 is about 8% (dFOM/dt).

Alignment with Company Strategic Drivers

The table below shows the potential strategic drivers and their respective alignment with the perception system for autonomous system for ground transport technology roadmap.

StrategicDriver.png

As shown the table above, this technology roadmap should target to achieve the strategic driver 1 and 2. Driver 1 yields a perception system that is capable of helping vehicles to accomplish full autonomy under urban and suburban operational environments. This requires the inclusion of more advanced perception algorithms such as sensor fusion and neuromorphic sensing, which would also improve the reliability and accuracy of the autonomous system for ground transport. Driver 2 asks for a perception system that is customized for and compatible with the transportation mode of trucking.

Positioning of Company vs. Competition

The Autonomous system for ground transport is a growing industry still in its youth, and like many other technologies, it is an ecosystem that involves various clusters and sectors of enabling technologies. Different companies take up their own market share by providing the industry with certain number of forms and/or functions. It is not at all an exaggeration to say that any company in the technology sector has something to do with the autonomous driving ecosystem.

AVCompanies.png

(Source: https://acceleratingbiz.com/proof-point/autonomous-connected-vehicle-ecosystem/)

Within the automobile industry, functions pertaining to autonomous driving have been gradually implemented onto the high end vehicles, as shown in the exhibit below on the left, whose customers are able and willing to pay a premium for these advanced yet developing features. It is a trend that autonomous system would be permeating into a wider range of automobiles especially in trucks, buses, as well as the middle to low end passenger vehicles.

Here we focus on the perception system configurations of products and projects related to autonomous driving development, as presented in the figure below on the right. One observation is that choices regarding types of sensor have been mainly converging into vision, LIDAR, RADAR, and Sonar. This pattern initiated and grounded our technical model construction of the perception system within the autonomous system for ground transport as we concentrate our research and analysis on these four types of sensors.

Our product aims to equip trucks with platooning and other autonomous driving capabilities via advanced perception and cross vehicle coordination systems.

AVFuntions.jpgAVSensors.png

(Source: https://reader.elsevier.com/reader/sd/pii/S0968090X18302134?token=010ACE9CE9AEE89B64AFE4376F797FC66676FA3CF5199A0AEA0116E84EAD8203EA5138F8B80C9602C0913F07F95964B2)

One drawback of our study on company position and competition, unfortunately, is that most of the autonomous driving technology companies did not publish the parameters and specifications of the sensors they were employing, which makes it difficult to quantify the perception systems available. From a more academic perspective, the majority of publications and research on design and building perception systems were based on either experimental or empirical approaches, making it hard to plot a Pareto front of state of art of the perception systems and their embedded sensor technologies.

Technical Model

In order to accomplish autonomous driving, the perception system is crucial and functions like the eyes and ears of the vehicle. It is the perception system that acquires relevant information and enables the driver assistance functions such as cruise control, automatic braking, lane change assist, traffic signage recognition, and object detection. Together, these automated functions compose an autonomous driving system.

This technology roadmap concentrates on different types of sensors, which are the most significant enabling components, employed in the perception system. Four most common types of sensors and their respective properties are listed in the table below.

Sensor Tables.png

One of the metrics mattered for different types of sensor is range, which represents detection coverage.

The analytical expression for radar detection range and normalized technological sensitivity is as shown

Radar Range 2.png

From the sensitivity analysis plot, we see that the various parameters in the governing equation has the same sensitivity for detection range. However, to improve detection range performance in the context of autonomous vehicle perception system, we may choose to focus on reducing Smin, to reduce the detectable signal threshold and/or improving the effective aperture of the radar antenna. These improvements could be potentially achieved through waveform design or signal processing methods and would have minimal impact on the rest of the autonomous system components (as compared to increasing power output).

The following table presents the morphological matrices of range or various types of sensors

Morphological matrix for detection range.png

Another metric outstanding is the range resolution, which represents the measurable smallest difference in distance and determines whether or not two objects can be separated or perceived as one.

The analytical expression for radar resolution and normalized technological sensitivity is as shown

RadarResolution.jpg

To improve radar resolution, signal processing techniques such as pulse compression could be employed with minimal impact to the autonomous system.

The following table presents the morphological matrices of range resolution for various types of sensors

Morphological matrix for range resolution.png

The architectural performance index approach was adopted to quantify the perception system performance of the autonomous system for ground transport as given below

API.jpg

APIs are computed for range and resolution respectively for the performance analysis.

Making a tradeoff between range and range resolution is oftentimes a challenge when selecting sensors to form a viable perception system for the autonomous system for ground transport. The performance of 60 sensor configurations from the morphological matrices and the Pareto Front is as shown below.

Pareto Front.png

The (0,0) point represents the least favorable combination of sensors while the two indicated sensor combinations consist of the state of the art technologies in perception system.

Financial Model

Based on our technology assessment and market review, we've decided as a near term strategy milestone, to focus on the truck transportation industry. There has been increasing demand for truck drivers (about 60000 driver shortage in 2018 and an estimated 1.1 million new drivers over the next decade) due to the significant increase in freight volume. It is therefore imperative to enable new drivers to get up to speed within a short period of time. Driver assistance technology is a probable solution to bring the learning curve from 20 years of experience down to 2 years, hence lowering the threshold for bringing in new drivers. At the same time, technology requirements for highway use cases are considerably less stringent than urban use cases, with longer distance sensing and higher sensing resolution being the key technology requirements.

Given the considerations, we developed a financial model for the near term strategy with the following assumptions

  1. Existing development and manufacturing infrastructure is sufficient, no additional capital investment is required.
  2. R&D investment: $30M per year for first 3 years (Mainly on sensors range and resolution performance, cross-platform perception communication and integration across range of truck platforms)
  3. Ramping up duration: 5 years
  4. Total program duration: 22 years (in line with estimated time frame of next evolution in autonomous technology)
  5. Total Initial Market Demand: 3500000 trucks (To be conservative, only class 8 trucks were considered. In service class 8 truck is estimated to be at 3.5 million in 2018 with a annual growth of 5% due from retail sales)
  6. Market Demand Growth Rate: 2% (conservative assumption)
  7. Target Annual Output Capacity: 200000 systems (initial design capacity, with flexibility to increase or decrease based on actual sales)
  8. Revenue per system: $10000 for sensor hardware and manpower (sensor prices, including LIDARS etc, are expected to be lowered to the range of $5-7000 or even lower by 2025 due to technology development and high volume of demand) [ref : https://velodynelidar.com/newsroom/velodyne-lidar-announces-new-velarray-lidar-sensor/]
  9. Cost per system: $8000 (assumed 25% margin)
  10. Recurring revenue: $1000 per system (maintenance and software upgrades)
  11. Discount rate: 15% (based on industry norm)

DCF.png Future Cashflow Forecast.png

List of R&T Projects and Prototypes

According to our technical and financial models, several goals and milestones emerged for the future development trend regarding the autonomous system for ground transport. These development opportunities go along with the figures of merit discussed above and thus can be further materialized into various projects. Below is a list of the potential projects and their respective categorization, along with their individual risk level and priority. The level of priority is evaluated based on how soon the technology is required along the development timeline as well as the relative maturity level. That is to say, a relatively matured technology (less development work required to realize it) which is needed the soonest, will be of high priority. The level of risk is evaluated based on the level of disruptiveness the project would bring to the current technology.

ListofProjectsTeam11.png

Based on our near-term milestone of bringing the autonomous driver assistance product to the trucking industry, we suggested that project number 5 and 6, which would eventually enable cross vehicle communication and increase transport safety, have the the highest priority among the listed projects.

Project number 2, which is to develop sensors with higher resolution in order to increase the reliability of the perception system within the autonomous system for ground transport, should also be implemented. If carried out properly with milestones achieved on schedule, this project would result in a shift of the Pareto front as shown below, which leads to an improvement in the reliability related FOM.

ParetoShiftTeam11.png

The relevant R&D and R&T projects are to be completed in three phases, development, production, and operation. The sensor development would be conducted both internally and externally in order to pursue a higher component performance. The vehicle perception coordination system, as the core technical competency of the company, would be researched and produced mostly internally. Control system redundancy feature would be kicked off after the two aforementioned system are practically verified and validated.

ProjectGanttTeam11.png

Key Publications, Presentations and Patents

The latest three decades witnessed the emergence of autonomous vehicle development. Despite the fact that bias might exist that more recent publication got archived digitally, the number of publications pertaining to autonomous vehicle has been increasing according to the Web of Science database as shown in the exhibit below. AVPublications.png

(Source: https://reader.elsevier.com/reader/sd/pii/S0968090X18302134?token=010ACE9CE9AEE89B64AFE4376F797FC66676FA3CF5199A0AEA0116E84EAD8203EA5138F8B80C9602C0913F07F95964B2)

We performed a search for publications and patents relevant to the perception system on the autonomous system for ground transport. These research serve as technological guidance and approach reference for our technology roadmap.

Publications:


Title: Autonomous driving in urban environments: Boss and the Urban Challenge

URL: https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.20255

Key Words: Motion Planning, Perception, Mission Planning, Behavioral Reasoning

Description: This journal paper introduces a three-layer autonomous driving planning system using an autonomous vehicle called Boss, which won the 2007 DARPA Urban Challenge, as an example. Its autonomous system on board comprises of mission, behavior, and motion planning systems. The paper presents the mathematical foundation as well as the development process for the autonomous system that could be generally applied to other similar systems.


Title: Are we ready for autonomous driving? The KITTI vision benchmark suite

URL: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6248074

Key Words: Visual Sensors Benchmark

Description: This publication proposes a benchmark for visual sensing and object recognition on the autonomous driving system via an experimental approach. Stereo camera and LIDAR are the two main types of sensors incorporated into the tested system. Data acquisition, sensor calibration, and related evaluation metrics for the system benchmark are discussed to form a suggested framework for visual sensors benchmark.


Title: A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application

URL: https://www.mdpi.com/1424-8220/17/10/2359

Key Words: Low Cost Sensors for Localization and Autonomous Driving

Description: This paper presents a low cost sensors approach for autonomous driving system which yields accurate vehicle localization and viable performance. The proposed design is mainly based on a camera and a computer vision algorithm, which contributes to its low cost feature. This is an autonomous system for ground transport that excludes LIDAR and RADAR which could be an alternative to commonly used systems.


Title: A Review of Current Neuromorphic Approaches for Vision, Auditory, and Olfactory Sensors

URL: https://www.frontiersin.org/articles/10.3389/fnins.2016.00115/full

Key Words: Neuromorphic Approaches for Vision Sensors

Description: Conventional vision, auditory, and olfactory sensors generate large volumes of redundant data and as a result tend to consume excessive power. To address these shortcomings, neuromorphic sensors have been developed. This paper reviews the current state-of-the-art in neuromorphic implementation of vision, auditory, and olfactory sensors and identify key contributions across these fields. Bringing together these key contributions, the paper further suggests a future research direction for further development of the neuromorphic sensing field.

Patents:


Title: Autonomous driving sensing system and method

URL: https://patents.google.com/patent/US9720411B2/en

Key Words: Autonomous Vehicle Sensing and Control

Description: This patent illustrates the decision process of the perception system on an autonomous vehicle from an architectural perspective. It presents the process flow of action taken by the autonomous system based on environmental conditions detected by the sensors.


Title: Cross-validating sensors of an autonomous vehicle

URL: https://patents.google.com/patent/US9555740

Key Words: Cross-validating sensors

Description: This patent describes a cross validation system for sensors on autonomous vehicles created by Google. Data and information gathered by two sensors could be validated with each other in order to improve the functionality and reliability of the perception system. The two sensors do not have to be the same type of sensor. This is an important consideration for type and quantity of sensor adoption when designing a perception system.


Title: Modifying Behavior of Autonomous Vehicle Based on Advanced Predicted Behavior Analysis of Nearby Drivers

URL: https://patents.google.com/patent/US20180061237A1/en

Key Words: Modifying Behavior of Autonomous Vehicle

Description: This patent describes a system that assesses one or more features of drivers within a threshold distance of a self-driving vehicle using sensors. Based on the assessment, the system predicts the corresponding behavior of the respective vehicles to serve as feedback to the self-driving vehicle. Subsequent changes in the assessment can alert the self-driving vehicle to change course and the way it monitors data.

Technology Strategy Statement

Our long term target is to develop a fully autonomous system for all ground transport systems by 2050 with a reliability performance of 1 intervention per 1015 miles . This final target requires significant technological breakthroughs along all broad key FOMs (Safety, Compatibility, Reliability and Cost). As such, we will focus on near term development milestones as part of our long term technology strategy.

In this key near term milestone, we target to develop advance driver-assistance system (ADAS) for the truck transportation industry, targeting both single and platoon truck transportation. This milestone has a realistic timeline with a viable business model. Through this milestone, we will be able to establish an operational test bed towards advanced autonomy while generating income for subsequent R&D. For this milestone, we will invest in two R&D projects. The first project will look at developing perception system across vehicles to enable platooning movements for the trucks. The second project will look at incorporating redundancies in autonomy control units. These R&D projects involve relatively mature technology components and will enable us to reach our goal by 2023 to begin integrating the technology onto truck fleets.


Swoosh chart.png