AI & 5G Convergence

5G And AI Convergence - The Autonomous Vehicle

We will not see the Autonomous Vehicle (AV) materialize as if out of thin air. There will need to be extensive legislative work to ensure safety and control as well as international standards bodies that create a common definition and interoperability across countries and continents. There will still be quite some time before this becomes a de facto norm.

What is less far away are the systems that can/will be used to implement the autonomous driving functionality. Already today we find a number of different functions implemented in various vehicles across the world, although most of the time they are currently labelled as “automobile safety applications”. An acronym that is commonly used is ADAS, which stands for Advanced Driver Assistance Systems, and very often this terminology is intermingled with the reference to the ADAS sensors, sometimes making discussions confusing. Reality is that ADAS require numerous sensors to be able to determine the situational status accurately and decide the correct actions.

This document will go through some of the more commonly used autonomous functions and the associated sensor use to see what future trends can be found. We will also investigate the co-existence with wireless and/or cellular technologies and see how the combination of wireless systems and ADAS sensors can make an autonomous vehicle possible.

Autonomous Vehicle Software System

Understanding the way all components need to work together also makes it easier to see how and why the various elements need to co-exist and co-operate. In an article called "Perception, Planning, Control, and Coordination for Autonomous Vehicles" ( the basic functional blocks are described. See the block diagram below.


adas function 600x400

There are three core categories that build up the software system in an autonomous vehicle: perception, planning and control.

ADAS sensors together with C-V2X information provide environmental information that is used by the system to extract relevant environment knowledge. This knowledge can then be shared with other connected elements (vehicles, infrastructure, pedestrians, cloud, etc.) via C-V2X. In this case we are, for example, talking about obstacle locations, road signs, lane information. This provides the vehicle with environmental perception and information about localization. This knowledge together with information provided wirelessly from a wider area (C-V2X) will allow the vehicle to take decisions regarding goals (mission planning); how these goals can be achieved (behaviour) and how to move to reach the goals (motion planning). The outcome of all of this information and knowledge creates various actions that will have an effect on the environment. Providing information about planned executions to receivers in the close proximity will allow for other units to take this into consideration in the respective unit's planning. The action commands are sent to the control unit, that will initate the relevant actuators to allow the vehicle to follow the plan of action.

ADAS Sensors Can Be Found Here

C-V2X Can Be Found Here