No longer driving your car yourself, but instead safely and comfortably letting it drive you – many car-owners have been dreaming of that for a long time, and so has the industry. From lane-keeping assistance, through distance measurement, cruise control and all the way to parking assistance, many small components of the future "autonomous drive" have already entered the mass market. Cameras and sensors have been doing all of the work so far, but are they enough to let a car drive completely on its own? To put it another way: Would you take your hands off the steering wheel today? Hopefully not. Although cars already have a very good "view" of their surroundings, they’re still missing the element of farsightedness and the panoramic view needed to drive autonomously: In good weather conditions they can only “see” up to 300 meters away. That’s enough for a full braking, but not for much more. How would the car find out about a traffic jam on the other side of a curve with no visibility? And when passing a truck, how would it detect a car in front that suddenly switches lanes?
The autonomous drive requires real-time communication
In street traffic, decisions have to be made within a split second. Therefore, we should expect the autonomously driving car of the future to react at least as fast as we can, and ideally even faster. This is only possible if the car receives all of the information it needs in real time – and real time means just a few milliseconds.
Take the traffic jam behind a curve with no visibility for example: The faster the traffic information from the cars that are already stuck inside it reaches the cars that are driving towards it, the better. At a driving speed of 100 km/h, the car travels a distance of almost 30 meters every second. To transport the information from one car to another, you need an app. In the current state of affairs, this app would be located in the cloud, in other words on servers in data centers. This means that the information, "Stop, there’s a traffic jam here!" would travel from the still-standing car, through the LTE mast and over the network to the data center where the app is located. After being processed there, it would travel back over the network and the LTE mast to the approaching car. The quantity of data transferred in this process would be smaller than in a Google search, which usually takes less than half a second. In that half of a second, however, the approaching car would already have traveled another 15 meters. This delay, also known as latency period, is actually too long to transfer the information from the still-standing car to the approaching one in real time.
Intelligence for the mobile radio mast
So-called "mobile-edge computing" is going to reduce this lapse of time significantly. It involves pushing a piece of the cloud out from the heart of the data networks towards the edge – right next to the LTE mast. The information will then just hop from the stopped car to the LTE mast, where it will be processed and directly sent back to the car coming up behind it.
The first tests carried out on the digital motorway test bed along the German A9 motorway had to prove this principle to be applicable in clearly defined, routine traffic situations. Mobile-edge computing allows the network to react in under 20 milliseconds. By way of comparison, our car traveling at 100 km/h would cover just about 60 centimeters in that amount of time, instead of the previous 15 meters.
It will still be a while before the first production cars are driving around with it, but the first field test has shown what an important role the networks will play in achieving the autonomous drive.
The following animations display the tests carried out at the A9.
Assisted braking:
Cooperative overtaking assistance system: