Consumer-viable autonomous vehicles (AVs) have evolved from far-out fantasy talk ten years ago to actual market opportunity with billions of dollars of investment backing by the world’s top companies. Yet for all the excitement and potential surrounding the capability to move people and things without a driver or pilot in the vehicle, there exists a serious and on-going concern about the most critical of all aspects of AV: safety. Every misstep by AV companies is scrutinized in great detail, rightfully so. And with safety concerns come trust issues. Many surveys indicate the majority of consumers do not yet trust autonomous vehicles for their own use.
Even if we are still in the early days of autonomous vehicles, it is clear we need to make the operation of them as human-like as possible. For now, co-existence and seamless interaction is a goal. This definitely requires that the machines we build emulate, and in some cases improve, how a human would perform in specific circumstances. Biology teaches us that the evolution of organisms is driven by a survival instinct, with safety being the main priority. As we move toward replacing humans driving cars we should look to biology as our guide, as nature has in fact equipped humans with one of the most evolved safety system of all.
Mobile data centers
Better safety in AVs will require new ways to collect, analyze and process information critical to the incident-free operation of a vehicle. This includes data from outside the vehicle, including the roads and infrastructure, other vehicles, traffic and weather conditions, as well as the functioning of the car itself. All of this must be considered to move the vehicle along safely. Such a scenario requires an organic transformation of the traditional car but also of its surroundings. To become fully functional and safe, AVs will need to perceive the environment, then negotiate and act seamlessly.
But the volume of information that is gathered is a problem in and of itself. A typical autonomous vehicle might contain a dozen or more sensing devices, if you count cameras, lidar, sonar and radar sensors.
The AV has become in essence a mobile data center handling tens of gigabits per second (i.e. the equivalent data volume generated by 10,000 internet users). More, and often times redundant, data for enhanced safety sounds like a good thing. However, all these sensing techniques are not informative or intelligent or smart per se. They are points of measurement triggered at fixed intervals in time which are uncorrelated with the dynamics of the scene. In fact, too much data is the bottleneck. And brute force processing to deal with the volume is neither practical nor efficient in this context.
If that critical data cannot be processed and acted up on in real-time to deliver actionable information, then safety is not possible.
Taking inspiration from evolution
This is how we can take a cue from biology to improve safety. From a vision standpoint, humans have evolved to use a decentralized perception capability. This makes integration of sensing input possible and facilitates a fast-feedback-control. Our fundamental thesis is that evolution provides us with an outstanding guide for how machines should view and interpret the world around them.
This type of approach is built on an engineering foundation known as neuromorphic engineering. It uses clues derived from the architecture and processing strategies of our brains to build a better, biologically-inspired approach to computer vision.
Underpinning this is the brain’s and eyes’ ability to very quickly interpret the vast amount of data available in the visual scene. More specifically, when it comes to limiting the amount of information actually needed, the photoreceptors in our eyes only report back to the brain when they detect changes in some features of the visual scene, such as its contrast or light. Evolutionary, it is far more important to be able to focus attention on movements within a scene than to take repeated, indiscriminate inventories of its every detail.
Speed is the key in safety critical situations. Recent research on human’s ability to recognize objects suggests that humans can gather useful information from a scene that is changing at rates of up to 1,000 times a second – a far higher rate than the 24, 30 or 60 frames/s that we use to represent movement on television or in movies. A huge amount of useful information is encoded in these changes, which conventional frame-rate cameras obscure due to their low sampling rates. In other words, between the frames, cameras are blind.
More and better cameras
The use of new bio-inspired technology paradigm could significantly reduce the risk of missing safety critical information that might have happened between frames, adding to the car a new first layer of safety cocoon. This new safety layer will have very tangible impact on the performance of autonomous vehicles and advanced driver-assistance systems, including reflex-like reaction times for automatic emergency braking, adaptive sensing in rapidly changing lighting conditions and the ability to use more cameras for increased redundancy and safety. All of which will help deliver on the promise of autonomous vehicles in a much more efficient and safe manner.
Clearly, there is work to do before user acceptance levels for safety are reached. The stakes are high given the breadth and scope of the transportation industry. But brute force application of technology to the challenges will not suffice. A new approach that mimics the biological evolutionary paradigm to sensing the environment is needed to create the conditions necessary to guarantee full safety, the key to user acceptance.