By Cliff De Locht, Melexis
In the near future the automotive electronics industry is set to go through some radical changes. Vehicle manufacturers will soon start to introduce models that feature semi-autonomous control. A variety of external sensor devices will capture accurate data concerning possible hazards. This data will then be fed to the vehicle’s automated driver assistance system (ADAS), informing it of possible hazards that lay ahead. The ADAS will also be provided with information on the driver’s attention via sensors within the vehicle’s interior. From this it will be possible for the ADAS to judge if the driver is concentrating fully on the road or whether it will be necessary for it to step in and take action instead (such as automatic emergency braking or collision avoidance maneuvers).
Sensor Technologies for Enabling ADAS
There are several forms of sensor technology that are currently being assessed for ADAS implementation. These are:
1. CMOS cameras – The imaging devices currently being incorporated into vehicles can deal with a broad spectrum of interior functions – like seat occupancy detection, head position detection, eye tracking, proximity sensing for displays and gesture recognition for human machine interfaces (HMIs). The large amounts of imaging data associated with applications of this kind will mandate that sophisticated image processing be provisioned for. Alternative technologies are being investigated that could result in less expensive deployment.
2. Low resolution FIR cameras – Day/night light level variations means that visible light sensing is in many cases impractical. It requires active cabin illumination, which raises costs. In addition, facial hair, headwear and brightly coloured clothing can all effect the optical classification procedure. Visible sensing systems also require specification of a high performance MCU, due the processing activity needed. In contrast, far infrared (FIR) technology, working in the 5µm to 15µm wavelength range, detects radiated heat at the frequencies emitted by the human body. As a result it is much better at distinguishing passengers from inanimate objects. FIR sensing is not ambient light dependent, so active cabin lighting can be dispensed with. Furthermore, it requires considerably less processing power. Introduction of lower resolution FIR imaging arrays are now pushing the cost/performance envelope. These combine high sensitivity thermopiles with streamlined signal processing. As each FIR pixel is supported by an amplifier and data converter, a significant SNR improvement is witnessed.
Optically-Based Automotive HMI
3. Active light sensing – Advanced optoelectronics are just starting to be implemented into automotive HMIs, enabling proximity detection and recognition of simple hand gestures. Systems based on this technology have the capacity differentiate between the driver and passenger, limiting access to infotainment options that could distract the driver, permitting access only from the passenger side. Moving forwards this will become tied into the ADAS – providing constantly updated information on what the driver is doing and where their hands are located, so that their ability to react at any given point can be gaged. Based on a robust multi-channel, close range optical sensing mechanism, solutions are now available that can cope with the wide variation in background lighting levels common in automotive scenarios. Each of the sensor ICs incorporated into such a system will have two independent, simultaneously operating light measurement channels. These can be assigned to detect the active optical reflection from the user’s hand. Integrated ambient light suppression will ensure the channels are not susceptible to light interference issues. LEDs emit short pulse trains of light which are reflected off the user’s hand, detected by the sensor ICs and then converted into a digital signal, while the background light is rejected. Only the LEDs and photo detection hardware need to be on the front of the console. Everything else can go wherever there is available space behind the console – thereby making integration much easier.
4. Time-of-Flight (ToF) camera systems – This consists of an infrared (IR) light source projecting a beam that travels to obstacles and subsequently gets reflected back towards the ToF sensor. The sensor then detects the reflected IR signal and subsequently compares it against a reference signal and determines the phase shift caused by traveling. It then provides output data on the distance to the obstacle. Through this good quality 3D images can be constructed. ToF sensing systems could be implemented within vehicle dashboards in order to capture vital data on driver awareness, as long as a high enough degree of sunlight robustness could be achieved for automotive environments. Entry on to the market of multi-pixel image sensors supporting extremely high dynamic ranges, combined with sensitivity levels allowing relevant scene details to be accurately detected, means the problems caused by sunlight can now be dealt with.
ToF Sensing System