Steven Spielberg’s HAL 9000 in 2001: A Space Odyssey taught us to fear and loathe the concept of artificial intelligence, but the autonomous future (we’re told) can’t really happen without lots of deep machine learning and AI. Numerous companies at 2019 CES trotted out concepts aimed at inferring the emotional and cognitive state of the driver and passengers. SAE level 3 and 4 autonomous systems will need this information to monitor the driver’s ability to take control of the car, but many companies propose to use this info to help improve occupant comfort or disposition. We just hope and pray such systems come with an off button.
Camera-based systems proposed by Nuance, Hyundai Mobis, and Harman all aim to infer driver mood and drowsiness from facial cues while monitoring head position and eye-gaze direction to gauge attention level. Nuance and Harman also take voice intonation into account. With this info, most companies obtaining this data alter ambient lighting or suggest music selections.
Nuance’s Emotion AI system proposes altering the vehicle’s speech to match the driver’s mood—chipper, chatty, and (cringingly) slangy when he/she is feeling upbeat; curt and businesslike in response to anger. When drowsiness is detected, Nuance offers games to keep the driver engaged—like name that tune or an eyes-free riff on hangman/Wheel of Fortune. It can also tell jokes or read your horoscope.
Harman dials up the driver-assist system’s alertness in response to drowsiness or distraction. Faurecia proposes dispensing scents—peppermint to increase alertness, lavender for relaxation.
Source: Read Full Article