Smart Cars Will a smart car save your life?
When a driver is distracted or experiences driver fatigue, the accident risk skyrockets. Many car manufacturers use tech as a means to track driver behavior, predict future behavior and prevent accidents. This article takes a look at the efforts to make cars monitor and respond to the health of their drivers as we get closer to hands-free driving.
It's nothing new to liken a car to a data center on wheels, equipped with a myriad of electronics, from the Can Bus to sensor embedded windscreen wipers and of course the digital dashboard. But car manufacturers are taking it one step further, with a nod to the wearable tech market which uses sensors to monitor the human body. Sensors, AI, and eye-tracking used in a myriad of ways to improve driver safety and passenger wellbeing.
Tracking driver distraction and fatigue
The problems of distracted drivers and driver fatigue are both contributors to car accidents and many companies are using tech as a means to track driver behavior, predict future behavior and prevent accidents. Affectiva has created what it calls "The first multi-modal in-cabin sensing AI that identifies, from face and voice, complex and nuanced emotional and cognitive states of drivers and passengers." The software uses facial identification cameras and software to understand the driver and passengers and can detect drive distractions due to mobile phone use.
Wearables company SmartCap is targeting truck drivers with a headband that fits into trucker caps, beanies or other headgear. The headband measures fatigue using EEG (brainwaves) and allows the earliest possible detection of fatigue. The headband connects via Bluetooth to a cabin display and app that shares real-time fatigue measurements and enables drivers to manage their alertness and energy levels.
Preparation for Level 3 Autonomous Driving
However, we're preparing for a future where distracted drivers are the norm as driving handsfree and "non-driving related tasks, such as eating, reading emails or watching a video will become permissible in certain situations, as per driving on a freeway."
We've all seen the videos of autonomous car trials with a driver who sits with their hands near the steering wheel. Startup Optalert is preparing for a time when this is the norm, with software that currently tracks and quantifies eye and eyelid movements, tracking when the driver's eyes are not on the road in front of them. It can detect drowsiness and monitor driver distraction.
However, the company sees its platform coming into its own with the integration of Level 3 autonomous driving. Level 3 of autonomous driving requires the driver to be able to take control of the car at any time, and split-second alertness will be mandatory.
In 2016, Ford opened the Automotive Wearables Experience lab at its Research and Innovation Center in Dearborn, Michigan. They are working on a range of research and concepts such as technology to signal a driver such as a wrist vibration or flashing lights on the dash to signify the need to take over the wheel in the event of an accident ahead.
They also suggest modalities responsive to biometric data such as a lane-keeping assist function where the car takes over if it identifies the driver is tired. Or cruise control that increases the distance between vehicles giving the driver more space if they have an elevated heart rate.
Innovation is cautious and considered
Can we identify a heart attack by a cardiac monitoring in-car device? Toyota has been working on a variety of solutions since 2011, including a heart monitoring steering wheel which they shelved in 2015, noting the cost compared to the integration with existing wearable tech and other in-car sensors. Toyota Collaborative Safety Research Center (CSRC) is currently in partnership with the University of Michigan Center for Integrative Research in Critical Care, according to Toyota's Pujitha Gunaratne:
"A challenge for vehicle applications is having a system that can detect small changes in heart rhythms but can also separate out the noise and motion that happens inside the vehicle. In an intensive care unit, there are all types of mechanisms in place to ensure that the monitors are not experiencing electronic interference. That's not as easy inside a vehicle. We're going to need to have robust and advanced algorithms."
Researchers will continue to test and validate algorithmic and hardware options that could be placed inside the vehicle to monitor the driver's heart. The team hopes to report results this year. We're not quite at the point where a car can identify a driver having a heart attack, but we might just be when the car can drive itself (and the driver/passenger) to the emergency room in response to the heart monitor.
When is a car not a car? When it's a mobile health clinic
Mobility-as-a-service positions a car or other vehicle as a space that can be modular, with it's interior modified according to function. A car embedded with health gathering sensors could offer a critical location for a mobile health clinic that could be integrated through remote medicine. Such a car enables people to access health services wherever they are, at their convenience.
A passenger/patient could follow a series of guided tests such as blood pressure and heart rate using equipment integrated into the car's interior, then access a doctor via telemedicine. The patient could then be driven to a clinic as needed, printed personalized pills in 3D or receive medicine via a drone.