Driver inattention is the leading cause of traffic accidents in the U.S., resulting in thousands of deaths per year. Inattention can be the result of driver fatigue, texting, loud music, or even daydreaming. Whatever the cause, when a driver’s focus strays, lives are put at risk. Associate Professor of Mechanical, Aerospace, and Biomedical Engineering Subhadeep Chakraborty’s work with biometric sensing could help minimize that risk.
Biometric sensing is the process of gathering information about a human, in this case the driver of a vehicle, such as where they are looking or their heartrate. Data gathered from a driver can subsequently be cross referenced with information about the vehicle itself and its surroundings to assess the situation and determine if the driver is behaving normally.
“Our physiological responses are tied to what the vehicle is doing and what is going on around the vehicle at the same time,” said Chakraborty. “Looking at these three things simultaneously will allow us to determine if a driver’s behavior is normal or something that could become dangerous. We can then act upon that by sounding an alarm for a sleepy driver or vibrating the steering wheel to return a distracted driver’s attention to the road.”
He cautions that for this technology to be effective, it must be accurate enough to avoid regular false alarms. Chakraborty’s team has developed a headmount containing a series of sensors capable of collecting biometric data as well as information about a driver’s gaze. The next step is to integrate that headmount with a driving simulator and begin building a data set, which will use both UT and ORNL equipment.
“Our lab setup is drawn from high-end gaming systems and uses a head mounted display. We can control the environment and get immediate data from the simulations. We can use this to gather driving data and safely simulate distractions by asking participants to solve puzzles or play memory games,” said Chakraborty.
From there, his team hopes to also gather data via a state-of-the-art simulator located in ORNL’s Connected and Autonomous Vehicle Environment (CAVE) Laboratory. This simulator mimics the experience of driving by removing a vehicle’s wheels and mounting the hub directly to four dynamometers. The full steering capability with torque feedback based on the current simulated vehicle dynamics make the simulation feel more realistic. This could translate to more accurate biometric data and fewer false alarms for potentially distracted drivers.
Chakraborty was previously the recipient of Science Alliance funding for his work on connected vehicle technology. That work involved several cross-disciplinary collaborations on campus that now, in addition to being applied to his StART project, have created opportunities for approaching his work from a holistic perspective.
“I don’t think these kinds of projects have any boundaries anymore. It’s a mechanical engineering topic, it’s a computer science topic, it’s a civil engineering topic,” said Chakraborty. “Ultimately, we are trying to address a safety issue, and that issue is multi-dimensional and needs to be looked at from a variety of perspectives. Fortunately, we have built a community capable of doing that.”