Feature Story


More feature stories by year:

2024
2023
2022
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006
2005
2004
2003
2002
2001
2000
1999
1998

Return to: 2021 Feature Stories

CLIENT: PATHPARTNER TECHNOLOGY

July 21, 2021: RTInsights

Erasing Driver’s Distraction with Head Pose Detection

Head pose detection can be a gamechanger in assessing driver distraction. It can convey volumes of information about the driver. 

A common notion among drivers is that they can drive safely even when they are distracted. But the sad truth is, the data tells a different story. Head pose detection offers some help.

Distracted driving is a burning issue. Every day, about eight people in the United States are killed in crashes that are reported to involve a distracted driver, according to the CDC. In addition to death on roads, non-fatal injuries are also high. In the U.S. alone in 2018, an estimated 400,000 were injured in crashes involving a distracted driver. The world report on Road Traffic Injury prevention shows that distracted driving tops the list along with other factors that have an impact on traffic safety.

The tasks that can take the driver’s eyes off the road and reduce visual scan and increase cognitive load may be dangerous in terms of road safety. The increasing use of in-cabin information and infotainment system is worsening the situation because they lead to visual, manual, and cognitive distractions. This may affect the driver’s performance in various ways.

Also, the advancement of personal communication devices such as mobile phones has increased the problem. The National Safety Council states that cell phone use while driving results in 1.6 million crashes each year. Secondary tasks such as eating, drinking, tuning the radio, etc., also lead to driver inattention.

There is no standard protocol to observe a driver’s inattention. The solution is to observe the signs. Monitoring eye closure and head pose of the driver is the most relevant indicator of driver distraction.

How do we detect inattention with the head pose?

Evaluating inattention with head pose detection

Head pose detection can be a gamechanger in assessing driver distraction. It can convey volumes of information about the driver. A variety of methods are used to detect head pose. There are other procedures that apply classification or regression on features extracted from pixels of the image.

There are also geometric methods that try to model the geometric relation between head pose and facial landmarks. The pose is estimated from only the facial landmark locations. So, the existing eye, nose, and mouth landmarks can be used to estimate the face pose.

But most common methods for head pose estimation often do so through landmark or fiducial point estimation and would need more computation than required. In such methods, errors in landmark estimation translate into a compounded error in pose estimation. Also, relying on 2D landmarks would limit the pose estimation ability at extreme poses due to the invisibility of some landmarks. Therefore, landmark-based head pose can’t cover the full range of head rotations. We choose to estimate all the 3D affine parameters (3D rotation, translation, and scale) directly using feature aggregation and regression using a compact convolutional neural network (CNN) in a landmark-free approach. We design a CNN to inherently capture the 3D orientation and position of the face with respect to the camera. Such a method is not affected by positional errors and is effective even in poses where more than half the face is invisible, either due to extreme poses or occlusions like sunglasses and masks.

Emotion detection and road safety

Emotion influences various cognitive processes in humans, including perception, reasoning, and intuition. Activities such as driving adapts based on driver risk tolerance which is influenced by driver motives and emotional state.

This emotional state often affects their performance. But this could be improved if the car actively assesses and responds to the emotion or facial expression of the driver.

Facial expressions can be described at different levels. The most commonly used description is the Facial Action Coding System (FACS), which is a human-observer-based system developed to capture subtle changes in facial expressions. With FACS, these expressions are decomposed into one or more action units or AUs.

Action unit recognition and detection have been recently in the spotlight. It has been seen that the basic emotions have corresponding universal facial expressions across all cultures and continents. Most of the current facial expression recognition systems aim to recognize emotions such as disgust, fear, joy, surprise, sadness, and anger, which can be very helpful in predicting the driver’s behavior.

Therefore, manufacturers are exploring face pose as a means of reading driver’s emotions apart from drowsiness and distraction. If systems such as driver monitoring solutions can understand feelings, then it could make driving even safer.

As discussed, the emotional status of the driver such as stress, the anxiety of the driver can also endanger their safety. High-level stress may damage self-confidence, narrow attention and might end up affecting concentration.

This very often leads to aggressive driving and makes the driver pay less attention to the traffic scenario. Even a small disturbance can have severe consequences in the performance of a complex cognitive task such as driving.

To reduce such accidents, it’s important to detect such emotions and take certain steps to calm the driver. In a real-world scenario, a non-intrusive system that can run in real-time is needed.

A Driver Monitoring System can be used to detect drowsiness and distraction. It includes a CCD camera that tracks the face along through an infrared LED detector.

The camera sensors can understand if the driver is tired or stressed and can give alerts or take over the control of the vehicle in extreme conditions. It can be coupled with radar-based non-contact sensing to track breathing rates for more efficacy.

It’s a big emerging market. The need for driver monitoring is realized not only in the passenger vehicle segment but also in the commercial vehicle segment, wherein challenges are more significant as compared to the passenger segment.

Since more than 90% of the accidents involve human errors, systems like these will be able to reassure a stressed driver and might alleviate unsafe driving habits. The rising demand for road safety in automobiles related to driver vigilance and strict government regulations related to DMS will boost the market in the future.

Return to: 2021 Feature Stories