Date of Degree
PhD (Doctor of Philosophy)
John D. Lee
Driver distraction contributes to approximately 43% of motor-vehicle crashes and 27% of near-crashes. Rapidly developing in-vehicle technology and electronic devices place additional demands on drivers, which might lead to distraction and diminished capacity to perform driving tasks. This situation threatens safe driving. Technology that can detect and mitigate distraction by alerting drivers could play a central role in maintaining safety. Correctly identifying driver distraction in real time is a critical challenge in developing distraction mitigation systems, and this function has not been well developed. Moreover, the greatest benefit may be from real-time distraction detection in advance of dangerous breakdowns in driver performance.
Based on driver performance, two types of distraction - visual and cognitive - are identified. These types of distraction have very different effects on visual behavior and driving performance; therefore, they require different algorithms for detection. Distraction detection algorithms typically rely on either eye measures or driver performance measures because the effect of distraction on the coordination of measures has not been established. Combining both eye glance and vehicle data could enhance the ability of algorithms to detect and differentiate visual and cognitive distraction.
The goal of this research is to examine whether poor coordination between visual behavior and vehicle control can identify diminished attention to driving in advance of breakdowns in lane keeping. The primary hypothesis of this dissertation is that detection of changes in eye-steering relationship caused by distraction could provide a prospective indication of vehicle state changes. Three specific aims are pursued to test this hypothesis. The first aim examines the effect of distracting activity on eye and steering movements to assess the degree to which the correlation parameters are indicative of distraction. The second aim applies a control-theoretic system identification approach to the eye movement and steering data to distinguish between distracted and non-distracted conditions. The third aim examines whether changes of eye-steering coordination associated with distraction provide a prospective indication of breakdowns in driver performance, i.e., lane departures.
Together, the three aims show how that a combination of visual and steering behavior, i.e., eye-steering model, can differentiate between non-distracted and distracted state. This model revealed sensitivity to distraction associated with off-road glances. The models derived for different drivers have similar structure and fit to data from other drivers reasonably well. In addition, the differences in model order and model coefficients indicate the variability in driving behavior: some people generate more complex behavior than others. As was expected, eye-steering correlation on straight roads is not as strong as observed on curvy roads. However, eye-steering correlation measured through correlation coefficient and time delay between two movements is sensitive to different types of distraction. Time delay mediates changes in lane position and the eye-steering system predicts breakdowns in lane keeping. This dissertation contributes to developing a distraction detection system that integrates visual and steering behavior. More broadly, these results suggest that integrating eye and steering data can be helpful in detecting and mitigating impairments beyond distraction, such as those associated with alcohol, fatigue, and aging.
distraction detection, driver distraction, eye movement, eye-steering model, lane departures, system identification
Copyright 2010 Lora Yekhshatyan