Location

Manchester Village, Vermont

Date

29-6-2017

Session

Session 6 — Hybrid Presentations

Abstract

We developed a new method for simultaneously assessing theworkload of a driver and a non-driver engaged in natural conversation either inthe vehicle or over a cell phone. For both the driver and non-driver, talking wasfound to be more demanding than listening and the pattern was identical for bothpassenger conversations and cell phone conversations. Operating the vehicleincreased the workload for the driver over and above the conversation task. Theeffects of driving (or not) and talking (or not) were found to be additive. The datareveal a pattern of dynamic fluctuation in workload in driver/non-driverconversational dyads. Driving is performed while processing various internal driver and external cues from the driving environment (e.g., subtle vibrations, lateral and longitudinal acceleration). The present study was conducted for the purpose of identifying how much external cues affect driver’s gaze behavior in an automated driving environment. Fifteen participants drove a commercially available vehicle with longitudinal and lateral automation on an oval test track. Participants were asked to drive the vehicle with and without automation, with or without a side-task, and either with their hands-on or hands-off-wheel. Driver’s gaze behavior, handson-wheel status and driving conditions were annotated from video data. The results showed that during automated driving and side-task performance, eyes-on-road time was significantly greater after entering a curve than before and as a result of changes in speed. These differences were not observed in automated driving mode when no side-task is performed. Also, these were more sensitive than hands-on or hands-off-wheel conditions. The results also suggest that drivers may process nonvisual information (e.g., vestibular information produced by changes in lateral and longitudinal vehicle acceleration) prior to or even during the implementation of a visual resource allocation strategy. The present study suggests driver awareness can be aided without requiring the driver to grab the steering wheel.

Rights

Copyright © 2017 the author(s)

DC Citation

Proceedings of the Ninth International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, June 26-29, 2017, Manchester Village, Vermont. Iowa City, IA: Public Policy Center, University of Iowa, 2017: 368-367.

Share

COinS
 
Jun 29th, 12:00 AM

How Do Changes in the External Environment Affect Driving Engagement in Automated Driving? – An Exploratory Study

Manchester Village, Vermont

We developed a new method for simultaneously assessing theworkload of a driver and a non-driver engaged in natural conversation either inthe vehicle or over a cell phone. For both the driver and non-driver, talking wasfound to be more demanding than listening and the pattern was identical for bothpassenger conversations and cell phone conversations. Operating the vehicleincreased the workload for the driver over and above the conversation task. Theeffects of driving (or not) and talking (or not) were found to be additive. The datareveal a pattern of dynamic fluctuation in workload in driver/non-driverconversational dyads. Driving is performed while processing various internal driver and external cues from the driving environment (e.g., subtle vibrations, lateral and longitudinal acceleration). The present study was conducted for the purpose of identifying how much external cues affect driver’s gaze behavior in an automated driving environment. Fifteen participants drove a commercially available vehicle with longitudinal and lateral automation on an oval test track. Participants were asked to drive the vehicle with and without automation, with or without a side-task, and either with their hands-on or hands-off-wheel. Driver’s gaze behavior, handson-wheel status and driving conditions were annotated from video data. The results showed that during automated driving and side-task performance, eyes-on-road time was significantly greater after entering a curve than before and as a result of changes in speed. These differences were not observed in automated driving mode when no side-task is performed. Also, these were more sensitive than hands-on or hands-off-wheel conditions. The results also suggest that drivers may process nonvisual information (e.g., vestibular information produced by changes in lateral and longitudinal vehicle acceleration) prior to or even during the implementation of a visual resource allocation strategy. The present study suggests driver awareness can be aided without requiring the driver to grab the steering wheel.