Exploring how drivers perceive spatial earcons in automated vehicles

Research output: Contribution to journalArticle

Automated vehicles seek to relieve the human driver from primary driving tasks, but this substantially diminishes the connection between driver and vehicle compared to manual operation. At present, automated vehicles lack any form of continual, appropriate feedback to re-establish this connection and offer a feeling of control. We suggest that auditory feedback can be used to support the driver in this context. A preliminary field study that explored how drivers respond to existing auditory feedback in manual vehicles was first undertaken. We then designed a set of abstract, synthesised sounds presented spatially around the driver, known as Spatial Earcons, that represented different primary driving sounds e.g. acceleration. To evaluate their effectiveness, we undertook a driving simulator study in an outdoor setting using a real vehicle. Spatial Earcons performed as well as Existing Vehicle Sounds during automated and manual driving scenarios. Subjective responses suggested Spatial Earcons produced an engaging driving experience. This paper argues that entirely new synthesised primary driving sounds, such as Spatial Earcons, can be designed for automated vehicles to replace Existing Vehicle Sounds. This creates new possibilities for presenting primary driving information in automated vehicles using auditory feedback, in order to re-establish a connection between driver and vehicle.
Original languageEnglish
Number of pages24
JournalIMWUT Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
StateAccepted/In press - 14 Jul 2017

    Research areas

  • human centered computing, human computer interaction, automated vehicles, auditory displays, existing vehicle sounds, driving simulators

View graph of relations