Skip to main content
Log in

In-vehicle air gesture design: impacts of display modality and control orientation

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

The number of visual distraction-caused crashes highlights a need for non-visual displays in the in-vehicle information system (IVIS). Audio-supported air gesture controls can tackle this problem. Twenty-four young drivers participated in our experiment using a driving simulator with six different gesture prototypes—3 modality types (visual-only, visual/auditory, and auditory-only) × 2 control orientation types (horizontal and vertical). Various data were obtained, including lane departures, eye glance behavior, secondary task performance, and driver workload. Results showed that the auditory-only displays showed a significantly lower lane departures and perceived workload. A tradeoff between eyes-on-road time and secondary task completion time for the auditory-only display was also observed, which means the safest, but slowest among the prototypes. Vertical controls (direct manipulation) showed significantly lower workload than horizontal controls (mouse metaphor), but did not differ in performance measures. Experimental results are discussed in the context of multiple resource theory and design guidelines for future implementation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Akyol S, Canzler U, Bengler K, Hahn W (2000) Gesture control for use in automobiles. In: Proceedings of the IAPR conference on machine vision applications (IAPR MVA 2000), Tokyo, Japan, pp 349–352

  2. Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: CHI’03 extended abstracts on human factors in computing systems, FL, USA, ACM, pp 932–933. https://doi.org/10.1145/765891.766078

  3. Bach KM, Jæger MG, Skov MB, Thomassen NG (2008) Evaluating driver attention and driving behaviour: comparing controlled driving and simulated driving. In: Proceedings of the 22nd British HCI group annual conference on people and computers: culture, creativity, interaction, Swindon, UK, pp 193–201

  4. Balakrishnan R (2004) “Beating” Fitts’ law: virtual enhancements for pointing facilitation. Int J Hum Comput Stud 61(6):857–874. https://doi.org/10.1016/j.ijhcs.2004.09.002

    Article  Google Scholar 

  5. Basil MD (1994) Multiple resource theory I: application to television viewing. Commun Res 21(2):177–207. https://doi.org/10.1177/009365094021002003

    Article  Google Scholar 

  6. Bilius LB, Vatavu RD (2020) A multistudy investigation of drivers and passengers’ gesture and voice input preferences for in-vehicle interactions. J Intell Transp Syst 25(2):197–220. https://doi.org/10.1080/15472450.2020.1846127

    Article  Google Scholar 

  7. Brewster S (1998) The design of sonically-enhanced widgets. Interact Comput 11(2):211–235. https://doi.org/10.1016/S0953-5438(98)00028-9

    Article  Google Scholar 

  8. Brewster S (1998b) Sonically-enhanced drag and drop. In: Proceedings of the International conference on auditory display, Glasgow, UK, pp 1–7

  9. Burnett GE, Summerskill SJ, Porter JM (2004) On-the-move destination entry for vehicle navigation systems: Unsafe by any means? Behav Inform Technol 23(4):265–272. https://doi.org/10.1080/01449290410001669950

    Article  Google Scholar 

  10. Caird JK, Johnston KA, Willness CR, Asbridge M, Steel P (2014) A meta-analysis of the effects of texting on driving. Accid Anal Prev 71:311–318. https://doi.org/10.1016/j.aap.2014.06.005

    Article  Google Scholar 

  11. Cairnie IW, Ricketts SJ, Mckenna G, Mcallister A (2000) Using finger-pointing to operate secondary controls in automobiles. In: Proceedings of the IEEE intelligent vehicles symposium 2000 (Cat. No.00TH8511) Dearborn, MI, USA, pp 550–555. https://doi.org/10.1109/IVS.2000.898405

  12. Dingus TA, Klauer SG, Neale VL, Petersen A, Lee SE, Sudweeks JD, Perez, MA, Hankey J, Ramsey D, Gupta S, Bucher C (2006) The 100-car naturalistic driving study, phase II-results of the 100-car field experiment. No. DOT-HS-810–593. United States. Department of Transportation. National Highway Traffic Safety Administration. https://rosap.ntl.bts.gov/view/dot/37370

  13. Dingus TA, McGehee DV, Hulse MC, Jahns SK, Manakkal N, Mollenhauer MA, Fleischman RN (1995) TravTek evaluation task C3-Camera car study. No. FHWA-RD-94-076. United States. Department of Transportation. Federal Highway Administration. https://vtechworks.lib.vt.edu/handle/10919/55071

  14. Donmez B, Boyle L, Lee J (2010) Differences in off-road glances: effects on young drivers’ performance. J Transp Eng 136(5):403–409. https://doi.org/10.1061/(ASCE)TE.1943-5436.0000068

    Article  Google Scholar 

  15. Edwards AD (1989) Soundtrack: an auditory interface for blind users. Hum-Comput Interact 4(1):45–66. https://doi.org/10.1207/s15327051hci0401_2

    Article  Google Scholar 

  16. Elliott D, Helsen WF, Chua R (2001) A century later: Woodworth’s (1899) two-component model of goal-directed aiming. Psychol Bull 127(3):342–357. https://doi.org/10.1037/0033-2909.127.3.342

    Article  Google Scholar 

  17. Gable TM, Raja SR, Samuels DP, Walker BN (2015) Exploring and evaluating the capabilities of Kinect v2 in a driving simulator environment. In: Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications. ACM, Nottingham, UK, pp 297–304. https://doi.org/10.1145/2799250.2799276

  18. Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum-Comput Interact 4(1):67–94. https://doi.org/10.1207/s15327051hci0401_3

    Article  Google Scholar 

  19. Goulati A, Szostak D (2011) User experience in speech recognition of navigation devices: an assessment. In: Proceedings of the 13th international conference on human computer interaction with mobile devices and services, Stockholm, Sweden, pp 517–520. https://doi.org/10.1145/2037373.2037451

  20. Green P (2000) Crashes induced by driver information systems and what can be done to reduce them. No. 2000-01-C008. SAE Technical Paper, pp 27–36. SAE. https://www.sae.org/publications/technical-papers/content/2000-01-C008/

  21. Grossman T, Balakrishnan R (2004) Pointing at trivariate targets in 3D environments. In: Proceedings of the SIGCHI conference on Human factors in computing systems, ACM, pp 447–454. https://doi.org/10.1145/985692.985749

  22. Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183. https://doi.org/10.1016/S0166-4115(08)62386-9

    Article  Google Scholar 

  23. Hassel E (2016) Touch screens in cars: investigating touch gestures and audio feedback in the context of in-vehicle infotainment. Honors Thesis, Malmö University, Sweden. diva2:1482085

  24. Horrey WJ, Wickens CD (2006) Examining the impact of cell phone conversations on driving using meta-analytic techniques. Hum Factors 48(1):196–205. https://doi.org/10.1518/001872006776412135

    Article  Google Scholar 

  25. Hurwitz JB, Wheatley DJ (2002) Using driver performance measures to estimate workload. Proc Hum Factors Ergon Soc Annu Meet 46(22):1804–1808. https://doi.org/10.1177/154193120204602206

    Article  Google Scholar 

  26. Jeon M, Gable TM, Davison BK, Nees M, Wilson J, Walker BN (2015) Menu navigation with in-vehicle technologies: auditory menu cues improve dual task performance, preference, and workload. Int J Hum-Comput Interact 31(1):1–16. https://doi.org/10.1080/10447318.2014.925774

    Article  Google Scholar 

  27. Jahani H, Alyamani HJ, Kavakli M, Dey A, Billinghurst M (2017) User evaluation of hand gestures for designing an intelligent in-vehicle interface. In: Maedche A, vom Brocke J, Hevner A (eds) Designing the digital transformation. Springer International Publishing, Cham, pp 104–121. https://doi.org/10.1007/978-3-319-59144-5_7

    Chapter  Google Scholar 

  28. Jiang L, Xia M, Liu X, Bai F (2020) Givs: fine-grained gesture control for mobile devices in driving environments. IEEE Access 8:49229–49243. https://doi.org/10.1109/ACCESS.2020.2971849

    Article  Google Scholar 

  29. Klauer SG, Dingus TA, Neale VL, Sudweeks JD, Ramsey DJ (2006) The impact of driver inattention on near-crash/crash risk: an analysis using the 100-car naturalistic driving study data. No. DOT HS 810 594. United States. Department of Transportation. National Highway Traffic Safety Administration. http://hdl.handle.net/10919/55090

  30. Liu Y-C (2001) Comparative study of the effects of auditory, visual and multimodality displays on drivers’ performance in advanced traveler information systems. Ergonomics 44:425–442. https://doi.org/10.1080/00140130010011369

    Article  Google Scholar 

  31. Liu Q, Ren J, Qian Z, Hua M (2017) Seated reach capabilities for ergonomic design and evalution with consideration of reach difficulties. Appl Ergon 59:357–363. https://doi.org/10.1016/j.apergo.2016.09.011

    Article  Google Scholar 

  32. May KR, Gable TM, Walker BN (2014) A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications, Seattle, WA, USA, ACM, pp 1– 6. https://doi.org/10.1145/2667239.2667280

  33. McGuffin MJ, Balakrishnan R (2005) Fitts’ law and expanding targets: experimental studies and designs for user interfaces. ACM Trans Comput-Hum Interact (TOCHI) 12(4):388–422. https://doi.org/10.1145/1121112.1121115

    Article  Google Scholar 

  34. Mynatt ED, Edwards WK (1992) The Mercator environment: a nonvisual interface to X windows and Unix workstations. GVU Tech Report GIT-GVU-92–05. Graphics, Visualization and Usability Center, College of Computing, Georgia Institute of Technology. Atlanta, GA, USA

  35. National Highway Traffic Safety Administration (2012) Visual-manual NHTSA driver distraction guidelines for in-vehicle electronic devices. No. NHTSA-2010-0053. United States. Department of Transportation, National Highway Traffic Safety Administration. pp 1–117

  36. Ohn-Bar E, Tran C, Trivedi M (2012) Hand gesture-based visual user interface for infotainment. In: Proceedings of the 4th international conference on automotive user interfaces and interactive vehicular applications, Portsmouth, NH, USA, ACM, pp 111–115. https://doi.org/10.1145/2390256.2390274

  37. Olson RL, Hanowski RJ, Hickman JS, Bocanegra JL (2009) Driver distraction in commercial vehicle operations, No. FMCSA-RRR-09-042. United States. Department of Transportation. Federal Motor Carrier Safety Administration. https://doi.org/10.21949/1502647

  38. Peng Y, Boyle L, Hallmark S (2013) Driver’s lane keeping ability with eyes off road: insights from a naturalistic study. Accid Anal Prev 50:628–634. https://doi.org/10.1016/j.aap.2012.06.013

    Article  Google Scholar 

  39. Pickering, C. A., Bumnham, K. J., & Richardson, M. J. (2007). A review of automotive human machine interface technologies and techniques to reduce driver distraction. In: Proceedings of the 2nd institution of engineering and technology international conference on system safety, London, pp 223–228. https://doi.org/10.1049/cp:20070468

  40. Pfleging B, Schneegass S, Schmidt A (2012) Multimodal interaction in the car: Combining speech and gestures on the steering wheel. In: Proceedings of the 4th international conference on automotive user interfaces and interactive vehicular applications, Portsmouth, NH, USA, pp 155–162. https://doi.org/10.1145/2390256.2390282

  41. Rahman ASMM, Saboune J, El Saddik A, Ave KE (2011) Motion-path based in car gesture control of the multimedia devices. In: Proceedings of the first ACM international symposium on design and analysis of intelligent vehicular networks and applications, Miami, FL, USA, pp 69–75. https://doi.org/10.1145/2069000.2069013

  42. Ranney TA, Mazzae E, Garrott R, Goodman MJ (2000) NHTSA driver distraction research: past, present, and future. No. 2001-06-0177. SAE Technical Paper. https://www.sae.org/publications/technical-papers/content/2001-06-0177/

  43. Shakeri G, Williamson JH, Brewster S (2017) Novel multimodal feedback techniques for in-car mid-air gesture interaction. In: Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications, Oldenburg, Germany, ACM, pp 84–93 https://doi.org/10.1145/3122986.3123011

  44. Shneiderman B (1997) Direct manipulation for comprehensible, predictable and controllable user interfaces. In Designing the user interface, 3rd ed. Addison-Wesley, Reading, pp 33–39

  45. Society of Automotive Engineers (1988) Recommended Practice J287, Driver Hand Control Reach. Society of Automotive Engineers, Inc, Warrendale PA. https://www.sae.org/standards/content/j287_202211/

  46. Society of Automotive Engineers (2007) Recommended Practice J287, Driver Hand Control Reach. Society of Automotive Engineers, Inc, Warrendale PA. https://www.sae.org/standards/content/j287_202211/

  47. Sodnik J, Dicke C, Tomažič S, Billinghurst M (2008) A user study of auditory versus visual interfaces for use while driving. Int J Hum Comput Stud 66(5):318–332. https://doi.org/10.1016/j.ijhcs.2007.11.001

    Article  Google Scholar 

  48. Srinivasan R, Jovanis PP (1997) Effect of selected in-vehicle route guidance systems on driver reaction times. Hum Factors 39(2):200–215. https://doi.org/10.1518/001872097778543

    Article  Google Scholar 

  49. Sterkenburg J, Landry S, Jeon M (2019) Design and evaluation of auditory-supported air gesture controls in vehicles. J Multimod User Interfaces 13(2):55–70. https://doi.org/10.1007/s12193-019-00298-8

    Article  Google Scholar 

  50. Strayer DL, Drew FA (2004) Profiles in driver distraction: effects of cell phone conversations on younger and older drivers. Hum Factors 46(4):640–649. https://doi.org/10.1518/hfes.46.4.640.56806

    Article  Google Scholar 

  51. Tabbarah M (2022) Novel in-vehicle gesture interactions: design and evaluation of auditory displays and menu generation interfaces. Unpublished Master’s Thesis. Virginia Polytechnic Institute and State University

  52. Tijerina L, Parmer E, Goodman MJ (1998). Driver workload assessment of route guidance system destination entry while driving: a test track study. In Proceedings of the 5th ITS world congress, Seoul, pp 12–16

  53. Tsimhoni O, Smith D, Green P (2004) Address entry while driving: speech recognition versus a touch-screen keyboard. Hum Factors 46(4):600–610. https://doi.org/10.1518/hfes.46.4.600.56813

    Article  Google Scholar 

  54. Young K, Regan M (2007) Driver distraction: a review of the literature. In: Faulks IJ, Regan M, Stevenson M, Brown J, Porter A, Irwin JD (eds) Distracted driving. Australasian College of Road Safety, Sydney, pp 379–405

    Google Scholar 

  55. Yu X, Ren J, Zhang Q, Liu Q, Liu H (2017) Modeling study of seated reach envelopes on spherical harmonics with consideration of the difficulty ratings. Appl Ergon 60:220–230. https://doi.org/10.1016/j.apergo.2016.12.002

    Article  Google Scholar 

  56. Walker BN, Lindsay J, Nance A, Nakano Y, Palladino DK, Dingler T, Jeon M (2013) Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Hum Factors 55(1):157–182. https://doi.org/10.1177/0018720812450587

    Article  Google Scholar 

  57. Wickens CD (2002) Multiple resources and performance prediction. Theor Issues Ergon Sci 3(2):159–177. https://doi.org/10.1080/14639220210123806

    Article  Google Scholar 

  58. Wickens CD, Seppelt B (2002). Interference with driving or in-vehicle task information: The effects of auditory versus visual delivery (Tech. Rep. No. AHFD-02-18/GM-02-3). University of Illinois at Urbana-Champaign, Aviation Human Factors Division, Urbana-Champaign

  59. Wu S, Gable T, May K, Choi YM, Walker BN (2016) Comparison of surface gestures and air gestures for in-vehicle menu navigation. Arch Des Res 29(4):65. https://doi.org/10.15187/adr.2016.11.29.4.65

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Myounghoon Jeon.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sterkenburg, J., Landry, S., FakhrHosseini, S. et al. In-vehicle air gesture design: impacts of display modality and control orientation. J Multimodal User Interfaces 17, 215–230 (2023). https://doi.org/10.1007/s12193-023-00415-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-023-00415-8

Keywords

Navigation