skip to main content
research-article

Virtual Big Heads in Extended Reality: Estimation of Ideal Head Scales and Perceptual Thresholds for Comfort and Facial Cues

Published:11 January 2023Publication History
Skip Abstract Section

Abstract

Extended reality (XR) technologies, such as virtual reality (VR) and augmented reality (AR), provide users, their avatars, and embodied agents a shared platform to collaborate in a spatial context. Although traditional face-to-face communication is limited by users’ proximity, meaning that another human’s non-verbal embodied cues become more difficult to perceive the farther one is away from that person, researchers and practitioners have started to look into ways to accentuate or amplify such embodied cues and signals to counteract the effects of distance with XR technologies. In this article, we describe and evaluate the Big Head technique, in which a human’s head in VR/AR is scaled up relative to their distance from the observer as a mechanism for enhancing the visibility of non-verbal facial cues, such as facial expressions or eye gaze. To better understand and explore this technique, we present two complimentary human-subject experiments in this article. In our first experiment, we conducted a VR study with a head-mounted display to understand the impact of increased or decreased head scales on participants’ ability to perceive facial expressions as well as their sense of comfort and feeling of “uncannniness” over distances of up to 10 m. We explored two different scaling methods and compared perceptual thresholds and user preferences. Our second experiment was performed in an outdoor AR environment with an optical see-through head-mounted display. Participants were asked to estimate facial expressions and eye gaze, and identify a virtual human over large distances of 30, 60, and 90 m. In both experiments, our results show significant differences in minimum, maximum, and ideal head scales for different distances and tasks related to perceiving faces, facial expressions, and eye gaze, and we also found that participants were more comfortable with slightly bigger heads at larger distances. We discuss our findings with respect to the technologies used, and we discuss implications and guidelines for practical applications that aim to leverage XR-enhanced facial cues.

REFERENCES

  1. [1] Adcock Matt and Gunn Chris. 2010. Annotating with ‘sticky’ light for remote guidance. In ACM SIGGRAPH ASIA 2010 Posters. 1.Google ScholarGoogle Scholar
  2. [2] Andrist Sean, Pejsa Tomislav, Mutlu Bilge, and Gleicher Michael. 2012. Designing effective gaze mechanisms for virtual agents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 705714.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. [3] Andrist Sean, Pejsa Tomislav, Mutlu Bilge, and Gleicher Michael. 2012. A head-eye coordination model for animating gaze shifts of virtual characters. In Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction. 16.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. [4] Sanz Ferran Argelaguet, Olivier Anne Helene, Bruder Gerd, Pettre Julien, and Lecuyer Anatole. 2015. Virtual proxemics: Locomotion in the presence of obstacles in large immersive projection environments. In Proceedings of IEEE Virtual Reality (VR’15). 7580.Google ScholarGoogle ScholarCross RefCross Ref
  5. [5] Bailenson Jeremy N., Blascovich Jim, Beall Andrew C., and Loomis Jack M.. 2003. Interpersonal distance in immersive virtual environments. Personality and Social Psychology Bulletin 29, 7 (2003), 819833.Google ScholarGoogle ScholarCross RefCross Ref
  6. [6] Bekerman Inessa, Gottlieb Paul, and Vaiman Michael. 2014. Variations in eyeball diameters of the healthy adults. Journal of Ophthalmology 2014 (2014), 503645.Google ScholarGoogle ScholarCross RefCross Ref
  7. [7] Bélisle Jean-François and Bodur H. Onur. 2010. Avatars as information: Perception of consumers based on their avatars in virtual worlds. Psychology & Marketing 27, 8 (2010), 741765.Google ScholarGoogle ScholarCross RefCross Ref
  8. [8] Billinghurst Mark, Kato Hirokazu, and Poupyrev Ivan. 2001. The MagicBook—Moving seamlessly between reality and virtuality. IEEE Computer Graphics and Applications 21, 3 (2001), 68.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. [9] Breuss-Schneeweis Philipp. 2016. “The speaking celt” augmented reality avatars guide through a museum–case study. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. 14841491.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. [10] Bruder Gerd, Pusch Andreas, and Steinicke Frank. 2012. Analyzing effects of geometric rendering parameters on size and distance estimation in on-axis stereographics. In Proceedings of the ACM Symposium on Applied Perception (SAP’12). 111118.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. [11] Cafaro Angelo, Vilhjálmsson Hannes Högni, Bickmore Timothy, Heylen Dirk, Jóhannsdóttir Kamilla Rún, and Valgarðsson Gunnar Steinn. 2012. First impressions: Users’ judgments of virtual agents’ personality and interpersonal attitude in first encounters. In Proceedings of the International Conference on Intelligent Virtual Agents. 6780.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. [12] Choudhary Zubin, Bruder Gerd, and Welch Gregory. 2021. Scaled user embodied representations in virtual and augmented reality. In Proceedings of the Workshop on User-Embodied Interaction in Virtual Reality (UIVR’21).Google ScholarGoogle Scholar
  13. [13] Choudhary Zubin, Gottsacker Matthew, Kim Kangsoo, Schubert Ryan, Stefanucci Jeanine, Bruder Gerd, and Welch Gregory F.. 2021. Revisiting distance perception with scaled embodied cues in social virtual reality. In Proceedings of IEEE Virtual Reality (VR’21). 110.Google ScholarGoogle ScholarCross RefCross Ref
  14. [14] Choudhary Zubin, Kim Kangsoo, Schubert Ryan, Bruder Gerd, and Welch Gregory F.. 2020. Virtual big heads: Analysis of human perception and comfort of head scales in social virtual reality. In Proceedings of IEEE Virtual Reality (VR’20). 425433.Google ScholarGoogle Scholar
  15. [15] Choudhary Zubin, Ugarte Jesus, Bruder Gerd, and Welch Greg. 2021. Real-time magnification in augmented reality. In Proceedings of the ACM Symposium on Spatial User Interaction (SUI’21). 12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. [16] Colburn Alex, Cohen Michael F., and Drucker Steven. 2000. The Role of Eye Gaze in Avatar Mediated Conversational Interfaces. Technical Report. Microsoft Research.Google ScholarGoogle Scholar
  17. [17] Davis Matthew Christopher, Can Dang D., Pindrik Jonathan, Rocque Brandon G., and Johnston James M.. 2016. Virtual interactive presence in global surgical education: International collaboration through augmented reality. World Neurosurgery 86 (2016), 103111.Google ScholarGoogle ScholarCross RefCross Ref
  18. [18] Melo Celso M. de, Carnevale Peter, and Gratch Jonathan. 2012. The effect of virtual agents’ emotion displays and appraisals on people’s decision making in negotiation. In Proceedings of the International Conference on Intelligent Virtual Agents. 5366.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. [19] DiSalvo Carl F., Gemperle Francine, Forlizzi Jodi, and Kiesler Sara. 2002. All robots are not created equal: The design and perception of humanoid robot heads. In Proceedings of the ACM Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques. 321326.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. [20] Du Shichuan and Martinez Aleix M.. 2013. Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion. Journal of Vision 13, 4 (2013), 13.Google ScholarGoogle ScholarCross RefCross Ref
  21. [21] Ducheneaut Nicolas, Wen Ming-Hui, Yee Nicholas, and Wadley Greg. 2009. Body and mind: A study of avatar personalization in three virtual worlds. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. 11511160.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. [22] Dunn Robert Andrew and Guadagno Rosanna E.. 2012. My avatar and me—Gender and personality predictors of avatar-self discrepancy. Computers in Human Behavior 28, 1 (2012), 97106.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. [23] Ekman Paul. 1989. The argument and evidence about universals in facial expressions. In Handbook of Social Psychophysiology. Wiley, 143164.Google ScholarGoogle Scholar
  24. [24] Ekman Paul. 1993. Facial expression and emotion. American Psychologist 48, 4 (1993), 384.Google ScholarGoogle ScholarCross RefCross Ref
  25. [25] Ekman Rosenberg. 1997. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford University Press.Google ScholarGoogle Scholar
  26. [26] Erickson Austin, Kim Kangsoo, Bruder Gerd, and Welch Gregory. 2020. Exploring the limitations of environment lighting on optical see-through head-mounted displays. In Proceedings of the ACM Conference on Spatial User Interfaces (SUI’20). 18.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. [27] Erickson Austin, Kim Kangsoo, Bruder Gerd, and Welch Gregory F.. 2020. A review of visual perception research in optical see-through augmented reality. In Proceedings of the International Conference on Artificial Reality and Telexistence and the Eurographics Symposium on Virtual Environments. 19.Google ScholarGoogle Scholar
  28. [28] Erickson Austin, Kim Kangsoo, Bruder Gerd, and Welch Gregory F.. 2020. Effects of dark mode graphics on visual acuity and fatigue with virtual reality head-mounted displays. In Proceedings of IEEE Virtual Reality (VR’20). 434442.Google ScholarGoogle Scholar
  29. [29] Erickson Austin, Kim Kangsoo, Lambert Alexis, Bruder Gerd, Browne Michael P., and Welch Greg. 2021. An extended analysis on the benefits of dark mode user interfaces in optical see-through head-mounted displays. ACM Transactions on Applied Perception 18, 3 (2021), 1–22.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. [30] Fernández-Caballero Antonio, Navarro Elena, Fernández-Sotos Patricia, González Pascual, Ricarte Jorge J., Latorre José M., and Rodriguez-Jimenez Roberto. 2017. Human-avatar symbiosis for the treatment of auditory verbal hallucinations in schizophrenia through virtual/augmented reality and brain-computer interfaces. Frontiers in Neuroinformatics 11 (2017), 64.Google ScholarGoogle ScholarCross RefCross Ref
  31. [31] Gabbard Joseph, Swan J. Edward, Zedlitz Jason, and Winchester Woodrow W.. 2010. More than meets the eye: An engineering study to empirically examine the blending of real and virtual color spaces. In Proceedings of IEEE Virtual Reality (VR’10). 7986.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. [32] Garau Maia, Slater Mel, Vinayagamoorthy Vinoba, Brogni Andrea, Steed Anthony, and Sasse M. Angela. 2003. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. 529536.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. [33] Bomb Giant. 2021. Big Head Mode. Retrieved November 12, 2022 from https://www.giantbomb.com/big-head-mode/3015-403/.Google ScholarGoogle Scholar
  34. [34] Guven Sinem, Oda Ohan, Podlaseck Mark, Stavropoulos Harry, Kolluri Sai, and Pingali Gopal. 2009. Social mobile augmented reality for retail. In Proceedings of the 2009 IEEE International Conference on Pervasive Computing and Communications. IEEE, Los Alamitos, CA, 13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. [35] Hager Joseph C. and Ekman Paul. 1979. Long-distance of transmission of facial affect signals. Ethology and Sociobiology 1, 1 (1979), 7782.Google ScholarGoogle ScholarCross RefCross Ref
  36. [36] Hall Edward. 1963. A system for the notation of proxemic behavior. American Anthropologist 65, 5 (1963), 10031026.Google ScholarGoogle ScholarCross RefCross Ref
  37. [37] Hall Edward. 1969. The Hidden Dimension: Man’s Use of Space in Public and in Private. Anchor Books.Google ScholarGoogle Scholar
  38. [38] Iachini Tina, Coello Yann, Frassinetti Francesca, and Ruggiero Gennaro. 2014. Body space in social interactions: A comparison of reaching and comfort distance in immersive virtual reality. PloS One 9 (2014), e111511.Google ScholarGoogle Scholar
  39. [39] Kennedy Daniel P., Gläscher Jan, Tyszka J. Michael, and Adolphs Ralph. 2009. Personal space regulation by the human amygdala. Nature Neuroscience 12 (2009), 12261227.Google ScholarGoogle ScholarCross RefCross Ref
  40. [40] Kim Kangsoo, Billinghurst Mark, Bruder Gerd, Duh Henry Been-Lirn, and Welch Gregory F.. 2018. Revisiting trends in augmented reality research: A review of the 2nd decade of ISMAR (2008–2017). IEEE Transactions on Visualization and Computer Graphics 24, 11 (2018), 29472962.Google ScholarGoogle ScholarCross RefCross Ref
  41. [41] Kim Kangsoo, Norouzi Nahal, Losekamp Tiffany, Bruder Gerd, Anderson Mindi, and Welch Gregory. 2019. Effects of patient care assistant embodiment and computer mediation on user experience. In Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR’19). 17177.Google ScholarGoogle ScholarCross RefCross Ref
  42. [42] Kiyokawa Kiyoshi, Takemura Haruo, and Yokoya Naokazu. 1999. A collaboration support technique by integrating a shared virtual reality and a shared augmented reality. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics. 4853.Google ScholarGoogle ScholarCross RefCross Ref
  43. [43] Kramida Gregory. 2015. Resolving the vergence-accommodation conflict in head-mounted displays. IEEE Transactions on Visualization and Computer Graphics 22, 7 (2015), 19121931.Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. [44] Krauß Veronika, Boden Alexander, Oppermann Leif, and Reiners René. 2021. Current practices, challenges, and design implications for collaborative AR/VR application development. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 115.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. [45] Kress Bernard, Saeedi Ehsan, and Perriere Vincent Brac-de-la. 2014. The segmentation of the HMD market: Optics for smart glasses, smart eyewear, AR and VR headsets. In Photonics Applications for Aviation, Aerospace, Commercial, and Harsh Environments V, Vol. 9202. International Society for Optics and Photonics, 92020D.Google ScholarGoogle Scholar
  46. [46] Kress Bernard C.. 2019. Digital optical elements and technologies (EDO19): Applications to AR/VR/MR. In Digital Optical Technologies 2019, Vol. 11062. International Society for Optics and Photonics, 1106222.Google ScholarGoogle Scholar
  47. [47] Kruijff Ernst, Swan J. Edward, and Feiner Steven. 2010. Perceptual issues in augmented reality revisited. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR’10). 312.Google ScholarGoogle ScholarCross RefCross Ref
  48. [48] Latoschik Marc Erich, Roth Daniel, Gall Dominik, Achenbach Jascha, Waltemate Thomas, and Botsch Mario. 2017. The effect of avatar realism in immersive social virtual realities. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 110.Google ScholarGoogle ScholarCross RefCross Ref
  49. [49] Lee Jong-Eun Roselyn, Nass Clifford I., and Bailenson Jeremy N.. 2014. Does the mask govern the mind?: Effects of arbitrary gender representation on quantitative task performance in avatar-represented virtual groups. Cyberpsychology, Behavior, and Social Networking 17, 4 (2014), 248254.Google ScholarGoogle ScholarCross RefCross Ref
  50. [50] Lee Myungho, Bruder Gerd, and Welch Gregory F.. 2017. Exploring the effect of vibrotactile feedback through the floor on social presence in an immersive virtual environment. In Proceedings of IEEE Virtual Reality (VR’17).105111.Google ScholarGoogle Scholar
  51. [51] Lee Myungho, Norouzi Nahal, Bruder Gerd, Wisniewski Pamela, and Welch Gregory. 2019. Mixed reality tabletop gameplay: Social interaction with a virtual human capable of physical influence. IEEE Transactions on Visualization and Computer Graphics 24, 8 (2019), 112.Google ScholarGoogle ScholarCross RefCross Ref
  52. [52] Lee Michael D., Vast Robyn L., and Butavicius Marcus A.. 2006. Face matching under time pressure and task demands. In Proceedings of the 28th Annual Conference of the Cognitive Science Society. 16751680.Google ScholarGoogle Scholar
  53. [53] McDonnell Rachel, Breidt Martin, and Bülthoff Heinrich H.. 2012. Render me real? Investigating the effect of render style on the perception of animated virtual humans. ACM Transactions on Graphics 31, 4 (2012), 111.Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. [54] McGill Mark and Brewster Stephen A.. 2017. I am the passenger: Challenges in supporting AR/VR HMDs in-motion. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 251251.Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. [55] Mori Masahiro, MacDorman Karl F., and Kageki Norri. 2012. The uncanny valley [from the field]. IEEE Robotics & Automation Magazine 19, 2 (2012), 98100.Google ScholarGoogle ScholarCross RefCross Ref
  56. [56] Norouzi Nahal, Bruder Gerd, Belna Brandon, Mutter Stefanie, Turgut Damla, and Welch Gregory. 2019. A systematic review of the convergence of augmented reality, intelligent virtual agents, and the Internet of Things. In Artificial Intelligence in IoT. Transactions on Computational Science and Computational Intelligence. Springer, 1–24.Google ScholarGoogle Scholar
  57. [57] Norouzi Nahal, Erickson Austin, Kim Kangsoo, Schubert Ryan, LaViola Joseph, Bruder Gerd, and Welch Greg. 2019. Effects of shared gaze parameters on visual target identification task performance in augmented reality. In Proceedings of the ACM Symposium on Spatial User Interaction (SUI’19). 111.Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. [58] Oh Soo Youn, Bailenson Jeremy, Krämer Nicole, and Li Benjamin. 2016. Let the avatar brighten your smile: Effects of enhancing facial expressions in virtual environments. PloS One 11, 9 (2016), e0161794.Google ScholarGoogle ScholarCross RefCross Ref
  59. [59] Palmarini Riccardo, Erkoyuncu John Ahmet, Roy Rajkumar, and Torabmostaedi Hosein. 2018. A systematic review of augmented reality applications in maintenance. Robotics and Computer-Integrated Manufacturing 49 (2018), 215228.Google ScholarGoogle ScholarCross RefCross Ref
  60. [60] Peck Tabitha C., Good Jessica J., Erickson Austin, Bynum Isaac, and Bruder Gerd. 2022. Effects of transparency on perceived humanness: Implications for rendering skin tones using optical see-through displays. IEEE Transactions on Visualization and Computer Graphics 28, 5 (2022), 21792189.Google ScholarGoogle ScholarCross RefCross Ref
  61. [61] Pidel Catlin and Ackermann Philipp. 2020. Collaboration in virtual and augmented reality: A systematic overview. In Proceedings of the International Conference on Augmented Reality, Virtual Reality, and Computer Graphics. 141156.Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. [62] Piumsomboon Thammathip, Day Arindam, Ens Barrett, Lee Youngho, Lee Gun, and Billinghurst Mark. 2017. Exploring enhancements for remote mixed reality collaboration. In Proceedings of ACM SIGGRAPH Asia Mobile Graphics and Interactive Applications. 16.Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. [63] Piumsomboon Thammathip, Lee Gun A., Hart Jonathon D., Ens Barrett, Lindeman Robert W., Thomas Bruce H., and Billinghurst Mark. 2018. Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 113.Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. [64] Piumsomboon Thammathip, Lee Youngho, Lee Gun A., Dey Arindam, and Billinghurst Mark. 2017. Empathic mixed reality: Sharing what you feel and interacting with what you see. In Proceedings of the IEEE International Symposium on Ubiquitous Virtual Reality. 3841.Google ScholarGoogle ScholarCross RefCross Ref
  65. [65] Renner Rebekka S., Velichkovsky Boris M., and Helmert Jens R.. 2013. The perception of egocentric distances in virtual environments—A review. ACM Computing Surveys 46, 2 (2013), Article 23, 40 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. [66] Ruhland Kerstin, Peters Christopher E., Andrist Sean, Badler Jeremy B., Badler Norman I., Gleicher Michael, Mutlu Bilge, and McDonnell Rachel. 2015. A review of eye gaze in virtual agents, social robotics and HCI: Behaviour generation, user interaction and perception. Computer Graphics Forum 34 (2015), 299–326.Google ScholarGoogle Scholar
  67. [67] Salkind Neil J.. 2010. Triangulation. In Encyclopedia of Research Design. SAGE, Thousand Oaks, CA, 1.Google ScholarGoogle ScholarCross RefCross Ref
  68. [68] Schrammel Franziska, Pannasch Sebastian, Graupner Sven-Thomas, Mojzisch Andreas, and Velichkovsky Boris M.. 2009. Virtual friend or threat? The effects of facial expression and gaze interaction on psychophysiological responses and emotional experience. Psychophysiology 46, 5 (2009), 922931.Google ScholarGoogle ScholarCross RefCross Ref
  69. [69] Seidel Eva-Maria, Habel Ute, Kirschner Michaela, Gur Ruben C., and Derntl Birgit. 2010. The impact of facial emotional expressions on behavioral tendencies in women and men. Journal of Experimental Psychology: Human Perception and Performance 36, 2 (2010), 500.Google ScholarGoogle ScholarCross RefCross Ref
  70. [70] Seyama Jun’ichiro and Nagayama Ruth S.. 2007. The uncanny valley: Effect of realism on the impression of artificial human faces. Presence 16, 4 (2007), 337351.Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. [71] Shibata Takashi, Kim Joohwan, Hoffman David M., and Banks Martin S.. 2011. The zone of comfort: Predicting visual discomfort with stereo displays. Journal of Vision 11, 8 (2011), 11.Google ScholarGoogle ScholarCross RefCross Ref
  72. [72] Slater Mel, Pertaub D.-P., and Steed Anthony. 1999. Public speaking in virtual reality: Facing an audience of avatars. IEEE Computer Graphics and Applications 19, 2 (1999), 69.Google ScholarGoogle ScholarDigital LibraryDigital Library
  73. [73] Smith Fraser W. and Schyns Philippe G.. 2009. Smile through your fear and sadness: Transmitting and identifying facial expression signals over a range of viewing distances. Psychological Science 20, 10 (2009), 12021208.Google ScholarGoogle ScholarCross RefCross Ref
  74. [74] Smith Harrison Jesse and Neff Michael. 2018. Communication behavior in embodied virtual reality. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems. 112.Google ScholarGoogle ScholarDigital LibraryDigital Library
  75. [75] Vera Lucía, Gimeno Jesús, Coma Inmaculada, and Fernández Marcos. 2011. Augmented mirror: Interactive augmented reality system based on kinect. In Proceedings of the IFIP Conference on Human-Computer Interaction. 483486.Google ScholarGoogle ScholarCross RefCross Ref
  76. [76] Walker Michael E., Szafir Daniel, and Rae Irene. 2019. The influence of size in augmented reality telepresence avatars. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR’19). IEEE, Los Alamitos, CA, 538546.Google ScholarGoogle ScholarCross RefCross Ref
  77. [77] Wallraven Christian, Bülthoff Heinrich H., Cunningham Douglas W., Fischer Jan, and Bartz Dirk. 2007. Evaluation of real-world and computer-generated stylized facial expressions. ACM Transactions on Applied Perception 4, 3 (2007), 16–es.Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. [78] Ware Colin. 2019. Information Visualization: Perception for Design. Morgan Kaufmann.Google ScholarGoogle Scholar
  79. [79] Welch Gregory F., Bruder Gerd, Squire Peter, and Schubert Ryan. 2019. Anticipating Widespread Augmented Reality: Insights from the 2018 AR Visioning Workshop. Technical Report. University of Central Florida and Office of Naval Research.Google ScholarGoogle Scholar
  80. [80] Yee Nick, Bailenson Jeremy N., Urbanek Mark, Chang Francis, and Merget Dan. 2007. The unbearable likeness of being digital: The persistence of nonverbal social norms in online virtual environments. Cyberpsychology & Behavior 10, 1 (2007), 115–121. DOI:Google ScholarGoogle ScholarCross RefCross Ref
  81. [81] Yoon Boram, Kim Hyung-Il, Lee Gun A., Billinghurst Mark, and Woo Woontack. 2019. The effect of avatar appearance on social presence in an augmented reality remote collaboration. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR’19). IEEE, Los Alamitos, CA, 547556.Google ScholarGoogle ScholarCross RefCross Ref
  82. [82] Zabels Roberts, Osmanis Krišs, Narels Mārtiņ, Gertners Uģis, Ozols Ainārs, Rūtenbergs Kārlis, and Osmanis Ilmārs. 2019. AR displays: Next-generation technologies to solve the vergence–accommodation conflict. Applied Sciences 9, 15 (2019), 3147. https://www.mdpi.com/2076-3417/9/15/3147.Google ScholarGoogle ScholarCross RefCross Ref
  83. [83] Zell Eduard, Aliaga Carlos, Jarabo Adrian, Zibrek Katja, Gutierrez Diego, McDonnell Rachel, and Botsch Mario. 2015. To stylize or not to stylize? The effect of shape and material stylization on the perception of computer-generated faces. ACM Transactions on Graphics 34, 6 (2015), 112.Google ScholarGoogle ScholarDigital LibraryDigital Library
  84. [84] Zhang Yulei Gavin, Dang Yan Mandy, Brown Susan A., and Chen Hsinchun. 2017. Investigating the impacts of avatar gender, avatar age, and region theme on avatar physical activity in the virtual world. Computers in Human Behavior 68 (2017), 378387.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Virtual Big Heads in Extended Reality: Estimation of Ideal Head Scales and Perceptual Thresholds for Comfort and Facial Cues

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Applied Perception
          ACM Transactions on Applied Perception  Volume 20, Issue 1
          January 2023
          122 pages
          ISSN:1544-3558
          EISSN:1544-3965
          DOI:10.1145/3584022
          Issue’s Table of Contents

          ACM acknowledges that this contribution was authored or co-authored by an employee, contractor, or affiliate of the United States government. As such, the United States government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for government purposes only.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 11 January 2023
          • Online AM: 10 November 2022
          • Accepted: 2 November 2022
          • Revised: 12 September 2022
          • Received: 5 May 2022
          Published in tap Volume 20, Issue 1

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Refereed

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Full Text

        View this article in Full Text.

        View Full Text

        HTML Format

        View this article in HTML Format .

        View HTML Format