skip to main content
10.1145/3489849.3489867acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Non-isomorphic Interaction Techniques for Controlling Avatar Facial Expressions in VR

Published: 08 December 2021 Publication History

Abstract

The control of an avatar’s facial expressions in virtual reality is mainly based on the automated recognition and transposition of the user’s facial expressions. These isomorphic techniques are limited to what users can convey with their own face and have recognition issues. To overcome these limitations, non-isomorphic techniques rely on interaction techniques using input devices to control the avatar’s facial expressions. Such techniques need to be designed to quickly and easily select and control an expression, and not disrupt a main task such as talking. We present the design of a set of new non-isomorphic interaction techniques for controlling an avatar facial expression in VR using a standard VR controller. These techniques have been evaluated through two controlled experiments to help designing an interaction technique combining the strengths of each approach. This technique was evaluated in a final ecological study showing it can be used in contexts such as social applications.

References

[1]
Jessalyn Alvina, Chengcheng Qu, Joanna McGrenere, and Wendy E. Mackay. 2019. MojiBoard: Generating Parametric Emojis with Gesture Keyboards. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems(CHI EA ’19). Association for Computing Machinery, Glasgow, Scotland Uk, 1–6. https://doi.org/10.1145/3290607.3312771
[2]
Apple. 2017. Animoji. https://support.apple.com/en-us/HT208190
[3]
Bernhard Bittorf and Charles Wuethrich. 2012. EmotiCon Interactive emotion control for virtual characters. In Proceedings of the 20th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision(WSCG’ 2012). Václav Skala-UNION Agency, Plzen, Czech Republic, 6 pages.
[4]
John Brooke. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.
[5]
Paul Ekman and Wallace V. Friesen. 1978. Facial action coding systems. Consulting Psychologists Press.
[6]
Facebook. 2019. Facebook Spaces. https://web.archive.org/web/20191005002238/https://www.facebook.com/spaces Last accessed July 15th, 2021.
[7]
Scott W. Greenwald, Zhangyuan Wang, Markus Funk, and Pattie Maes. 2017. Investigating Social Presence and Communication with Embodied Avatars in Room-Scale Virtual Reality. In Immersive Learning Research Network, Dennis Beck, Colin Allison, Leonel Morgado, Johanna Pirker, Foaad Khosmood, Jonathon Richter, and Christian Gütl (Eds.). Springer International Publishing, 75–90.
[8]
Marc Hassenzahl, Michael Burmester, and Franz Koller. 2003. AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. In Mensch & computer 2003. Springer, 187–196.
[9]
Jennifer Hyde, Elizabeth J. Carter, Sara Kiesler, and Jessica K. Hodgins. 2015. Using an Interactive Avatar’s Facial Expressiveness to Increase Persuasiveness and Socialness. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems(CHI ’15). ACM, New York, NY, USA, 1719–1728. https://doi.org/10.1145/2702123.2702465 event-place: Seoul, Republic of Korea.
[10]
Hao Li, Laura Trutoiu, Kyle Olszewski, Lingyu Wei, Tristan Trutna, Pei-Lun Hsieh, Aaron Nicholls, and Chongyang Ma. 2015. Facial Performance Sensing Head-Mounted Display. ACM Trans. Graph. 34, 4, Article 47 (July 2015), 9 pages. https://doi.org/10.1145/2766939
[11]
J. Lugrin, D. Zilch, D. Roth, G. Bente, and M. E. Latoschik. 2016. FaceBo: Real-time face and body tracking for faithful avatar synthesis. In 2016 IEEE Virtual Reality(VR ’16). IEEE, Greenville, SC, USA, 225–226. https://doi.org/10.1109/VR.2016.7504735
[12]
Albert Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament. Current Psychology 14, 4 (Dec. 1996), 261–292. https://doi.org/10.1007/BF02686918
[13]
Mozilla. 2020. Hubs by Mozilla. https://hubs.mozilla.com/ Last accessed July 15th, 2021.
[14]
Mirror Networking. 2021. Open Source Networking for Unity. https://mirror-networking.com/, retrieved July 12th, 2021.
[15]
M. Obaid, R. Mukundan, M. Billinghurst, and C. Pelachaud. 2010. Expressive MPEG-4 Facial Animation Using Quadratic Deformation Models. In 2010 Seventh International Conference on Computer Graphics, Imaging and Visualization (CGIV ’10). IEEE, Sydney, Australia, 9–14. https://doi.org/10.1109/CGIV.2010.11
[16]
Soo Youn Oh, Jeremy Bailenson, Nicole Krämer, and Benjamin Li. 2016. Let the Avatar Brighten Your Smile: Effects of Enhancing Facial Expressions in Virtual Environments. PLoS ONE 11, 9 (2016), 18 pages. https://doi.org/10.1371/journal.pone.0161794
[17]
Igor S. Pandzic and Forchheimer Robert (Eds.). 2003. MPEG-4 Facial Animation: The Standard, Implementation And Applications. John Wiley & Sons, Inc.https://doi.org/10.1002/0470854626
[18]
Robert Plutchik. 1980. Chapter 1 - A General Psychoevolutionary Theory of Emotion. In Theories of Emotion, Robert Plutchik and Henry Kellerman (Eds.). Academic Press, 3–33. https://doi.org/10.1016/B978-0-12-558701-3.50007-7
[19]
Robert Plutchik. 2001. The Nature of Emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist 89, 4 (2001), 344–350. https://www.jstor.org/stable/27857503 Publisher: Sigma Xi, The Scientific Research Society.
[20]
Henning Pohl, Dennis Stanke, and Michael Rohs. 2016. EmojiZoom: emoji entry via large overview maps. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services(MobileHCI ’16). Association for Computing Machinery, Florence, Italy, 510–517. https://doi.org/10.1145/2935334.2935382
[21]
Rec Room. 2020. Rec Room. https://recroom.com/ Last accessed July 15th, 2021.
[22]
Fiorella de Rosis, Catherine Pelachaud, Isabella Poggi, Valeria Carofiglio, and Berardina De Carolis. 2003. From Greta’s mind to her face: modelling the dynamics of affective states in a conversational embodied agent. International Journal of Human-Computer Studies 59, 1 (July 2003), 81–118. https://doi.org/10.1016/S1071-5819(03)00020-X
[23]
Jeff Sauro. 2011. A practical guide to the system usability scale: Background, benchmarks & best practices. Measuring Usability LLC.
[24]
Clemens Sielaff. 2010. EmoCoW: An Interface for Real-time Facial Animation. In ACM SIGGRAPH ASIA 2010 Sketches(SA ’10). ACM, New York, NY, USA, 40:1–40:2. https://doi.org/10.1145/1899950.1899990
[25]
Katsuhiro Suzuki, Fumihiko Nakamura, Jiu Otsuka, Katsutoshi Masai, Yuta Itoh, Yuta Sugiura, and Maki Sugimoto. 2016. Facial Expression Mapping Inside Head Mounted Display by Embedded Optical Sensors. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology(UIST ’16 Adjunct). ACM, New York, NY, USA, 91–92. https://doi.org/10.1145/2984751.2985714
[26]
Unicode. 2020. Unicode® Emoji Charts v13.0. https://unicode.org/emoji/charts/ Last accessed July 15th, 2021.
[27]
Vive. 2019. HTC Vive VR headset. https://www.vive.com/us/product/vive-virtual-reality-system/, retrieved January 7th, 2019.
[28]
VRChat Inc.2020. VRChat. https://vrchat.com/ Last accessed July 15th, 2021.
[29]
Thibaut Weise, Sofien Bouaziz, Hao Li, and Mark Pauly. 2011. Realtime Performance-based Facial Animation. In ACM SIGGRAPH 2011 Papers(SIGGRAPH ’11). ACM, New York, NY, USA, 77:1–77:10. https://doi.org/10.1145/1964921.1964972
[30]
Jacob O. Wobbrock, Andrew D. Wilson, and Yang Li. 2007. Gestures without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (Newport, Rhode Island, USA) (UIST ’07). Association for Computing Machinery, New York, NY, USA, 159–168. https://doi.org/10.1145/1294211.1294238
[31]
Anthony Zhang, Arvind Chembarpu, Kevin Smith, Kamus Hadenes, Sarah Braden, Bohdan Turkynewych, Steve Dougherty, and Broderick Carlin. 2021. SpeechRecognition. https://pypi.org/project/SpeechRecognition/ Last accessed July 15th, 2021.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '21: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology
December 2021
563 pages
ISBN:9781450390927
DOI:10.1145/3489849
Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 December 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Avatar
  2. Emoji
  3. Emoticons
  4. Emotion
  5. Facial expression
  6. VR

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

VRST '21

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)58
  • Downloads (Last 6 weeks)7
Reflects downloads up to 31 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media