skip to main content
10.1145/3544793.3560325acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
demonstration

OCOsense Glasses for Facial Expressions Recognition and Contextual Affective Computing in Real World and Augmented Reality

Published: 24 April 2023 Publication History

Abstract

The paper presents the novel OCOSenseTM smart glasses with integrated sensors, primarily non-contact optomyographic (OMG) OCOTM sensors, 9-axis inertial measurement unit (IMU), and an altimeter. The glasses connect with a smartphone application, which facilitates the continuous and real-time measurements of facial-muscles activation and head movement, thus allowing for the detection of facial expressions and the activities of the user in real-time. We will demonstrate how the system is used in practice, i.e., a participant will wear the OCOSenseTM glasses, which will stream the sensor data to a tablet, where the real-time visualization of the sensor data and the data interpretation will be presented such as facial expressions (smile, frown, surprise) and activities. We believe that the OCOSenseTM glasses are the next big thing in wearables, which will allow for better understanding of the user's context, activities, emotional state, and more, which can be easily coupled within Augmented and Extended Reality environments.

References

[1]
[1]P. Ekman and W. V. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement. Palo Alto: Consulting Psychologists Press, 1978.
[2]
[2]C. Darwin, The expression of the emotions in man and animals (3rd ed.). 1872.
[3]
[3]G. Duchenne, Mécanisme de la physionomie humaine: où, Analyse électro-physiologique de l'expression des passions. 1876.
[4]
[4]A. J. Fridlund and J. T. Cacioppo, “Guidelines for Human Electromyographic Research,” Psychophysiology, vol. 23, no. 5, pp. 567–589, 1986.
[5]
[5]L. F. Barrett, R. Adolphs, S. Marsella, A. M. Martinez, and S. D. Pollak, “Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements,” Psychol. Sci. Public Interes., vol. 20, no. 1, pp. 1–68, Jul. 2019.
[6]
[6]J. Russell and L. Barrett, “Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant.,” J. Personal. Soc. …, 1999, [Online]. Available: http://psycnet.apa.org/journals/psp/76/5/805/.
[7]
[7]H. H. Muhammed and J. Raghavendra, “Optomyography ( OMG ): A Novel Technique for the Detection of Muscle Surface Displacement Using Photoelectric Sensors,” diva-portal.org, vol. 10, no. 13, 2015, Accessed: Aug. 03, 2022. [Online]. Available: https://www.diva-portal.org/smash/record.jsf?pid=diva2:886303.
[8]
[8]Lisa Feldman Barrett, How Emotions Are Made by Lisa Feldman Barrett. 2018.

Cited By

View all
  • (2024)Efficient Real-time On-the-edge Facial Expression Recognition using Optomyography Smart Glasses2024 International Conference on Intelligent Environments (IE)10.1109/IE61493.2024.10599896(49-55)Online publication date: 17-Jun-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp/ISWC '22 Adjunct: Adjunct Proceedings of the 2022 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2022 ACM International Symposium on Wearable Computers
September 2022
538 pages
ISBN:9781450394239
DOI:10.1145/3544793
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 April 2023

Check for updates

Author Tags

  1. Activity
  2. Affective Computing
  3. Emotion Recognition
  4. Facial Expressions
  5. Glasses
  6. IMU
  7. Machine Learning
  8. OMG
  9. Valence

Qualifiers

  • Demonstration
  • Research
  • Refereed limited

Funding Sources

Conference

UbiComp/ISWC '22

Acceptance Rates

Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)40
  • Downloads (Last 6 weeks)5
Reflects downloads up to 14 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Efficient Real-time On-the-edge Facial Expression Recognition using Optomyography Smart Glasses2024 International Conference on Intelligent Environments (IE)10.1109/IE61493.2024.10599896(49-55)Online publication date: 17-Jun-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media