Lifelog

Last updated
Evolution of lifelogging apparatus, including the wearable computer, camera, and viewfinder with wireless Internet connection. Early apparatus used separate transmitting and receiving antennas. Later apparatus evolved toward the appearance of ordinary eyeglasses in the late 1980s and early 1990s. Wearcompevolution.jpg
Evolution of lifelogging apparatus, including the wearable computer, camera, and viewfinder with wireless Internet connection. Early apparatus used separate transmitting and receiving antennas. Later apparatus evolved toward the appearance of ordinary eyeglasses in the late 1980s and early 1990s.
Evolution of the lifelogging lanyard camera. From left to right: Mann (1998); Microsoft (2004); Mann, Fung, Lo (2006); Memoto (2013) LifeGlogging cameras 1998 2004 2006 2013 labeled.jpg
Evolution of the lifelogging lanyard camera. From left to right: Mann (1998); Microsoft (2004); Mann, Fung, Lo (2006); Memoto (2013)

A lifelog is a personal record of one's daily life in a varying amount of detail, for a variety of purposes. The record contains a comprehensive dataset of a human's activities. The data could be used to increase knowledge about how people live their lives. [2] In recent years, some lifelog data has been automatically captured by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers (or sometimes lifebloggers or lifegloggers).

Contents

The sub-field of computer vision that processes and analyses visual data captured by a wearable camera is called "egocentric vision" or egography. [3]

Examples

A known lifelogger was Robert Shields, who manually recorded 25 years of his life from 1972 to 1997, at 5-minute intervals. This record resulted in a 37-million word diary, thought to be the longest ever written. [4]

Steve Mann was the first person to capture continuous physiological data along with a live first-person video from a wearable camera. [5] Starting in 1994, Mann continuously transmitted his life — 24 hours a day, 7 days a week. [6] Using a wearable camera and wearable display, he invited others to see what he was looking at, as well as to send him live feeds or messages in real-time. [7] In 1998 Mann started a community of lifeloggers (also known as lifebloggers or lifegloggers) which has grown to more than 20,000 members. Throughout the 1990s Mann presented this work to the U.S. Army, with two visits to US Natick Army Research Labs. [8]

In 1996, Jennifer Ringley started JenniCam, broadcasting photographs from a webcam in her college bedroom every fifteen seconds; the site was turned off in 2003. [9]

"We Live In Public" was a 24/7 Internet conceptual art experiment created by Josh Harris in December 1999. With a format similar to TV's Big Brother , Harris placed tapped telephones, microphones and 32 robotic cameras in the home he shared with his girlfriend, Tanya Corrin. Viewers talked to Harris and Corrin in the site's chatroom. [10] Harris recently launched the online live video platform, Operator 11. [11]

In 2001, Kiyoharu Aizawa discussed the problem of how to handle a huge amount of videos continuously captured in one's life and presented an automatic summarization. [12]

The lifelog DotComGuy ran throughout 2000, when Mitch Maddox lived the entire year without leaving his house. [13] After Joi Ito's discussion of Moblogging, which involves web publishing from a mobile device, [14] came Gordon Bell's MyLifeBits (2004), an experiment in digital storage of a person's lifetime, including full-text search, text/audio annotations, and hyperlinks.[ citation needed ]

In 2003, a project called LifeLog was started at the Defense Advanced Research Projects Agency (DARPA), under the supervision of Douglas Gage. This project would combine several technologies to record life activities, in order to create a life diary. Shortly after, the notion of lifelogging was identified as a technology and cultural practice that could be exploited by governments, businesses or militaries through surveillance. [15] The DARPA lifelogging project was cancelled by 2004, but this project helped to popularize the idea, and the usage of the term lifelogging in everyday discourse. It contributed to the growing acceptance of using technology for augmented memory. [16]

In 2003, Kiyoharu Aizawa introduced a context-based video retrieval system that was designed to handle data continuously captured from various sources, including a wearable camera, a microphone, and multiple sensors such as a GPS receiver, an acceleration sensor, a gyro sensor, and a brain-wave analyzer. By extracting contextual information from these inputs, the system can retrieve specific scenes captured by the wearable camera. [17]

In 2004, conceptual media artist Alberto Frigo began tracking everything his right hand (his dominant hand) had used, [18] then began adding different tracking and documentation projects. His tracking was done manually rather than using technology.

In 2004 Arin Crumley and Susan Buice met online and began a relationship. They decided to forgo verbal communication during the initial courtship and instead spoke to each other via written notes, sketches, video clips, and Myspace. They went on to create an autobiographical film about their experience, called Four Eyed Monsters . It was part-documentary, part-narrative, with a few scripted elements added. They went on to produce a two-season podcast about the making of the film to promote it.[ citation needed ]

In 2007 Justin Kan began streaming continuous live video and audio from a webcam attached to a cap, beginning at midnight on March 19, 2007. He created a website, Justin.tv, for the purpose. [19] He described this procedure as "lifecasting".

In recent years, with the advent of smartphones and similar devices, lifelogging became much more accessible. For instance, UbiqLog [20] and Experience Explorer [21] employ mobile sensing to perform life logging, while other lifelogging devices, like the Autographer, use a combination of visual sensors and GPS tracking to simultaneously document one's location and what one can see. [22] Lifelogging was popularized by the mobile app Foursquare, which had users "check in" as a way of sharing and saving their location; this later evolved into the popular lifelogging app, Swarm.[ citation needed ]

Life caching

Life caching refers to the social act of storing and sharing one's entire life events in an open and public forum such as Facebook. [23] [24] [25] [26] [27] Modern life caching is considered a form of social networking and typically takes place on the internet. The term was introduced in 2005 by trendwatching.com, [28] in a report predicting this would soon be a trend, given the availability of relevant technology. However, life log information is privacy-sensitive, and therefore sharing such information is associated with risks. [29]

Mobile and wearable apps

To assist in their efforts of tracking, some lifeloggers use mobile devices and apps. Utilizing the GPS and motion processors of digital devices enables lifelogging apps to easily record metadata related to daily activities. Myriad lifelogging apps are available in the App Store (iOS), Google Play and other app distribution platforms, but some commonly cited apps include: Instant, [30] Reporter, [31] Journey, [32] Path, [33] Moves, [34] and HeyDay, [35] insight for Wear (a smartwatch app). [36]

Xperia also has a native mobile application which is called Lifelog. [37] The app works standalone but gets enriched when used with Sony Smart Bands. [38]

Swarm is a lifelogging app that motivates users to check-in, recording every place they've visited, while inspiring them to visit new places.[ citation needed ]

See also

Related Research Articles

<span class="mw-page-title-main">Wearable computer</span> Small computing device worn on the body

A wearable computer, also known as a body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated 3D content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. As such, it is one of the key technologies in the reality-virtuality continuum.

<span class="mw-page-title-main">Steve Mann (inventor)</span> Canadian wearable tech engineer (born 1962)

William Stephen George Mann is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann has sometimes been labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field. He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children. In 2023, Mann unsuccessfully ran for mayor of Toronto.

<span class="mw-page-title-main">Webcam</span> Video camera connected to a computer or network

A webcam is a video camera which is designed to record or stream to a computer or computer network. They are primarily used in video telephony, live streaming and social media, and security. Webcams can be built-in computer hardware or peripheral devices, and are commonly connected to a device using USB or wireless protocols.

<span class="mw-page-title-main">Sousveillance</span> Recording of an activity by a participant

Sousveillance is the recording of an activity by a member of the public, rather than a person or organisation in authority, typically by way of small wearable or portable personal technologies. The term, coined by Steve Mann, stems from the contrasting French words sur, meaning "above", and sous, meaning "below", i.e. "surveillance" denotes the "eye-in-the-sky" watching from above, whereas "sousveillance" denotes bringing the means of observation down to human level, either physically or hierarchically.

<span class="mw-page-title-main">EyeTap</span> Wearable computer worn in front of the eye

An EyeTap is a concept for a wearable computing device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive.

<span class="mw-page-title-main">Computer-mediated reality</span> Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

<span class="mw-page-title-main">Mobile device</span> Small, hand-held computing device

A mobile device or handheld computer is a computer small enough to hold and operate in hand. Mobile devices are typically battery-powered and possess a flat-panel display and one or more built-in input devices, such as a touchscreen or keypad. Modern mobile devices often emphasize wireless networking, to both the Internet and to other devices in their vicinity, such as headsets or in-car entertainment systems, via Wi-Fi, Bluetooth, cellular networks, or near-field communication.

<span class="mw-page-title-main">Gesture recognition</span> Topic in computer science and language technology

Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures. A subdiscipline of computer vision, it employs mathematical algorithms to interpret gestures.

MyLifeBits is a life-logging experiment begun in 2001. It is a Microsoft Research project inspired by Vannevar Bush's hypothetical Memex computer system. The project includes full-text search, text and audio annotations, and hyperlinks. The "experimental subject" of the project is computer scientist Gordon Bell, and the project will try to collect a lifetime of storage on and about Bell. Jim Gemmell of Microsoft Research and Roger Lueder were the architects and creators of the system and its software.

<span class="mw-page-title-main">Microsoft SenseCam</span>

Microsoft's SenseCam is a lifelogging camera with a fisheye lens and trigger sensors, such as accelerometers, heat sensing, and audio, invented by Lyndsay Williams, a patent granted in 2009. Usually worn around the neck, Sensecam is used for the MyLifeBits project, a lifetime storage database. Early developers were James Srinivasan and Trevor Taylor.

Mobile blogging is a method of publishing to a website or blog from a mobile phone or other handheld device. A moblog helps habitual bloggers to post write-ups directly from their phones even when on the move. Mobile blogging has been made possible by technological convergence, as bloggers have been able to write, record and upload different media all from a single, mobile device. At the height of its growth in 2006, mobile blogging experienced 70,000 blog creations a day and 29,100 blog posts an hour. Between 2006 and 2010, blogging among teens declined from 28% to 14%, while blogging among adults over 30 increased from 7% to 11%. However, the growing number of multi-platform blogging apps has increased mobile blogging popularity in recent years creating a brand new market that many celebrities, regular bloggers and specialists are utilizing to widen their social reach.

<span class="mw-page-title-main">Wearable technology</span> Clothing and accessories incorporating computer and advanced electronic technologies

Wearable technology is any technology that is designed to be used while worn. Common types of wearable technology include smartwatches and smartglasses. Wearable electronic devices are often close to or on the surface of the skin, where they detect, analyze, and transmit information such as vital signs, and/or ambient data and which allow in some cases immediate biofeedback to the wearer.

<span class="mw-page-title-main">Smartwatch</span> Wearable computer in the form of a watch

A smartwatch is a portable wearable computer that resembles a wristwatch. Most modern smartwatches are operated via a touchscreen, and rely on mobile apps that run on a connected device in order to provide core functions.

The Narrative Clip is a small wearable lifelogging camera. Its development began in 2012 by the Swedish company Memoto after a successful crowd funding via Kickstarter. It can automatically take a picture every 30 seconds whilst being worn throughout the day, a practice known as "life-logging". At the end of the day the Clip uploads the photos and videos it made into the vendor's cloud service, where they are processed and organized into collections called Moments, available to the user through a web client or mobile apps. The Moments or individual photos and videos can be shared through other apps or through the company's own social network.

<span class="mw-page-title-main">Activity tracker</span> Device or application for monitoring fitness

An activity tracker is an electronic device or app that measures and collects data about an individual's movements and physical responses, towards the goal of monitoring and improving their health, fitness or psychological wellness over time.

<span class="mw-page-title-main">Autographer</span> Camera model

Autographer is a hands-free, wearable digital camera developed by OMG Life. The camera uses five different sensors to determine when to automatically take photos and can take up to 2,000 pictures a day. It was released in July 2013 and is used primarily for lifelogging, entertainment and travel. As of 16 October 2016, OMG Life, the company behind Autographer discontinued operations.

<span class="mw-page-title-main">Cathal Gurrin</span> Irish academic and "lifelogger"

Cathal Gurrin is an Irish Professor and lifelogger. He is the Head of the Adapt Centre at Dublin City University, a Funded Investigator of the Insight Centre, and the director of the Human Media Archives research group. He was previously the deputy head of the School of Computing.

<span class="mw-page-title-main">Smartglasses</span> Wearable computers glasses

Smartglasses or smart glasses are eye or head-worn wearable computers. Many smartglasses include displays that add information alongside or to what the wearer sees. Alternatively, smartglasses are sometimes defined as glasses that are able to change their optical properties, such as smart sunglasses that are programmed to change tint by electronic means. Alternatively, smartglasses are sometimes defined as glasses that include headphone functionality.

Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.

References

  1. IEEE Computer, pp. 25-32, Vol. 30, Iss. 2 Feb. 1997
  2. Gurrin, Cathal; Smeaton, Alan F.; Doherty, Aiden R. (2014). "LifeLogging: Personal Big Data" (PDF). Foundations and Trends in Information Retrieval. 8 (1): 1–125. doi:10.1561/1500000033. ISSN   1554-0669.
  3. An Introduction to the 3rd Workshop on Egocentric (First-person) Vision, Steve Mann, Kris M. Kitani, Yong Jae Lee, M. S. Ryoo, and Alireza Fathi, IEEE Conference on Computer Vision and Pattern Recognition Workshops 2160-7508/14, 2014, IEEE DOI 10.1109/CVPRW.2014.1338272014
  4. Martin, Douglas (29 October 2007). "Robert Shields, Wordy Diarist, Dies at 89". The New York Times. Retrieved 2 September 2019.
  5. Wearable Computing, a First Step Toward Personal Imaging, IEEE Computer, Vol. 30, No. 2, February 1997, 25-32.
  6. "Still Cool Archive: February 1995". Cool Site of the Day. Archived from the original on 2013-09-25. Retrieved 2013-09-26.
  7. Steve Mann (1995). "Wearable Wireless Webcam and Telemetry". WearTech. Retrieved 13 October 2019.
  8. Sensate Liner Quarterly Review, April 23-24th, 1997, Natick Army Labs, Dr. Eric J. Lind, Naval Command Control and Ocean Surveillance Center, Research Development Test and Evaluation Division, Navigation and Applied Sciences Department, Environmental Sciences Division, Materials, Sensors and Systems Branch Code 364
  9. "Jennicam: The first woman to stream her life on the internet". BBC News Magazine. 18 October 2016. Retrieved 13 October 2019.
  10. Charles Platt (November 2000). "Steaming video". Wired. Archived from the original on 11 January 2007. Retrieved 13 October 2019.
  11. Erica Naone (10 August 2007). "The Rise of the Net Jockey". Technology Review.
  12. Aizawa, K.; Ishijima, K.; Shiina, M. (2001). "Summarizing wearable video". Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205). Vol. 2. pp. 398–401. doi:10.1109/ICIP.2001.958135. ISBN   0-7803-6725-1.
  13. Couldry, Nick; McCarthy, Anna (2004-01-07). MediaSpace: place, scale, and culture in a media age. Psychology Press. p. 200. ISBN   978-0-415-29175-0 . Retrieved 31 March 2011.
  14. "Joi Ito's Moblogging, Blogmapping and Moblogmapping related resources as of 6/10/2003". Joi Ito's Radio Outline. 2004. Archived from the original on 31 August 2004. Retrieved 13 October 2019.
  15. Pedersen, Isabel (2005). "A Semiotics of Human Actions for Wearable Augmented Reality Interfaces". Semiotica. 155 (1): 183–201. doi:10.1515/semi.2005.2005.155.1part4.183 (inactive 2024-08-17).{{cite journal}}: CS1 maint: DOI inactive as of August 2024 (link)
  16. Pedersen, Isabel (2013). Ready to Wear: A Rhetoric of Wearable Computers and Reality-Shifting Media. Anderson: Parlor Press. pp. 109–112.
  17. Hori, Tetsuro; Aizawa, Kiyoharu (2003). "Context-based video retrieval system for the life-log applications". Proceedings of the 5th ACM SIGMM international workshop on Multimedia information retrieval - MIR '03. p. 31. doi:10.1145/973264.973270. ISBN   1-58113-778-8.
  18. Bruce Sterling (April 9, 2006). "Alberto Frigo". Wired Magazine.
  19. "A Conversation with Justin Kan of Justin.tv". 10zenmonkeys.org. 2007-06-06. Retrieved 2009-09-30.
  20. Rawassizadeh, Reza; Tomitsch, Martin; Wac, Katarzyna; Tjoa, A. Min (2013). "UbiqLog: a generic mobile phone-based life-log framework". Personal and Ubiquitous Computing. 17 (4): 621–637. CiteSeerX   10.1.1.308.3749 . doi:10.1007/s00779-012-0511-8. S2CID   10664069.
  21. Belimpasakis, Petros; Roimela, Kimmo; You, Yu (2009). 2009 Third International Conference on Next Generation Mobile Applications, Services and Technologies. IEEE. pp. 77–82. doi:10.1109/NGMAST.2009.49. ISBN   978-0-7695-3786-3. S2CID   26172277.
  22. "Autographer Life Logging Wearable Camera Review". 2014. Archived from the original on 2014-05-15. Retrieved 2014-05-14.
  23. Schofield, Jack (August 18, 2004). "How to save your life". The Guardian. London.
  24. Schofield, Jack (February 21, 2007). "Life caching revisited -- Gordon Bell's digital life". The Guardian. London.
  25. Beaumont, Lucy (July 18, 2004). "Life in byte-sized pieces". The Age. Melbourne.
  26. "The trend-spotters handbook". The New Zealand Herald . October 17, 2005. Retrieved October 14, 2011.
  27. "Ethics on the line as ordinary people put themselves in the picture". The Sydney Morning Herald. August 1, 2006.
  28. "LIFE CACHING | an emerging consumer trend and related new business ideas". Archived from the original on 2010-04-12. Retrieved 2010-04-10.
  29. Rawassizadeh, Reza (2012). "Towards sharing life-log information with society". Behaviour & Information Technology. 31 (11): 1057–1067. doi:10.1080/0144929X.2010.510208.
  30. "5 Lifelogging apps for 2018: Keep track of your year". Emberify Blog. 2016-01-14. Archived from the original on 2018-01-18. Retrieved 2018-01-18.
  31. Ellis Hamburger (February 6, 2014). "Reporter for iPhone tracks your whole life, one quiz at a time". The Verge .
  32. Eric Ravenscraft (January 14, 2016). "Journey Is a Journal App With Photo Support and Calendar View". Lifehacker .
  33. Nate Swanner (July 9, 2014). "These three iOS apps make life-logging fun and easy". Slash Gear.
  34. Dani Fankhauser (September 24, 2013). "9 Lifelogging Apps to Log Personal Data". Mashable .
  35. "Lifelogging / Quantified Self". Lifestream Blog.
  36. "Lifelogging tool for Android Smartwatch".
  37. "Lifelog – innovative activity tracker Android™ app from Sony" . Retrieved 10 Oct 2015.
  38. "Sony SmartBand 2" . Retrieved 10 Oct 2015.

Bibliography