A lifelog is a personal record of one's daily life in a varying amount of detail, for a variety of purposes. The record contains a comprehensive dataset of a human's activities. The data could be used to increase knowledge about how people live their lives. [2] In recent years, some lifelog data has been automatically captured by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers (or sometimes lifebloggers or lifegloggers).
The sub-field of computer vision that processes and analyses visual data captured by a wearable camera is called "egocentric vision" or egography. [3]
This section possibly contains original research .(June 2024) |
A known lifelogger was Robert Shields, who manually recorded 25 years of his life from 1972 to 1997, at 5-minute intervals. This record resulted in a 37-million word diary, thought to be the longest ever written. [4]
Steve Mann was the first person to capture continuous physiological data along with a live first-person video from a wearable camera. [5] Starting in 1994, Mann continuously transmitted his life — 24 hours a day, 7 days a week. [6] Using a wearable camera and wearable display, he invited others to see what he was looking at, as well as to send him live feeds or messages in real-time. [7] In 1998 Mann started a community of lifeloggers (also known as lifebloggers or lifegloggers) which has grown to more than 20,000 members. Throughout the 1990s Mann presented this work to the U.S. Army, with two visits to US Natick Army Research Labs. [8]
In 1996, Jennifer Ringley started JenniCam, broadcasting photographs from a webcam in her college bedroom every fifteen seconds; the site was turned off in 2003. [9]
"We Live In Public" was a 24/7 Internet conceptual art experiment created by Josh Harris in December 1999. With a format similar to TV's Big Brother , Harris placed tapped telephones, microphones and 32 robotic cameras in the home he shared with his girlfriend, Tanya Corrin. Viewers talked to Harris and Corrin in the site's chatroom. [10] Harris recently launched the online live video platform, Operator 11. [11]
In 2001, Kiyoharu Aizawa discussed the problem of how to handle a huge amount of videos continuously captured in one's life and presented an automatic summarization. [12]
The lifelog DotComGuy ran throughout 2000, when Mitch Maddox lived the entire year without leaving his house. [13] After Joi Ito's discussion of Moblogging, which involves web publishing from a mobile device, [14] came Gordon Bell's MyLifeBits (2004), an experiment in digital storage of a person's lifetime, including full-text search, text/audio annotations, and hyperlinks.[ citation needed ]
In 2003, a project called LifeLog was started at the Defense Advanced Research Projects Agency (DARPA), under the supervision of Douglas Gage. This project would combine several technologies to record life activities, in order to create a life diary. Shortly after, the notion of lifelogging was identified as a technology and cultural practice that could be exploited by governments, businesses or militaries through surveillance. [15] The DARPA lifelogging project was cancelled by 2004, but this project helped to popularize the idea, and the usage of the term lifelogging in everyday discourse. It contributed to the growing acceptance of using technology for augmented memory. [16]
In 2003, Kiyoharu Aizawa introduced a context-based video retrieval system that was designed to handle data continuously captured from various sources, including a wearable camera, a microphone, and multiple sensors such as a GPS receiver, an acceleration sensor, a gyro sensor, and a brain-wave analyzer. By extracting contextual information from these inputs, the system can retrieve specific scenes captured by the wearable camera. [17]
In 2004, conceptual media artist Alberto Frigo began tracking everything his right hand (his dominant hand) had used, [18] then began adding different tracking and documentation projects. His tracking was done manually rather than using technology.
In 2004 Arin Crumley and Susan Buice met online and began a relationship. They decided to forgo verbal communication during the initial courtship and instead spoke to each other via written notes, sketches, video clips, and Myspace. They went on to create an autobiographical film about their experience, called Four Eyed Monsters . It was part-documentary, part-narrative, with a few scripted elements added. They went on to produce a two-season podcast about the making of the film to promote it.[ citation needed ]
In 2007 Justin Kan began streaming continuous live video and audio from a webcam attached to a cap, beginning at midnight on March 19, 2007. He created a website, Justin.tv, for the purpose. [19] He described this procedure as "lifecasting".
In recent years, with the advent of smartphones and similar devices, lifelogging became much more accessible. For instance, UbiqLog [20] and Experience Explorer [21] employ mobile sensing to perform life logging, while other lifelogging devices, like the Autographer, use a combination of visual sensors and GPS tracking to simultaneously document one's location and what one can see. [22] Lifelogging was popularized by the mobile app Foursquare, which had users "check in" as a way of sharing and saving their location; this later evolved into the popular lifelogging app, Swarm.[ citation needed ]
Life caching refers to the social act of storing and sharing one's entire life events in an open and public forum such as Facebook. [23] [24] [25] [26] [27] Modern life caching is considered a form of social networking and typically takes place on the internet. The term was introduced in 2005 by trendwatching.com, [28] in a report predicting this would soon be a trend, given the availability of relevant technology. However, life log information is privacy-sensitive, and therefore sharing such information is associated with risks. [29]
To assist in their efforts of tracking, some lifeloggers use mobile devices and apps. Utilizing the GPS and motion processors of digital devices enables lifelogging apps to easily record metadata related to daily activities. Myriad lifelogging apps are available in the App Store (iOS), Google Play and other app distribution platforms, but some commonly cited apps include: Instant, [30] Reporter, [31] Journey, [32] Path, [33] Moves, [34] and HeyDay, [35] insight for Wear (a smartwatch app). [36]
Xperia also has a native mobile application which is called Lifelog. [37] The app works standalone but gets enriched when used with Sony Smart Bands. [38]
Swarm is a lifelogging app that motivates users to check-in, recording every place they've visited, while inspiring them to visit new places.[ citation needed ]
A wearable computer, also known as a body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.
Augmented reality (AR) is an interactive experience that combines the real world and computer-generated 3D content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. As such, it is one of the key technologies in the reality-virtuality continuum.
William Stephen George Mann is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann has sometimes been labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field. He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children. In 2023, Mann unsuccessfully ran for mayor of Toronto.
A webcam is a video camera which is designed to record or stream to a computer or computer network. They are primarily used in video telephony, live streaming and social media, and security. Webcams can be built-in computer hardware or peripheral devices, and are commonly connected to a device using USB or wireless protocols.
Sousveillance is the recording of an activity by a member of the public, rather than a person or organisation in authority, typically by way of small wearable or portable personal technologies. The term, coined by Steve Mann, stems from the contrasting French words sur, meaning "above", and sous, meaning "below", i.e. "surveillance" denotes the "eye-in-the-sky" watching from above, whereas "sousveillance" denotes bringing the means of observation down to human level, either physically or hierarchically.
An EyeTap is a concept for a wearable computing device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive.
Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.
A mobile device or handheld computer is a computer small enough to hold and operate in hand. Mobile devices are typically battery-powered and possess a flat-panel display and one or more built-in input devices, such as a touchscreen or keypad. Modern mobile devices often emphasize wireless networking, to both the Internet and to other devices in their vicinity, such as headsets or in-car entertainment systems, via Wi-Fi, Bluetooth, cellular networks, or near-field communication.
Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures. A subdiscipline of computer vision, it employs mathematical algorithms to interpret gestures.
MyLifeBits is a life-logging experiment begun in 2001. It is a Microsoft Research project inspired by Vannevar Bush's hypothetical Memex computer system. The project includes full-text search, text and audio annotations, and hyperlinks. The "experimental subject" of the project is computer scientist Gordon Bell, and the project will try to collect a lifetime of storage on and about Bell. Jim Gemmell of Microsoft Research and Roger Lueder were the architects and creators of the system and its software.
Microsoft's SenseCam is a lifelogging camera with a fisheye lens and trigger sensors, such as accelerometers, heat sensing, and audio, invented by Lyndsay Williams, a patent granted in 2009. Usually worn around the neck, Sensecam is used for the MyLifeBits project, a lifetime storage database. Early developers were James Srinivasan and Trevor Taylor.
Mobile blogging is a method of publishing to a website or blog from a mobile phone or other handheld device. A moblog helps habitual bloggers to post write-ups directly from their phones even when on the move. Mobile blogging has been made possible by technological convergence, as bloggers have been able to write, record and upload different media all from a single, mobile device. At the height of its growth in 2006, mobile blogging experienced 70,000 blog creations a day and 29,100 blog posts an hour. Between 2006 and 2010, blogging among teens declined from 28% to 14%, while blogging among adults over 30 increased from 7% to 11%. However, the growing number of multi-platform blogging apps has increased mobile blogging popularity in recent years creating a brand new market that many celebrities, regular bloggers and specialists are utilizing to widen their social reach.
Wearable technology is any technology that is designed to be used while worn. Common types of wearable technology include smartwatches and smartglasses. Wearable electronic devices are often close to or on the surface of the skin, where they detect, analyze, and transmit information such as vital signs, and/or ambient data and which allow in some cases immediate biofeedback to the wearer.
A smartwatch is a portable wearable computer that resembles a wristwatch. Most modern smartwatches are operated via a touchscreen, and rely on mobile apps that run on a connected device in order to provide core functions.
The Narrative Clip is a small wearable lifelogging camera. Its development began in 2012 by the Swedish company Memoto after a successful crowd funding via Kickstarter. It can automatically take a picture every 30 seconds whilst being worn throughout the day, a practice known as "life-logging". At the end of the day the Clip uploads the photos and videos it made into the vendor's cloud service, where they are processed and organized into collections called Moments, available to the user through a web client or mobile apps. The Moments or individual photos and videos can be shared through other apps or through the company's own social network.
An activity tracker is an electronic device or app that measures and collects data about an individual's movements and physical responses, towards the goal of monitoring and improving their health, fitness or psychological wellness over time.
Autographer is a hands-free, wearable digital camera developed by OMG Life. The camera uses five different sensors to determine when to automatically take photos and can take up to 2,000 pictures a day. It was released in July 2013 and is used primarily for lifelogging, entertainment and travel. As of 16 October 2016, OMG Life, the company behind Autographer discontinued operations.
Cathal Gurrin is an Irish Professor and lifelogger. He is the Head of the Adapt Centre at Dublin City University, a Funded Investigator of the Insight Centre, and the director of the Human Media Archives research group. He was previously the deputy head of the School of Computing.
Smartglasses or smart glasses are eye or head-worn wearable computers. Many smartglasses include displays that add information alongside or to what the wearer sees. Alternatively, smartglasses are sometimes defined as glasses that are able to change their optical properties, such as smart sunglasses that are programmed to change tint by electronic means. Alternatively, smartglasses are sometimes defined as glasses that include headphone functionality.
Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.
{{cite journal}}
: CS1 maint: DOI inactive as of August 2024 (link)