iCLIPS  
line decor
   Integrating Context through Annotation and Linking in Information Retrieval for Personal Information Archives
line decor
 
 
 
 

 
 
iCLIPS Lifelogs

We have recorded 20 months of PC, laptop, mobile phone and SenseCam activity for 3 subjects. Details of the lifelog content and method of creation are provided.

Overview

As part of our ongoing work on personal lifelogs we are gathering long term lifelog collections from three subjects. These lifelogs contain PC and laptop computer items accessed (web pages viewed, files created or accessed, emails sent and received, etc), SMSs sent and received, digital photographs taken, and passively captured images of an individual's activity. Lifelog items are annotated with rich sources of automatically generated context data: file name; file location; extension type; date time information; light status; geo-location; weather conditions; people present; mobile phone calls made and received. One month of the lifelog items are additionally annotated with biometric response information.

Lifelog Items

Computer activity: Recorded using the Slife package. Slife monitors computer activity and records the event of a window being brought to the foreground. For each event it records: type of application (e.g. web, chat), document source (e.g. Microsoft Word), window title and begin and end time of the event. Window title, application and document source were used to determine extension type (e.g. pdf, doc). The textual content inside the window (e.g. the text of an email, web page or document being written) and path to each file are obtained using MyLifeBits for two subjects who are users of Windows XP and using in-house scripts for the other one who uses Mac OS X.

SMSs: The subjects all use Nokia N95 mobile phones to capture SMSs. Logs of SMSs sent and received were generated using scripts installed on N95s.

Digital photos: Are taken using the N95's inbuilt 5 mega pixel camera.

Passive image capture: A visual log of subjects' activities was created using a Microsoft Research SenseCam. The SenseCam is a digital camera, with fish-eye lens, worn around a subject's neck. This passively captures images approximately every 20 seconds. Image capture is triggered based on changes in sensor data captured by the device. For example, high acceleration values, passive infrared (body heat detector) as someone walks in front of the wearer or changes in light level. If no sensor has triggered an image to be captured, the camera takes one anyway after a period of approximately 30 seconds. When worn continuously, roughly 3,000 images are captured in an average day.

Lifelog Context Data

Date & time: Using time and date information functions were written to determine, the month, day of week, part of week e.g. weekend or weekday, hour, minute, second, and period of the day e.g. morning, afternoon, evening, night.

Geo-location: GPS data, wireless network presence and GSM location data was captured by constantly running the Campaignr software, provided to us by UCLA (USA), on subjects N95 mobile phones. From which geo-location was derived using in-house scripts.

Light status and weather conditions: Derived using date, time and geo-location information.

People present: The Campaignr software also recorded co-present Bluetooth devices, from which people present can be uncovered.

Mobile phone call logs: Generated using freeware installed on N95s.

Biometric response: Heart rate data was collected using a Polar Heart Rate Monitor. All other biometric data was collected using a BodyMedia SenseWear Pro2 armband. Data captured includes galvanic skin response along with transverse acceleration, longitudinal acceleration, heat flux, skin temperature, and energy expenditure. (1 month only)

 

 

 

 

 

 

 

 

 

 

 
 
             
Funded by SFI Research Frontiers Programme 2006. © iCLIPS, Centre for Digital Video Processing, Dublin City University, 2016. (Site maintained by L. Kelly.)