top of page

Interpretation framework of lifelog photos

2013 - present

Funded by LG Electronics and

WISET (Korea Center for Woman in Science, Engineering, and Technology)



Supervised by

Prof. Hokyoung Ryu

Imagine X Lab, Hanyang University

My role:

UX researcher

Research methodologies:

interview, survey,

physiological data analysis


The point-and-shoot action for camera is an active behavior that includes the user’s intention to capture events and review them later. As camera technologies are advancing, a new camera with less effortful actions, has been introduced with a quantified-self movement that people try to digitally record one’s daily life  Lifelogging cameras such as SenseCam™ and Narrative Clip™ automatically record almost every life moment every few seconds. Automatic shoot of lifelogging camera might be attributed to a larger portion of the lifelog photos’ having only the factual information that only seemingly shows what the wearers did during one’s daily life but does not include one’s personal meaning. People may elicit different meanings from lifelog photos compared to the photo taken by point-and-shoot action. As the lifelog photos show one’s daily life that would be enough to build one’s self knowledge, it could give opportunities to the wearers to reconcile their life events and reflect their current situation. This would be critical to design the lifelogging camera application that manages the voluminous lifelog data and give opportunities for the users to revisit their meaningful moments by automatically filtering out the particular photos.


Research Aim

  • Determine the effect of capture modality of lifelogging camera and how it could affect the types of photo contents

  • Articulate how such different interpretations of the lifelog photos from the manually capture photos

  • Explore what possible effect of reviewing the lifelog photos by measuring one’s brain activation level while they reviewed the photos

  • Explore what possible effect of sharing the lifelog photos with the third person  


  • Lee, A., & Ryu, H. (2019). Comparison of the Change in Interpretative Stances of Lifelog Photos versus Manually Captured Photos Over Time. Online Information Review.  DOI: 10.1108/OIR-03-2018-0108

  • Lee, A., & Ryu, H. (2018). Emotional Tagging with Lifelog Photos by Sharing Different Levels of Autobiographical Narratives. Asian CHI Symposium at CHI2018.                                                                       (*Best Poster)

  • Lee. A. & Ryu, H. (2016). My mental scrapbook: What to store, what to remember, and what to retrieve in the lifelog data. Proc of HCI Korea 2016. 396-401.

  • Lee. A. (2015). Challenges for Wearable Camera: Understanding of the meaning behind photo-taking. CHI 2015 Extended Abstract. 139-144. ACM. DOI: 10.1145/2702613.2726967   (*2nd place, Student Research Competition)

  • Lee. A., Kim, J., & Ryu, H. (2014). Challenges of Designing a Life-Log Sharing System- The Pensieve. Proc of AHFE 2014 (Advances in Affective and Pleasurable Design). 469-475.            (*Best paper in Affective and Pleasurable Design 2014)

bottom of page