A 4-D Diary Co-authored by Human and Machine: Interactively Visualize, Control, and Understand Data Collection and ML Computation within Built Environments
Advisors: Daniel Cardoso Llach, Mayank Goel
Abstract:
Ubiquitous data acquisition and machine learning (ML) systems, in particular, smart cameras, have proliferated throughout all kinds of built environments for surveillance or profit-oriented analysis. However, building occupants are rarely consciously aware of their presence and have no access to the data or its analysis. Transparency and privacy concerns may be exaggerated when ML inferences extract more information from the collected data. Occupants should gain not only awareness but also control over the data collected from themselves and the computation analyzing them. Critically investigating ML-powered smart cameras, I first performed a self-surveillance experiment documenting my personal life. It generated a vivid 4-D diary that captured lively details and recreated memories through a combination of machine recognitions and personal diaries. Extending the first experiment, I transformed the unidirectional data acquisition and ML computation into a bi-directional interactive system in the form of a 4D diary co-authored by the machine and the occupants. Through user study of a digital prototype and semi-structured interviews, I studied the dynamics between the occupants and the data collection and ML computation system. The interactive data visualization, control, and communication with a chatbot promoted occupants’ understanding of the system and addressed transparency and privacy concerns. This research seeks to empower occupants with an active voice in the current climate of passive data acquisition and ML computation systems, and reintroduce the rich and humane aspects that have been overshadowed by automation. It demonstrates a methodology guiding technology development empathetically.