Teachable Reality: Prototyping Tangible Augmented Reality with Everyday Objects by Leveraging Interactive Machine Teaching

Kyzyl Monteiro , Ritik Vatsal , Neil Chulpongsatorn , Aman Parnami , Ryo Suzuki

chi-2023-monteiro.pdf

Abstract

This paper introduces Teachable Reality, an augmented reality (AR) prototyping tool for creating interactive tangible AR applications with arbitrary everyday objects. Teachable Reality leverages vision-based interactive machine teaching (e.g., Teachable Machine), which captures real-world interactions for AR prototyping. It identifies the user-defined tangible and gestural interactions using an on-demand computer vision model. Based on this, the user can easily create functional AR prototypes without programming, enabled by a trigger-action authoring interface. Therefore, our approach allows the flexibility, customizability, and generalizability of tangible AR applications that can address the limitation of current marker-based approaches. We explore the design space and demonstrate various AR prototypes, which include tangible and deformable interfaces, context-aware assistants, and body-driven AR applications. The results of our user study and expert interviews confirm that our approach can lower the barrier to creating functional AR prototypes while also allowing flexible and general-purpose prototyping experiences.

Keywords:  Augmented RealityMixed RealityPrototyping ToolsTangible InteractionsEveryday ObjectsInteractive Machine TeachingHuman Centered Machine Learning

Reference

Kyzyl Monteiro, Ritik Vatsal, Neil Chulpongsatorn, Aman Parnami, Ryo SuzukiTeachable Reality: Prototyping Tangible Augmented Reality with Everyday Objects by Leveraging Interactive Machine TeachingIn Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI '23)ACM, New York, NY, USA  Page: 1-15.  DOI: https://doi.org/10.1145/3544548.3581449