We present ChameleonControl, a real-human teleoperation system for scalable remote instruction in hands-on classrooms. In contrast to the existing video or AR/VR-based remote hands-on education, ChameleonControl uses a real human as a surrogate of a remote instructor. Building on existing human-based telepresence approaches (e.g. ChameleonMask), we contribute a novel method to teleoperate a human surrogate through synchronized mixed reality (MR) hand gestural navigation and verbal communication. By overlaying the remote instructor's virtual hands in the local user's MR view, the remote instructor can guide and control the local user as if they were physically present. This allows the local user/surrogate to synchronize their hand movements and gestures with the remote instructor, effectively ``teleoperating'' a real human. We evaluate our system through the in-the-wild deployment for physiotherapy classrooms, as well as lab-based experiments for other application domains such as mechanical assembly, sign language, and cooking lessons. The study results confirm that our approach can increase engagement and the sense of co-presence, showing potential for the future of remote hands-on classrooms.
Mehrad Faridan, Bheesha Kumari, Ryo Suzuki. ChameleonControl: Teleoperating Real Human Surrogates through Mixed Reality Gestural Guidance for Remote Hands-on Classrooms. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI '23). ACM, New York, NY, USA Page: 1-13. DOI: https://doi.org/10.1145/3544548.3581381