iLab Invited Talk Series

2021-01-25

Creating Smart Everyday Things

Abstract

In my vision, the user interfaces of the future are in a blend of smart physical and virtual environments. My research focuses on the physical side by bringing interactivity to everyday things. I believe this vision is only achievable if people with varying backgrounds and abilities can work together in an accessible and collaborative environment. In this talk, I will describe two major threads of research in interactive everyday things and hardware prototyping tools. The first thread investigates interactive systems to sense the context of use of the things or estimate a user’s intention when touch input data is noisy. For example, I will demonstrate a tablecloth augmented with a fabric sensor that can sense and recognize non-metallic objects placed on a table, such as food, different types of fruits, liquids, plastic, and paper products. I will also show examples of how this technique can be used for contextual applications. The second thread investigates tools to lower the bar of entry to prototyping electronics, which is an essential skill needed to create smart everyday things. The goal of this line of work is to enable more people with varying backgrounds and abilities to create smart everyday things and eventually a better user experience of smart environments. For example, I will demonstrate an audio-tactile tutorial system for blind or low vision learners to understand circuit diagrams, which is an important task in the circuit prototyping pipeline. Both of these threads share a common goal that is to create a better user experience in smart environments.

Bio

Xing-Dong Yang is an Assistant Professor of Computer Science at Dartmouth College. His research is broadly in Human-Computer Interaction (HCI), where he creates interactive systems using sensing techniques and haptics to enable new applications in smart physical and virtual environments. Xing-Dong’s work is recognized through a Best Paper award at UIST 2019, eight Honorable Mention awards with one at UIST 2020, six at CHI (2010, 2016, 2018, 2019 × 2, 2020), and one at MobileHCI 2009. Aside from academic publications, Xing-Dong’s work attracts major public interest via news coverage from a variety of media outlets with different mediums, including TV (e.g., Discovery Daily Planet), print (e.g., The Wall Street Journal, Forbes), and Internet News (e.g., MIT Technology Review, New Scientist).

2021-01-15

'Mechanical Shells' for Actuated Tangible UIs - Hybrid Architecture of Active and Passive Machines for Interaction Design

Abstract

Research on actuated and shape-changing Tangible User Interfaces (TUIs) in the field of HCI has been explored widely in the past few decades to enrich interaction with digital information in physical and dynamic ways. In this effort, various types of generic devices of actuated TUIs have been investigated including pin-based shape displays, actuated curve interfaces, and swarm user interfaces. While these approaches are intended to be dynamically reconfigurable to offer generic interactivity, each hardware is inherently limited to the fixed configurations. How can we further expand the versatility of the actuated TUIs for fully expanding their capability for tangible interactions and motion / shape representations? In my talk, I propose a ‘mechanical shell’, a design concept for actuated TUIs with modular interchangeable components that extends and converts the shape, motion, and interactivity of the hardware. By doing so, compared with the actuated TUI itself, each mechanical shell would bring much more specialized and customized interactivity, while, as the whole architecture, the system can adapt to much more versatile interactions. I present two research instances that demonstrate this concept based-on pin-based shape display and swarm user interface, and introduce proof-of-concept implementation as well as a range of applications. By introducing the novel interaction architecture, my research envisions the future of the physical environment where active and passive machines exist together for enriching tangible and embodied interactions.

Bio

Ken is an interaction designer and HCI researcher from Japan. Currently, he is a Ph.D. Candidate of Tangible Media Group, MIT Media Lab. He is interested in developing interfaces that combine digital information or computational aids into daily physical tools and materials, to develop novel physical and perceptual experiences. His research has been presented in top HCI conferences (ACM CHI, UIST, TEI, etc), and demonstrated in various exhibitions and awards including Ars Electronica, A' Design Award, and Japan Media Arts Festival.