Access to digital information is an important right in our society, as it defines if one can integrate into society and get away from “digital exclusion”. However, the future form of digital interaction getting increasingly exclusive. With the development of Mixed Reality and Internet-of-things, people are linking the digital and physical, and make intangibles tangible. However, while we are approaching this exciting era fulled with floating AR interfaces and gesture-controlled smart devices, we seem to have forgotten to make it beneficial to EVERYONE. As the world is entering this hand-gesture controlled future driven by machine-learning, we realize it will be difficult to accommodate disabilities, due to a lack of datasets. While we may use thousands of people’s hand-motion video to train one model, it is almost impossible to find two people with exactly the same form of disabilities. Disability is often highly individual, which is not reflected in machine learning. Can we imagine asking an upper limb amputee to use Hololens? Dots is an inclusive interface for future spatial computing, which empowers the disabled person to design their way to interact with Mixed Reality and Internet-of-things, based on their body conditions. Dots is a two-point body gesture recognizing system. Each dot contains one IMU sensor, which can be attached to any movable body parts. After a simple calibration process, with the inertial-navigation technology, the relative motion between any two body parts can be sensed, which represents the four fundamental units in 3D-interface (selecting, moving, rotating, scaling). The user can enjoy the full control of any spatial interfaces in MR and IoT. We worked closely with the disabled community and The Alex Lewis Trust in the UK to validated the product by several round user trials. Our key participant, Alex Lewis, is a thought leader in the global disabled people community and the fonder of the Alex Lewis Trust, as his legendary experience flighting with his illness was made into a documentary by Channel 4. As not only a user but an expert in the disabled community, he believes “Dots can bring real impacts on the disabled people globally, as it’s the first time a product consider their differences and help them get into the future digital world.” Let’s make the future more inclusive, together.
Assistive technology and inclusive design for disabled people are well-studied areas, but, with the emerging technologies, they may face new challenges. Over the past few years, the development of immersive experiences through Virtual Reality (VR) and Augmented Reality (AR) have revealed the possibility of a mixed reality (MR) and screenless future, marked we are entering an era of ubiquitous computing. Furthermore, Natural User Interface (NUI) became one of the emerging interaction approaches and interfaces due to the rise of ubiquitous / pervasive technologies . Nevertheless, the accessibility for these technologies was not introduced sufficiently to people with physical disabilities. Although some approaches were introduced, the constraint of the limited body part—in our case, hands—and the methods of existing gesture recognition needs to be explored further. Thus, we think inclusive design needs a new approach for people with physical disability in relation to assistive technologies within the context of ubiquitous computing. This project introduces a customizable inclusive interface for disabled people who are difficult to interact with spatial computing environment.
Since the era of 2D interface, accessibility and inclusivity have been considered as one of the key important factors in developing user interactions/interfaces. Researchers working on inclusive HCI have attempted to open various social opportunities to disabled people by improving connectivity and interactions. To take disparate examples, Ikea’s “This Ables” project created furniture and electronics with bigger handles and bottoms so people with disability can adapt it; an interactive system uses breath as a way of input. While these approaches provide accessible interactions for people with certain disabilities, it lacks generality in terms of inclusive design, as disability is often highly individual. For instance, many inclusive design projects—including the above examples—are aiming for small groups of people, which require designers to develop different types of systems to embrace people’s diverse yet individual conditions.
We are entering the era of 3D spatial interaction that will often require people to interact within virtual environments, and people with disability would not be an exception. In this context, we thought designing/proposing a system for people with different individual disabilities is necessary in order to expand the idea of inclusive design. Unlike traditional inclusive design where designers/researchers/engineers need to propose a system that could embrace different types of users, we instead want different users to adapt to one flexible and customizable system. In this paper, we introduce a work-in-progress inclusive NUI system that enables people with physical disability to access 3D interactions via customizable and wearable embodied interface.
To explore the possibility of an inclusive NUI for people who have physical disabilities, several experiments were conducted. The goal of these experiments was to extract and understand full body movement-based interactions in relation to 3D object manipulations.
Participants were asked to use their full body to perform four basic interactions—selection, positioning, rotation and scaling—for 3D object manipulation with their constricted body parts. We ran the same experiment with two target users—one quadruple amputee, one paralysed patient with limited motor ability of upper limbs. In this study, we observed and learned that most of the interactions and manipulations are possible via two points in 3D space. Below is the two-points model we built throughout this study.
Two-Points Model Selection—two points quickly approaching each other. Positioning—one point keeps still and the other moves. Scaling—two points leave or approach each other at the same time. Rotation—two points rotate around the pivot point.
The purpose of building this model was to help us to understand how human body movements in different conditions could express their intentions of interactions using certain body parts in relation to 3D object manipulation. Understanding how people interact was necessary in order to design an inclusive interface or system for people with physical disabilities.
Dots is a customizable body interface or system for the future of inclusive pervasive computing. Dots is composed of a wireless charger, sensors and two dot-like slices. Users—people with physical disabilities—can have full control with MR interfaces and IoT devices. The two-points model was used to develop this experiential work-in-progress prototype.
To use Dots, users need to first attach two dots onto any of their body parts depending on their body conditions. Users need to make sure the two body parts can make relative movements to accomplish at least one interaction pattern. Two dots are also possible to be attached to any object such as a table. This depends on the types of interaction users want to achieve.
The two dots respectively consist of one IMU sensor, one Bluetooth module and one battery. These two IMU sensors can measure the relative movement of two dots and then further identify the interaction patterns that users wish to make. Dots also has its own calibration system to improve its accuracy to understand user behaviours.
It interacts with MR equipment and IoT devices and allows users to accomplish multiple tasks through body interactions. For example, creating 3D arts in the HoloLens, remotely controlling the smart home devices or surfing on the internet. It empowers disabled people to interact with technologies by letting them customize their demands by using their environment and body.
We have tested our project with ten participants, and they provided some qualitative feedback. Below are several points they referred to after trying our prototype.
Freedom - they enjoyed the freedom to customize the way of using their body parts.
Novelty - our design brought them a totally new way of thinking and working.
Equality - they felt that they are on the same page with everyone now.
The ‘customizable’ feature of Dots goes beyond the general inclusive design. It offers a variety of choices for different situations, which largely breaks the barrier of disabled people’s physical limitations. The implicit aim of this project is to explore the feasibility of the body interface within the context of NUI.
The idea of a customizable body interface was conceived by recognizing the intersection between the trend of spatial computing as well as the physical limitation of hand gesture control. We wanted to make the future accessible to everyone by providing an inclusive device where users can adapt to one flexible/customizable system or interface.