3DKnITS: Three-dimensional Digital Knitting of Intelligent Textile Sensor for Activity Recognition and Biomechanical Monitoring
In 3DKnITS, we demonstrated various aspects of textile design, fabrication approach, wireless hardware system, deep-learning enabled recognition methods to application scenarios of novel piezo-resistive knit textile. We propose digital knitting approach with functional and common yarns to design, architect, and engineer the electrical, mechanical, and visual properties of an interactive sensing textile. By incorporating a melting fiber, we propose a method to shape and personalize three-dimensional pressure-imaging textile that can conform to the human body through thermoforming. It results in an intimate and robust textile structure, eliminating sensor drifts and maximizing accuracy while ensuring comfortability. Our approach enables the fabrication of 2 to 3D pressure-sensitive textile interiors and wearables, including an intelligent mat, sock, and shoe that can be used for a variety of applications ranging from rehabilitation and sport science, to gaming interfaces. Our AI-driven, convolutional neural network (CNN) models that give these textiles intelligence are able to classify various basic activities and exercises and yoga poses with high accuracy of around 99.6 % and 98.7 %, respectively, offering great prospects for high-accuracy detection or prediction based on a deep-learning approach.
To demonstrate practical applications of our knitted intelligent textile mat, we built a sliding window algorithm on top of our classification results to infer transient activities such as walking, running, and jumping. We used and interfaced our recognition results to control a Minecraft video game in real-time in order to gamify exercise. By showing real-time yoga pose classification results, we could also demonstrate an application to inform the user if the right balance or pose has been achieved by first feeding training data reinforced by an expert. The 3D knitted shoe or sock could also be used to gather form-fitting and biomechanical data and track activities "in the wild" for wearable and mobile applications, which are useful not only for athletes and dancers, but also for prosthetic designers and shoemakers. The same fabrication principles can be executed to develop other types of intelligent apparel, including sleeves, gloves, or shirts. In the end, since textiles are ubiquitous in our environments, the underlying process and technology of 3DKnITS can spark intelligent textile applications spanning health and activity tracking, sports and rehabilitation, wearable and ubiquitous computing, robotics, human-computer interaction, and inspire new kinds of interactive objects and environments.
Most of the current efforts in functional and electronic textiles focus on the coating, screen-printing, embedding or attachment of electronic devices on fabrics. These manual and hand-made approaches, even though they have certain values in some aspects, they restrain researchers and designers from rapid prototyping, large-scale manufacturing, and translation of electronic textiles. Recently, advances in mechatronics, digital fabrication, and computer-aided design have revolutionized the concept of three-dimensional (3D) knitting with computerized knitting machines. These additive manufacturing machines enable users to design and fabricate their textile patterns and structures through a specialized visual programming environment and various types of fibers and yarns. In this work, we leverage digital knitting techniques using flat-bed and circular knitting machines with thermoforming techniques to realize a set of 2 to 3D piezoresistive matrix textile mats and wearables that are able to detect multipoint pressure across their surfaces in real-time. This fabrication approach allows us to explore various parameters, including interconnect resistance, matrix resolution, pressure sensitivity, and the fabric's visual, mechanical, and electrical properties.
We are motivated by the fact that most of our physical gestures and interactions involve contacts between different parts of our body and a surface. As we perform our daily activities such as walking, sitting, exercising or sleeping, a characteristic spatiotemporal contact and pressure pattern can be monitored and identified from sensing through the fabrics in our apparel or upholstery. In this project, we treat our spatiotemporal 2D pressure sensor data or heat-map similar to image frames. As we balance and redirect our center of mass through our feet, we exert force on the grounds. As a sub-set of machine learning, deep learning has flourished to solve complex image processing and speech recognition challenge, as it provides an efficient way to learn high-level features from raw signals without complex feature extractions by training an end-to-end neural network. By detecting the pressure distribution of the feet through our intelligent mat, we can extract rich contextual information about our posture and activities. We presented two application scenarios: an intelligent mat connected to a virtual environment in order to gamify exercise and motivate users to move their body and play, as well as a real-time yoga posture recognition for balance training purposes.
To develop a tubular knit textile, we employed a digital circular knitting machine and a combination of polyester, spandex, conductive, and TPU yarns in the knitting process. The machine greatly increases productivity because the relatively slow reciprocating motion of flat knit machines is replaced by a continuous and faster circular motion. The circular knitting is mostly used to make various tubular garments such as socks, shoes, sleeves, underwear or t-shirts. In order to realize form-fitting apparel or prosthetic lining customized to the wearer, 3D-printing and 3D-scanning of the human body could be performed to create 3D-printed models of the parts for thermoforming and shaping the tubular textile.
As one of the world's most practice sports, a significant research effort has been conducted to study the science behind soccer. We demonstrated the functionality of our 3D knitted sensing shoe or sock in this particular sport since it involves various biomechanical movements, including gait, balance, and coordination of muscles when running, sliding, and kicking a ball, as well as positioning of the ball on the shoe to ensure the right angle, power, and trajectory. This 3D knitted sensing shoe or sock could find many applications in prosthetics, kinesiology, rehabilitation, and sport science. Unlike camera-based systems that typically trigger privacy concerns regarding continuous, potentially invasive sensing and recognition, pressure-imaging approach is less intrusive and is not sensitive to line-of-sight or lighting levels.