TOCABI is a life-size humanoid robot designed from the ground up for research in humanoid robotics. The robot is capable of both autonomous control and being a remote avatar controlled by a person in VR. Unlike many of the animatronic robots available today, TOCABI is capable of walking and interacting with the environment in meaningful ways. This lets a user enter into a "virtual" space, which is in fact a real-world streamed to their VR headset, and with hand tracking and foot pedals, can remotely control the robot by moving themselves and interacting with their virtual space. The cameras and audio equipment let the user have conversations in real-time, with abstracted facial animations created from the live audio of the operators voice. This type of virtual-to-real avatar has some important applications, including remote-work and one day enabling people with all sorts of physical disabilities to interact with the built environment in new ways.
TOCABI stands for Torque-controlled CompliAnt BIped. It is 1.8 meters, 100KG and 33 degrees of freedom. The torque-control and compliance algorithms controlling the robot help make human-robot interaction safer by limiting the energy put into the motors to only what is required for the expected tasks. The design of the robot was a collaboration between New Jersey Institute of Technology and Seoul National University.
A well designed object simultaneously takes into account the aesthetics, functionality, durability, and usability. In robotic design, this means a design should account for visual meaning (aesthetics), the goal of the robot (functionality), how long it will last (durability), and how researchers will interact with the robot (usability). This project was more about exploring novel solutions and approaches to the design of a rare tool than a practice in working through compromises. While movies and media provide endless visualizations and concepts of what humanoid robots will look like, in the entire world there are only a handful of life-size humanoids capable of walking and interacting with the environment. Along with the years of development, there is little reference for how to design a humanoid robot – making it an innovative, exciting, and challenging design process.
TOCABI (Torque-controlled CompliAnt BIped) is a life-size humanoid robot designed from the ground up for research in humanoid robotics. It is 1.8 meters, 100KG and 33 degrees of freedom. The robot was created in two stages (lower and upper body), with the lower body finished in 2014 and the upper body completed in 2021. The design serves multiple purposes and posed unique challenges from both aesthetics and function. While most robots seen in the media are covered by metal or plastic casings to hide the wiring and structural support, the need for constant access to the robots components in the lab would mean an exterior shell would almost never be used, as well as create additional difficulties when the robot needed to be modified. At the same time, the human perception of how a robot moves is greatly impacted by the aesthetic and form of the robot, so a series of box-like body parts was not acceptable. TOCABI is the result of a unique design solution for a scientific instrument used for robotics research while representing human form and the intent for human-like movement. The aesthetic of the robot is built directly into the structure -- eliminating the need for external plastic casings—while provided instant access and easy-to-modify components.
Our robot is built from the ground up; we make the motor housings, try multiple different versions of designs, and need to switch between ideas quickly. We basically use Fusion 360 as we do git -- we are able to track what everyone is working on, see if there were recent updates, and merge new parts as they are available. Much of the design process occurred with the main designer in New Jersey, and the main engineer in South Korea, which is a 12 hour time difference. This made it vital we could always know what was being worked on, and over a video conference be able to talk about the same model, making changes and decisions that were immediately updated for the other team. The best example is probably the robots head, which we redesigned and implemented within two months of the semifinals for the Avatar competition. We had the overall head shape going through multiple iterations, which would attach to the neck joint. We would also have the face/screen being updated or changed multiple times, and the video cameras used to display the environment to an operator were being changed.
The integration of multiple features in one collaborative platform enabled us to create a tight feedback loop, with motors and joint locations being defined in one place(we have 33 degrees of freedom), and the links that we were designing using that information in another. Likewise, we were able to create much better communication of the design and engineering requirements using simulations that were built into the modeling program. Once our structural simulations were setup, it was easy to discuss the safety factor needed, and redesign-analyze iteratively. Sometimes we would use a feature just to explore our own decisions, like topology optimization. Across the center of the robot is a sort of split bar that is vital to the overall strength, helping to connect both arms and sitting upon the torso. We originally designed this based on some intuition, but by making the bar solid and using topology optimization, we were able to check some of these decisions.
The robot was manufactured through extensive use of multi-axis CNC machining at a high precision using 7075 heat-treated aluminum. As a specialized scientific tool, the design process was extremely integrated with the engineering. Motor sizes and specifications had to be balanced with the weight of the structure. The placement of electronics had to be considered with respect to the software control and algorithms used to drive the robot. This balance is best understood with respect to the technologies and capabilities of other robots. For example, while there are semi life-like robotic faces, the humanoid robots using them are not able to walk on their own, and are instead stationary animatronics.
---------
The following description provides insights to a few key design elements throughout the robot:
To provide visual suggestions that the legs are of human proportions, we took special consideration to emphasize the size and curvature of large muscle groups. We used flat mounting plates that connect the robot links (parts in between joints) directly to the motor. From the side view, these plates follow the tangential lines created by the curve of the front plate and rear cylinders. The direct motors connections at the joint prevent the structure from achieving the same proportions as a human. This challenge is emphasized in the legs, whereas a human ankle is able to stabilize the entire body in a very compact form, to have multiple degrees of freedom on the foot requires multiple large motors to support the entire robot, whereas the robots knee needs only one motor for flexion/extension. While the three planar joints on the side view are equal in size, a combination of the thicker cylinder on the upper link and the larger protrusion of the upper plate create a similar aesthetic to human proportions.
The usability of this product mostly lies in the ability for the researchers to interact and modify its components. Ease of accessibility to motor drivers and electronics (such as wires that degrade over time) was built into the structure, with side bars of the lower legs held in place by just two screws. The removal of these two screws provides easy access to the motor drivers for the link. Additionally, the use of multi-axis machining allowed for the design to integrate the mounting bar for the heatsink -- reducing the number of extra brackets required to secure the electronics. Both the thigh and shin have three holes in which the electronic assembly is easily attached and detached.
In the upper body, the motor drivers and electronics are integrated more tightly into the chest of the robot. While providing a benefit for locomotive control algorithms (due to the mass being closer to the center), it also created free/open channels in the arms for additional components to be easily integrated. Most representative of this feature is the quick ability to change the robots hands—an ongoing research topic into object manipulation and mapping between human and robot dexterity.
While I use the manufacturing tab in Fusion for 3-axis CNC in much of my work and in the design studios I teach, we relied much more on additive manufacturing for fast prototyping and visualization. Both my own lab and the SNU lab have the same FormLabs2 3D printer, so when we needed to discuss over a physical model, it was straightforward to 3D print the same model and have physical prototypes during our remote meetings. In fact, the constantly changing requirements for the head made us realize it would be easier to create a mounting bracket in the aluminium that would generally accept 3D printed camera holders with a standardized mount so that we could swap out different cameras and microcontrollers.
-------
While there are a variety of ways to develop a robotic system, TOCABI uses what is known as torque-control. Unlike robotic arms found in a manufacturing facility that are controlled by commanding a position, which can be dangerous for a human who does not anticipate its movement, torque-control first determines where it wants to go and then calculates and commands the amount of energy needed to the motors for that movement. The robot is capable of both autonomous control and being a remote avatar controlled by a person in VR. Unlike many of the animatronic robots available today, TOCABI is capable of walking and interacting with the environment in meaningful ways. This type of virtual-to-real avatar has some important applications, including remote-work and one day enabling people with all sorts of physical disabilities to interact with the built environment in new ways, from emergency responder to worker avatars, with far reaching implications for everything from occupational safety to the transformation of industrial economies.