Home » How a Robot Can Give You Superhuman Strength: Dr. DelPreto’s Gesture Language

How a Robot Can Give You Superhuman Strength: Dr. DelPreto’s Gesture Language

Dr. Joseph DelPreto has superpowers. He can lift a sofa with one hand.

However, he didn’t get his powers from a radioactive spider bite or from the planet Krypton; his powers come from a robot instead. As a researcher from the Computer Science and Artificial Intelligence Lab at the Massachusetts Institute of Technology (MIT), he even studies how regular people like you and I can become robot-powered superheroes, too (DelPreto, n.d.-b)!

What do you think of when you hear the word “robot”? Perhaps you imagine Wall-E, the friendly, Earth-cleaning, animated character from the 2008 film of the same name. Or perhaps you recall Jarvis from Iron Man’s armor, the English-speaking Marvel Studios creation. In fact, some robots from science fiction are known for cooperating with humans, such as R2-D2 helping the Jedi save the galaxy throughout the Star Wars saga (Aylward et al., 2015).

Unfortunately, outside of the movie theater, using computers and technology can be too confusing or overly complex for most people. You may have experienced this issue if you have ever argued with Siri or did not know how to change a setting on your phone. Fortunately, through his research on human-computer interactions, DelPreto strives to make it easier and more intuitive for us non-engineers and non-scientists to use computers and robots for our everyday tasks, such as lifting sofas and other heavy objects.

Figure 1: This is Baxter, the robot used by DelPreto for his human-computer interaction experiments. Like a human teammate, the robot’s arm has joints and a hand, which can move similarly to a human arm. On top of the robot’s main body, DelPreto’s team has even placed a wig and a screen on which Baxter can display a friendly face! Photo Credit: Audrey Lee (2021)

To do this, DelPreto has his own version of R2-D2: the Rethink Robotics Baxter robot (DelPreto & Rus, 2019). As shown in Figure 1, Baxter’s arm is very similar to a human arm (CNBC International TV, 2014). It has joints (circled in blue) that allow it to move and twist its hand (boxed in green). Additionally, its hand can have various attachments that are shaped like a claw or the prongs of a forklift, which Baxter uses to grab and lift objects in the same way that we use our hands to grab and lift objects (MIT CSAIL, 2019).

To communicate with Baxter, DelPreto needs his “gesture language.” Although we might understand the command “lift it,” “levántalo,”  or “举起它,” robots and computers do not. That’s why DelPreto has developed a way for Baxter to determine what a human user wants it to do based on the user’s gestures and muscle movements.

To better understand what a gesture language entails, try holding your forearm as you move your wrist from side to side. Your muscle movements may be imperceptible to you, but many wearable electronic devices can detect them. Wearable electronic devices are any computing devices that can be worn on the body, such as Bluetooth headsets or smartwatches (Carp, n.d.). If you’ve ever worn a Fitbit or Apple Watch, you have used a wearable electronic device. Similar to how these health-monitoring watches can measure a person’s heart rate or how many steps they have taken in a day, the wearable electronics that DelPreto uses can measure how its user is moving their limbs (Fitbit LLC, n.d.).

Many superheroes obtain their powers from a variety of tools, including Iron Man’s lasers and Spiderman’s webs. Dr. DelPreto’s tools are two wearable electronic devices: the MyoWare Muscle Sensors and the Thalmic Labs’ Myo gesture control armband (DelPreto & Rus, 2019, 2020). Both devices use electromyography (EMG) interfaces, which refer to the muscle signals that they collect and amplify (DelPreto & Rus, 2019, 2020). When our brains send signals to our muscles to tell them how to move, our muscles increase the strength of these signals while following the brain’s instructions. This brain-and-muscle signal is known as electromyography (Chowdhury et al., 2013). By placing EMG interfaces on a person’s arms, DelPreto can measure how much and what kind of muscle movements that person is doing, such as bending their wrist, clenching their fist, or raising their arm.

Figure 2: DelPreto wears the MyoWare Muscle Sensors on his bicep and the Thalmic Labs’ Myo gesture control armband around his forearm. Photo Credit: Joseph DelPreto

Together, Baxter and these wearable electronics translate a human’s gestures into a robot’s actions. DelPreto’s go-to example is lifting a sofa with Baxter. Although you can’t speak to it, Baxter can “watch” how you move your arm. In other words, it will use your muscle measurements to reconstruct the motions of your arms as you position them under the sofa, grip the sofa, and lift. These motions are then copied by the robot arm, which similarly positions its forklift-like hand under the sofa, grips the sofa, and lifts. If you already know how to lift one side of a sofa, you don’t have to learn anything new in order to lift something with Baxter!

DelPreto, on the other hand, did have to learn various subjects in order to develop his “gesture language to robot motion” translator. Throughout his undergraduate studies at Columbia University, he explored a variety of barely connected interests, including tinkering, electronics, psychology, and neuroscience. His innate enthusiasm for all these fields is clear as he jokes about completing a degree in electrical engineering and two minors in mechanical engineering and psychology by simply “taking classes that looked like fun.”

Figure 3: Baxter and DelPreto demonstrate lifting a thin mattress. By “watching” how DelPreto keeps his arm at a certain height, Baxter can match that height and keep the mattress level. Photo Credit: Joseph DelPreto

Later, through various undergraduate research internships, DelPreto added another interest to his list: developing robots that can directly work with and help people with difficult tasks. In the summer of 2021, with the guidance of MIT’s Prof. Daniela Rus, his Ph.D. advisor and current boss, DelPreto completed his thesis on using the gesture language for robots. He was finally able to integrate his interests into one field: psychology and neuroscience are used in studying people’s gestures while electronics, robotics, tinkering, and both of his engineering degrees come into play when studying the control of robots as part of a human-robot team.

“Team” is a key part of DelPreto’s research. As easy as our lives would be if robots did all of the heavy lifting, DelPreto understands that users may have concerns about “the AI takeover.” You may recognize this term from movie franchises such as The Matrix or Terminator, which take the fear of humans being overtaken or enslaved by human-made technology to an extreme. As a result of the concerns sparked by such media, DelPreto emphasizes “combining the strengths of the robot with the strengths of the person.” For example, while robots can be repetitive and physically stronger than humans, a person’s strengths include “creativity, strategic thinking, [and] long term planning.” DelPreto believes that a human and a robot working together can achieve greater goals than a human or a robot alone.

DelPreto believes that a human and a robot working together can achieve greater goals than a human or a robot alone.

For now, Baxter, the robot teammate, exists only in DelPreto’s lab at MIT due to some problems that still need to be addressed. What if the measurements are wrong? What if the person and the robot aren’t synchronized? What if different people lift with different muscles? DelPreto isn’t sure what the answers to these questions are just yet.

Regardless, DelPreto looks forward to the day when his technologies are used by you, me, and anyone else in the world. From helping some people lift boxes in warehouses to helping others move furniture at home, he hopes that robots using intuitive, gesture language communication become widespread among all types of users. Eventually, he also hopes to expand the gesture language beyond just parroting the user’s movements to have meanings associated with each gesture. Excitedly demonstrating some potential gestures, DelPreto urges us to imagine a world in which waving your hand means “make me a coffee” or clenching your fist means “open the door” or raising your arm means “let’s move a sofa together.”

November 2021

References

Aylward, D., Bolger, D., Earley, S., Gorey, C., Hayes, B., Killeen, C., King, N., Maleney, I., Melia, J., Murphy, R., O’Brien, L., Robinson, J., Rogers, S., & Van Nguyen, D. (2015, January 23). The 50 Greatest Robots in Pop Culture History 25-1—Machines | siliconrepublic.com—Ireland’s Technology News Service. https://www.siliconrepublic.com/machines/the-50-greatest-robots-in-pop-culture-history-25-1

Bland. (n.d.). 10 times film showed us the power of AI. National Museums Liverpool. Retrieved July 18, 2022, from https://www.liverpoolmuseums.org.uk/stories/10-times-film-showed-us-power-of-ai

Carp, A. (n.d.). Wearable Electronics | Electrical and Computer Engineering Design Handbook. Retrieved November 2, 2021, from https://sites.tufts.edu/eeseniordesignhandbook/2015/wearable-electronics/

Chowdhury, R. H., Reaz, M. B. I., Ali, M. A. B. M., Bakar, A. A. A., Chellappan, K., & Chang, Tae. G. (2013). Surface Electromyography Signal Processing and Classification Techniques. Sensors (Basel, Switzerland), 13(9), 12431–12466. https://doi.org/10.3390/s130912431

CNBC International TV. (2014, August 1). Baxter, The Bionic Robot | The Edge. https://www.youtube.com/watch?v=JWBqXLHlqjE

DC Entertainment. (2012, February 23). Superman. DC. https://www.dccomics.com/characters/superman

DelPreto, J. (n.d.-a). Controlling drones and other robots with gestures – Joseph DelPreto. Retrieved November 1, 2021, from https://www.josephdelpreto.com/portfolio/control-drones-and-other-robots-with-gestures/

DelPreto, J. (n.d.-b). Joseph DelPreto. Retrieved November 2, 2021, from https://www.josephdelpreto.com/

DelPreto, J. (n.d.-c). Lifting objects with robots using muscle signals – Joseph DelPreto. Retrieved November 15, 2021, from https://www.josephdelpreto.com/portfolio/using-muscle-signals-to-lift-objects-with-robots/

DelPreto, J. (2021, October 29). Personal interview with author [Personal communication].

DelPreto, J., & Rus, D. (2019). Sharing the Load: Human-Robot Team Lifting Using Muscle Activity. 2019 International Conference on Robotics and Automation (ICRA), 7906–7912. https://doi.org/10.1109/ICRA.2019.8794414

DelPreto, J., & Rus, D. (2020). Plug-and-Play Gesture Control Using Muscle and Motion Sensors. Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 439–448. https://doi.org/10.1145/3319502.3374823

Fitbit LLC. (n.d.). SpO2, Heart Rate Variability | Fitbit Technology. Retrieved November 2, 2021, from https://www.fitbit.com/global/us/technology/health-metrics

Marvel. (n.d.). Spider-Man (Peter Parker) In Comics Powers, Villains, Enemies | Marvel. Marvel Entertainment. Retrieved November 2, 2021, from https://www.marvel.com/characters/spider-man-peter-parker/in-comics

MITCSAIL. (2019, May 22). Using Muscle Signals to Lift Objects with Robots. https://www.youtube.com/watch?v=t4iJRy41d3Y

 

Back to the Table of Contents

Audrey Lee

About the Author

Audrey Lee is a member of the class of 2025 who is majoring in Electrical Engineering and Computer Science (Course 6-2) with minors in Mechanical Engineering (Course 2) and Writing (Course 21W). Outside of the lecture hall, she can be found tinkering with MIT’s Roboteam, making music with MIT’s Symphony Orchestra, or crocheting with her fellow sponges (Simmons residents!). As an aspiring professor, she strives to be able to explain any concept to anyone, no matter how complicated or convoluted, such as through these “Science Writing for the Public” pieces that she wrote for Angles.

Subject: 21W.035, Elements of Science Writing for the Public

Assignment: Assignment #2, “Awesome Profile”