MIT researchers show muscle signals to control drones

For the novice of unmanned aerial vehicle, different styles of joysticks need a long period of time to adapt and become skilled. However, researchers at MIT have come up with a new way to control unmanned aerial vehicles more intuitively by using operator muscle signals. By being equipped with multiple EMG sensors, the Conduct-A-Bot system can be worn in the area of the biceps, triceps and forearm of the user’s right arm to read the user’s muscle signals.


Joseph DelPretoto demonstrates muscle control and guidance of unmanned aerial vehicles through iron rings (from: MIT)

the supporting sensors can detect muscle and arm movements and relay the data to the microprocessor hard connected to it so as to identify different arm movements with the help of machine learning-based algorithms.

The system has programmed each action in advance, which can easily convert the user’s actions into specific instructions and then wirelessly transmit them to Parrot Bebop 2, a four-axis aircraft.


The system can also be fine-tuned and adaptive according to each user’s unique EMG signal (from: MIT)

by default, tightening the upper arm allows the UAV to hover, clenching the fist means moving forward, clockwise/counterclockwise will rotate accordingly, and up and down, left and right (horizontal offset).

During the most recent test, Bebop correctly responded to 1500 of the 82% commands. It is believed that with the further development of the system, this figure will be further improved.

Controlling Drone with Gestures ( via)

Joseph DelPreto, a researcher, said: “This system has taken us an important step towards seamless human-computer cooperation, making it a more effective and intelligent tool for daily tasks”.

Looking to the future, this technology can not only be compatible with other unmanned aerial vehicles, but also be used as an auxiliary robot to help the elderly or the disabled live a more comfortable life.