A world of wifi controlled robots sounds like a dream until someone hacks the drone that’s picking up lunch and your Taco Bell goes to feed a hacker. Fortunately, researchers from MIT have developed a way to protect drone swarms from being hacked.
“There are a lot of really great things on the horizon,” Stephanie Gil, an MIT research scientist and the study’s lead author tells Inverse. “As we start introducing more robotic systems into the mainstream, I think it’s important to start to think about security in different ways.”
The key, the researchers say, is an algorithm that makes the wireless signal controlling a group of robots a unique identifier in and of itself. This way, any deviation from the correct signal would be read as false by the drone swarm. This connection between the wireless signal and the physical world makes things like drone swarms, drone delivery, and potentially autonomous cars significantly safer. The algorithm is a novel way to make sure all the robots that are going to be in our lives are able to tell which signals are telling the truth and which ones are tricks. The researchers published their findings in Autonomous Robots at the end of February.
“I think we think of wireless signals of having a main function, which is just to transfer data,” says Gil. “But it turns out that in traveling through the environment it gains a lot of information.” As wireless signals travel from one place to another, the signals bounce off obstacles, or are weakened by passing through objects. A trip from one receiver to another has a unique energy pattern because of the space in between the devices. The algorithm created by Gil and her team compares the original strength of the signal, and the final strength of a signal in all directions and creates a “fingerprint” for each device.
“What we’ve tested so far are flying vehicles and small ground robots,” says Gil. In a drone swarm, all of the drones can carry the protective algorithm, which lets the robots determine if it is getting the correct location information for all the other drones. “We can actually protect the actions of our drones against malicious spoofers in our environment,” says Gil.
The researchers tested the fingerprint by loading the algorithm on a delivery drone trying to make a series of deliveries to different locations in the lab. Then they sent a series of spoofing attacks to the drone, trying to convince the computer system that the deliveries belonged at a location specified by the hacking. By identifying that the requests weren’t coming from where they said they were, the drone barely wavered in performing the correct deliveries.
“If the origin of the signal is being reported as one position, but you’re not getting any energy from that direction, you might want to red flag that signal,” says Gil. This prevents a hacker from disguising requests on the system as if the signals were coming from multiple places, because the algorithm can identify that all the requests are coming from the same place in the physical world.
Inverse asked Gil if she thought her algorithm could be used to protect the autonomous cars that Charlize Theron hacks in the new Fate of the Furious trailer. “I would say that sounds like an interesting area for future research,” she says, laughing.