By smart algorithms simplest of robots can accomplish tasks

Anyone who has had children knows that while controlling one child can be difficult, controlling multiple children at the same time can be nearly impossible. Getting swarms of robots to work together can be equally difficult unless researchers carefully choreograph their interactions, similar to planes in formation, using increasingly sophisticated components and algorithms. But what can be done reliably when the robots available are simple, inconsistent, and lack sophisticated programming for coordinated behavior?

A team of Georgia Institute of Technology researchers led by Dana Randall, ADVANCE Professor of Computing, and Daniel Goldman, Dunn Family Professor of Physics, sought to demonstrate that even the most basic of robots can perform tasks far beyond the capabilities of one, or even a few, of them. The team’s goal of completing these tasks with “dumb robots” (essentially mobile granular particles) exceeded their expectations, and the researchers report being able to remove all sensors, communication, memory, and computation—instead of completing a set of tasks by leveraging the robots’ physical characteristics, a trait that the team refers to as “task embodiment.”

Getting swarms of robots to work collectively can be equally challenging unless researchers carefully choreograph their interactions—like planes in formation—using increasingly sophisticated components and algorithms.

BOBbots, or “behaving, organizing, buzzing bots,” named after granular physics pioneer Bob Behringer, are “about as dumb as they get,” according to Randall. “Their cylindrical chassis have vibrating brushes underneath and lose magnets on their periphery, causing them to spend more time in areas with more neighbors.” As a way to study aspects of the system that were inconvenient to study in the lab, the experimental platform was supplemented by precise computer simulations led by Georgia Tech physics student Shengkai Li.

Despite the BOBbots’ simplicity, the researchers discovered that as the robots move and collide, “compact aggregates form that are capable of collectively clearing debris that is too heavy for one alone to move,” according to Goldman. “Whereas most people build increasingly complex and expensive robots to ensure coordination, we wanted to see what complex tasks could be accomplished with extremely simple robots.”

Simple robots, smart algorithms

Their research, which was published in the journal Science Advances on April 23, 2021, was inspired by a theoretical model of particles moving around on a chessboard. To rigorously study a mathematical model of the BOBbots, a theoretical abstraction known as a self-organizing particle system was developed. Using ideas from probability theory, statistical physics, and stochastic algorithms, the researchers demonstrated that as magnetic interactions increase, the theoretical model undergoes a phase change, abruptly changing from dispersed to aggregating in large, compact clusters, similar to phase changes seen in common everyday systems such as water and ice.

Robots are designed to manipulate objects by perceiving, picking, moving, modifying the physical properties of the object, destroying it, or having an effect, thereby freeing manpower from performing repetitive tasks without becoming bored, distracted, or exhausted.

While Modular Robotic Systems (MRS) are expected to be used in a variety of workspaces, scales, and structures, practical implementations of such systems lag behind their potential in performing real-world tasks. The challenges of improving MRS capabilities include not only designing reliable, responsive, and robust hardware, but also developing software and algorithms that can effectively fulfill tasks by performing fundamental functions such as shape-formation, locomotion, manipulation, and so on.

Machine learning is a popular technology that allows computers to act intelligently on a specific task or problem without being explicitly programmed. Algorithms are used by the computer to derive knowledge from data and to interpret data for itself. The computer learns from the data and corrects outputs as more data is presented to the machine learning application.

“The rigorous analysis not only showed us how to build the BOBbots, but it also revealed an inherent robustness of our algorithm that allowed some of the robots to be faulty or unpredictable,” says Randall, who is also a computer science professor and an adjunct professor of mathematics at Georgia Tech.