Evolving Soft Robots Using a Genetic Algorithm
In this project, my partner Heather Chevlin and I wrote a genetic algorithm to evolve soft robots (in simulation) from a starting design of our own. Soft robots are robots that are composed of non-rigid material. Our goal was to evolve the fastest walking robot in simulation.
Physics Simulator + Robot Construction
Our robot was to be composed of cubes of different types of material. We created a physics simulator using a mass and spring system to build cubes: the vertices of each cube had a mass of 0.1 kg and were connected by massless springs with a side length of 0.1 m. By varying the spring coefficients, we created cubes that can mimic different materials.
Our robot was comprised of 4 different types of material: tissue, bone, and two types of muscle.
The four types of material were given different different spring coefficients. Tissue had the lowest coefficient while bone had the highest. The two types of muscle had the same coefficient, but they differ from the previous two material types in that the edges changed length following a sinusoidal pattern. This gives the cube an effect of "breathing". The two muscle types "breath" out of phase with each other and is what gives the robot the motion.
Muscle Cubes: Starting length = rest length + b*sin ( omega ( 0.01 + c) )
Representation
Our base design is a worm-like structure composed of a long main body with the possibility of attaching cubes to the top, bottom, and two sides of the base body. To find the fastest robot, we have to evolve both the design (length of robot, placement of additional cubes) and the material of each cube the robot is composed of. In order to successfully implement an evolutionary algorithm, we first must create a way to encode the "genes" of the robot so that they could be crossed-over in the evolutionary process.
We decided to represent our robot using a matrix with dimensions 5 x (length of robot), with a number 1 to 4 representing the type of material for the cube, or 0 for no cube. The first row represents the body so no number could be 0. The other four rows represent the placement and type of cube, if any, attached to the four possible sides of the body.
Evolutionary Parameters
In our evolutionary algorithm, we started with a random population of 20 robots. The fitness of each robot was calculated as the distance between the center position at initialization and end of the simulation. We used a fitness proportionate method to pick the robots (parents) that would crossover to create children. This meant that the robots with a greater fitness were more likely to be chosen to be the parents.
Each pair of parents crossed-over to create two children. Because the robot lengths were variable, the lengths of the children were variable as well, especially since the cut points were chosen at random. However, to keep the number of masses and springs manageable in our simulation, we did limit the length of the resulting children to be no longer than 10. Because of the matrix representation, crossing over the matrices produced robots that are also physically crossed-over from their parents. From each of the four robots (2 parents and 2 children), we kept the top two performing ones in a holding population that would move on to the next generation.
Results
Although we only ran the code for 100 generations, we already saw improvements in the fitnesses of the robots produced. We noticed that in the final generation, the robots all had the gene that encoded for a cube at the front. The robots would fall on its arm and use it to push itself across the floor. We also noticed that the population had lengths that were on the shorter side, and we hypothesize that this may be because the push arm is better utilized in robot bodies with less mass.
However, our robots still have a lot of room for improvement. In order to obtain better results, we can run the simulation for a much longer time and for a greater number of robots in the starting population. Our algorithm also had the potential to lose diversity quickly, so another way we can improve our algorithm is to try more methods to maintain and generate diversity in the population.