Insect LegConNet

Cockroach

The latest version of LegConNet is constructed from physiological neuron and synapse models. It causes both simulated and robotic legs to step with the same rules that the cockroach Blaberus discoidalis uses to coordinate it various joints into different stepping motions, such as walking, inside turning, and outside turning.

The network shown coordinates walking, inside turning, outside turning, and standing still in the middle leg of the cockroach. It is based on neurobiological data about stick insect nervous systems and 3D kinematic data from walking and turning cockroaches. The roles of the stick insect joints were mapped onto the cockroach, which uses its legs a little differently. Despite different joint function, they likely have similar neural pathways coordinating their motion.

Alignment
Center

Sensory information (light blue neurons, top) is transmitted to the central pattern generators (CPGs, red) that control the flexion and extension of each joint. CPGs will oscillate without input, and their phase can be reset by stimulating the top row of each; this is how the leg is coordinated. Notice that none of the CPGs are directly connected, but only remain in phase because of sensory information. The CPGs change the yellow muscle control circuits, which try to contract its respective muscle to a particular length. The network switches between different modes when one of the green context neurons are excited. These neurons excite or inhibit the dark blue sensory interneurons, which normally carry sensory information from the light blue sensory neurons to the red (CPGs). When the context neurons are stimulated, sensory information goes to different CPGs than normal, causing the joints to flex and extend in a different order, producing different motion. This is likely how insects change between stepping motions, and produces smooth, stable transitions both in simulation and in a robotic leg. This is an improvement over the previous version, which fed sensory information into finite state machines at each joint to bistably change between flexion and extension. This system worked, but required very precise timing for proper transitions. In addition, the lack of CPGs meant that improper sensory information could lead to the leg ceasing to step. This system can walk both in simulation and in an actual robot.

The videos below shows the 3D dynamic simulation of a cockroach walking and turning. The body was made invisible to show the leg motion. The model is supporting its own weight and walking on the ground with friction. Forward walking and turning are indicated on screen, and should be clear because the body begins to pivot when turning. One can see that the legs on the inside begin to reach to the side rather than moving backward like usual, while the legs on the outside make more of a pushing motion.

This video shows a single robotic leg on a test stand walking forward. The same nervous system was simulated and used to control servos acting at the joints of the robotic leg. Servo positions are read for use in the network, and strain gauges on the leg provide information about loading. One can see the same basic motion is present. In addition to making robots better walkers, this work is valuable to biology. By constructing models of what the insect is doing, we may be able to more easily perform experiments and explore how the system works. In addition, the modeler must fill gaps in knowledge with hypotheses, which are testable in the lab. This spurs further research.

The plots below show that the joint motions in the simulation and the animal are in fact quite similar (simulation on left, animal on right). Similar ranges of motion are observed, and the same changes take place between walking and turning. No parameter estimation or system optimization took place; variables were set by hand, and even as rough approximations, they clearly show the same behavior as the animal. This suggests that the model is replicating some of what makes the insect behave as it does.

Alignment
Center

Alignment
Center

AnimatLab, the program used to build this simulation, can be downloaded at http://animatlab.com/Download/AnimatLab10/tabid/281/Default.aspx.


Mantis

Our work with cockroaches shows that biological models of central pattern generators (CPGs) can be coordinated into different stepping patterns to propel a simulated body. But what if we want to make deliberate, precise postural adjustments, for instance, toward a target of interest? How does information from head sensors produce coordinated, arrythmic joint motions?

Because animals have distributed nervous systems, our model performs all computation at the lowest level possible. For instance, each leg has feedback loops that produce joint torques to regulate the body’s translation in all three dimensions, as well as rotation about a vertical axis. Different joints participate in different control loops depending on their orientation. These loops establish motor modules (synergies): constant proportions of muscle co-activation that move the leg in a particular direction. Information from head sensors changes the set point of one or several of these control loops to move the body in a particular direction, while maintaining the reference position in the other directions.

Alignment
Center

Commands can also be given to move in multiple directions at once, superimposing commands and producing apparent changes in coordination. For instance, commanding the model to translate different amounts while rotating the same amount will cause one joint to extend in some cases but flex in others. Reversals like these are seen in other insects fleeing from predators. This model suggests that this is not due to a centralized controller, but instead the interaction of several distributed control loops, receiving inputs from relatively simple descending commands.

Alignment
Center