In this episode we hear from researchers at the Harvard Microrobotics Lab about the Science paper published today reporting on the first controlled flight of an insect-sized robot. The amazing high-speed video below shows the robot taking off, hovering in place and steering left and right. This work is part of the Robobees project that aims to make swarms of insect robots. You can read our full coverage on Robohub.
Kevin Ma, Pakpong Chirarattananon and Sawyer Fuller
Kevin Ma and Pakpong Chirarattananon are graduate student researcher at the Harvard Microrobotics Lab working with Prof. Robert Wood (listen to Wood’s podcast here). Kevin studies the design and manufacturing of very small-scale robots while Pakpong’s work focuses on flight control strategies for flapping wing robots. Sawyer Fuller is a postdoctoral researcher with experience in the control and sensing of biological and robotic flies.
In this episode Per talks to Michael Mangan from the University of Edinburgh about using robotics to study and replicate insect behaviour. Mangan describes his studies of desert ants, that are able to accurately navigate arduous environments despite having a very small brain (less than 400 000 neurons). This is an interesting problem as the desert environment is very challenging, it is too hot for pheromone navigation and nearly featureless, making visual navigation difficult.
Michael Mangan Michael Mangan started by training as an avionics engineer at the University of Glasgow, later deciding to specialize in robotics after taking a course. At that time he was particularly inspired by some of the biorobotics projects in the press such as MIT’s Robot Tuna and Penguin Boat projects. He was very interested in this approach promising improved performance for engineering tasks by taking inspiration from biological systems solving similar problems.
Keen to work in this area he then moved to the Insect Robotics Lab, at the University of Edinburgh to undertake a PhD with Prof. Barbara Webb (see previous podcast interview). This lab combines robotics techniques with animal behavioural experiments in a synergistic loop aimed at revealing how these organisms achieve such impressive behaviors, despite their limited neural hardware and often low-resolution sensory systems. Revealing the parsimonious techniques used by these animals may then allow us to apply them to robotic systems.
Mangan’s current research focuses on the navigational abilities of desert ants. These ants scavenge for food over long distances despite searing surface temperatures when pheromone trails evaporate too quickly to use for guidance. Instead the ants rely mainly on visual cues for guidance. He has recently documented the impressive individual route following behavior of desert ants in southern Spain, and mapped their habitat for the first time. This has allowed the first rigorous testing of robotic and biologically plausible models of navigation in the ant world, as viewed by the ant.
Mangan is currently constructing these virtual worlds for public use and they will be available from www.AntNav.org. This webpage is currently under development but he hopes to have initial data uploaded soon, so stay tuned.
She is now involved in the FILOSE, robotic fish locomotion and sensing project, whose team attempts to build robots that mimic how fish react and adapt to the water flow around them. In the first part of the interview Professor Kruusmaa talks about why they are using a novel, soft and compliant body approach for robotic fish rather than the more common linked chain. She describes how this embodiment helps reduce the computational load and how it allows them to make a simpler and cheaper robot that is more reliable than a more rigid version would be. We also hear about opportunities that come from sensing and adapting to the flow and the advantages of robotic fish compared to conventional UAVs, before talking about possible applications, such as underwater archeology.
Professor Kruusmaa is the R&D Director of Fits.me since 2009, working alongside COO Diana Saarva. They have created a virtual fitting room which enables users to virtually try on clothes before buying them, with the help of shape-shifting robotic mannequins which can grow from slim to muscular in just a few moments. This allows buyers to enter their measurements and see what clothes would look like on them.
Fits.me robotic mannequin
In the second part of the interview, Professor Kruusmaa and Diana Saarva talk about the Fits.me idea. It is particularly interesting to hear about how they developed the cooperation between the technology/research side and the entrepreneurs/business side.
Diana Saarva joined Fits.me in September 2009, and became the COO in 2011. She is responsible for supervising and coordinating all client operations and developing new business development.
Whether you’re looking at multicellular organisms or social insects such as ants and termites, nature has found powerful ways to make systems self-organize. In these collectives, individuals that are typically simple, unreliable, and limited, cooperate through local interactions to achieve complex behaviors.
Radhika Nagpal has been building on these principles to make modular and swarm robots that are able to work together in a decentralized manner. She tells us about a self-balancing modular table that is able to adapt to terrain while balancing your cup of coffee. In the TERMES project, robots work together to build the environment in which they evolve, creating the very staircase that will allow them to build a structure. We also look at how her group has made large-scale swarm robotics a reality with the kilobot project and its 1024 quarter-sized robots previously featured on our podcast.
Finally, Nagpal tells us about how her insights in mathematics and the theory of self-organization can also help us learn something about biological systems.
He presents his work on self-reconfigurable modular robots done as part of the DRAGON (Distributed Real-time Autonomously Guided OrgaNisms) project. His snake inspired robot is composed of a set of modules and DRAGON joints that enable the robot to physically connect and disconnect, share energy and communicate. He tells us about challenges in building such a robot, including making smart mechanical docking systems, integrating all the functional requirements of a joint in a single mobile structure, and using Model Predictive Control to generate robot motion.
In his current work, Nilsson is focussing on integration sensor readings to obtain precise motor control. As inspiration, and in collaboration with Neurophysiologists, he looks at how the cerebellum is able to fuse proprioceptive sensing and touch to achieve precise motions in humans.