Archive for the ‘Podcast’ Category

August 13th, 2010

Robots: Distributed Flight Array

In this episode, we discover an aerial modular robot called the Distributed Flight Array. To talk about this, we have Raymond Oung from the Swiss Federal Institute of Technology in Zürich.

Then, to celebrate aerial robotics, we’re holding a contest on flying robot noises for a chance to win a WowWee Bladestar.

Raymond Oung

Raymond Oung is lead researcher of the Distributed Flight Array project at the Swiss Federal Institute of Technology in Zürich under the supervision of Prof. Raffaello D’Andrea (see previous ROBOTS interview).

The idea behind this project is to design a set of vehicles equipped with a single propeller and wheels that can drive around in search for fellow modules with whom to dock. Single modules are not stable but once assembled, the flight array is able to take-off and achieve coordinated flight. Modules then detach in-air, fall to the floor and repeat their search for other propellers.

The main challenge in this system is to come up with a distributed controller that can allow modules to work together to achieve coordinated flight. Because of its endless number of configurations, the distributed flight array is the perfect research and pedagogical testbed to study control theory for complex systems.

Contest

We were trying to imagine the sound of all of these propellers and then realized it would be fun to record the sound of some of the flying objects here at EPFL. If you manage to match the sound with the correct robot picture, we’ll be sending you a Wowwee Bladestar. If multiple correct answers are received, the winner will be selected randomly. The contest ends on the 27th of August and answers can be sent via email to info@robotspodcast.com or can be posted below this episode in the comments section.

WowWee Bladestar

Audio:

Noises of Flying Robots

Images:

1: Eyebot

2: Airburr

3: SMAV

4: Blimp

5: WowWee DragonFly

6: Eyebot



The correct answer was:
1 -> F
2 -> B
3 -> A
4 -> C
5 -> E
6 -> D

Links:


Latest News:
For more information and videos of Ishiguro’s Telenoid R1 and the F1 Robocoaster in action, have a look at our forum.

| More

Related episodes:

June 18th, 2010

Robots: Modeling Biology

In today’s episode we speak about modeling biology using robots and how lessons learned through this process can feedback into robotics. Our first guest, Barbara Webb, is a world renowned expert in the field with several seminal papers on the subject such as “Using robots to understand animal behavior.” This interview follows up on her previous interview with Talking Robots. Our second guest, Steffen Wischmann, from the EPFL and University of Lausanne gives us his in-depth overview of the cross-fertilization between biology and robotics and tells us about his interest in artificial evolution.

Barbara Webb

Barbara Webb is director of the Insect Robotics Group at the Institute of Perception, Action and Behaviour at the University of Edinburgh.

Her group researches and models the sensorimotor capabilities of insects ranging from simple reflexive behaviours such as the phonotaxis of crickets, to more complex capabilities such as multimodal integration, navigation and learning.

While her group carries out behavioural experiments on insects, they principally work on computational models of the underlying neural mechanisms, which are often embedded on robot hardware. We’ll be talking to her about insect inspired robotics as a control system design approach.

Steffen Wischmann

Steffen Wischmann is a Postdoctoral researcher based at the Laboratory of Intelligent Systems at the EPFL and at the Department of Ecology and Evolution at the University of Lausanne. His current research investigates the evolution and the neural mechanisms of cooperation and communication in biological systems using robotic models. After years of reading about the close interaction between robotics and biology, he gives us his opinion on when robotic models are interesting for biology, to what depth the models should replicate biology and the use of artificial evolution.

Links:


Latest News:
For more information and a video on the Ballbot as well as this year’s robot novelties at the Automatica trade fair, visit the Robots Forum!

| More

Related episodes:

April 9th, 2010

Robots: URBI Software Platform

In this episode we look at robotics software platforms with Jean-Christophe Baillie who is the CEO of Gostai, a robotics software company out of Paris.

He tells us about the software URBI that he created to help developers program and control robots and his motivation to go open source at the International Conference on Robotics and Automation (ICRA) in a couple of weeks.

Jean-Christophe Baillie

Jean-Christophe Baillie received a PhD in Artificial Intelligence from University of Paris 6 and Sony Computer Science Lab and then founded the Cognitive Robotics Lab in ENSTA/ParisTech.

He tells us about his past work on developmental robotics and more specifically on the Talking Heads experiments covered in part in an interview with Talking Robots. During this research, he designed URBI to control complex robotics systems like the AIBO. In 2006 Baillie founded Gostai to further develop the URBI technology which has now been extended to many robotic platforms such as the Nao humanoid used for the RoboCup robotics competition. He also tells us about his plans to make URBI open source and what that entails in terms of business model.

Links:


Latest News:

For videos of the current autonomous Audi TTS rally car prototype, details on NASA’s new autonomous exploration system AEGIS, and videos of Ishiguro’s new android, Geminoid F, have a look at the Robots Forum.

View and post comments on this episode in the forum

| More

Related episodes:

March 26th, 2010

Robots: Chaos Control

In this episode we focus on chaos control and ways to generate unpredictable behaviour. Our first guess, Poramate Manoonpong is a research associate at the Bernstein Center for Computational Neuroscience in Göttingen, Germany where he studies ways to make an insect like robot get out of tricky situations by generating chaotic input to a central pattern generator (CPG) in charge of the robot’s gait. We then speak with Alex Pitti from the University of Tokyo about chaos controllers that can synchronise to the dynamics of the body they are controlling, thus creating more complex behaviours while at the same time simplifying the controller.

Poramate Manoonpong

Poramate Manoonpong is a Thai research associate who works at the Bernstein Center for Computational Neuroscience, University of Goettingen, Germany. He is currently doing a JSPS Postdoc fellowship at the Department of Brain Robot Interface in Kyoto before returning to the University of Goettingen.

His recent work on Self-organized adaptation of a simple neural circuit enables complex robot behaviour was recently published in Nature Physics. In this work, he explains choas control, CPGs and learning applied to one of his insect-like robots. The CPG composed of only two neurons is used to control the walking gait of the robot that is packed with actuators. By peppering the CPG input with a bit of chaos, the robot is able to get itself out of tricky situations by randomly trying out different walking gaits. Learning is then used to help the robot adapt its gait to save energy depending on the inclination of the slope it is walking up. Interestingly, his work can even contribute to biology.

Manoonpong has one of the nicest personal websites in the roboticist community so make sure to have a look around to see some of the insect-like, running or modular robots he’s worked on.

Alex Pitti

Alex Pitti is a researcher at the Intelligent Systems and Informatics Lab at the University of Tokyo in Japan. He’s currently working on the JST Erato Asada Project, the goal of which is to study how infants, one of the world’s most complex learning systems, learn to control the dynamics of their body. The lessons learned from infants are then applied to the control of complex robots with many non-linear actuators.

Pitti’s recent work has focused on the interaction between an oscillating controller and the morphology and dynamics of the body it is controlling. He tells us how we can create controllers that can synchronise to the material properties of the body to create much more dynamic motions while at the same time reducing the complexity of the controller. A few simple global parameters can then be used to control highly complex synchronised motions such as dynamic hopping or running of a robotic leg.

Links:


Latest News:

The Robots Forum has more information on this episode’s news, including a video of RoboSoft’s new care robot, farewell pictures of one of the first AUVs, Woods Hole’s Autonomous Benthic Explorer ABE, and some background information on the robotic arm support.

View and post comments on this episode in the forum

| More

Related episodes:

January 15th, 2010

Robots: Deep-Sea Exploration

In today’s show we focus on the great depths of our ocean and robotic vehicles capable of taking us deeper than we ever imagined. Alberto Collasius Jr. tells us about his institute’s highly-advanced remotely operated vehicle, or ROV, capable of bringing high-definition video from over 5km underwater. We then announce the winner of our Christmas contest and proud owner of two Didel SA robot kits.

Alberto Collasius Jr.

Alberto Collasius Jr., or Tito to those who know him, is part of the Applied Ocean Physics and Engineering Department at the Woods Hole Oceanographic Institution in Massachusetts in the US. Collasius spends much of his time at sea as expedition leader with the JASON ROV which is used throughout the world’s oceans to search for old shipwrecks, underwater volcanoes or deep-sea natural environments that are inaccessible to human-operated vehicles. He tells us about the particular difficulties involved in operating at depths beyond 5000m and the sophisticated sensors and control systems present on their advanced ROV and base station.


Click to see a video of the underwater volcanic eruption

(photo courtesy of Woods Hole Oceanographic Institution)

Contest

Before Christmas, we asked you “who made the giant six legged robot?” for a chance to win the two robot kits offered by Didel SA. Turns out there were actually two answers to this question any of which qualified our many participants for the lottery. The first possible answer was Julie Townsend from the NASA and her Athlete robot for Lunar missions which was featured on a recent episode. The second giant six legged robot was actually called “the giant six legged robot” by its creator Jaimie Mantzel who was featured in April of last year.




The lucky winner of our competition is Will Preston who will be receiving his prize shortly.

Links:


Latest News:

For more information on this episode’s news, including some first robotics milestones for 2010, videos of ROV Justin’s close encounter with an underwater volcano and this year’s robot novelties at the CES 2010, visit the Robots forum!

View and post comments on this episode in the forum

| More

Related episodes: