Archive for the ‘Podcast’ Category

August 13th, 2010

Robots: Distributed Flight Array

In this episode, we discover an aerial modular robot called the Distributed Flight Array. To talk about this, we have Raymond Oung from the Swiss Federal Institute of Technology in Zürich.

Then, to celebrate aerial robotics, we’re holding a contest on flying robot noises for a chance to win a WowWee Bladestar.

Raymond Oung

Raymond Oung is lead researcher of the Distributed Flight Array project at the Swiss Federal Institute of Technology in Zürich under the supervision of Prof. Raffaello D’Andrea (see previous ROBOTS interview).

The idea behind this project is to design a set of vehicles equipped with a single propeller and wheels that can drive around in search for fellow modules with whom to dock. Single modules are not stable but once assembled, the flight array is able to take-off and achieve coordinated flight. Modules then detach in-air, fall to the floor and repeat their search for other propellers.

The main challenge in this system is to come up with a distributed controller that can allow modules to work together to achieve coordinated flight. Because of its endless number of configurations, the distributed flight array is the perfect research and pedagogical testbed to study control theory for complex systems.

Contest

We were trying to imagine the sound of all of these propellers and then realized it would be fun to record the sound of some of the flying objects here at EPFL. If you manage to match the sound with the correct robot picture, we’ll be sending you a Wowwee Bladestar. If multiple correct answers are received, the winner will be selected randomly. The contest ends on the 27th of August and answers can be sent via email to info@robotspodcast.com or can be posted below this episode in the comments section.

WowWee Bladestar

Audio:

Noises of Flying Robots

Images:

1: Eyebot

2: Airburr

3: SMAV

4: Blimp

5: WowWee DragonFly

6: Eyebot



The correct answer was:
1 -> F
2 -> B
3 -> A
4 -> C
5 -> E
6 -> D

Links:


Latest News:
For more information and videos of Ishiguro’s Telenoid R1 and the F1 Robocoaster in action, have a look at our forum.

| More

Related episodes:

July 2nd, 2010

Robots: R&D at iRobot

In this episode we look at the Research and Development (R&D) done at iRobot in the government field with lead roboticist Brian Yamauchi.

Brian Yamauchi

Brian Yamauchi is Lead Roboticisist at iRobot in Bedford, MA where he leads many of the government projects aimed at helping soldiers and first-response teams do their work.

During this interview, Yamauchi covers some of the developments done over the past 10 years, most of which are based on the PackBot robot. In particular, he’ll be telling us how they make these robots more robust and what sensors they’re using to increase autonomy, and even map out the world. One of these sensors, the ultra-wideband radar, was presented at this year’s ICRA conference in Alaska (paper).

Beyond the single PackBot, Yamauchi is now looking at how to make robots collaborate with examples in terrestrial and aerial robot team and mobile wireless transmitters for the quick deployment of communication networks.

Moreover, because many of the government robots developed at iRobot are being used in Iraq or Afghanistan, he’ll be telling us about the research in making good soldier-robot interactions and the ethics of military robots.

Finally, we’ll be learning more on the business of iRobot and the futuristic projects they’re working on such as the chembot and jambot projects that involve making soft and deformable robots (see video below).

Before working at iRobot, Yamauchi completed a PhD in Computer Science from Case Western Reserve University and worked at the Naval Research Laboratory (Washington, DC).

Poll

In this week’s episode we’ll be asking you about your take on the cross-fertilization between the military and robotics. Make sure you take the poll and debate in the comments section below or on our forum.

Links:


Latest News:
For videos of this week’s Robots news, including the autonomous robot lifeguard and the sand swimming salamander robot, have a look at the Robots Forum.

| More

Related episodes:

May 7th, 2010

Robots: 50 Years of Robotics (Part 2)

Welcome to the second part of our 50th episode special! To celebrate 50 episodes of Robots, we’re doing a review of some of the greatest advances in robotics during the last 50 years, and predictions on what we can hope to see in the next half century. In last week’s episode we covered embodied AI, robot toys, androids, underwater robots, education robots and brain-machine interfaces.

In today’s episode we speak with Jean-Christophe Zufferey on flying robots, Dan Kara on the robotics market, Kristinn Thórisson on AI, Andrea Thomaz on human robot interactions, Terry Fong on space robotics and Richard Jones on nano robots.

Finally, don’t forget to check out all the new features of our website including episode browsing by topic, interviewee and tag or leaving comments under our blog posts or in the forum.

Jean-Christophe Zufferey

Jean-Christophe Zufferey is a researcher at the Laboratory of Intelligent Systems at the Swiss Federal Polytechnic in Lausanne, Switzerland, where he works on cutting-edge research in Micro Air Vehicles (MAVs). His latest advances have led him to create the startup SenseFly that specializes in small and safe autonomous flying systems for applications such as environmental monitoring and aerial photography.

Dan Kara

Dan Kara is President of Robotics Trends and the Robotics Business Review, which are web-portals and research firms specialized in the robotics markets. He’ll be telling us about the past products which have marked the minds and the future developments that will be gathering the buck in the future.

Kristinn R. Thórisson

Kristinn Thórisson is Associate Professor at the School of Computer Science, Reykjavik University in Iceland.  Active in the field of Artificial Intelligence for a couple decades, Thórisson is pioneering new approaches such as constructivist AI which he hopes will bring us towards more adaptive and complex artificial systems.

Andrea Thomaz

Andrea Thomaz is professor at Georgia Tech and the director of the Socially Intelligent Machines Research Laboratory. Lately, she’s been seen with her new humanoid Simon and his expressive traits. We were able to catch her at this year’s ICRA conference for a little chat on the past and future of human robot interactions.

Terry Fong

Terry Fong is the Director of the Intelligent Robotics Group at the NASA Ames Research Center. As an expert in space robotics, he’ll be telling us about robots leaving the solar system to explore our universe and how humans and robots will work together towards this endeavor.

Richard Jones

Richard Jones is the author of the book Soft Machines: nanotechnology and life and a blog on the subject also named Soft Machines. From the University of Sheffield in the UK, where he is Professor of Physics, Jones has been looking at how to make nanoscale robots which can eventually be used in the body for medical applications.

Links:


Latest News:

For more information on this episode’s news, including a video of Kumagai’s balancing BallIP robots, McGill’s rapid ice sculpture prototyping system, and Stanford’s perching UAV as well as more coverage from the ICRA 2010 conference, visit the Robots Forum.

| More

Related episodes:

January 29th, 2010

Robots: Quadrotors

Today’s show is centered around robots in the air, and more specifically on Unmanned Aerial Vehicles (UAVs) of the quadrotor variety. We chat with Joshua Portlock from Cyber Technology about their portfolio of different-sized UAVs with special emphasis on the CyberQuad, a four-rotor helicopter with advanced autonomous capabilities. Near the end of the show we also start what will hopefully be an animated debate on what exactly is the definition of a Robot, so join in the discussion!

Joshua Portlock

Joshua Portlock is the project manager of the CyberQuad project at Cyber Technology out of Perth, Australia. Portlock tells us about his company’s fleet of UAVs and their increasingly broad range of applications in the civil market. He then gets into the nitty gritty on his own creation, the CyberQuad, a four-rotor autonomous aircraft that’s the final result of years of research started while he was still an engineering student at the Curtin University of Technology.




The CyberQuad is a highly-optimized quadrotor that uses ducted fans to increase the efficiency of the drivetrain and provide protection from obstacles. Recently featured in Wired magazine, the platform can fly and hover in constrained environments and has already been used to visually survey oil platforms, bridges or search for bushfires in the Australian outback before they go out of control.




What is a Robot?

Have you ever wondered what a robot really is? Over coffee the other day we were trying to find a sleek and simple one size fits all definition for all the robots we’ve covered on the show from molecular robots to smart houses, humanoids or flying crawling and jumping robots. However, for every definition we came up with there was a counter example that either didn’t fit the definition or did although it wasn’t really what we think of as a robot! For example, the definition “A robot is a machine with inputs and outputs” was not satisfying because a calculator fits that definition although it is not a robot. Therefore, every episode from now on will explore a new or modified definition and submit it to the “counter-example” test until we are satisfied with the result. We’ll be asking our friends, colleagues and you our listeners for your best answer to the question “What is a robot?”. If you think that you have a good answer, email us your short definition at info@robotspodcast.com with your phone number so that we can call you and ask you directly on the air!

Links:


Latest News:

For more information on the future of the Spirit Mars Rover, the plans for a fuel-cell powered exoskeleton and Korea’s Mahru-Z robot, visit the Robots Forum.

View and post comments on this episode in the forum

| More

Related episodes:

August 14th, 2009

Robots: Brain-Machine Interfaces

In today’s show we’ll be speaking with two experts in the field of brain-machine interfaces. Our first guest, Charles Higgins from the University of Arizona tells us how he uses insects to control robot motion and how they might be used in the future to develop new biological sensors for artificial systems.
We then speak with Steve Potter from the Georgia Institute of Technology. Instead of taking a fully developed brain and connecting it to a robot, he grows neural circuitry in a Petri-dish and interfaces it with robots, with the ambition to discover how we learn and memorize.

Charles Higgins

Charles Higgins is associate professor and leader of the Higgins Lab at the University of Arizona. Though he started as an electrical engineer, his fascination with the natural world has led him to study insect vision and visual processing, and to try to meld together the worlds of robotics and biology. This fascination and his interest to share it with others brings him every year to the Neuromorphic Engineering Workshop in Telluride, Colorado, where he met our interviewer Adam and took him dragonfly-hunting!

Higgins first tells us about his experiments with natural systems such as dragonflies, and how he’s learning about how their brains work in the hope of applying some of the concepts of neurobiology to engineering systems. He then talks about his most recent work in trying to use the amazing visual system of a dragonfly as a sensor to control a robot, and in turn to provide motion stimulus back to the dragonfly in a closed-loop system. He finishes by telling us a bit about the future in which we will design insect-inspired robots, or even have insects built-in to them directly!

Steve Potter

Steve Potter is the Director of the Potter Group which is part of the Laboratory for NeuroEngineering, a collective research unit shared between Emory University and the Georgia Institute of Technology. To understand how the neurocircuitry in the brain can lead to learning and memory, he’s been growing neural circuits in Petri-dishes and hooking them up to the sensors and actuators of robots. The embodiment provides the stimulus needed for the brain to develop. Because the neurons are in a dish, they can easily be monitored over time, providing a close-up sneak peak into the brain activity.

Robots that have been hooked up to this system include the Koala and Khepera wheeled robots from K-team and a robot artist named MEART (Multi-Electrode Array Art). MEART was built in collaboration with the SymbioticA Research Group and went on tour around the world, drawing pictures based on stimulation from its in-vitro brain and feeding back camera images of its art. After weeks of stimulation, the brain actually calms down, providing insight into the possible treatment of epilepsy.



MEART Robotic Arm

Finally, Potter gives us his take on whether these hybrid living robots (Hybrots), or Animats are more life or machine?

Links:


Latest News:

For more information on the LEGO Moonbots challenge, the AUVSI conference and the Evolta robot, visit the Robots Forum.
View and post comments on this episode in the forum

| More

Related episodes: