Archive for the ‘Podcast’ Category

June 20th, 2008

Robots: A Robot Fly at Harvard and at the MoMA - Transcript

This episode features an interview with Robert Wood about his micro-robotic fly, as well as a talk with the curator of design at the Museum of Modern Art in Manhattan.

Rob Wood

Rob Wood's Robot fly

Professor Robert Wood is the founder and director of the Harvard Microrobotics Lab at Harvard University. He initially started out at Ron Fearing‘s Biomimetic Lab at Berkley working on the Micromechanical Flying Insect (MFI) project (see Talking Robots interview). Strong of his experience with designing the tiny, he went on to build his own microscale robots for aerial, terrestrial, and aquatic environments. His recent article in IEEE Spectrum Magazine, “Fly, Robot Fly” describes the first flight of his tethered fly:

« It began when I took a stick-thin winged robot, not much larger than a fingertip, and anchored it between two taut wires, rather like a miniature space shuttle tethered to a launchpad. Next I switched on the external power supply. Within milliseconds the carbon-fiber wings, 15 millimeters long, began to whip forward and back 120 times per second, flapping and twisting just like an actual insect’s wings. The fly shot straight upward on the track laid out by the wires. As far as I know, this was the first flight of an insect-size robot. »



Now that the micromechanical structure has proven it has sufficient thrust to actually lift the robot off the ground, the questions focus on how to power the robot insect and what sensors and control could allow it to perform its intended long term applications, namely search and rescue, hazardous environment exploration, environmental monitoring, and reconnaissance.

Wood also gives us some insight on how Biology has been driving his research and how he hopes to be able to return the favor by using his platform to study flies in nature.

Chaotic flight controlled, robot insect swarms, tech-driving miniaturization… let’s wait and see.

Paola Antonelli

Rob Wood’s robotic fly was featured as part of an exhibition at the Museum of Modern Art (MoMA) in New York City entitled Design and the Elastic Mind. We had a talk with Paola Antonelli, the curator of the Department of Architecture and Design at MoMA, about the role of design in helping people cope with momentous changes in science and technology. How will designers help people adapt as robots become ubiquitous in our daily lives? How does our experience in nature affect the design of future robotic systems? Paola takes us through a brief tour of a designer’s perspective of science and technology.

Links:


Latest News:

Visit the Robots Forum for links and discussions about
iRobot’s “Seaglider” underwater robot, the DARPA contract awarded to iRobot for the Chembot, the sale of the autonomous car “Odin” and EMA the robotic girlfriend mentioned in the podcast.

View and post comments in the forum

| More

Related episodes:

June 6th, 2008

Robots: Cornell Racing Team and Velodyne’s LIDAR Sensor - Transcript

Our inaugural episode centers on the 2007 DARPA Urban Challenge, featuring interviews with professor Daniel Huttenlocher from Team Cornell and Rick Yoder from Velodyne, a producer of LIDAR sensors used by several teams in the robot car race.

Dan Huttenlocher

Team Cornell's Robot Racing TeamDan Huttenlocher is professor of Computing, Information Science and Business at Cornell University in Ithaca New York. As the co-leader of Cornell’s racing team for the 2007 DARPA Urban Challenge, he spent countless hours testing the autonomous car which finally finished among the six final automobiles capable of following California’s road code over 56 miles of a mock urban environment. With design in mind, his team of 13 students managed to discretely embed a slick black 2007 Chevy Tahoe with a Velodyne LIDAR, three IBEO 1.5D LIDARs, five 1D SICK LIDARs, five millimeter-wave radars, and four cameras. Of course, millions of data points per second don’t come for free and Cornell’s trunk is the home of 17 dual core processors.

Since a pile of impressive hardware and CPU is not enough, Team Cornell developed the artificial intelligence and control software needed to allow their robot to locally represent its location on the road and further figure out, on a more global scale, where it really was in the world. Moreover, the Cornell car also needed to localize and track other objects in the environment and ideally reason about their next moves. So, what went wrong in this little fender bender with MIT’s car (see video below)? I guess the professional human drivers during the challenge weren’t wrong, when they said that Cornell’s car drove like a human.



Velodyne LIDAR

Velodyne's LIDAR robot sensorRick Yoder is an employee at Velodyne, a new-comer in the field of LIDAR (Light Detection and Ranging) sensors. The HDL-64E LIDAR uses an impressive 64 stationary lasers on a base rotating at 900rpm. This sensor was specifically designed for the 2007 DARPA Urban Challenge, and was used by around a third of the participating teams, although some other teams may have been turned away by the hefty $75,000USD price tag! Though not yet destined for the consumer market, Rick hints at a new series of sensors that may soon find their way into your car.

Links:


Latest News:

Visit the Robots Forum for links and discussions about the IEEE Spectrum Magazine’s Singularity, Robin Murphy’s Survivor Buddy, Georgia Tech’s Sandbot, the rapid-prototyper robot “RepRap” and the Japanese Navirobo teddybear mentioned in the podcast.

View and post comments in the forum

| More

Related episodes:

April 25th, 2008

Talking Robots Podcast LogoTalking Robots: Neurobotic Prosthetics
Go to original website

In this episode of Talking Robots we speak with Yoky Matsuoka who is the director of the Neurobotics Laboratory at the University of Washington in Seattle, USA. Boosted by her nomination as MacArthur Fellow she has been recognized as a leader in the emerging field of neurobotics. With her team, she’s been focused on understanding how the central nervous system coordinates musculoskeletal action and how robotic technology can enhance the mobility of people with manipulation disabilities.

Read more...

Related episodes:

February 29th, 2008

Talking Robots Podcast LogoTalking Robots: BioMicroRobots
Go to original website

In this episode we interview Brad Nelson who is the Professor of Robotics and Intelligent Systems at ETH Zürich. At the root of BioMicroRobotics, Nelson has designed microrobots for retinal surgery applications. Pushing the principle of “embodiment” to the extreme, he’s by embedding the intelligence of his robot within their physical body. In the end, their shape, material and physical properties allow them to interact with the environment and subsequently harvest energy, perform sensing, and navigate through the human body. Using similar principles, Nelson’s lab won the 2007 RoboCup Nanogram Competition, the first year the event was held. The goal was to use autonomous microrobots smaller than 300µm to perform a series of soccer related tasks.

Read more...

Related episodes:

January 18th, 2008

Talking Robots Podcast LogoTalking Robots: Autonomous Robots
Go to original website

In this interview we talk to Roland Siegwart who is Full Professor at the Autonomous Systems Lab at the ETH Zurich. Based on his experience with the 18 robots he’s created, he shares his know-how on autonomous robotics and the research which is being done on robot navigation/localization and mapping.

Read more...

Related episodes: