Archive for the ‘Podcast’ Category

December 27th, 2014

Robots: 3D SLAM

In this episode, Audrow Nash speaks with Professor John Leonard from MIT about his research on dense, object-based 3D Simultaneous Localization And Mapping (SLAM).

Leonard explains what SLAM is, as well as its practical applications. The explanations include what it means for SLAM to be object-based (versus feature-based) and to have dense (versus sparse) environmental mapping. The interview closes with advice for aspiring roboticists.

John Leonard
jleonard_05_nov2014John J. Leonard is Professor of Mechanical and Ocean Engineering and Associate Department Head for Research in the MIT Department of Mechanical Engineering. He is also a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). His research addresses the problems of navigation and mapping for autonomous mobile robots. He holds the degrees of B.S.E.E. in Electrical Engineering and Science from the University of Pennsylvania (1987) and D.Phil. in Engineering Science from the University of Oxford (1994). He studied at Oxford under a Thouron Fellowship and Research Assistantship funded by the ESPRIT program of the European Community. Prof. Leonard joined the MIT faculty in 1996, after five years as a Post-Doctoral Fellow and Research Scientist in the MIT Sea Grant Autonomous Underwater Vehicle (AUV) Laboratory. He has served an associate editor of the IEEE Journal of Oceanic Engineering and of the IEEE Transactions on Robotics and Automation. He was team leader for MIT’s DARPA Urban Challenge team, which was one of eleven teams to qualify for the Urban Challenge final event and one of six teams to complete the race. He is the recipient of an NSF Career Award (1998), an E.T.S. Walton Visitor Award from Science Foundation Ireland (2004), the King-Sun Fu Memorial Best Transactions on Robotics Paper Award (2006), and he is an IEEE Fellow (2014).

Links:

| More

Related episodes:

May 31st, 2013

Robots: Curved Artificial Compound Eye

In this episode, we speak with Ramon Pericet and Michal Dobrzynski from EPFL about their Curved Artificial Compound Eye (CurvACE) published in the Proceedings of the National Academy of Sciences. Inspired by the fly’s vision system, their sensor can enable a large range of applications that require motion detection using a small plug-and-play device. As shown in the video below, you could use these sensors to control small robots navigating an environment, even in the dark, or equip a small autonomous flying robot with limited payload. Other applications include home automation, surveillance, medical instruments, prosthetic devices, and smart clothing.


The artificial compound eye features a panoramic, hemispherical field of view with a resolution identical to that of the fruitfly in less than 1 mm thickness. Additionally, it can extract images 3 times faster than a fruitfly, and includes neuromorphic photoreceptors that allow motion perception in a wide range of environments from a sunny day to moon light. To build the sensors, the researchers align an array of microlenses, an array of photodetectors, and a flexible PCB that mechanically supports and electrically connects the ensemble.

This work is part of the European Project Curvace which brings together a total of 15 people from four partners in France, Germany and Switzerland.

You can read our full coverage about this new sensor on Robohub.

Ramon Pericet Camara
Ramon Pericet Camara is the scientific coordinator for the CurvACE project and a postdoctoral researcher at the Laboratory of Intelligent Systems at EPFL. His research interests are oriented towards bio-inspired robotics, soft robotics, and soft-condensed matter physics.

Ramon received a Masters degree in Physics in 2000 from the University of Granada (Spain) and a PhD in Multidisciplinary Research from the University of Geneva (Switzerland) in 2006. Subsequently, he was granted a fellowship for prospective researchers from the Swiss National Science Foundation to join the Max Planck Institute for Polymer Research in Mainz (Germany).

Michal Dobrzynski
Michal Dobrzynski is a PhD student at the Laboratory of Intelligent Systems at EPFL. He obtained his master degree in Automatic Control and Robotics in 2006 from the Warsaw Technical University (Poland). He then joined the SGAR S.L. Company (Barcelona, Spain) as a Robot and PLC Software Engineer where his work focused on industrial robots and automatic lines programming and visualization. Next, in 2007, he joined a Numerical Method Laboratory at the University Politechnica of Bucharest (Romania) where he spent two years working in the FP6 “Early Stage Training 3″ project as a Researcher.




Links:

| More

Related episodes:

February 10th, 2012

Robots: Senseable Robots

In today’s episode we look at some of the work done by the Senseable City Lab. We’ll be talking to Carlo Ratti, the director of the Lab, about two of the Lab’s many projects – namely Flyfire and Seaswarm.

Carlo Ratti

An architect and engineer by training, Carlo Ratti practices in Italy and teaches at the Massachusetts Institute of Technology, where he directs the Senseable City Lab. He graduated from the Politecnico di Torino and the École Nationale des Ponts et Chaussées in Paris, and later earned his MPhil and PhD at the University of Cambridge, UK.

As well as being a regular contributor to the architecture magazine Domus and the Italian newspaper Il Sole 24 Ore, Carlo has written for the BBC, La Stampa, Scientific American and The New York Times. His work has been exhibited worldwide at venues such as the Venice Biennale, the Design Museum Barcelona, the Science Museum in London, GAFTA in San Francisco and The Museum of Modern Art in New York. His Digital Water Pavilion at the 2008 World Expo was hailed by Time Magazine as one of the ‘Best Inventions of the Year’. Carlo was recently a presenter at TED 2011 and is serving as a member of the World Economic Forum Global Agenda Council for Urban Management. He is also a program director at the Strelka Institute for Media, Architecture and Design in Moscow and a curator of the 2012 BMW Guggenheim Pavilion in Berlin.

Carlo founded the Senseable City Lab in 2004 within the City Design and Development group at the Department of Urban Studies and Planning, in collaboration with the MIT Media Lab. The Lab’s mission is to creatively intervene and investigate the interface between people, technologies and the city. Whilst fostering interdisciplinary, the Lab’s work draws on diverse fields such as urban planning, architecture, design, engineering, computer science, natural sciences and economics to capture the full nature of urban problems and deliver research and applications that empower citizens to make choices that result in a more liveable urban condition.

Links:

| More

Related episodes:

January 28th, 2011

Robots: Odor Source Localization

In this episode we revisit robot olfaction and take a closer look at the problem of odor source localization. Our first guest, Hiroshi Ishida from the Tokyo University of Agriculture and Technology is an expert in the field, whose sniffing robots range from blimps to ground and underwater robots. Our second guest, Thomas Lochmatter from EPFL talks about tradeoffs between biologically inspired and probabilistic approaches to navigate a gas plume.

Hiroshi Ishida

Hiroshi Ishida is Associate Professor in the Department of Mechanical Systems Engineering, Tokyo University of Agriculture and Technology, Japan.
The focus of his research group is to develop robots that can find sources of airborne gas plumes or underwater chemical plumes. To this end, they developed the Active Stereo Nose (see figure below), a differential gas sampling system inspired by the dog’s nose, and the Crayfish robot that mimics the mechanism used by crayfish in nature to create unidirectional water currents.

Thomas Lochmatter

image credit: SNF


During his PhD at the Distributed Intelligent Systems and Algorithms Lab at EPFL in Switzerland, Thomas Lochmatter developed a modular odor system for the Khepera III robot. His research focused on the pros and cons of biologically-inspired and probabilistic algorithms for odor localization, while dealing with both single and multi-robot systems.

Links:

| More

Related episodes:

November 5th, 2010

Robots: Autonomous Vehicles

In today’s episode we take a deeper look at what’s behind the hype over autonomous vehicles, and talk to two experts in the field, Alberto Broggi, leader of the Vislab Intercontinental Vehicle Challenge, and Raul Rojas, leader of the Made in Germany autonomous vehicle project.

Alberto Broggi

Alberto BroggiAlberto Broggi is the Director of the Artificial Vision and Intelligent Systems Lab at the University of Parma.

His main milestones are the ARGO Project (a 2000+ km test done on Italian highways back in 1998 in which the ARGO vehicle drove itself autonomously) and the setup of the Terramax vehicle who reached the finish line of the DARPA Grand Challenge 2005. The Vislab Intercontinental Vehicle Challenge was accomplished when the vehicle expedition recently reached Shanghai on October 28th after crossing two continents in a journey more than 3 months long.

Raúl Rojas

Raúl Rojas is a professor of Computer Science and Mathematics at the Free University of Berlin and a renowned specialist in artificial neural networks.

The FU-Fighters, football-playing robots he helped build, were world champions in 2004 and 2005. He formerly lead an autonomous car project called Spirit of Berlin and is now leading the development of the Made in Germany car, a spin-off project of the AutoNOMOS Project. Although most of his current research and teaching revolves around artificial intelligence and its applications, he holds academic degrees in mathematics and economics.

Links:


Latest News:
For more information on this week’s news, including pictures and videos of the two new robotic grippers, have a look at the Robots Forum.

| More

Related episodes: