Archive for the ‘Podcast’ Category

October 31st, 2015

Robots: Embodied Quadrotors - Transcript

In this interview, Audrow Nash speaks with Dr. Davide Scaramuzza, Assistant Professor of Robotics at the University of Zurich and leader of the Robotics and Perception Group, about autonomous unmanned vehicles (UAV) that navigate using only on-board systems—no GPS or motion capture systems.

Below are some videos of Scaramuzza’s research.


Davide Scaramuzza

Davide_Scaramuzza_ID_photoDavide Scaramuzza (1980, Italian) is Assistant Professor of Robotics at the University of Zurich. He is founder and director of the Robotics and Perception Group, where he develops cutting-edge research on low-latency vision and visually-guided micro aerial vehicles. He received his PhD (2008) in Robotics and Computer Vision at ETH Zurich (with Roland Siegwart). He was Postdoc at both ETH Zurich and the University of Pennsylvania (with Vijay Kumar and Kostas Daniilidis). From 2009 to 2012, he led the European project “sFly”, which introduced the world’s first autonomous navigation of micro quadrotors in GPS-denied environments using vision as the main sensor modality. For his research contributions, he was awarded an ERC Starting Grant (2014), the IEEE Robotics and Automation Early Career Award (2014), a Google Research Award (2014). He coauthored the book “Introduction to Autonomous Mobile Robots” (MIT Press). He is author of the first open-source Omnidirectional Camera Calibration Toolbox for MATLAB, also used at NASA, Bosch, and Daimler. He is also author of the 1-point RANSAC algorithm, an effective and computationally efficient reduction of the standard 5-point RANSAC for visual odometry, when vehicle motion is non-holonomic. He is Associate Editor of the IEEE Transactions of Robotics and has numerous publications in top-ranked robotics and computer vision journals, such as PAMI, IJCV, T-RO, IJRR, JFR, AURO. His hobbies are piano and magic tricks.


| More

Related episodes:

April 17th, 2015

Robots: Soft Robotics Toolkit - Transcript

In this podcast, Ron Vanderkley speaks to Donal Holland of Harvard University about his team’s work on the Soft Robotics Toolkit.

Soft Robotics is a class of elastically soft, versatile, and biologically inspired machines represents an exciting and highly interdisciplinary paradigm in engineering that could revolutionize the role of robotics in healthcare, field exploration, and cooperative human assistance.

The Soft Robotics Toolkit is a collection of shared resources to support the design, fabrication, modelling, characterization, and control of soft robotic devices. The toolkit was developed as part of educational research being undertaken in the Harvard Biodesign Lab. The ultimate aim of the toolkit is to advance the field of soft robotics by allowing designers and researchers to build upon each other’s work. The web site contains the open source fluidic control board, detailed design for wide range soft robotic components (including actuators and sensors).

The growing popularity of site is now bringing in hobbyist and makers alike. The Soft Robotics Toolkit team has announce two competitions intended to reward students, researchers, makers, and designers of all levels for their contributions to the field of soft robotics.


Donal Holland


Donal Holland is a visiting Lecturer in Engineering Sciences at Harvard School of Engineering and Applied Sciences Demographic info Ireland | Mechanical or Industrial Engineering. He was a passed PhD Student at Trinity College Dublin, Visiting Fellow at Harvard School of Engineering and Applied Sciences, Research Assistant at Treocht Ltd.





| More

Related episodes:

February 20th, 2015

Robots: Sensors for Autonomous Driving - Transcript

In this episode, Audrow Nash interviews Christoph Stiller from the Karlsruhe Institute of Technology. Stiller speaks about the sensors required for various level of autonomous driving, as well as the ethics of autonomous cars, and his experience in the Defense Advanced Research Projects Agency (DARPA) Grand Challenge.


Christoph Stiller

ChristophStillerChristoph Stiller studied Electrical Engineering in Aachen, Germany and Trondheim, Norway, and received the Diploma degree and the Dr.-Ing. degree (Ph.D.) from Aachen University of Technology in 1988 and 1994, respectively. He worked with INRS-Telecommunications in Montreal, Canada for a post-doctoral year as Member of the Scientific Staff in 1994/1995. In 1995 he joined the Corporate Research and Advanced Development of Robert Bosch GmbH, Germany. In 2001 he became chaired professor and director of the Institute for Measurement and Control Systems at Karlsruhe Institute of Technology, Germany.

Dr. Stiller serves as immediate Past President of the IEEE Intelligent Transportation Systems Society, Associate Editor for the IEEE Transactions on Intelligent Transportation Systems (2004-ongoing), IEEE Transactions on Image Processing (1999-2003) and for the IEEE Intelligent Transportation Systems Magazine (2012-ongoing). He served as Editor-in-Chief of the IEEE Intelligent Transportation Systems Magazine (2009-2011). He has been program chair of the IEEE Intelligent Vehicles Symposium 2004 in Italy and General Chair of the IEEE Intelligent Vehicles Symposium 2011 in Germany. His automated driving team AnnieWAY has been finalist in the Darpa Urban Challenge 2007 and winner of the Grand Cooperative Driving Challenge in 2011.


| More

Related episodes:

December 27th, 2014

Robots: 3D SLAM

In this episode, Audrow Nash speaks with Professor John Leonard from MIT about his research on dense, object-based 3D Simultaneous Localization And Mapping (SLAM).

Leonard explains what SLAM is, as well as its practical applications. The explanations include what it means for SLAM to be object-based (versus feature-based) and to have dense (versus sparse) environmental mapping. The interview closes with advice for aspiring roboticists.

John Leonard
jleonard_05_nov2014John J. Leonard is Professor of Mechanical and Ocean Engineering and Associate Department Head for Research in the MIT Department of Mechanical Engineering. He is also a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). His research addresses the problems of navigation and mapping for autonomous mobile robots. He holds the degrees of B.S.E.E. in Electrical Engineering and Science from the University of Pennsylvania (1987) and D.Phil. in Engineering Science from the University of Oxford (1994). He studied at Oxford under a Thouron Fellowship and Research Assistantship funded by the ESPRIT program of the European Community. Prof. Leonard joined the MIT faculty in 1996, after five years as a Post-Doctoral Fellow and Research Scientist in the MIT Sea Grant Autonomous Underwater Vehicle (AUV) Laboratory. He has served an associate editor of the IEEE Journal of Oceanic Engineering and of the IEEE Transactions on Robotics and Automation. He was team leader for MIT’s DARPA Urban Challenge team, which was one of eleven teams to qualify for the Urban Challenge final event and one of six teams to complete the race. He is the recipient of an NSF Career Award (1998), an E.T.S. Walton Visitor Award from Science Foundation Ireland (2004), the King-Sun Fu Memorial Best Transactions on Robotics Paper Award (2006), and he is an IEEE Fellow (2014).


| More

Related episodes:

May 31st, 2013

Robots: Curved Artificial Compound Eye

In this episode, we speak with Ramon Pericet and Michal Dobrzynski from EPFL about their Curved Artificial Compound Eye (CurvACE) published in the Proceedings of the National Academy of Sciences. Inspired by the fly’s vision system, their sensor can enable a large range of applications that require motion detection using a small plug-and-play device. As shown in the video below, you could use these sensors to control small robots navigating an environment, even in the dark, or equip a small autonomous flying robot with limited payload. Other applications include home automation, surveillance, medical instruments, prosthetic devices, and smart clothing.

The artificial compound eye features a panoramic, hemispherical field of view with a resolution identical to that of the fruitfly in less than 1 mm thickness. Additionally, it can extract images 3 times faster than a fruitfly, and includes neuromorphic photoreceptors that allow motion perception in a wide range of environments from a sunny day to moon light. To build the sensors, the researchers align an array of microlenses, an array of photodetectors, and a flexible PCB that mechanically supports and electrically connects the ensemble.

This work is part of the European Project Curvace which brings together a total of 15 people from four partners in France, Germany and Switzerland.

You can read our full coverage about this new sensor on Robohub.

Ramon Pericet Camara
Ramon Pericet Camara is the scientific coordinator for the CurvACE project and a postdoctoral researcher at the Laboratory of Intelligent Systems at EPFL. His research interests are oriented towards bio-inspired robotics, soft robotics, and soft-condensed matter physics.

Ramon received a Masters degree in Physics in 2000 from the University of Granada (Spain) and a PhD in Multidisciplinary Research from the University of Geneva (Switzerland) in 2006. Subsequently, he was granted a fellowship for prospective researchers from the Swiss National Science Foundation to join the Max Planck Institute for Polymer Research in Mainz (Germany).

Michal Dobrzynski
Michal Dobrzynski is a PhD student at the Laboratory of Intelligent Systems at EPFL. He obtained his master degree in Automatic Control and Robotics in 2006 from the Warsaw Technical University (Poland). He then joined the SGAR S.L. Company (Barcelona, Spain) as a Robot and PLC Software Engineer where his work focused on industrial robots and automatic lines programming and visualization. Next, in 2007, he joined a Numerical Method Laboratory at the University Politechnica of Bucharest (Romania) where he spent two years working in the FP6 “Early Stage Training 3″ project as a Researcher.


| More

Related episodes: