January 10th, 2015

Robots: RoboThespian - Transcript

In today’s podcast, Ron Vanderkley speaks with Will Jackson from Engineered Arts Limited about his team’s work making robot actors.

Engineered Arts was founded in October 2004 by Will Jackson, to produce mixed media installations for UK science centres and museums, many of which involved simple mechanical figures, animated by standard industrial controllers.

In early in 2005, the Company began work on the Mechanical Theatre for the Eden Project. This involved three figures, with storylines focused on genetic modification. Rather than designing another set of figures for this new commission, Engineered Arts decided to develop a generic programmable figure that would be used for the Mechanical Theatre, and the succession of similar commissions that would hopefully follow. The result was RoboThespian Mark 1 (RT1).

From thereon, Engineered Arts took a change of direction and now concentrates entirely on development and sales of an ever expanding range of humanoid and semi-humanoid robots featuring natural human-like movement and advanced social behaviours.

RoboThespian, now in its third version, is a life sized humanoid robot designed for human interaction in a public environment. It is fully interactive, multilingual, and user-friendly. Clients range from NASA’s Kennedy Space Centre through to Questacon, The National Science and Technology Centre in Australia. You can watch it in action in the video below.

Will Jackson
Will Jackson and his RoboThespianWill Jackson has a BA in 3D design from University of Brighton, UK and is the Founder of Engineered Arts Ltd.

Links:

| More

Related episodes:

December 27th, 2014

Robots: 3D SLAM

In this episode, Audrow Nash speaks with Professor John Leonard from MIT about his research on dense, object-based 3D Simultaneous Localization And Mapping (SLAM).

Leonard explains what SLAM is, as well as its practical applications. The explanations include what it means for SLAM to be object-based (versus feature-based) and to have dense (versus sparse) environmental mapping. The interview closes with advice for aspiring roboticists.

John Leonard
jleonard_05_nov2014John J. Leonard is Professor of Mechanical and Ocean Engineering and Associate Department Head for Research in the MIT Department of Mechanical Engineering. He is also a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). His research addresses the problems of navigation and mapping for autonomous mobile robots. He holds the degrees of B.S.E.E. in Electrical Engineering and Science from the University of Pennsylvania (1987) and D.Phil. in Engineering Science from the University of Oxford (1994). He studied at Oxford under a Thouron Fellowship and Research Assistantship funded by the ESPRIT program of the European Community. Prof. Leonard joined the MIT faculty in 1996, after five years as a Post-Doctoral Fellow and Research Scientist in the MIT Sea Grant Autonomous Underwater Vehicle (AUV) Laboratory. He has served an associate editor of the IEEE Journal of Oceanic Engineering and of the IEEE Transactions on Robotics and Automation. He was team leader for MIT’s DARPA Urban Challenge team, which was one of eleven teams to qualify for the Urban Challenge final event and one of six teams to complete the race. He is the recipient of an NSF Career Award (1998), an E.T.S. Walton Visitor Award from Science Foundation Ireland (2004), the King-Sun Fu Memorial Best Transactions on Robotics Paper Award (2006), and he is an IEEE Fellow (2014).

Links:

| More

Related episodes:

December 13th, 2014

Robots: Robotics in Theatre, Film and Television - Transcript

In this episode, Ron Vanderkley speaks with Mythbusters Grant Imahara, and Richard McKenna from The Creature Technology Company about robotics in the Film, Television and Theatre industries.

Grant Imahara

Ron Vanderkley talking with Grant Imahara at SupaNova 2014

Ron Vanderkley talking with Grant Imahara at SupaNova 2014

Grant Imahara graduated from the University of Southern California with a degree in electrical engineering. It was shortly after that Imahara was hired as an engineer at LucasFilms and Industrial Light & Magic, building and operating a number of visual effects, models and robots for popular films/film series (such as Star Wars, Galaxy Quest, Jurassic Park, Terminator, The Matrix and AI: Artificial Intelligence). Imahara also built the Energizer Bunny, for the battery company’s commercials, Deadblow robot on BattleBots and Jeff Peterson from The Late Late Show with Craig Ferguson. Imahara is perhaps best known as a presenter on Discovery Channel’s MythBusters, and is often seen making robots or robotic rigs needed to aid in the testing of various myths. Imahara appeared at Supanova 2014 promoting his casting as Mr. Sulu in the popular professional web-series Star Trek: Continues.

Richard McKenna
Richard-McKenna_220x220Richard McKenna is Chief Engineer at The Creature Technology Company. He joined CTC in 2010 and has worked on all of the major projects since that time, including How to Train Your Dragon; King Kong and the Sochi Olympic Mascots.  He has a Bachelor of Engineering (Hons) in Mechatronics, Robotics and Automation Engineering from Swinburne University and is certified as a “Chartered Professional Engineer” by Engineers Australia, registered on the National Professional Engineers Register (NPER). Prior to joining CTC, the majority of Richard’s time was spent in the defence industry, and he has also worked in special effects for film and television.

Links:

| More

Related episodes:

November 29th, 2014

Robots: Mobility Transformation Facility

In this episode, Audrow Nash speaks with Edwin Olson, an Associate Professor at the University of Michigan, about the University’s 32-acre testing environment for autonomous cars and the future of driverless vehicles.

The testing environment, called the “Mobility Transformation Facility,” has been designed to provide a simulation of circumstances that an autonomous car would experience driving on real-world streets. The Transformation Facility features “one of everything,” says Edwin Olson, including a four-lane highway, road signs, stoplights, intersections, roundabouts, a railroad crossing, building facades, and even, mechanical cyclists and pedestrians.

Edwin Olson
Edwin OlsonEdwin Olson is an Associate Professor of Computer Science and Engineering and the University of Michigan. He is the director of the APRIL robotics lab, which studies Autonomy, Perception, Robotics, Interfaces, and Learning. His active research projects include applications to explosive ordinance disposal, search and rescue, multi-robot communication, railway safety, and automobile autonomy and safety.

In 2010, he led the winning team in the MAGIC 2010 competition by developing a collective of 14 robots that semi-autonomously explored and mapped a large-scale urban environment. For winning, the U.S. Department of Defense awarded him $750,000. He was named one of Popular Science’s “Brilliant Ten” in September, 2012. In 2013, he was awarded a DARPA Young Faculty Award.

He received a PhD from the Massachusetts Institute of Technology in 2008 for his work in robust robot mapping. During his time as a PhD student, he was a core member of their DARPA Urban Challenge Team which finished the race in 4th place. His work on autonomous cars continues in cooperation with Ford Motor Company on the Next Generation Vehicle project.

 

Links:

| More

Related episodes:

November 15th, 2014

Robots: Finding Objects Using RFID - Transcript

In this episode, Sabine Hauert speaks with Travis Deyle, about his IROS-nominated work on RFID tags, his blog Hizook, and the career path that brought him from academia, to founding his own start-up, and finally working for Google[x].

For his PhD at Georgia Tech with Dr. Charles C. Kemp, Deyle helped robots find household objects by tagging them with small Band-Aid-like Ultra High Frequency (UHF) Radio-Frequency Identification (RFID) labels. The tags allowed robots to precisely identify tagged objects. Once identified, the robots would follow a series of simple behaviors to navigate up to the objects and orient towards them.

Compared to vision and lasers, RFID can detect objects that are hidden while providing precise information and identification. This could allow a robot to find a bottle of medication in a cupboard, and make sure it’s the correct medication, before bringing it to a person. Furthermore, the technology can scale to large numbers of objects, and be used to map their location in the environment.

In the future, such tags augmented with better energy, sensing and computation capabilities could form the basis of the Internet of Things and provide a smart environment for robots to interact with.

uhf-rfid-robot-medication-delivery

Travis Deyle
tdeyle-242x300Travis Deyle earned a PhD in Fall 2011 from Georgia Tech’s School of Electrical and Computer Engineering (ECE). His PhD with Dr. Charles C. Kemp at the at Healthcare Robotics Lab was entitled, “Ultra High Frequency (UHF) Radio-Frequency Identification (RFID) for Robot Perception and Mobile Manipulation.”

After his PhD, Deyle worked with Dr. Matt Reynolds as a postdoc researcher at Duke University where he focused on a software-defined radio receiver to decode (in real-time) the high-speed biotelemetry signals reflected by a custom neuro-telemetry chip. This system was designed to capture high-fidelity neural signals from a dragonfly in flight — aka, a “cyborg dragonfly”.

He then co-founded the successful company Lollipuff.com: an online auction site dedicated exclusively to women’s designer clothes and accessories.

Deyle currently works at Google[x] where he was part of the team that made the “smart contact lense” to measure tear glucose levels which was recently licensed to Novartis.

He also founded the well know blog Hizook.com, a robotics website for academic and professional roboticists.
Links:

| More

Related episodes: