October 3rd, 2014

Robots: Quest for Computer Vision

In this episode, Audrow Nash interviews Peter Corke from Queensland University of Technology, about computer vision, the subject of his plenary talk at IROS 2014. He begins with a brief history of biological vision before discussing some early and more modern implementations of computer vision. Corke also talks about resources for those interested in learning computer vision, including his book, Robotic Vision & Control, and a massively open online course (MOOC) that he plans to release in 2015. 

 

Peter Corke
Peter CorkePeter Corke joined Queensland University of Technology at the start of 2010 as a Professor of Robotic Vision. Now he’s also director of the ARC funded Centre of Excellence for Robotic Vision. Peter is known for his research in vision-based robot control, field robotics and wireless sensor networks. He received a B.Eng and M.Eng.Sc. degrees, both in Electrical Engineering, and a PhD in Mechanical and Manufacturing Engineering, all from the University of Melbourne, Australia.

Links:

| More

Related episodes:

September 21st, 2014

Robots: AirDog

In this episode, Audrow Nash interviews Edgars Rozentals, the CEO and Founder of Helico Aerospace Industries. They talk about Helico’s upcoming product ‘AirDog’, which is an autonomous quadrocopter designed to record video for action sports. Airdog uses an ‘AirLeash’ (worn on the users’ person) to track the users as they move and give the user simple control of AirDog. The AirLeash is waterproof and has big buttons—for gloves. For advanced control, there is a smart phone application that allows the user to control the flight-path, following angle and height, and mark obstacles.

AirDog has recently had a successful KickStarter campaign (raising 1.368M with a goal of 200K), and plans to make deliveries in December 2014.

Edgars Rozentals
edgars rozentalsEdgars Rozentals is the CEO and Founder of Helico Aerospace Industries. He is a self-described “visionary” and “adventurer,” who has founded numerous software and webservice ventures before founding Helico and creating AirDog. Edgars hopes that AirDog challenges people to be creative, and to push themselves and their skills to the next level.

Links:

| More

Related episodes:

September 5th, 2014

Robots: M-Blocks

In this episode, Audrow Nash interviews John Romanishin from MIT, about his modular robotics project ‘M-Blocks’. M-Blocks are small cubes (5 cm on a side) that have no external actuators, yet they manage to move and even jump. They do this by rotating an internal mass at high speeds then stopping that mass suddenly, which transfers inertia to the cube causing it to move. The rotating mass can change which plane it’s spinning in allowing the cube to move in any direction.  By combining this inertial actuator with permanent magnets, M-Blocks can move over similar robots (or more M-Blocks) and precisely line up. The future of this project is best put by John, who says, “We want hundreds of cubes, scattered randomly across the floor, to be able to identify each other, coalesce, and autonomously transform into a chair, or a ladder, or a desk, on demand.” 

John Romanishin
john-romanishinJohn Romanishin is currently a graduate student studying mechanical engineering and researching self-reconfigurable modular robots at the Distributed Robotics Laboratory led by Daniela Rus at MIT.

Links:

| More

Related episodes:

August 22nd, 2014

Robots: Birdly

In this episode, Audrow Nash interviews Max Rheiner from Zurich University of the Arts (ZHDK) about his  project Birdly. Birdly explores the experience of a bird in flight with several methods. Unlike a common flight simulator, the user embodies a bird, the Red Kite. To evoke this embodiment, Birdly mainly relies on the sensory-motor coupling. The participant can control the simulator with their hands and arms, which directly correlates to the wings and the primary feathers of the bird. Those inputs are reflected in the flight model of the bird and displayed physically by the simulator through nick, roll and heave movements.

Visualized through a head-mounted display (Oculus Rift), the whole scenery is perceived in the first person perspective of a bird. To intensify the embodiment, Birdly has additional sound, olfactory and wind feedback. The participant hears the roaring of the wind and the flaps of the wings. The olfactory feedback is based on the changing scenery and ranges from the scent of a forest, or soil, to several other odors of the wilderness. According to the speed of the bird, the simulator regulates the headwind with a fan.

Max Rheiner
max-rheinerMax Rheiner is a senior lecturer at Zurich University of the Arts (ZHDK) where he teaches bachelors and masters programs for the Department of Interaction Design. He also developed the Physical Computing Laboratory there. He received his Diploma from Zurich University of the Arts in the field of New Media Arts in 2003.

Rheiner’s research and artistic interests center on interactive experiences which utilize methods from Virtual/Augmented Reality and Immersive Telepresence. His artistic work has been recognized and exhibited in a number of international and well-renowned venues such as Biennale Venice, Italy, Ars Electronic Linz, Austria, and Yamaguchi Center for Arts and Media, Japan.

Links:

| More

Related episodes:

August 8th, 2014

Robots: Stiquito

In this episode, Audrow Nash interviews James Conrad, professor at the University of North Carolina at Charlotte, about the history of the autonomous walking robot, Stiquito. Stiquito is a small, inexpensive hexapod (i.e., six-legged) robot that has been used since 1992 by universities, high schools, and hobbyists. It is propelled by nitinol, an alloy actuator wire that expands and contracts, and roughly emulates the operation of a muscle. Nitinol contracts when heated and returns to its original size and shape when cooled. The robot can be outfitted with several sensors for more advanced behavior, such as obstacle avoidance, line following, and light tracking.

Jonathan Mills of Indiana University, developed Stiquito as an inexpensive vehicle for research. The robot became popular after the publication of Stiquito: Advanced Experiments with a Simple and Inexpensive Robot in 1997, which included a kit to build a Stiquito robot. Since then, two additional books have been published, and Stiquito has been used to introduce students to the concepts of analog electronics, digital electronics, computer control, and robotics. It has also been used for advanced topics such as subsumption architectures, artificial intelligence, and advanced computer architecture.

The video below shows an explanation and demo of Stiquito. You can find more videos about Stiquito here.

James Conrad
JamesConrad_2013James M. Conrad is professor at the University of North Carolina at Charlotte. He has served as an assistant professor at the University of Arkansas and as an instructor at North Carolina State University. He has also worked at IBM, Ericsson/Sony Ericsson, and BPM Technology. He has been elected to serve on the IEEE Board of Directors as Region 3 director for 2016-2017. He is the author of numerous books, book chapters, journal articles, and conference papers in the areas of embedded systems, robotics, parallel processing, and engineering education.

Links:

| More

Related episodes: