As it turns out, team members Hiroaki Wagatsuma, David Chik, and Gyanendra Tripathi, determined a need for home danger detection to assist with the aging population in Japan, and also to monitor children inside the home. The project is currently called ‘Brain-Inspired Robots for Home Safety Evaluation.’ What is remarkable about this project is that, as inferred in the project name, the robot is developed to mimic certain elements of the human brain. To do this, they developed an intelligent algorithm that can learn to recognize the causes of dangerous situations in the home and predict home accidents – we like to think of it as a robot babysitter. For instance, if a child walks over to a hot stove, the robot will follow the child and announce ‘Danger’ until the child steps away. The algorithm is currently suited for Turtlebot, provided by Clearpath Robotics, and the motion of the robot is powered by software called RoboRealm.
“Robots can’t have the exact same feeling as we do, but they can understand it.”
Humans have memory capability: when we experience something we remember it because we like it or we dislike it, or it feels good or it hurts. These cues cause us to learn, recognize and remember the environment. Robots, on the other hand, “can’t have the exact same feeling as we do, but they can understand it,” explained Wagatsuma.
The team studied the brain of a monkey and a mouse to identify behavioral patterns. Once identified, they integrated their findings into the algorithm so the robot can have the capacity to understand a similar kind of social monitoring and emotions that humans have. In doing so, the algorithm records an experience and result (as humans, we call this memory), so it creates a process that replicates the human brain mechanism enabling robots to have a fuller understanding of their environment.
The Roadmap Ahead
The notion of home safety evaluation by a robot (verses a human) is a new concept, so not all of the details of the solution are fully worked through – yet. The team needed to develop an intelligent algorithm from scratch, and although they have made exciting developments, they remarked “we still have lots to do.” Currently, the robot-and-algorithm combo use tracking tags to identify objects; however, they don’t expect ‘real life’ humans to wear tags around their house, so the next step is to implement visual recognition.
Despite the limitation of not yet having visual recognition, the team successfully developed a prototype robot that can navigate around the environment, memorize observed information, do reasoning, acknowledge sudden movement, and predict home accidents.
The team successfully demonstrated the robot during Open Campus in May at the Kyushu Institute of Technology. They look forward to continued development and hope to present at IROS 2013 in October. Dreaming big: The team would love to commercialize the product to make it available for household use and, at the same time, encourage individuals to build trust in robots because of their reliability.
Check out this video of the project overview and a demo of the system: