Although there are those of us who would use a robot hand for evil purposes, the scientists at Ishikawa Oku Laboratory have designed a hand with a more light-hearted purpose. Japanese researchers at the University of Tokyo have developed a janken, or rock-paper-scissors robot, that wins every time. Whether this invention is good or evil, however, is up for debate.
Part of a class of robots that utilize human-machine cooperation, the janken robot detects which gesture will be thrown with vision-based motion detectors. Although technically cheating by 1ms, the robot determines which gesture is going to be thrown and then counters it with the appropriate winning gesture. By determining the angle of the human’s wrist joint, the rock-paper-scissors robot determines which motion the hand is about to make. The application of a robot with such a quick reaction time could range from assistance in surgical procedures to preventative military technology.
According to Gregory Hager’s lecture notes on “Human-machine Cooperative Manipulation with Vision-based Motion Constraints” in Control and Information Sciences, “It is shown that algorithms originally designed for vision-based control of manipulators can be easily converted into control algorithms that provide virtual fixtures. As a result it is possible to create advanced human-machine cooperative manipulation systems that take complete advantage of information provided by vision, yet permit the user to retain control of essential aspects of a given task.”
Past research on optics in computing made the janken robot possible, using optics and electronics in the fashion described by Hager above. Ishikawa Oku Laboratory seeks to develop robotics according to four focus areas: sensor fusion theory and implementation in engineering systems, dynamic image control based on high speed visual information, massive parallel image processing through a vision chip, and the utilization of meta-perception otherwise unavailable to humans and robots alike.
The Ishikawa Oku Laboratory mission statement reads, “Interactions in the real world (not only physical but also social and psychological interactions) are inherently parallel phenomena. By constructing models and engineering systems that take into account such parallelism, one can expect a better understanding of the real world as well as enhanced performance of systems when dealing with practical applications. These fundamental considerations lead us to concentrate on parallel processing for sensory information.”