The newest version of Cog, developed at MIT AI laboratory.
For robots to be profitably integrated into the everyday lives of humans within military, commercial, educational or domestic contexts, robots must be able to interact with humans in more meaningful, natural ways. As artificial agents inundate our lives, it will be increasingly important to enable multi-modal, intuitive modes of communication that include speech, gesture, movement, affect, tactile stimulation and context. Body dictates behavior, and if we want a robot to relate with and learn from humans, it must be able to map its body to our own.
An ambitious project at MIT is based on the premise that humanlike intelligence requires humanoid interactions with the world. These researchers are developing a robot they call Cog as a set of sensors and actuators that tries to approximate the sensory and motor dynamics of a human body. Cog is equipped with a sophisticated visual system capable of saccades, smooth pursuit, vergence, and coordinating head and eyes through modeling of the human vestibulo-ocular reflex. Cog responds not only to visual stimulation, but also to sounds and to the ways people move Cog's body parts. By exploiting its ability to interact with humans, Cog can learn a diverse array of behaviors including everything from playing with a slinky to using a hammer. Eventually, Cog will have the ability to be tasked naturally and quickly by users who may not know beforehand what tasks the robot will need to accomplish. The user should be able to demonstrate actions and supply auditory and visual cues to help the robot correctly perceive the instructions.
Kismet interacts with one of its creators at the MIT AI laboratory.
There are many problems still to be solved. How should a robot respond to actions that are physically impossible for it to imitate? How should it recognize these situations? For imitative learning techniques to succeed, the robot must have some way of knowing which aspects of the environment it should attend to and precisely which actions it should try to reproduce. For instance, the robot should not imitate a cough or an itch when being shown how to turn a crank. To guide robots through the process of imitative learning, we must give them the ability to recognize and respond to natural cues we give unconsciously via body language. Another MIT project, called Kismet, is training a robot head with eyebrows, eyelids, ears and mouth, etc., to discern social cues such as nodding or eye contact which are crucial in correctly guiding interaction.
Wendy S. Sugano Laboratory Waseda University.
WENDY is a human symbiotic robot that consists of two anthropomorphic arms, a headand torso.It has wheels instead of legs. Wendy is designed to work with humans, often in the same working space, carrying out physical, informational and psychological interaction with humans. For human symbiotic robots, safety is a key issue. To ensure impact safety, the joints are equipped with force sensors that detect collision. Also, reliable shock absorption is accomplished by covering the arms with special material. Robust, dexterous handling is accomplished using a mechanism for pressure adjustment based on human fingertips. After modeling the way the human fingers work to pick up very difficult objects, they built realistic fingertips that can apply pressure much like a human. The fingers even include fingernails for picking up small, flat objects. The robot hand can accomplish a number of real tasks such as chopping vegetables and grasping very small coins.
Hadaly - 2 Humanoid Project Waseda University.
Hadaly-2 is a new humanoid robot designed by Waseda University for the purpose of interactive communication with humans. Hadaly-2 has an environmental recognition system that uses vision and voice recognition to remain aware of the presence and actions of people around it. Like Wendy, it uses a compliant motion system and can achieve mobility using electric wheels. Hadaly-2 uses these capabilities to communicate with humans not only informationally, but also physically. The robot shown here is about 6 feet tall and weighs over 600 pounds.