Researchers from Waseda University have teamed up with Kyushu-based robot manufacturer tmsuk to develop a humanoid robot that uses its entire body to express a variety of emotions. (Watch video.)
Named "KOBIAN," the android integrates features of two previously developed robots -- the WABIAN-2 bipedal humanoid and the WE-4RII emotion expression humanoid -- into a bipedal machine that can walk around, perceive its environment, perform physical tasks, and express a range of emotions. The robot also features a new double-jointed neck that helps it achieve more expressive postures.
KOBIAN can express seven different feelings, including delight, surprise, sadness and dislike. In addition to assuming different poses to match the mood, the emotional humanoid uses motors in its face to move its lips, eyelids and eyebrows into various positions. To express delight, for example, the robot lifts its soft rubbery hands over its head and opens its eyes and mouth wide.
To show sadness, the robot slouches over, hangs its head down and holds a hand up to its face in a gesture of grief.
According to KOBIAN's developers, the robot's expressiveness makes it better equipped to interact with humans and assist with daily activities. In the future, the robot may seek work in the field of nursing.
WABOT-2, an intelligent humanoid keyboard player developed by Waseda University in the 1980s, was considered the most advanced robot of its time. In addition to camera eyes that could read musical notation and deft hands that could tap out tunes of average difficulty, WABOT-2 could listen to accompanying singers and adjust its tempo, as well as carry on basic conversation. The android demonstrated its musical skills at Expo '85 in Tsukuba, Japan with a performance of Kitaro's new age classic "Silk Road." (Watch a clip.)
Built in order to develop the basic technology, strength and skills for robots of the 21st century, WABOT-2 was equipped with a hierarchical system of 80 microprocessors modeled after the human nervous system, and its arms and legs had 50 degrees of freedom -- more than any other robot in existence at the time. Waseda University regards WABOT-2 as a landmark achievement in the evolution of personal robots.
[Video: The Computer Chronicles (1985 broadcast) - Parts 1, 2, 3]
On October 9, professors Atsuo Takanishi of Waseda University and Akitoshi Katsumata of Asahi University unveiled an oral rehabilitation robot, called "WAO-1" (Waseda Asahi Oral Rehabilitation Robot 1), which is designed to help treat mouth, jaw and facial disorders by performing therapeutic face massages. In November, the developers will begin clinical testing of a prototype robot -- built by dental X-ray equipment manufacturer Asahi Roentgen -- on patients in Yokohama.
Equipped with two 50-cm (20-inch) arms that protrude from a chair-sized aluminum box, WAO-1 performs massages by pressing the patient's face from both sides. Each arm's position and angle can be precisely controlled, as can the direction of the pressure applied to the face. WAO-1 also relies on a complex system of software and fuses to ensure the pressure does not exceed a certain level, and it is equipped with a "torque limiter function" that allows the arms to bend back should the robot begin to exert too much force. Much of WAO-1's control technology, which can also be found in humanoid robots, is the product of Takanishi's well-known work on robots that walk and express emotions.
While the parts for the prototype cost about 8 million yen ($70,000), Takanishi says the robot is cost-effective because it can be used to massage other body parts and perform other tasks like hold a patient's mouth open during treatment.
Facial massage, which is known to combat dry mouth because it stimulates salivation, is used in the treatment of various mouth and jaw disorders. An estimated 10 million people in Japan are believed to suffer from oral conditions such as dry mouth and temporomandibular joint disorder -- a condition that makes it painful to open the mouth, which is sometimes caused by stress or age. WAO-1's creators, who are confident they can develop a commercial version soon, hope the robot can take advantage of the lack of skilled practitioners in this high-growth area.
On June 21, researchers at Waseda University's Institute of Egyptology unveiled the computer-generated facial image of an ancient Egyptian military commander that lived about 3,800 years ago. The image is based on CAT scans taken of a mummy.
Researchers claim the mummy, which was unearthed near Cairo at an archeological site in North Dashur, is from ancient Egypt's 13th dynasty (c.1756 to c.1630 BC), and according to inscriptions on the sarchophagus, it appears to be that of a military commander named Senw.
The facial image, which was created by a team of graphic artists from the Joshibi University of Art and Design, is based on bone structure data obtained from CAT scans of the mummy. The research group determined that he was a middle-aged to elderly man, and from information such as the military commander's title they hypothesized he was of mixed race. The artists worked to provide the face with strong, military commander-like features, referring to ancient Egyptian pictorial representations and sculptures.
"We had to rely on artististic imagination for the parts we did not understand," says Sakuji Yoshimura, Waseda University professor who led the research team. Several faces were generated, and the one that most resembled that of a military commander was chosen.
The face will be on display to the public beginning in July at Fukuoka City Museum.