Tag: ‘Smart-Tech’

Fukitorimushi: Autonomous floor-wiping robot

07 May 2009

Fukitorimushi --

Move over, Roomba. Make way for Fukitorimushi, an autonomous floor-cleaning robot that crawls like an inchworm and uses a super-absorbent nanofiber cloth to wipe up microscopic dust and residue that ordinary vacuums leave behind. Unveiled at the recent Tokyo Fiber Senseware exposition in Milan, Fukitorimushi (lit. "wipe-up bug") is designed by Panasonic and incorporates nanofiber technology developed by textile maker Teijin, Ltd.

The robot cleans by simply dragging its nanocloth belly across the floor as it slowly crawls around in search of dirt. (Watch the video.)

Fukitorimushi, which moves around by flexing and stretching its body like an inchworm, uses "feelers" of blue-white light to search for floor grime. When it finds a dirty spot, the robot emits a red light and devotes extra effort to cleaning that area. After it has finished cleaning, the machine returns to its charging station to replenish its battery.

Fukitorimushi's body is covered in Teijin's Nanofront cloth, which is made of polyester filament fibers measuring 700 nanometers in diameter (about 7,500 times thinner than the average human hair). The nanofibers significantly increase the fabric's surface area and porosity, giving it superior wiping characteristics and the ability to absorb oil and ultra-fine dust particles less than one micron in diameter. The large surface contact area also increases the fabric's friction with the floor and makes it resistant to sliding. The robot relies on this increased friction to push itself forward while wiping the floor.

Fukitorimushi --

According to its creators, Fukitorimushi is also designed to engage the emotions and foster a closer relationship between humans and machines. The way the machine creeps across the floor may seem a little strange at first, but the designers say people tend to grow fond of the robotic creature after watching it for a while. In addition, the owner must periodically replace Fukitorimushi's nanocloth cover with a clean one. The designers suggest this task of looking after the Fukitorimushi may encourage a pet-like affection for the machine.

[Link: Tokyo Fiber '09 Senseware Guide (PDF)]

CB2 baby robot developing social skills

06 Apr 2009

CB2 baby robot --

In the nearly two years since it was first unveiled to the world, the Child-robot with Biomimetic Body, or CB2, has been developing social skills by interacting with humans and watching their facial expressions, according to its creators at Osaka University.

Comprised of robotics engineers, brain specialists, psychologists and other experts, the research team has been teaching the android to think like a baby by having it evaluate facial expressions and classify them into basic categories, such as happiness and sadness.


+ Video of CB2 from June 2007

The 130-centimter (4 ft 4 in) tall, 33-kilogram (73 lb) robot is equipped with eye cameras that record emotional expressions. Designed to learn like an infant, the robot can memorize facial expressions and match them with physical sensations, which it detects via 197 pressure sensors under a suit of soft, silicone skin.

CB2 baby robot --

In addition to watching faces, CB2 has been learning to walk. With 51 pneumatic "muscles," the little android can now amble through a room more smoothly than it could nearly two years ago, though it still requires the aid of a human.

CB2 baby robot --

Within two years, the researchers hope the robot will gain the intelligence of a two-year-old child and the ability to speak in basic sentences. In the coming decades, the researchers expect to develop a "robo species" that has learning abilities somewhere between those of humans and chimps.

CB2 baby robot --

[Link: AFP]

Asimo robot controlled by human thought (video)

31 Mar 2009

Honda's brain-machine interface for robot control --

Honda has developed new brain-machine interface (BMI) technology that allows humans to control the Asimo humanoid robot simply by thinking certain thoughts.

The BMI system, which Honda developed along with Advanced Telecommunications Research Institute International (ATR) and Shimadzu Corporation, consists of a sensor-laden helmet that measures the user's brain activity and a computer that analyzes the thought patterns and relays them as wireless commands to the robot. (Watch video.)

When the user simply thinks about moving his or her right hand, the pre-programmed Asimo responds several seconds later by raising its right arm. Likewise, Asimo lifts its left arm when the person thinks about moving their left hand, it begins to walk when the person thinks about moving their legs, and it holds its hand up in front of its mouth when the person thinks about moving their tongue.

Honda BMI -- The high-precision BMI technology relies on three different types of brain activity measurements. EEG (electroencephalography) sensors measure the slight fluctuations in electrical potential on the scalp that occur when thinking, while NIRS (near-infrared spectroscopy) sensors measure changes in cerebral blood flow. Newly developed information extraction technology is used to process the complex data from these two types of sensors, resulting in a more accurate reading. The system reportedly has an accuracy rate of more than 90%.

The use of EEG and NIRS sensors makes the new system more compact than previous BMI systems that rely on bulkier fMRI (functional magnetic resonance imaging) technology. Although the system is small enough to be transported from place to place, the developers plan to further reduce the size.

Honda, which has been conducting BMI research and development with ATR since 2005, is looking into the possibility of one day using this type of interface technology with artificial intelligence and robotics to create devices that users can operate without having to move.

[Source: Honda press release]

Video: Synchronized electric face stimulus test

17 Mar 2009

In his latest video, experimental media artist Daito Manabe choreographs a synchronized face dance for four friends by hooking them up to the Face Visualizer, a device which converts music into electrical impulses that stimulate the facial muscles.


+ Video

[Via: Daito Manabe - YouTube]

‘Magic mirror’ shows real-time muscle data

02 Mar 2009

Magic mirror system by IRT --

Researchers at the University of Tokyo have developed a computerized, sensor-based "magic mirror" that analyzes muscular activity and shows real-time computer-generated images of how hard the user's muscles are being worked while exercising.

The magic mirror, developed under the leadership of professor Yoshihiko Nakamura of the Information and Robot Technology Research Initiative (IRT), was unveiled at the University of Tokyo last Friday. In a demonstration for the media, the system's display monitor showed a real-time computer-generated image of a male model's musculo-skeletal system while he performed a series of physical exercises.

The system, which is currently capable of monitoring the activity of 30% of the body's roughly 300 skeletal muscle pairs, consists of 16 electromyographs (instruments that record the electrical waves associated with muscle activity) attached to the user's body, 10 motion-capture cameras, and a pair of floor sensors to measure the force exerted on the legs.

On the monitor, each muscle is shown in a different color depending on how much it is being used at a particular moment. Active muscles are shown in red, while inactive muscles are shown in yellow.

Magic mirror system by IRT --
(Muscle images can also be overlaid on the video image of the user's body.)

The magic mirror system uses newly developed software that is reportedly 10 times faster than previous technology, allowing the system to operate in real-time, even when the user is moving rapidly.

The researchers, who are already working on a more compact version that incorporates the cameras directly into the display, envision the system being used in homes, gyms and hospitals. In addition to helping people get into shape, the system might also help doctors more effectively treat conditions that affect the muscles.

[Sources: Robot Watch, Yomiuri, Nikkei]

Scientists extract images directly from brain

12 Dec 2008

ATR mind reader --

Researchers from Japan's ATR Computational Neuroscience Laboratories have developed new brain analysis technology that can reconstruct the images inside a person's mind and display them on a computer monitor, it was announced on December 11. According to the researchers, further development of the technology may soon make it possible to view other people's dreams while they sleep.

The scientists were able to reconstruct various images viewed by a person by analyzing changes in their cerebral blood flow. Using a functional magnetic resonance imaging (fMRI) machine, the researchers first mapped the blood flow changes that occurred in the cerebral visual cortex as subjects viewed various images held in front of their eyes. Subjects were shown 400 random 10 x 10 pixel black-and-white images for a period of 12 seconds each. While the fMRI machine monitored the changes in brain activity, a computer crunched the data and learned to associate the various changes in brain activity with the different image designs.

Then, when the test subjects were shown a completely new set of images, such as the letters N-E-U-R-O-N, the system was able to reconstruct and display what the test subjects were viewing based solely on their brain activity.

For now, the system is only able to reproduce simple black-and-white images. But Dr. Kang Cheng, a researcher from the RIKEN Brain Science Institute, suggests that improving the measurement accuracy will make it possible to reproduce images in color.

"These results are a breakthrough in terms of understanding brain activity," says Dr. Cheng. "In as little as 10 years, advances in this field of research may make it possible to read a person's thoughts with some degree of accuracy."

The researchers suggest a future version of this technology could be applied in the fields of art and design -- particularly if it becomes possible to quickly and accurately access images existing inside an artist's head. The technology might also lead to new treatments for conditions such as psychiatric disorders involving hallucinations, by providing doctors a direct window into the mind of the patient.

ATR chief researcher Yukiyasu Kamitani says, "This technology can also be applied to senses other than vision. In the future, it may also become possible to read feelings and complicated emotional states."

The research results appear in the December 11 issue of US science journal Neuron.

[Source: Chunichi]

1,000 Paro robots migrating to Denmark

21 Nov 2008

PARO Mental Commit Robot --

The largest-ever migration of baby harp seal robots from Japan is about to begin, following an agreement by Denmark to purchase 1,000 of them for use in health care facilities. Paro, a human-interactive robotic seal developed by Japan's National Institute of Advanced Industrial Science and Technology (AIST), has scientifically demonstrated the ability to elicit emotions, activate the mind and calm nerves in patients at hospitals and nursing homes, earning it the Guinness title of "world's most therapeutic robot." Although the well-traveled Paro now resides at welfare institutions in more than 20 nations around the world, the Danish government is the first organization to make a large-scale purchase. Denmark aims to have the Paro robots in their new homes by 2011.

[Sources: Jiji, Chunichi]

Midori-san, the blogging houseplant

07 Oct 2008

Midori-san, the blogging houseplant --
Midori-san, the blogging houseplant, at bowls Donburi Cafe in Kamakura

If houseplants could blog, what would they say? To find out, Kamakura-based IT company KAYAC Co., Ltd. has developed a sophisticated botanical interface system that lets plants post their thoughts online. A succulent Sweetheart Hoya (Hoya kerii) named "Midori-san" is now using the system to blog daily from its home at bowls Donburi Cafe in Kamakura.

The plant interface system, which is built around technology developed by Satoshi Kuribayashi at the Keio University Hiroya Tanaka Laboratory, uses surface potential sensors to read the weak bioelectric current flowing across the surface of the leaves. This natural current fluctuates in response to changes in the immediate environment, such as temperature, humidity, vibration, electromagnetic waves and nearby human activity. A specially developed algorithm translates this data into Japanese sentences, which are used as fodder for the plant's daily blog posts.

Midori-san, the blogging houseplant --
Diagram of plant interface system

Midori-san started blogging about a week ago. So far, the plant's highly structured posts summarize the day's weather, temperature and lighting conditions, describe its overall physical condition, tell how much light it received via the user-activated lamp (see below), and explain how much fun the day was. Each post also includes a self-portrait photo and a plant-themed pun (in Japanese), which Midori-san likely did not write. A graph at the top of the sidebar shows the plant's surface potential in real-time.

Readers can also treat Midori-san to a dose of fluorescent light either through the website or this widget:

To activate a web-controlled fluorescent lamp positioned next to the plant inside the cafe, click the "Give Light to Midori-san" (?????????) button at the bottom of the widget, enter your name (or a nickname), and click OK. (Get the widget code here.)

Once the lamp activated, the widget shows a real-time view of Midori-san under the light.

Judging from the blog content and the numerous "thank yous" below the fold of each post, Midori-san seems to really appreciate every chance it gets to photosynthesize.

In addition to exploring the potential of intelligent networks that involve the natural environment around us, KAYAC hopes this entertaining plant interface system will inspire people to think about the environment in new ways.

[Link: Kyo no Midori-san]

Photos: Robots at CEATEC 2008

01 Oct 2008

Robots old and new are on display at the CEATEC 2008 home electronics trade show currently underway in Chiba, Japan.

Nissan BR23C Biomimetic Robot Car at CEATEC 2008 --
Nissan BR23C Biomimetic Robot Car

Nissan unveiled the bumblebee-inspired BR23C Biomimetic Robot Car, which is equipped with a prototype collision avoidance system developed in cooperation with the University of Tokyo. The next-generation safety technology is modeled after the way that bees avoid crashing into each other.

* * * * *

Mechadroid Type C3 at CEATEC 2008 --
Mechadroid Type C3

The Mechadroid Type C3 receptionist robot developed by Business Design Laboratory relies on face recognition technology, a touch panel display, speech, and facial expressions to interact with visitors and guide them to their destination.

* * * * *

ifbot at CEATEC 2008 --
ifbot

Ifbot -- also developed by Business Design Laboratory -- is a speech-capable robot that can identify emotions in the voice and word choice of the person talking. The robot can also communicate its own emotions with a range of facial expressions.

* * * * *

Murata Seiko-chan and Seisaku-kun (Murata Boy) at CEATEC 2008 --
Murata Seiko-chan and Seisaku-kun (a.k.a. Murata Boy)

Murata Manufacturing Co., Ltd.'s popular robot bicyclist, Murata Seisaku-kun (a.k.a. Murata Boy), was joined on stage by his recently-unveiled younger cousin, Murata Seiko-chan, who is well-balanced enough to ride a unicycle.

* * * * *

Nabaztag at CEATEC 2008 --
Nabaztag

The Nabaztag Wi-Fi Smart Rabbit manufactured by Violet is a bunny-shaped personal assistant that connects to your home wireless network.

Nabaztag at CEATEC 2008 --
Nabaztag

In addition to announcing the latest news, weather and traffic information, the rabbit can tell the time, light up when email arrives, stream Internet radio and podcasts, and respond to spoken commands.

* * * * *

Enon at CEATEC 2008 --
Enon leads the way to the wine section

Fujitsu's Enon robot demonstrated the ability to interact with customers and guide them to the wine section.

Enon at CEATEC 2008 --
Enon takes a break