Tag: ‘Health’

Tiny robot reduces need for surgery

26 Feb 2007

Surgical microbot ---

On February 26, researchers from Ritsumeikan University and the Shiga University of Medical Science completed work on a miniature robot prototype that, once inserted into the body through an incision, can be freely controlled to perform medical treatment and capture images of affected areas. The plastic-encased minibot, which measures 2 cm (0.8 inch) in length and 1 cm (0.4 inch) in diameter, can be maneuvered through the body by controlling an external magnetic field applied near the patient.

While other types of miniature swallowable robots have been developed in the past, their role has mostly been limited to capturing images inside the body. According to Ritsumeikan University professor Masaaki Makikawa, this new prototype robot has the ability to perform treatment inside the body, eliminating the need for surgery in some cases.

The researchers developed five different kinds of prototypes with features such as image capture functions, medicine delivery systems, and tiny forceps for taking tissue samples. MRI images of the patient taken in advance serve as a map for navigating the minibot, which is said to have performed swimmingly in tests on animals. Sensor and image data is relayed back to a computer via an attached 2-mm diameter cable, which looks like it can also serve as a safety line in case the minibot gets lost or stranded.

[Source: Chugoku Shimbun]

Sayaka: Next-generation capsule endoscope

05 Jan 2007

Sayaka: Next-generation endoscopic capsule --

Endoscopic capsules, ingestible pill-shaped devices designed to capture images from inside the digestive tract, have been around for quite a while. But Sayaka, an endoscopic capsule developed by RF System Lab in December 2005, has dramatically increased the overall image quality by changing the camera position and enabling the camera to rotate.

While conventional capsules -- including RF System Lab's own Norika -- typically have cameras at one end of the capsule, Sayaka's camera has been moved to the side, where it has a better view of the intestinal walls. In addition, a tiny stepper motor rotates the camera as the capsule passes through the digestive tract, allowing Sayaka to capture images from every angle.

Like Norika, Sayaka's power is supplied wirelessly from an external source, primarily so that no harmful battery substances get into the body.

On a typical 8-hour, 8-meter (26 feet) journey through the gastrointestinal tract, Sayaka snaps approximately 870,000 photos, which are sent to a receiver located near the body. Image mosaicking technology is then used to stitch the images together into a flat, high-resolution rectangular map of the intestines, which can be magnified up to 75 times. In addition to scouring the maps for problem areas, gastroenterologists can compare maps from previous sessions to track changes in a patient's condition.

And as if all that were not enough, RF System Lab has released a trippy Sayaka promo video featuring a smooth disco/house soundtrack and starring a naked humanoid that floats over the Nazca Lines, shooting beams from its eyes as it scans the landscape below. Cool.

[Link: Sayaka homepage]

Model train controlled via brain-machine interface

17 Nov 2006

Hitachi brain-machine interface -- Hitachi has successfully tested a brain-machine interface that allows users to turn power switches on and off with their mind. Relying on optical topography, a neuroimaging technique that uses near-infrared light to map blood concentration in the brain, the system can recognize the changes in brain blood flow associated with mental activity and translate those changes into voltage signals for controlling external devices. In the experiments, test subjects were able to activate the power switch of a model train by performing mental arithmetic and reciting items from memory.

The prototype brain-machine interface allows only simple control of switches, but with a better understanding of the subtle variations in blood concentrations associated with various brain activities, the signals can be refined and used to control more complex mechanical operations.

In the long term, brain-machine interface technology may help paralyzed patients become independent by empowering them to carry out actions with their minds. In the short term, Hitachi sees potential applications for this brain-machine interface in the field of cognitive rehabilitation, where it can be used as an entertaining tool for demonstrating a patient?s progress.

The company hopes to make this technology commercially available in five years.

[Source: Yomiuri Shimbun via Seihin World]

Intelligent wheelchair sees all

21 Sep 2006

SOS: Stereo Omnidirectional System --

On September 20, Japan's National Institute of Advanced Industrial Science and Technology (AIST) unveiled an intelligent wheelchair that relies on an omnidirectional camera for a view of its surroundings, avoids collisions with people and obstacles, and knows when something is wrong with the chair's occupant. Developed with the cooperation of the National Rehabilitation Center for Persons with Disabilities, the new technology is expected to improve the safety and security of electric wheelchairs for the disabled and elderly.

While the increased prevalence of electric wheelchairs has improved the mobility of persons with serious disabilities, they have also resulted in an increased number of collisions and accidents. To boost wheelchair safety, AIST engineers incorporated elements of intelligent automotive systems, which are capable of calculating the risk of collisions before they happen and automatically applying the brakes when necessary.

The prototype wheelchair is equipped with a camera system -- interestingly dubbed Stereo Omnidirectional System (SOS) -- whose 360-degree field of vision has no blind spot. Relying on the camera images, the chair detects potential hazards that arise while in motion and decelerates or stops accordingly. The chair also checks the occupant for signs of abnormality (unusual posture) and is equipped with a function that allows the occupant to control the chair by gesturing (pointing). Check out the AIST press release for videos of the chair in action.

The wheelchair is currently equipped with a function for transmitting the camera's color video via wireless LAN, and AIST is investigating the possibility of enabling the video to be delivered via cellular phone and providing support for remote-control functions. AIST will soon subject the prototype to rigorous testing and continue upgrading the functions.

This intelligent wheelchair technology will be demonstrated at the 2006 Home Care and Rehabilitation Exhibition scheduled for September 27 to 29 in Tokyo.

[Sources: MYCOM Journal and AIST press release]

Mind-controlled wheelchair

11 Aug 2006

Brain waves A University of Electro-Communications team of researchers led by professor Kazuo Tanaka has developed a prototype of an electric wheelchair that the user can steer simply by thinking of which direction he or she would like to go.

The wheelchair interprets the user's intended direction by means of a skull cap outfitted with a system of sensors. The sensors read the brain waves, enabling the user to control the wheelchair's direction simply by thinking "move left" or "move right." Tests have shown that the wheelchair has an 80% degree of accuracy in interpreting the user's intentions and moving in the desired direction.

The field of mind-controlled technology has seen a number of significant developments recently, and the promise of wheelchairs, televisions and other devices that can be controlled by people with physical disabilities looms on the horizon.

The developers of the wheelchair also envision applications in computer games and in the field of entertainment.

The idea of using a brain interface in entertainment reminds me of this video excerpt from the "Music for Solo Performer," a sound piece composed by Alvin Lucier in 1964. In this performance, EEG electrodes attached to the performer's scalp pick up brain waves, which are used to control a variety of percussion instruments. The resulting music has a nice, mind-altering effect.

[Source: Nikkei Net]

New pharmaceutical pictograms

21 Jul 2006

The Risk/Benefit Assessment of Drugs-Analysis and Response (RAD-AR) Council of Japan has released a new batch of pictograms for use on pharmaceutical packaging. No more deciphering complicated dosage directions and warnings -- a glance is all it takes now.

Medical pictograms

Get your copy of all 51 pictograms here.

[Via Iza!]

Face to face with high-tech medical devices

13 Jul 2006

The International Modern Hospital Show 2006 is being held from July 12 to 14 in Tokyo (Tokyo Big Sight), where nearly 400 companies have gathered to showcase the latest in healthcare-related technology. The theme of the show is "Reliable Health, Medical Treatment, and Care -- Aiming for High Quality Service," a theme whose success evidently depends on high technology. Below are photos (via Impress Watch) and explanations of a few of the devices appearing at the show. Despite appearances, these fellows are here to help.

The first photo shows a patient simulator developed by IMI Corporation and Paramount Bed Co., Ltd., a system consisting of a monitor connected to a sensor-laden mannequin whose physiology changes realistically according to the treatment it receives. Great for training future medical professionals. Great for your haunted house, too.

Patient sumulator developed by IMI and Paramount Bed

The next photo shows a transnasal endoscope developed by FUJIFILM Medical Co., Ltd. and Fujinon Toshiba ES Systems Co., Ltd. Surveys show that 90% of patients who have experienced endoscopy think it is more comfortable to enter through the nose (as opposed to through the mouth or anus). I hope the expression on this guy's face is no indication of his comfort level.

Nasal endoscope developed by FUJIFILM and Fujinon Toshiba ES Systems

The next photo shows Muu Socia 3.0 (left), a cute cyclopean teardrop-shaped "communication support" robot developed by ATR and Systec Akazawa. The robot is designed to serve as a social mediator that livens up the communication between care giver and care recipient. Muu Socia has voice recognition, voice synthesis, speech processing and face recognition capabilities. And it starts bouncing around when something obstructs its view (watch the 5-second video (WMV)).

Pictured on the right is a home appliance control robot developed by RayTron Co., Ltd. Voice recognition capabilities allow patients to operate their home appliances by remote control. It looks sort of like an owl.

Muu Socia 3.0 (left) and home appliance control robot (right)

You can see more photos and read about the other technology on display at the link below.

[Source: Impress Watch]

Nasal airflow regulator amplifies whispers

21 Jun 2006

Nasal airflow regulatorOn June 20, an Okayama University team of researchers led by Professor Shogo Minagi unveiled a nasal airflow regulator designed to alleviate voice loss such as that which sometimes occurs after a stroke.

In normal speech, the soft palate (located at the back of the roof of the mouth) works to regulate the amount of air expelled through the mouth and nose. When these nerves are damaged by a stroke, for example, the soft palate may sag, preventing air from escaping through the nose. The result is the inability to pronounce speech sounds.

When inserted into the nostrils, the device forces air through the nasal passage when speaking, enabling sounds to be produced. According to the developers, the device allows people with nasal airflow problems -- even those whose speech is all but inaudible -- to be clearly heard.

[Source: Akita Sakigake Shimpo, Jiji]

Robo-patient tells where it hurts

08 May 2006

A "sick" robot developed by researchers at Gifu University?s Graduate School of Medicine is providing hands-on educational assistance to future medical practitioners. When students touch its head and abdomen in places it feels pain, the robot says, "That hurts."

Robot patient

With 24 sensors embedded in its head and body under a layer of soft, warm (near body temperature) silicone skin, the robot can detect the hand pressure applied by the examiner. And depending on which of the 8 pre-programmed medical conditions -- which range from acute gastroenteritis to appendicitis -- it is suffering from, the robot provides a vocal response to the examiner's questions and manual pressure.

Developers claim the robot helps students cultivate medical examination skills, and it is being used in classes beginning this academic year. The students appreciate the robot, claiming it helps improve confidence before performing examinations on real people.

"Great pains were taken to provide the sensors with a human level of sensitivity," says professor Yuzo Takahashi (57), who developed the robot. "We would like to make further improvements and expand the number of symptoms the robot can respond to."

[Source: Yomiuri Shimbun]