Archives: ‘Sci/Tech’ Category

Artificial bones made with 3D inkjet printers

13 Aug 2007

Custom artificial bone made on 3D inkjet printer -- Researchers from the Tissue Engineering Department at the University of Tokyo Hospital and venture company Next 21 are using 3D inkjet printers to produce tailor-made artificial bones for use in facial reconstructive surgery. Following initial trials performed on a Welsh corgi and 10 people over the past year and a half, the researchers are set to begin a more extensive second round of human testing this autumn.

To make an artificial bone with this technology, a 3D computer model of the bone is first created based on the patient's X-ray and CT scan data. The computer model is then sliced into a large number of cross-sections and the data is sent to a special 3D inkjet printer, which works sort of like an ordinary inkjet printer by transferring tiny droplets of liquid onto a surface. However, unlike ordinary printers that print on paper, this one prints onto thin layers of powdered alpha-tricalcium phosphate (alpha-TCP). The "ink" is a water-based polymer adhesive that hardens the alpha-TCP it comes into contact with. By repeatedly laying down the powder and printing successive layers on top of one another, the printer is able to physically reproduce the desired bone to an accuracy of one millimeter.

Strong, lightweight and porous, the printed bones have characteristics similar to natural bone, and because they are tailored to fit exactly where they need to go, they are quick to integrate with the surrounding bone. The printed bone is also designed to be resorbed by the body as the surrounding bone slowly grows into it and replaces it.

In initial human trials conducted between March 2006 and July 2007, the effectiveness and safety of the artificial bones were tested in plastic surgery operations performed on 10 male and female patients between the ages of 18 and 54. In the second round of trials beginning this autumn at 10 medical institutions across Japan, the researchers plan to print up and implant synthetic bones in 70 volunteer patients with face or skull bones that have been damaged or removed due to injury or surgery.

While the printed bones are still not considered strong enough to replace weight-bearing bones, they are ten times stronger than conventional artificial bones made from hydroxylapatite, a naturally occurring mineral that is also the main component of natural bone. The printed bones are also cheaper and easier to make than hydroxylapatite implants, which must be sintered, or heated to a high temperature to get the particles to adhere to each other. In addition to taking longer to produce, sintered implants also take longer for the body to resorb.

The next round of human trials will be conducted at Dokkyo Medical University, Saitama Medical University, Tokyo Dental College, University of Tokyo, Juntendo University, Tsurumi University, Kyoto University, Osaka Medical College, Kobe University and Osaka City General Medical Center.

The researchers hope to make the technology commercially available by 2010.

[Source: Fuji Sankei, The Chemical Daily]

“World’s smallest” gas turbine engine

09 Aug 2007

World's smallest gas turbine engine --

Researchers at Tohoku University have developed a working prototype of what they are calling the world's smallest gas turbine engine, a palm-sized motor they hope will one day be used to power autonomous robots and serve as a portable engine for personal transportation devices.

The research team led by professor Shuji Tanaka from Tohoku University's Nano-Precision Mechanical Fabrication Lab worked with researchers from IHI Corporation and the University of Tokyo to create the tiny engine, which measures 10 cm (4 in.) in diameter and 15 cm (6 in.) in length. With a 16 mm (0.63 in.) compressor rotor diameter and a 17 mm (0.67 in.) turbine rotor diameter and combustion chamber, the engine boasts a rotational speed of 500,000 to 600,000 rpm, which is made possible by special air bearings the researchers developed.

Unlike battery-powered engines that need to stop for periodic recharging, gas turbine engines can run continuously as long as fuel is supplied. Furthermore, gas turbine engines feature a higher power density than fuel cell and battery-powered engines, and they run cleaner than reciprocating piston engines.

With demand expected to increase for robots that use commonly available fuels and compact motors for personal transportation for the elderly, the Tohoku University researchers have been working with IHI since 2000 to develop a portable, lightweight and quiet engine able to operate for long periods of time between refuelings. After 7 years of work, they have broken the 20 mm diameter rotor barrier, a goal long shared by their microturbine-minded peers around the globe.

The engine has not yet been outfitted with a generator because it is still under development, but space has been set aside for it within the engine.

The engine will be officially unveiled at PowerMEMS 2007 scheduled for November 28-29 in Freiberg, Germany.

[Source: Nikkei Net]

Fourth-generation pig clones born

07 Aug 2007

Fourth-generation clone piglets ---

In a development that brings us one step closer toward the mass-cloning of animals for use in regenerative medicine, researchers from Meiji University have succeeded in creating the world's first fourth-generation pig clones.

Since creating a pair of pig clones in April 2004, professor Hiroshi Nagashima, the research team leader, has been recloning the clones using cells from their salivary glands. The fourth-generation piglets -- three of them born on July 23 -- are clones of clones of clones of clones, so they share the exact DNA as the original pig.

Scientists have been seeking to advance pig cloning technology because pig organs are physiologically similar to human organs, meaning they could be the key to alleviating the worldwide shortage of organs for transplant.

Past examples of animals that have been cloned through multiple generations include mice, which have been recloned up to the sixth generation, and cows, which have been recloned to the second generation. While recloning technology may promise to boost the productivity of cloning operations, there are some drawbacks. For instance, in somatic cell nuclear transfer -- a reproductive cloning technique where the nucleus from a donor adult cell (somatic cell) is transferred to a nucleus-free egg cell, which is then transferred into the uterus of a surrogate mother -- DNA damage accumulates with each generation that is cloned. After a number of generations, the cumulative damage to the DNA could result in an animal that is significantly different from the original.

So far, however, the fourth-generation pig clones show no signs of abnormalities, and the researchers are planning to reclone them again.

In addition to creating the world's first fourth-generation pig clones, Nagashima's team also reported success in using a combination of gene transfer technology and cloning technology to create transgenic diabetic pigs -- pigs with human genes that exhibit symptoms of diabetes mellitus. The researchers worked with BIOS Inc., a venture company based in Kanagawa prefecture, to engineer the pigs.

While we have seen transgenic diabetic mice in the past, these transgenic diabetic pigs are reportedly the first of their kind. With the anatomical similarities between man and swine, transgenic diabetic pigs could lead to a cure for diabetes by helping scientists develop transplant technology involving the use of pig pancreatic tissue, a potential source of insulin. In addition, the pigs can also serve as models for observing the complications associated with diabetes, such as arteriosclerosis, and they could help researchers develop new medicines.

Professor Nagashima suggests that in addition to serving as model animals for human diseases, individuals will be able to use their own cells in these bioengineered pigs to test the effectiveness and safety of regenerative medicine therapies.

[Source: Yomiuri, Jiji]

‘Tondon’: Balinese-style robot janitor

02 Aug 2007

Tondon, robot janitor --- An autonomous robot janitor built by Subaru (Fuji Heavy Industries) and Sumitomo has landed a job cleaning the outdoor hallways of a new 14-story Bali-themed luxury apartment complex in Tokyo. Lovingly nicknamed 'Tondon' in an apparent reference to a legendary Balinese snake god, the robot is a close relative of RFS1, the autonomous floor cleaning robot that received Japan's 2006 Robot of the Year Award last December.

Like the RFS1, which currently cleans hallway floors inside ten Tokyo-area office buildings, Tondon works unsupervised and relies on an optical communication system to control the building's elevators, allowing it to move freely from floor to floor as needed. To improve the robot's ability to clean gritty outdoor surfaces, Tondon's makers have added a set of heavy-duty brushes designed to sweep up leaves and dirt from hallway floors and drains. Furthermore, Tondon's outer shell has been strengthened and waterproofed to protect its internal components from the elements, and it has been painted with a unique design to complement the apartment building's Bali-themed decor.

Tondon also has a number of safety features that help it better coexist with the building's residents, including proximity sensors that help prevent collisions with people, as well as bumper switches that stop the robot in its tracks when it is touched. A protective guard around the brushes prevents the robot from giving people unwanted shoeshines, while lamps and voice announcements provide ample warning when it is approaching.

A set of video cameras has also been added to the robot. With four cameras that record the robot's every move and a hard disk that stores the video feed, human overlords can keep close tabs on Tondon to make sure it doesn't nap on the job. The cameras can also be used for hallway surveillance, the company says, allowing the robot to double as a watchful security guard as it cleans.

[Source: Fuji press release]

Motion Portrait: Instant 3D animation from photos

01 Aug 2007

Motion Portrait --

Motion Portrait is a killer little piece of digital animation technology that easily transforms an ordinary digital photograph of a face into a living 3D animation that can blink and move its eyes, turn its gaze to follow the movement of the mouse cursor, express a range of emotions, sneeze and more -- all in a matter of seconds. In addition to being fast and easy to use, Motion Portrait is much lighter than conventional 3D animation engines, making it ideal for use in cellphones, handheld games and other portable devices.

Originally developed at the Sony Kihara Research Center two years ago, Motion Portrait now belongs to Motion Portrait Inc., which was founded this July after the research center closed its doors last year. As the company continues to develop Motion Portrait, it is on the lookout for new ways to put the technology to use.

While the technical details are being kept secret, the company says Motion Portrait works by automatically recognizing the eyes, nose, mouth and other facial features in ordinary digital photographs to create an instant three-dimensional map of the head. The data is then run through Motion Portrait's Expression Engine to create a range of facial expressions. Here are a few examples of Motion Portrait at work: 1, 2, 3.

In addition to photographs of human faces, Motion Portrait can also breathe life into pictures of pets, with a little manual input. Check out this fun yet sinister-looking dog (pictured above). Grrrr!

The technology can also be used to create anime from illustrations, and it is already being used in a PSP game -- the Promise of Suzumiya Haruhi (Suzumiya Haruhi No Yakusoku) by Bandai Namco Games -- due out in early August. With the ability to create a wide range of facial animations from a single image, Motion Portrait promises to reduce game development costs and improve quality. In addition, when used in conjunction with the company's voice recognition technology, Motion Portrait automatically syncs the animated lips with voice data to create realistic-looking talking mouths. Here are two anime examples: 1, 2.

The company also sees potential uses for Motion Portrait in advertising and in creating avatars used on social networking sites. Further, as a simulation tool, beauty salons can use Motion Portrait to show customers how they might look with different hairstyles and makeup, and eyeglass shops can use it to help customers choose their next pair of eyeglasses.

No word yet on when Motion Portrait will be made available to the general user.

[Source: Motion Portrait via IT Media]

Halluc II: 8-legged robot vehicle

26 Jul 2007

Halluc II --

On June 25, researchers at the Chiba Institute of Technology unveiled a working prototype of the Halluc II, a robotic vehicle with eight wheels and legs designed to drive or walk over rugged terrain. The agile robot, which the developers aim to put into practical use within the next five years, can move sideways, turn around in place and drive or walk over a wide range of obstacles. The researchers hope the robot's abilities will help out with rescue operations, and they would like to see Halluc II's technology put to use in transportation for the mobility-impaired.

The operator can put Halluc II into one of three modes depending on the terrain -- Vehicle, Insect or Animal mode. In Vehicle mode, Halluc II drives around on its eight wheels, and as it moves over uneven surfaces, each of the legs moves up and down in sync with the terrain to provide a smooth ride that keeps the cab at a constant height. In Insect mode, Halluc II does not use the wheels; instead, it walks with an insect-like gait, with its legs extended outward from the cab. In Animal mode, Halluc II keeps its legs directly beneath the cab while it walks, allowing it to pass through tight spaces. With wireless LAN capabilities and a system of cameras and sensors that monitor the distance to potential obstacles, Halluc II constantly assesses how best to adjust the position of its legs and wheels.

Here's a short video of the model in action.

Halluc II's design calls for a total of 56 motors -- 2 for each leg joint (3 joints per leg), plus 1 for each wheel. Equipping each joint with 2 motors provides the legs with abundant power and allows for a smoother ride, say the researchers, who have devoted a great deal of attention to the cutting-edge multi-motor control system, a key component of Halluc II's design.

According to Mr. Yoshida, chief researcher at Chiba Institute of Technology's Future Robotics Technology Center (fuRo), the expensive price tag of high-precision motors poses some challenges, but as costs come down in the future, it will become easier to incorporate greater numbers of motors into drive systems. Halluc II appears to be a more advanced version of fuRo's 8-wheeled Hallucigenia01 robot created in 2003.

In designing Halluc II, the researchers have enlisted the help of renowned industrial designer Shunji Yamanaka, who has worked on everything from furniture and watches to robots and transportation. "Human beings have a large number of muscles, which allows for a great degree of freedom," says Yamanaka. "By incorporating greater redundancy into the vehicle's functions, we can give it more flexibility and speed and enable it to continue operating even when obstacles are in the way."

The Halluc II prototype is scheduled to go on display at Miraikan in Tokyo beginning August 1. At the exhibit, visitors will be allowed to operate the vehicle from a remote-control cockpit with a large screen showing real-time video shot from the onboard camera.

[Source: Nikkei BP]

UPDATE: Robot Watch also has a lot of great photos and videos. Here's a clip showing Halluc II in Insect mode and Animal mode. I can't wait to drive one through rush-hour traffic.

Hitachi finger vein money

25 Jul 2007

Finger vein authentication -- On July 24, Hitachi announced the development of a biometric cardless credit payment system, called "finger vein money," which allows shoppers to pay for purchases using only their fingertips. The company plans to begin field testing the finger vein money in September.

Finger vein money relies on Hitachi's finger vein authentication technology, which verifies a person's identity by reading the pattern of blood vessels in his or her fingers. These blood vessel patterns are unique to each individual, much like fingerprints or retinas, only they are hidden securely under the skin, making them all the more difficult to counterfeit. Hitachi's finger vein authentication technology is already being used to verify user identities for ATMs, door access control systems and computer log-in systems in Japan and elsewhere.

In the finger vein money system, consumers first register their finger vein pattern data with the credit card company. The data is then entered into a database along with the individual's credit account information. Later, when shoppers want to pay for something, they simply go to the cash register and place their finger in a vein reader, which uses infrared LEDs and a special camera to capture a detailed image of their vein structure. The image is converted into a readable format and sent to the database, where it is checked against the records on file. When the system verifies the identity of the shopper, the purchase is charged to the individual's credit account.

Hitachi's three-month field test, which is set to begin in September, involves 200 Hitachi employees volunteering to use finger vein money at the company cafeteria and shops in the Hitachi System Plaza Building located in Shin-Kawasaki. If all goes well, Hitachi -- who is conducting the test with the cooperation of major credit card company JCB -- plans to expand the trial system to all of its company buildings.

As a cardless payment system that promises the ultimate in convenience and security, finger vein money could help contribute to the disappearance of credit cards and all the anxieties associated with their loss and theft. When that day comes, we may only need to worry about losing our fingers.

[Source: Nikkei Net]

MOTOMAN: Industrial-strength taiko drummer

23 Jul 2007

Yaskawa MOTOMAN robots play taiko drums --

As industrial equipment manufacturer Yaskawa Electric forces the MOTOMAN robot out of its comfort zone on the factory floor, we see it quickly acquiring new skills. First the robot developed the ability to sort mail. Now it has learned to play taiko drums.

On July 21, a team of four MOTOMAN machines -- two dual-armed MOTOMAN-DIA10 robots and two MOTOMAN-HP3 welding robots -- gave a special taiko performance at the nearly 400-year-old Kokura Gion Daiko Festival in Kitakyushu, which is famous for its traditional drumming competition. Organizers invited the robots to spice up the special opening ceremony for the competition's 60th anniversary. The robots -- the first ever to play taiko drums at the ancient festival -- were paraded through the crowd of spectators on a float while they performed.

Yaskawa worked with festival organizers for four months to teach the robots the proper rhythm, technique and choreography for the performance, which was seen as a success. Here's a short video.

[Source: Robot Watch]

NEC’s drive-thru face recognition system

20 Jul 2007

Drive-thru face recognition system -- On July 19, electronics giant NEC announced it has developed the world's first automated border control system that uses facial recognition technology capable of identifying people inside their automobiles. The system is already in operation at checkpoints on the Hong Kong - Shenzhen border.

Built around NEC's NeoFace biometric face recognition system, as well as NEC's electronic passport technology, the system is designed to boost the speed and efficiency of Hong Kong Immigration Department operations by allowing residents with microchipped national ID cards to remain in their vehicles while automated cameras verify their identities. Hong Kong residents aged 11 or over are required by law to carry a national ID card (HKID), and the recently issued "smart" IDs are embedded with chips that contain biometric and personal data.

The system works by first reading a vehicle?s license plate as it approaches a border gate. Because each vehicle in Hong Kong is registered to an individual driver, a simple automated database check determines who the driver should be. Next, the cameras scan the face of the driver and a database search is performed. If there is a match, the immigration process is completed and the gate opens, allowing the vehicle to pass through.

For now, NEC's setup only works with truck drivers, but coming improvements promise the ability to identify up to 8 passengers per vehicle. The cameras have been installed at 8 of the 40 border gates on a new road connecting Hong Kong and Shenzhen, with all 40 gates expected to be upgraded by August.

NEC eventually hopes to develop a face recognition system so quick and accurate that it would eliminate the need for fingerprinting.

[Sources: Softbank Business + IT, NEC press release]