Tag: ‘Smart-Tech’

Steering wheel finger vein authentication system

23 Oct 2007

Hitachi biometric finger vein verification technology embedded in steering wheel --

Over the past few years, Hitachi's finger vein authentication technology -- which identifies individuals based on the unique pattern of blood vessels inside their fingers -- has appeared in everything from ATMs and computers to building entrances and cardless payment systems. Hitachi's latest development puts the biometric security technology inside the car steering wheel and couples it with a system that allows the engine to start only for drivers whose finger vein patterns the vehicle recognizes.

While providing an extra layer of security against car theft, Hitachi's steering wheel finger vein authentication system also works to improve in-vehicle comfort when used with seats, mirrors and air conditioners that auto-adjust according to the preferences of the driver touching the wheel. Furthermore, the finger vein reader, which is small enough to be embedded inconspicuously on the back of the steering wheel, can be used as a programmable multi-purpose switch that lets the driver perform different functions with different fingers. The driver could, for example, use different fingers to turn on the stereo, open the sunroof, and operate the navigation system -- all while concentrating on the road and maintaining a natural grip on the wheel.

The company also sees great future potential for the steering wheel finger vein reader as cars become smarter and equipped with increasingly complex IT-based functions. In Hitachi's vision, the reader will one day be used with on-board electronic payment systems that literally keep you in the driver's seat while making secure payments at drive-thrus, as well as with services that let you pay for and download music while on the road.

Hitachi first brought their finger vein authentication technology to automobiles in 2005, with a keyless car door lock that checks finger veins and opens only for the vehicle's registered driver. The technology, which Hitachi originally developed in 1997, relies on image sensors and near-infrared light passing through the finger to measure the vein patterns inside. Each individual finger has a unique pattern of blood vessels, much like a fingerprint, which can be used as a form of biometric identification.

A model vehicle equipped with Hitachi's steering wheel finger vein authentication system will be on display at the 2007 Tokyo Motor Show from October 27 to November 11.

[Source: Hitachi press release]

NTT to test digital aromatic signs

18 Oct 2007

Digital aroma-emitting sign by NTT -- NTT Communications (NTT Com) has announced plans to begin testing its latest aroma-emitting digital sign technology, called "Kaoru Digital Signage," in Tokyo. The tests, which will take place outside the Kirin City Beer Hall in the underground Yaesu Shopping Mall (JR Tokyo station) from October 21 to the end of December, will involve internet-controlled signs that display electronic imagery of beer while emitting aromas such as lemon and orange. The researchers aim to study the sign's effectiveness in drawing passersby into the restaurant.

Billed as the world's first advertising sign system capable of emitting multiple aromas while displaying electronic images, the signs combine NTT's Spot Media digital signage service (currently used in marketing and customer service applications at banks, hospitals, public offices and retail stores) with its Kaori Tsushin online fragrance communication service. Kaori Tsushin, which gives users web-based control over aroma-emitting devices, is already in use at retail stores and cafes, where it is reportedly helping to improve on-site customer satisfaction.

NTT's new sign system consists of a 30 x 50 x 15 centimeter (12 x 20 x 6 inch) aroma diffuser, a 19-inch display and an NTT Spot Media content receiver, which are used to deliver aroma and display images of beer (and live shots from inside the restaurant) based on instructions received via a web connection. In the tests, the sign's smell will change according to the time of day, dispersing appetizing orange and lemon aromas at lunchtime, and releasing a more relaxing "woody" aroma at night.

The aroma diffuser contains three 450-milliliter bottles of aroma oil. When the "recipe" -- which determines the type and strength of smell -- is received via the web, the device releases a vapor created by blasting the oil with a series of ultrasonic waves. With the ability to deliver fragrances across a 500 square meter (5,400 square feet) area, the new aroma diffuser is a great deal more powerful than NTT's Aromageur, which was developed for personal use in spaces the size of a small bedroom.

The scheduled testing follows a spate of aroma-related experiments conducted by NTT earlier this year. On Valentine's Day, NTT researchers conducted an experiment with vanilla fragrance in an office lobby. When vanilla fragrance was periodically released near free chocolate (labeled with a "Please Take One" sign) placed on a reception counter, the researchers found that passersby were nearly twice as likely to take a chocolate. In other experiments conducted at Tokyo-area bookstores from May to September, relaxing orange and lavender aromas were found to boost monthly sales by nearly 5%.

[Sources: NTT press release, IT Media]

Brain-computer interface for Second Life

12 Oct 2007

Brain-computer interface controls Second Life avatar --

While recent developments in brain-computer interface (BCI) technology have given humans the power to mentally control computers, nobody has used the technology in conjunction with the Second Life online virtual world -- until now.

A research team led by professor Jun'ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts -- the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines moving his/her right and left arms.

The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user's imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle.

Future plans are to improve the BCI so that users can make Second Life avatars perform more complex movements and gestures. The researchers hope the mind-controlled avatar, which was created through a joint medical engineering project involving Keio's Department of Rehabilitation Medicine and the Tsukigase Rehabilitation Center, will one day help people with serious physical impairments communicate and do business in Second Life.

(For video of the Second Life BCI, check the links on the Ushida & Tomita Laboratory news page, right above the first photo.)

[Source: Nikkei Net]

OKAO Vision: Real-time smile analysis

07 Sep 2007

OKAO Vision real-time smile measurement technology --On September 5, the Omron Corporation unveiled smile recognition software that promises to improve the ability of machines to read human emotions.

Built around Omron's previous face recognition technology, the new "OKAO Vision Real-time Smile Measurement Technology" is designed to automatically identify faces in digital images and assign each corresponding smile a score of 0% to 100%. The system works by automatically fitting a 3D face model onto the subject's face and analyzing a number of key points, such as the degree to which the eyes and mouth are open, the shape of wrinkles at the edges of the eyes and mouth, and changes in the position of facial features. The entire process, from the time an image is input until the time the smile score is output, takes 0.044 second (for a 3.2 GHz Pentium 4 processor).

OKAO Vision can analyze multiple faces simultaneously as long as they are at least 60 pixels wide and facing the camera (tilted less than 30 degrees to either side and less than 15 degrees up or down), and the software does not require faces to be registered beforehand. OKAO Vision, which Omron says is more than 90% accurate, was developed by studying the facial expressions of 15,000 individuals ranging from infants to the elderly, from a variety of countries.

OKAO Vision real-time smile measurement technology --

The 46-KB, Windows 2000/XP-compatible program can easily be incorporated into a variety of devices, say the developers, who hope to see it put to use in digital cameras designed to capture the perfect smile, or in robots that can recognize when humans are happy. Masato Kawade, OKAO project leader, says he hopes the technology "can contribute to the development of an ideal society where machines operate in harmony with human emotions."

Omron plans to release the new OKAO Vision system later this year, making it the latest addition to a line of face recognition technology that boasts features such as the ability to determine a subject's age, gender and line of sight. In the future, the company plans to shift the focus away from shiny, happy people and develop technology that can read faces for anger and sadness.

[Sources: Kyoto Shimbun, RBB Today]

NEC’s drive-thru face recognition system

20 Jul 2007

Drive-thru face recognition system -- On July 19, electronics giant NEC announced it has developed the world's first automated border control system that uses facial recognition technology capable of identifying people inside their automobiles. The system is already in operation at checkpoints on the Hong Kong - Shenzhen border.

Built around NEC's NeoFace biometric face recognition system, as well as NEC's electronic passport technology, the system is designed to boost the speed and efficiency of Hong Kong Immigration Department operations by allowing residents with microchipped national ID cards to remain in their vehicles while automated cameras verify their identities. Hong Kong residents aged 11 or over are required by law to carry a national ID card (HKID), and the recently issued "smart" IDs are embedded with chips that contain biometric and personal data.

The system works by first reading a vehicle?s license plate as it approaches a border gate. Because each vehicle in Hong Kong is registered to an individual driver, a simple automated database check determines who the driver should be. Next, the cameras scan the face of the driver and a database search is performed. If there is a match, the immigration process is completed and the gate opens, allowing the vehicle to pass through.

For now, NEC's setup only works with truck drivers, but coming improvements promise the ability to identify up to 8 passengers per vehicle. The cameras have been installed at 8 of the 40 border gates on a new road connecting Hong Kong and Shenzhen, with all 40 gates expected to be upgraded by August.

NEC eventually hopes to develop a face recognition system so quick and accurate that it would eliminate the need for fingerprinting.

[Sources: Softbank Business + IT, NEC press release]

Video: Kansei robot fears war

07 Jun 2007

Kansei robot -- Kansei, a robot face capable of 36 expressions that vary according to emotional interpretations of words it hears, is the latest achievement to emerge from a Meiji University research lab working to develop conscious and self-aware robots. When Kansei hears a word, it uses software to access a database of 500,000 keywords, create word associations and determine an emotion -- ranging from happiness to sadness, anger and fear -- which is expressed by a system of 19 actuators under its silicone skin.

"What we are trying to do here is to create a flow of consciousness in robots so that they can make the relevant facial expressions," said project leader Junichi Takeno, a professor at Meiji University. "I believe that's going to be a key to improving communication between humans and robots."

Check out the video to see how Kansei reacts to the word "president."

Link: Reuters video

[Source: Yahoo!]

CB2 baby humanoid robot

01 Jun 2007

CB2, baby humanoid robot --

On June 1, researchers from Osaka University's Graduate School of Engineering unveiled a robot that acts like a human infant, which they hope may one day help scientists better understand the child development process.

The researchers have named the baby robot "CB2," and for now, it is designed to function as a 1- to 2-year-old child, gazing intently at its surroundings, squirming about on the floor and lighting up the room with child-like charm.

CB2, baby humanoid robot -- The 130 cm long, 33 kg robot features 56 air cylinders that serve as muscles. With cameras for eyes and microphones for ears, and with 197 tactile sensors embedded in the layer of soft silicone skin covering its entire body, CB2 is well-equipped to take in environmental stimuli. When CB2's shoulders are tapped, it blinks as if surprised, stops moving, and turns its gaze toward the person who touched it, and when a toy is dangled in front of its eyes, it appears to devote all its energy to trying to reach for it. CB2 also has a set of artificial vocal chords that it uses to speak baby talk.

The researchers say that once CB2 is equipped with software that gives it the ability to learn, they will be very interested in undertaking the long-term challenge of teaching it how to walk and talk.

[Source: Asahi]

====================

CB2, baby humanoid robot

UPDATE: Check out videos of little CB2 -- whose full name is "Child-robot with Biomimetic Body." Fans of Actroid and Geminoid might recognize one of the faces in the videos -- that of robot designer Dr. Ishiguro.

Video 1: Toward the end of this report, the announcer says that within the next four years, researchers at the Japan Science and Technology Agency (JST) -- who worked with Osaka University to develop CB2 -- hope to create a slightly more advanced version of the robot that has the vocabulary and cognitive skills of a 3-year-old child. At the end of the report, the Osaka University project leader says this type of "soft" robot technology will facilitate communication between humans and robots, which will prove useful for research purposes and for developing robots that can better assist and entertain us in our day-to-day lives.

Video 2: This report also mentions that the research team hopes to eventually create a robot that children can play with.

Walkman-style brain scanner

23 May 2007

Portable brain scanner ---

Hitachi has successfully trial manufactured a lightweight, portable brain scanner that enables users to keep tabs on their mental activity during the course of their daily lives. The system, which consists of a 400 gram (14 oz) headset and a 630 gram (1 lb 6 oz) controller worn on the waist, is the result of Hitachi's efforts to transform the brain scanner into a familiar everyday item that anyone can use.

The rechargeable battery-operated mind reader relies on Hitachi's so-called "optical topography" technology, which interprets mental activity based on subtle changes in the brain's blood flow. Because blood flow increases to areas of the brain where neurons are firing (to supply glucose and oxygen to the tissue), changes in hemoglobin concentrations are an important index by which to measure brain activity. To measure these hemoglobin concentrations in real time, eight small surface-emitting lasers embedded in the headset fire harmless near-infrared rays into the brain and the headset's photodiode sensors convert the reflected light into electrical signals, which are relayed to the controller.

The real-time brain data can either be stored in Flash memory or sent via wifi to a computer for instant analysis and display. A single computer can support up to 24 mind readers at a time, allowing multiple users to monitor brain activity while communicating or engaging in group activities.

In addition to health and medical applications, Hitachi foresees uses for the personal mind reader in fields such as psychology, education and marketing. Although it is unclear what neuromarketing applications the company has in mind, it is pretty clear that access to real-time customer brain data would provide marketers with a better understanding of how and why shoppers make their purchasing decisions. One can also imagine interactive campaigns that, for example, ask customers to think positive thoughts about a certain product in exchange for discount coupons or the chance to win a prize.

The technology could also be used in new forms of entertainment such as "mind gaming," where the player's physical brain activity becomes a part of game play. It is also feasible to integrate the brain scanner with a remote control brain-machine interface that would allow users to operate electronic devices with their minds.

Hitachi has yet to determine when the personal mind reader will be made commercially available.

[Source: Tech-On!]

NTT’s eye-tracking system monitors pupil size, blinking

15 Mar 2007

Eye-tracking system recognizes viewer interest --- The NTT Group has unveiled technology that analyzes the interest level of TV viewers and web surfers by monitoring their eye movement, pupil size and blinking.

Improving on conventional eye-tracking systems that provide an understanding of where viewers cast their gaze, this new computer-operated system features cameras that monitor and analyze unconscious physiological reactions to interesting viewing material -- namely, enlarged pupils and changes in the rate of blinking.

The technology, which became commercially available on March 14, was developed by NTT Learning Systems (NTTLS) and the Visual Interactive Sensitivity Research Institute (VIS), both of which are involved in visual content creation. NTTLS says the technology can be used in conjunction with driver safety training videos, and negotiations with a major automaker are now underway.

NTTLS claims the system appeals to a wide range of potential users, including those involved in TV commercial advertising and web content creation. Television audience ratings alone do not provide producers a clear picture of the level of interest in commercials, and web traffic stats do not show which parts of a web page visitors find interesting. With this system, however, producers can get a more accurate understanding of what the audience is looking at and how interesting they find it.

Judging from the large size (and presumably high cost) of the device that sits between the viewer and the monitor, though, the system is clearly designed for use in the laboratory. But it's just a matter of time before this is standard computer monitor/TV screen equipment and producers keep one eye on the real-time audience pupil data while they develop and deliver content.

[Source: Asahi]