Tag: ‘Game’

U-Tsu-Shi-O-Mi virtual humanoid

12 Oct 2007

U-Tsu-Shi-O-Mi Virtual Humanoid --

U-Tsu-Shi-O-Mi is an interactive "mixed reality" humanoid robot that appears as a computer-animated character when viewed through a special head-mounted display. A virtual 3D avatar that moves in sync with the robot's actions is mapped onto the machine's green cloth skin (the skin functions as a green screen), and the sensor-equipped head-mounted display tracks the angle and position of the viewer's head and constantly adjusts the angle at which the avatar is displayed. The result is an interactive virtual 3D character with a physical body that the viewer can literally reach out and touch.

U-Tsu-Shi-O-Mi Virtual Humanoid --

Researcher Michihiko Shoji, formerly of NTT DoCoMo, helped create U-Tsu-Shi-O-Mi as a tool for enhancing virtual reality simulations. He is now employed at the Yokohama National University Venture Business Laboratory, where he continues to work on improving the virtual humanoid. The system, which currently requires a lot of bulky and expensive equipment to run, will likely see its first real-world applications in arcade-style video games. However, Shoji also sees a potential market for personal virtual humanoids, and is looking at ways to reduce the size and cost to make it suitable for general household use.

Here is a video of U-Tsu-Shi-O-Mi.

The virtual humanoid will be on display at ASIAGRAPH 2007 in Akihabara (Tokyo) from October 12 to 14.

[Source: Robot Watch]

Brain-computer interface for Second Life

12 Oct 2007

Brain-computer interface controls Second Life avatar --

While recent developments in brain-computer interface (BCI) technology have given humans the power to mentally control computers, nobody has used the technology in conjunction with the Second Life online virtual world -- until now.

A research team led by professor Jun'ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts -- the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines moving his/her right and left arms.

The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user's imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle.

Future plans are to improve the BCI so that users can make Second Life avatars perform more complex movements and gestures. The researchers hope the mind-controlled avatar, which was created through a joint medical engineering project involving Keio's Department of Rehabilitation Medicine and the Tsukigase Rehabilitation Center, will one day help people with serious physical impairments communicate and do business in Second Life.

(For video of the Second Life BCI, check the links on the Ushida & Tomita Laboratory news page, right above the first photo.)

[Source: Nikkei Net]

Clannad canned bread in Akiba vending machine

04 Oct 2007

Canned bishojo bread -- A vending machine at Tokyo's Akihabara station is now offering a limited run of canned bishoujo bread in celebration of the new Clannad TV anime series that begins October 4.

Clannad, which was first released as an interactive love adventure game (visual novel) for the PC in April 2004, follows the adventures of a high school delinquent as he develops relationships with some of his female classmates.

The 350-yen ($3) canned Clannad treats -- available in chocolate, strawberry, green tea, butter, raisin, blueberry, and milk -- are manufactured and distributed by Pan Akimoto, who originally developed canned bread as an emergency food in the aftermath of the 1995 Kobe earthquake. In Akihabara, where cuisine like canned oden enjoys widespread popularity, the future of canned bishoujo bread looks bright.

A vending machine on platform 6 (for Sobu line trains bound for Chiba) will be dispensing the canned bread until the end of October.

[Source: Mainichi]

Canned Final Fantasy VII Potion

31 Aug 2007

Final Fantasy VII Potion character cans --

To help commemorate the 10th anniversary of the release of Final Fantasy VII, Suntory has announced plans to begin selling its vitamin-packed Final Fantasy VII Potion carbonated drink in special aluminum cans featuring depictions of Cloud, Sephiroth and other characters from the game. Sixteen different character cans will be available at a price of 191 yen each when the limited-edition 4.8-million can shipment hits shelves on October 23.

[Source: Mantan Web]

Motion Portrait: Instant 3D animation from photos

01 Aug 2007

Motion Portrait --

Motion Portrait is a killer little piece of digital animation technology that easily transforms an ordinary digital photograph of a face into a living 3D animation that can blink and move its eyes, turn its gaze to follow the movement of the mouse cursor, express a range of emotions, sneeze and more -- all in a matter of seconds. In addition to being fast and easy to use, Motion Portrait is much lighter than conventional 3D animation engines, making it ideal for use in cellphones, handheld games and other portable devices.

Originally developed at the Sony Kihara Research Center two years ago, Motion Portrait now belongs to Motion Portrait Inc., which was founded this July after the research center closed its doors last year. As the company continues to develop Motion Portrait, it is on the lookout for new ways to put the technology to use.

While the technical details are being kept secret, the company says Motion Portrait works by automatically recognizing the eyes, nose, mouth and other facial features in ordinary digital photographs to create an instant three-dimensional map of the head. The data is then run through Motion Portrait's Expression Engine to create a range of facial expressions. Here are a few examples of Motion Portrait at work: 1, 2, 3.

In addition to photographs of human faces, Motion Portrait can also breathe life into pictures of pets, with a little manual input. Check out this fun yet sinister-looking dog (pictured above). Grrrr!

The technology can also be used to create anime from illustrations, and it is already being used in a PSP game -- the Promise of Suzumiya Haruhi (Suzumiya Haruhi No Yakusoku) by Bandai Namco Games -- due out in early August. With the ability to create a wide range of facial animations from a single image, Motion Portrait promises to reduce game development costs and improve quality. In addition, when used in conjunction with the company's voice recognition technology, Motion Portrait automatically syncs the animated lips with voice data to create realistic-looking talking mouths. Here are two anime examples: 1, 2.

The company also sees potential uses for Motion Portrait in advertising and in creating avatars used on social networking sites. Further, as a simulation tool, beauty salons can use Motion Portrait to show customers how they might look with different hairstyles and makeup, and eyeglass shops can use it to help customers choose their next pair of eyeglasses.

No word yet on when Motion Portrait will be made available to the general user.

[Source: Motion Portrait via IT Media]

Virtual itasha invade Forza 2

18 Jun 2007

Virtual itasha in Forza 2 --

Since the release of the Forza Motorsport 2 racing sim for Xbox 360 several weeks ago, players worldwide have been using the in-game custom paint function to create incredible designs for their cars, which they can either race online or buy, sell and trade through the game's virtual auction house. Japan?s digital racers have been in on the fun from the start, turning their virtual rides into magnificent itasha that scream otaku pride.

Here are links to two enormous online galleries (Gallery 1, Gallery 2) of virtual otaku-mobile paint jobs incorporating loads of Japanese-flavored eye candy, from anime and manga to games, food packaging and more. What makes these detailed paint jobs even more impressive is the fact they are created entirely with the game's basic paint tools -- a limited selection of vector shapes that can be colored, scaled, rotated and layered endlessly to create complex designs. Graphics cannot be imported from external sources, so everything is created manually step by step in what is undoubtedly a time-consuming process.

Here's a tiny sample of some of the work found in the galleries:

Virtual itasha in Forza 2 --

Virtual itasha in Forza 2 --

Virtual itasha in Forza 2 --

Virtual itasha in Forza 2 --

Virtual itasha in Forza 2 ---

Virtual itasha in Forza 2 --

Virtual itasha in Forza 2 --

[Link: Gallery 1, Gallery 2 via TECHSIDE]

Walkman-style brain scanner

23 May 2007

Portable brain scanner ---

Hitachi has successfully trial manufactured a lightweight, portable brain scanner that enables users to keep tabs on their mental activity during the course of their daily lives. The system, which consists of a 400 gram (14 oz) headset and a 630 gram (1 lb 6 oz) controller worn on the waist, is the result of Hitachi's efforts to transform the brain scanner into a familiar everyday item that anyone can use.

The rechargeable battery-operated mind reader relies on Hitachi's so-called "optical topography" technology, which interprets mental activity based on subtle changes in the brain's blood flow. Because blood flow increases to areas of the brain where neurons are firing (to supply glucose and oxygen to the tissue), changes in hemoglobin concentrations are an important index by which to measure brain activity. To measure these hemoglobin concentrations in real time, eight small surface-emitting lasers embedded in the headset fire harmless near-infrared rays into the brain and the headset's photodiode sensors convert the reflected light into electrical signals, which are relayed to the controller.

The real-time brain data can either be stored in Flash memory or sent via wifi to a computer for instant analysis and display. A single computer can support up to 24 mind readers at a time, allowing multiple users to monitor brain activity while communicating or engaging in group activities.

In addition to health and medical applications, Hitachi foresees uses for the personal mind reader in fields such as psychology, education and marketing. Although it is unclear what neuromarketing applications the company has in mind, it is pretty clear that access to real-time customer brain data would provide marketers with a better understanding of how and why shoppers make their purchasing decisions. One can also imagine interactive campaigns that, for example, ask customers to think positive thoughts about a certain product in exchange for discount coupons or the chance to win a prize.

The technology could also be used in new forms of entertainment such as "mind gaming," where the player's physical brain activity becomes a part of game play. It is also feasible to integrate the brain scanner with a remote control brain-machine interface that would allow users to operate electronic devices with their minds.

Hitachi has yet to determine when the personal mind reader will be made commercially available.

[Source: Tech-On!]

Gemotion screen shows video in living 3D

22 Jan 2007

Gemotion -- Here's a groovy display for people looking to add that extra dimension to their viewing material...

Gemotion is a soft, 'living' display that bulges and collapses in sync with the graphics on the screen, creating visuals that literally pop out at the viewer.

Yoichiro Kawaguchi, a well-known computer graphics artist and University of Tokyo professor, created Gemotion by arranging 72 air cylinders behind a flexible, 100 x 60 cm (39 x 24 inch) screen. As video is projected onto the screen, image data is relayed to the cylinders, which then push and pull on the screen accordingly.

"If used with games, TV or cinema, the screen could give images an element of power never seen before. It could lead to completely new forms of media," says Kawaguchi.

The Gemotion screen will be on display from January 21 to February 4 as part of a media art exhibit (called Nihon no hyogen-ryoku) at National Art Center, Tokyo, which recently opened in Roppongi.

[Source: Asahi]

Video: Asteroid impact avoidance system

12 Oct 2006

This video shows a very simple, yet ingenious way to save the planet from destruction by a giant meteor.