Tag: ‘Tokyo-University’

scoreLight turns shapes into sound

08 Dec 2009

"scoreLight" is a laser-based musical device that generates real-time sound based on the shape of drawings or objects.


+ scoreLight (ver.1)

Relying on 3D tracking technology developed at the Ishikawa-Komuro Laboratory in 2003, scoreLight uses lasers to trace the outline of a drawing or object. As the laser dances along the contours, scoreLight produces and modulates sound according to the curvature, angle, texture, color, and contrast. An abrupt change in the direction of a line generates a discrete sound (a glitch or percussion sound), resulting in a steady rhythm when the laser follows a looped path (the size and shape of the looped path determines the tempo and structure of the beat). The device creates a layered tapestry of sound when multiple laser points explore different parts of a drawing.

Here is some video of scoreLight making music from a sketch of a brain:


+ NOU-ISE

scoreLight's developers include Alvaro Cassinelli (concept, hardware and software), Kuribara Yusaku (software), Daito Manabe (sound concept and programming) and Alexis Zerroug (electronics). See Cassinelli's YouTube channel for more videos.

[Link: scoreLight]

Tangible hologram projector

06 Aug 2009

Researchers at the University of Tokyo have developed a holographic projector that displays three-dimensional virtual objects you can feel with your bare hands.


+ Video

The system consists of a Holo display (developed by Provision Interactive Technologies), a pair of Wii Remotes that track the position of the user's hand in front of the screen, and an "Airborne Ultrasound Tactile Display" unit that shoots focused ultrasonic waves at the hand to create the sensation of pressure on the skin.

By controlling the movement of these focused ultrasonic waves -- which can produce up to 1.6 grams-force of pressure within a 20-millimeter-wide focal point -- the projector can recreate virtual objects that seem to have physical mass. In the video above, the projector displays a tangible virtual bouncing ball, raindrops, and a small creature that runs around on the user's hand.

The tangible hologram projector is now on display at SIGGRAPH 2009 in New Orleans.

[Link: Touchable Holography (PDF) via @GreatDismal]

Video: Robot baseball

24 Jul 2009

To demonstrate the latest advances in high-speed industrial robot technology, researchers at the University of Tokyo have pitted a baseball-pitching robotic arm against a mechanical batter with a near-perfect swing.


+ Video

The robot pitcher consists of a high-speed, three-fingered hand (developed by professor Masatoshi Ishikawa and his team from the Graduate School of Information Science and Technology) mounted on a mechanical arm (developed at the Massachusetts Institute of Technology). With superb control of nimble fingers that can open and close at a rate of up to 10 times per second, the robot can release the ball with perfect timing. Precise coordination between the fingers, hand and arm allow the robot pitcher to hit the strike zone 90% of the time.

The robot batter is an upgraded version of a machine that Ishikawa's team developed in 2003.

In the demonstration -- which was designed to showcase the speed at which multiple high-speed industrial robots can respond to external circumstances and perform activities together -- the researchers placed the robot pitcher 3.5 meters (11 ft) away from the mechanical batter. The pitcher's 40-kph (25-mph) sidearm throws posed little challenge to the batter, whose 1000-frame-per-second camera eyes allow it to see the ball in super slow motion as it approaches. The robot batter has a near-perfect batting average when swinging at pitches in the strike zone.

To make future contests more interesting, the researchers plan to increase the robot pitcher's throwing speed to 150 kph (93 mph) and teach it to throw breaking balls and changeups. In addition, they plan to train the robot batter to repeatedly hit balls to the same target.

[Source: Mainichi]

‘Magic mirror’ shows real-time muscle data

02 Mar 2009

Magic mirror system by IRT --

Researchers at the University of Tokyo have developed a computerized, sensor-based "magic mirror" that analyzes muscular activity and shows real-time computer-generated images of how hard the user's muscles are being worked while exercising.

The magic mirror, developed under the leadership of professor Yoshihiko Nakamura of the Information and Robot Technology Research Initiative (IRT), was unveiled at the University of Tokyo last Friday. In a demonstration for the media, the system's display monitor showed a real-time computer-generated image of a male model's musculo-skeletal system while he performed a series of physical exercises.

The system, which is currently capable of monitoring the activity of 30% of the body's roughly 300 skeletal muscle pairs, consists of 16 electromyographs (instruments that record the electrical waves associated with muscle activity) attached to the user's body, 10 motion-capture cameras, and a pair of floor sensors to measure the force exerted on the legs.

On the monitor, each muscle is shown in a different color depending on how much it is being used at a particular moment. Active muscles are shown in red, while inactive muscles are shown in yellow.

Magic mirror system by IRT --
(Muscle images can also be overlaid on the video image of the user's body.)

The magic mirror system uses newly developed software that is reportedly 10 times faster than previous technology, allowing the system to operate in real-time, even when the user is moving rapidly.

The researchers, who are already working on a more compact version that incorporates the cameras directly into the display, envision the system being used in homes, gyms and hospitals. In addition to helping people get into shape, the system might also help doctors more effectively treat conditions that affect the muscles.

[Sources: Robot Watch, Yomiuri, Nikkei]

Photos: Yoichiro Kawaguchi’s robot designs

02 Feb 2009

More photos of Yoichiro Kawaguchi's robot designs on display at Yushima Seid? temple in Tokyo:

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design -- Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design -- Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design -- Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design -- Yoichiro Kawaguchi robot design --

Yoichiro Kawaguchi robot design --

‘Organic’ robots to mimic primitive life

27 Jan 2009

Primitive lifeforms as robots --

A University of Tokyo research team led by professor/computer graphic artist Yoichiro Kawaguchi is developing robots designed to imitate primitive life forms. Mockups have been put on display at a Confucian temple in Tokyo, and working versions of the robots are scheduled for completion in two years.

According to the researchers, these robots are being developed as a way to explore artificial life and gain insights into how living things survive in a world governed by the law of the jungle.

Primitive lifeforms as robots --

Kawaguchi and his team are developing a basic reflex system for the primitive artificial life forms, as well as a visual processing system equipped with eyes that recognize and instinctively track certain objects.

In addition, the researchers are working to create powerful biomimetic actuators for locomotion. As part of their research, Kawaguchi and his team have conducted computer simulations to investigate the use of neural oscillators in a locomotion system that imitates the way centipedes crawl. They are also working on simple, mechanical tentacles that extend and contract to move the robot in a specified direction. If all goes according to schedule, they will have a fully functional robot in two years.

Primitive lifeforms as robots --

With a more thorough understanding of how primitive life forms survive, the researchers believe they can provide robots with a better ability to move, hunt, sense danger, and escape. They suggest that strong survival and hunting skills can be put to use in applications ranging from security guard dog robots to swarm robots tasked with exploring the surface of an alien planet.

Primitive lifeforms as robots --

Kawaguchi, a professor at the University of Tokyo Graduate School of Interdisciplinary Information Studies, has become known for creating artistic computer graphics programs that exhibit "lifelike" behavior such as self-organization and self-propagation. The robot mockups, which are three-dimensional models of his previous computer graphics work, will remain on display at Yushima Seid? temple until February 8.

[Source: Robot Watch]

UPDATE: More photos HERE.

Tiny doll made of living cells

23 Jan 2009

Tiny doll made of cell capsules --

To demonstrate a new method for fabricating three-dimensional living biological structures, researchers at the University of Tokyo's Institute of Industrial Science (IIS) have created a 5-millimeter tall doll composed of living cells.

According to an announcement made on January 22, the researchers created the tiny figurine by cultivating 100,000 cell capsules -- 0.1-millimeter balls of collagen, each coated with dozens of skin cells -- together inside a doll-shaped mold for one day. After the cell capsules had coalesced to form the doll-shaped mass of tissue, it was placed in a culture solution, where it reportedly survived for more than a day.

The researchers, led by IIS professor Shoji Takeuchi, also successfully tested the biofabrication method with human liver cells. According to Takeuchi, the technique can be used to create bodily organs and tissues with complex cellular structures, which may prove useful in the fields of regenerative medicine and drug development.

"The overall shape can be controlled by changing the mold," said Takeuchi, who expressed a desire to combine multiple types of cells to create a complex system that functions as a living organism.

[Sources: Yomiuri, 47NEWS]

Photos of JAXA’s origami space shuttles

08 Oct 2008

Oriplane, paper shuttle --

Japanese precision machinery manufacturer Castem has sent nine origami space shuttles to the Japan Aerospace Exploration Agency (JAXA) office in Houston, it was announced on October 7. If all goes as planned, the paper planes will conduct experimental flights from the space station to Earth early next year.

Oriplane, paper space plane --

The 29-gram (1 oz) origami shuttles, which measure 38 centimeters (14 in) long and 22 centimeter (9 in) wide, are made from lightweight but durable sugar cane fiber paper that has been chemically treated to resist heat and water. Developed by JAXA and the University of Tokyo, the special paper has already been used to construct a miniature prototype shuttle, which was tested in a hypersonic wind tunnel in January. In that test, the prototype survived wind speeds of Mach 7 (8,600 kph/ 5,300 mph) and temperatures of around 200 degrees Celsius (nearly 400 degrees Fahrenheit).

Oriplane, paper spacecraft --

If NASA approves, the Space Shuttle Discovery will carry the origami planes to the International Space Station (ISS) in February 2009. JAXA astronaut Dr. Koichi Wakata, who will be living aboard ISS when the origami planes arrive, will carry out the experiment from the Kibo Japanese Experiment Module. It is yet to be decided whether Wakata himself will throw the paper planes or whether he will use the space station's robotic arm.

In either case, JAXA estimates it will take two days for the origami shuttles to complete the 400-kilometer (250 mi) journey from ISS to the planet surface.

Oriplane, origami space shuttle --

A message printed beneath the wings identifies the plane, explains that it has completed a return journey from the space station, and requests the finder to contact JAXA. The message is printed in 10 different languages, including Japanese, English, Chinese, Hindi, and Arabic.

Japan Origami Airplane Association chairman Takuo Toda, a strong proponent of the experiment, says he hopes the test flights will help engineers develop new types of lightweight spacecraft in the future.

[Source: Asahi // Photos: Oriplane]

Stretchable circuitry for soft machines

13 Aug 2008

Stretchable electronic circuit -- In a technological advance that opens up new possibilities in the fields of robotics and wearable computing, researchers at the University of Tokyo have developed a stretchable, rubbery material that conducts electricity and can be incorporated into electronic devices.

The researchers -- led by assistant professor Takao Someya of the University of Tokyo -- were able to create elastic electronic circuits that could be stretched up to 1.7 times their original size without affecting performance, thanks to conductive wires made from a new carbon nanotube-polymer composite they developed.

In recent years, scientists have made advances in blending carbon nanotubes (good conductors of electricity) with polymers to make flexible conductive materials, but success has been limited because nanotubes tend to cluster together, causing the composite to harden when too many nanotubes are added. The University of Tokyo researchers were able to overcome this hurdle by mixing the nanotubes with an ionic liquid containing charged particles that keep the nanotubes evenly distributed and prevent them from clumping together. The result is a stretchable material that conducts electricity more than 500 times better than other commercially available carbon nanotube-polymer blends.

With the list of potential uses of stretchable electronic circuits limited only by the imagination, the researchers envision applications ranging from high-tech suits that enhance athletic performance and monitor the wearer's physical condition, to soft machines with flexible mechanical parts. For robots, elastic electronic circuits will enable layers of soft, sensor-laden skin to be stretched tightly across the curves of their bodies, giving them both a more lifelike appearance and greater sensitivity to touch.

The research results were published in the online edition of Science (August 8).

[Link: Yomiuri]

See also: Robot beauty goes skin-deep