SIGGRAPH Emerging Technologies: Breakthroughs in Haptics, Robotics and Gaming

CHICAGO — (BUSINESS WIRE) — May 18, 2009 SIGGRAPH 2009's Emerging Technologies presents innovative technologies and applications in many fields including alternative displays, robotics, input interfaces, gaming, audio, haptics/VR, and experimental sensory experiences.

Presented in a combination of curated demonstrations and juried interactive installations, a minimum of 29 of the more than 100 international juried submissions were selected and will be on display and available for interaction with attendees in New Orleans this summer.

“These installations showcase how technology and computer graphics might soon be enhancing the average person's everyday work and life,” stated Manabu Sakurai, SIGGRAPH 2009 Emerging Technologies Chair. “From helping those with physical challenges to improving the entertainment experience, Emerging Technologies offer a unique look into the future at how complex technologies can have a major impact."

Following are only some of the highlights of this popular venue [high resolution images and video are available].

Sound Scope Headphones

Masatoshi Hamanaka, SeungHee Lee - University of Tsukuba

The Sound Scope Headphones let users control an audio mixer through natural movements, and thus enable a musical novice to separately listen to each musical instrument independently during a group concert.

Potential Future Use:

The Sound Scope Headphones will allow a novice user to control different levels of musical pieces in a way that until now has only been available using state of the art commercial equipment. For example, when listening to jazz, one might want to clearly hear the guitar while also eliminating or reducing the sound of the sax.

The UnMousePad - The Future of Touch Sensing

Ilya Rosenberg, Ken Perlin, Charles Hendee, Alexander Grau, Nadim Awad, Adrian Secord, Merve Keles - New York University; Christian Miller - University of Texas - Austin; Julien Beguin - Gotham Wave Games

The UnMousePad is based on a flexible and inexpensive sensor technology called IFSR that enables the acquisition of high quality multi-touch pressure images. The core advantage of this revolutionary sensor technology is that by interpolating pressure, it allows tracking at high resolutions, using a fairly coarse grid of electrodes

Potential Future Use:

Multi-touch input has been an active area of research for over two decades and has appeared on CNN, but not on everyone's desk, computer screens, table-tops, walls and floors. This technology allows for more commercial and mainstream use of inexpensive, flexible and sensitive touch imaging technology

HeadSPIN: A One-to-Many 3D Video Teleconferencing System

Andrew Jones, Magnus Lang, Graham Fyffe, Xueming Yu, Jay Busch - University of Southern California, Institute for Creative Technologies; Ian McDowall - Fakespace Labs;

Mark Bolas - University of Southern California, Institute for Creative Technologies & School of Cinematic Arts; Paul Debevec - University of Southern California, Institute for Creative Technologies

This installation presents a 3D teleconferencing system that enables true eye contact between a three-dimensionally transmitted subject and multiple participants in an audience. The system is able to reproduce the effects of gaze, attention, and eye contact not available in traditional teleconferencing systems.

Potential Future Use:

This device will take teleconferencing to a much more personal level allowing participants to make eye contact as if they were interacting face-to-face.

Graphical Instruction for a Garment Folding Robot

Yuta Sugiura - Graduate School of Media Design, Keio University/ JST, ERATO, Tokyo; Takeo Igarashi - The University of Tokyo / JST, ERATO, Tokyo; Hiroki Takahashi - Waseda University / JST, ERATO, Tokyo; Tabare Akim Gowon - Harvard University /JST, ERATO, Tokyo; Charith Lasantha Fernando, Maki Sugimoto, Masahiko Inami - Graduate School of Media Design, Keio University/ JST, ERATO, Tokyo

This project proposes the use of an interactive graphical editor to give instructions to robots for folding garments in a household environment. This editor allows the user to specify instructions by performing simple editing operations (clicking and dragging) in order to teach the robot how to uniquely fold clothes.

Potential Future Use:

This technology provides a glimpse into the future for improving the living conditions of consumer users, or greatly improving the efficiency of industrial users.

Pull-Navi

Yuichiro Kojima, Yuki Hashimoto, Shogo Fukushima, Hiroyuki Kajimoto - The University of Electro-Communications

While many tactile navigation systems have used hands or arms, we developed a novel, intuitive, instinctive and energy-efficient walk navigation interface that "pulls the ears‚" and confirmed that users were inevitably tempted to move in the pulled direction without experiencing pain or force. The device simply is worn on the ears and leads or lightly pulls the user in a desired direction.

1 | 2  Next Page »
Bentley - Virtuosity Microstation


Featured Video
Jobs
Advanced Mechanical Engineer for General Dynamics Mission Systems at Canonsburg, Pennsylvania
Upcoming Events
Building Innovation 2024 at Capitol Hilton washington D.C. MD - May 22 - 24, 2024
Digital Construction Week UK 2024 at ExCeL London london United Kingdom - Jun 5 - 6, 2024
AIA Conference 2024 at Walter E. Washington Convention Center Washington, D.C. MD - Jun 5 - 8, 2024
Commercial UAV Expo USA - 2024 at Caesars Forum Las Vegas NV - Sep 3 - 5, 2024



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise