ACM SIGGRAPH 2003 Symposium on Interactive 3D Graphics
Submission date was 5pm PST, 25 October 2002. Submissions are closed!
We received a record number of submission for the symposium, 102 paper submissions! The previous record for I3D was 95 papers submitted in 1994 for I3D 1995, also in Monterey.
Call for Participation
28 - 30 April 2003
Monterey Marriott, Monterey, California
Sponsored by ACM SIGGRAPH.
We are seeking Paper, Panel, Live Demo & Video submissions for I3D!
Submission date is by 5pm PST, 25 October 2002
This symposium has historically been the leading-edge conference for all aspects of real-time, interactive 3D computer graphics. While the title of our symposium has remained the same, our focus has often changed to highlight the hot research directions of the day. In 1990 at Snowbird, we stated the following:
The purpose of the symposium is to look at what research groups are doing with their high-performance, real-time, interactive graphics systems, to find out what special purpose architectures are on the drawing board, to discuss what are the most user-friendly paradigms for interaction with such systems, and to learn what applications are still waiting for an appropriate 3D interactive system.
Today graphics hardware is a commodity item, but programmability of that hardware opens new and interesting, yet unknown possibilities. User-friendly paradigms are on our desktop, but not in the virtual environment. The computer game industry is now larger than the motion picture industry and much of that work is impressive, but we are not yet running Toy Story in real-time 3D.
Things have changed since the start of this symposium series, but the core technological exploration embodied in it is still interactive 3D graphics - and it is still an active area. With this call for participation, we express that exploration in terms of todays research problems, and provide a categorization for what we have been examining.
Focus for the Symposium on Interactive 3D Graphics
The Symposium on Interactive 3D Graphics will focus on research and application for the real-time, interactive 3D domain including the following broad areas:
- Interactive 3D visual display
- Networked interactive systems
- Human computer interaction
- Technologies for immersion
- Computer generated autonomy
Interactive 3D visual display
- Algorithms and systems for interacting with and managing large and complex data; representations; game engines;
- Interactive model-building tools - shaping, building, sculpting; interactive assembly and manipulation of systems of parts;
- Exploitation of programmable graphics engines;
- Languages, APIs, and tools.
Networked interactive systems
- Software architectures for large-scale, media-rich, interactive, networked interactive systems. Interoperability, scalability & dynamic extensibility.
- Interactive systems distributed over local and wide-area networks;
- High bandwidth networks - experimentation and utilization of next-generation Internet technologies for large-scale, 3D interactive systems.
- Wireless - handheld interactive 3D devices.
Human computer interaction
- Interaction techniques for 3D systems, including domain-specific interaction methods.
- Innovative human-machine interface paradigms for navigating, working, and playing in complex, real-time graphics environments, including virtual worlds, Web-based systems, computer games and visualization systems;
- Perceptual and psychological issues in multimodal interaction and operation in complex virtual spaces;
- Multimodal interfaces, task analysis, spatial orientation and navigation, performance evaluation, interaction techniques, interaction devices, virtual ergonomics, usability engineering, training transfer, human perception.
Technologies for immersion
- Image generation - real-time, high-performance architectures for the generation of complex imagery; rendering on clusters; handheld and body-worn devices;
- Novel display technologies; driving displays from clusters, multi-projector display systems.
- Tracking - technologies for tracking human participants in virtual environments; avatar control.
- Full sensory interfaces - technologies for providing a wide range of sensory stimuli: visual, auditory, olfactory, and haptic.
- Novel sound systems the generation and delivery of both interactive and recorded media. Spatial sound. Immersive sound & psychoacoustics.
Computer generated autonomy
- Human representations & models - avatars that look, move, and speak like humans.
- Computer-generated characters technologies for providing animations and behaviors. Technologies that provide the characters adaptability and learning.
- Interactive computer-generated story; story and drama engines; game AI.
The symposium will consist of formal paper sessions, panels and hands-on demonstrations where research groups and vendors will show the state-of-the-art in the field.
Panels We Wish to See Submitted
- What is the state of the art in interactive, computer-generated story?
- What is the future for interactive, networked entertainment?
- Interactive education technology that could make a difference
- The web in our pocket - what changes when we have sufficient storage to hold the Library of Congress in a package the size of a cigarette case?
- Do we want to play 3D games on our cell phone ever?
- Will the web ever be 3D? Will we ever achieve cyberspace?
- SIGGRAPH or SIGSENSES - should the focus for our special interest group be broadened?
Paper, Panel, Live Demo & Video Submissions Deadline:
5 PM, PST, October 25, 2002