DART is designed to support rapid prototyping of AR experiences that use see-through displays (either transparent displays or video-mixed displays) to overlay graphics and audio on a user's view of the world.

DART is built as a collection of extensions to the Macromedia Director multimedia-programming environment, the de-facto standard for multimedia content creation. DART is designed to leverage the power of Director and assumes that developers are familiar with Director.

Our long-term goal is to enable designers to rapidly develop and test their AR experiences, in the same environment that will be used to deploy the final experience. This last point is critical; while our research is focused on supporting early design activities, designers using DART can gradually evolve their prototypes as they see fit. Polished content can be mixed with crude content, elaborate narratives and complex behaviors can be tested as desired, and changes to “complete” experiences can be rapidly prototyped.

DART's functionality is provided by a set of extensions (Xtras, casts, and behaviors) to the Macromedia Director (version 8.5 or higher) multimedia authoring system. Through the DART extensions Director becomes a platform for coordinating the whole AR experience: 3D objects, video, sound, and tracking information.

DART supports the streaming of live video into Director's 3D world, real-time tracking of markers in the video stream (currently via the ARToolkit), and real-time steaming of data from a wide range of trackers and sensors commonly used in VR and AR (via the VRPN sensor package). Distributed shared-memory objects can be used to coordinate between two or more computers (based on facilities provided by VRPN). Because DART communicates with sensors and between multiple processes using VRPN, DART programs can be easily integrated with programs written in other languages.

Virtually all of DART's functionality (beyond the low level functionality described in the previous paragraph) is provided by the interpretted scripts in the DART behavior palettes, and is thus editable by the application developer. It is not our intention to provide a complete collection of behaviors that satisfy the needs of all AR application designers; such an effort would be doomed to failure. Rather, these behaviors are designed to be modular and extensible, and also to provide a framework for interactive, event-based applications that can be easily appropriated by the designer for their own needs.

The DART system consists of

1. A Director Xtra to communicate with cameras, the marker tracker, hardware trackers and sensors, and distributed memory. Information from all sources is made available to Director via direct callbacks into Lingo, Director's scripting language.

2. A collection of director behavior palettes that contains drag and drop behaviors for controlling the functionality of the AR application, from low-level device control to high level actors and actions. The designer creates an experience by defining new object-sprites in the score, dropping appropriate behaviors on the sprites, and adding multimedia content in the form of external casts containing video and audio. The designer can also write new behaviors or additional scripts.

Detailed information on how to prepare the video and audio content, attach tracking devices, and script the experience is contained in the DART co-web (NOTE: co-web is not currently running). The DART files can be downloaded from this site.