Progress Report II – AR for Participation with Outdoor Art

artExplore Project Progress Report II:

Summary

Our Argon2 application called artExplore is developing towards our goal of supporting interactions with art or images of very large size in an outdoor setting. In this document we describe the evolving vision for the app, and report our proof-of-concept milestones currently working and in progress. In addition, we lay out our goals for the remaining time in the semester.

Current Deployment

The system is deployed at http://artexplore.aws.af.cm (not a typo, actually ‘cm’) using a Node.js backend, MySQL database, and multi-image-tracking filesets generated with Qualcomm Vuforia. The code for the current deployment is present at https://github.com/indivisibleatom/artExplore.

Demonstration Materials

Sub-image Multi-image Tracking Method, by Anshul.

http://www.youtube.com/watch?v=qxnyeYdej_0

Server Backend and showing comments based on estimating area of interest using multi-image tracking, by Mukul.

http://youtu.be/fmTIUaQtDes

Showing and Hiding Comments With Click Interaction, by Andy.

http://www.youtube.com/watch?v=sYyQuxHymRk

Content Creation Desktop App, by Mukul.

Progress Report I – AR for Participation with Outdoor Art

Brief overview:

We are running a bit behind schedule due to certain issues revealed in prototyping, requiring us to try out different alternatives as opposed to our initial plans. As we see it,  should not affect the overall deliverable time-schedule, but we might have to rethink the user interaction so as to provide a best-possible experience in line with technical considerations.

Tasks laid out: 

  1. Prototype/explore registration issues, gather media, begin IRB approval process for User Studies.
  2. Registration of images in lab setting (backend, client side code, image placement). Meet with stake-holders for interaction ideation and wants/needs analysis, level of detail viewing.

Tasks completed: 

  1. Prototype/explore registration issues, gather media.
  2. Registration of images in lab setting. Meet with stake-holders for interaction ideation and wants/needs analysis, level of detail viewing .

Prototypes: http://cc.gatech.edu/~msati3/argon/multiMarkers/basic.html (multiple image markers simultaneous tracking) and http://cc.gatech.edu/~msati3/argon/frameMarkerTransform/index.html (explorations into using the rotation angles of the phone to determine viewing frustum in case of absence of frame-markers). As things stand, we believe we will be able to get the desired interaction we want using a multi-image target, as demonstrated by the prototype @ http://cc.gatech.edu/~msati3/argon/multiImage/basic1.html. Initial mural media @ https://www.dropbox.com/sh/epslvenxgcm4xip/HSodB1c-tP. Initial sync up with people involved scheduled.

Tasks not completed/still in progress:

IRB approval – pushed back till prototyping complete.

Risks update:

1.   Prototyping has informed us that the primary risk is figuring a way to track a large image target, which might not be completely in view all the time. We have approached this problem via subdivion of the target, sensor fusion, and presently, using a single multi-image target. We need to nail this work-item before proceeding.

Video:

 

Project Proposal – AR for Participation with Outdoor Art

AR for Participation with Outdoor Art

Group Members: Andy Pruett, Anshul Bhatnagar, Mukul Sati

Abstract: The central idea of this project is to investigate how AR technology can be deployed to support interaction with outdoor public art. Public art and street art may often be perceived as objectionable, may be temporary or experimental, and often promotes discussion and debate highly tied to the site of the art installation. We propose that an interactive viewing system for showing and exploring public art in its real setting by using augmented reality tools like Argon can serve as a platform for this discussion if it is designed in a way that is usable, meets the needs of the most interested parties, and is widely deployable.

Project overview:

Our project will involve the creation of a web-based application that allows for and outdoor AR experience which allows for user interactions with public art. Salient features of our proposal are:
a) Leveraging Argon, provide a web-based AR experience allowing users to view public art such as murals at augmented locations. Also look into leveraging geo-spotting for remote-viewing. Thus we provide a grounding Virtual Environment for user interactions.
b) Investigate and facilitate different interactions between users and outdoor augmented spaces, including, techniques such as dynamic level of detail of augmentation based on distance from tracked site. While enabling such interactions, we hope to explore vision-based tracking techniques, which we’ll also employ in c).
c) Leveraging social collaboration, present an evolving view of the art, consuming and serving content such as multiple artwork photographs taken by people, tags and comments, showing these elements in context, in a coherent way. There is potential for spatial information filtering and visualization techniques to be explored in this contextual setting. Temporal tagging and subsequent retrieval is also possible for evolving art-work. (As an aside, these techniques are equally relevant to several other AR applications, and, if designed appropriately, this component can function as a generic framework – we plan to keep this as a design goal).
d) Working closely with identified stake-holders/users over the course of the project, actively engaging in participatory design.