Creating an Immersive Virtual Environment

Concept
Co-creator Laura Rodriguez and I aimed to create a room for a virtual Spielberg Museum to resemble Carol Anne’s room in Poltergeist the movie. Design of the room included interactive elements such as lights that change with proximity, a TV that turns on and plays a clip from the movie when you come close to it, moving objects in a cyclone, the picking up and releasing of objects, and shattering a window on touch.

Methodology
Our 5-week plan included dividing and conquering various different interactive and animated elements. Laura dived into various iterations of animating and shattering techniques of glass for our window in both Maya and Unity while I sourced scripts to move the objects in a cyclone in Unity and created and textured the walls in Maya. We both sourced 3d Models for the static room objects (bed, wall decals, shelves, door, exit sign, window curtains, tree, and little bunny model) through Turbo Squid and various other free 3D models. We also sourced 3d Models in this way to modify in Maya and Unity such as the lamp, records, horse, and television. When you get close to the television the lights turn off and the television plays a video clip of the Poltergeist movie in which the objects circle in Carol Ann’s bedroom. This video gives context if the user has never seen the movie before. With guidance we created a script to activate the lights when you got close to the TV, this also activated the TV and the circling of the objects. We created hands that were able to grab the objects and conducted many tests to determine how large the collision boxes needed to be on the objects before they were grabbable as they passed the user. The speed of the cyclone also needed alteration (and perhaps still does) so the user can grab the objects. To simulate a museum experience, once an object is grabbed, the user hears descriptive information about the movie or disturbing information of the actors who played the characters in Poltergeist.

Discoveries and Looking Ahead
The biggest challenges we faced during this process included our learning curve when using Maya and also creating scripts for the interactions. Some intricacies in using Maya’s software made it difficult to transfer objects from Maya to Unity and it was easy to miss one little click which would be the determining factor for something working. Similarly, creating the scripts required a problem solving brain and although I have some understanding of coding, we had to get some more in depth help to really get some of these interactions working. The most difficult was when we wanted multiple things to happen at the same time like proximity to the TV results in the lights switching off and the objects beginning to circle (which turns on being able to grab the objects). The more if/then’s you added, the more complicated and the more one thing began to affect the next. I think this factored into the shatter of the window and the video clip playing on the TV. At one point the window was shattering into pieces on the floor, and as you see in this video, the window is one big piece of glass which I think is a result of some of the code mentioned above. Similarly, the clip on the TV is distorted in this final video when it was originally working before this script was implemented.

Even with all these challenges, our previous experience of choreography, movement, and creating performative spaces made it second nature for us to imagine the possibilities of what things could do in the virtual space. Turning lights off, triggering sound, and the flickering of a TV combine to create an immersive experience in the virtual world. Just like live performance, the audience is able to suspend their disbelief further when you nuance light, sound, and interaction. The magic is in the details. Having a more in depth knowledge of how these worlds are created gives me insight into the possibilities when used in performance and how live and virtual performances can live in the same spaces together.

Advisor: Shadrick Addy ACCAD

Digging Augmented Reality

Concept
My co-creator and I, Laura Rodriguez/LROD, aimed to create an Augmented Reality application used as an educational supplement to Laura Dixon Gottschild’s Digging the Africanist Presence in American Performance: Dance and Other Contexts.

Methodology
We chose ten pages that we would augment in the book. Each being accountable for five, we discussed ways that would most effectively visualize the textual content. We also were looking at different ways to activate our selected pages resulting in a blueprint containing a motion capture example of poly-rhythm, several reliefs of video examples of the pieces discussed in the book with sound and including the capability to stop and restart the video with a button, one relief picture of a definition with sound to clarify the pronunciation of the word ephebism and one relief of a black and white picture that changed to color.

Discoveries and Looking Ahead
As we collected assets and artifacts for the project (i.e. video of Pearl Primus, a video of Earl “Snake Hips” Tucker, a photo of Dayton Contemporary Dance Company, etc.) we realized this project would run into copyright issues. This was also brought up in our first user test and demonstration during our Grad Day Showing in OSU’s Dance Department. However, using materials without permissions to create a proof of concept was successful. We also have discussed converting the app for cross platform use as it is currently only Android based.

In addition, Vuforia only recognizes images not text unless graphically designed, so using Gottschild’s “Digging” posed some problems in that the content that we wanted to highlight only had text on the pages and all the photos for the whole book were located in the center. For the pages that didn’t have imagery, we used found photos and created inserts to the parts of the book. Although effective, it was most satisfying when the photographs printed on the books pages came to life. If designing an educational tool for augmented reality again, I would prefer to work with the writers as the book is being designed to keep Augmented Reality in the design process creating a more cohesive visual and educational experience.

Advisor/Instructor: Shadrick Addy ACCAD