Future Storytelling
Summer 2018 | Developer![](https://freight.cargo.site/t/original/i/47500ede429568f3fd20f07b8ebd5a9ef4c7c6565a30c4802092c1b1dda58583/qtgLveAQ.jpeg)
Audience members are given staffs (monopods) with a Galaxy S9+ running the app we developed in Unity. Some are given subpacs (wearable audio backpack) which provide an addtional layer of immersion, and others are given headphones. They are separated into three groups, led by actors who guide them. While all participants share the same world - that is, the same AR elements appear in the same physical location for everybody - additional audio and visual elements are selectively rendered for each group, and each group learns a different ‘power’ in their track. Fiducial markers unlock AR elements while also providing tracking information. The three groups come together in the final act of the experience to work together and combine their powers.
I worked principally as one of four members of the AR Software and Systems group, developing the technology for the performance using Unity. In addition to general development & testing of the app, some of my specific contributions include: a control infrastructure for cueing audio/visual elements, developing custom fiducial markers that provide tracking for Google ARCore, and various custom interactive elements (i.e. ‘powers’) such as phone shakes to create simulated earthquakes by triggering low-frequency rumbles from on-stage speakers. I also worked as part of a user experience group to find HCI metaphors which bridge the story world with the realities of our technical prototpe- what does it mean in the story when an audience member’s phone loses tracking? How can we intuitively introduce the capabilities and limits of AR to audience members who may have no prior exposure?
Audience members share a collaborative AR environment through their phones:
Additional AR elements are used to augment physical action on-stage, such as this particle system dancing around actors in a particularly dramatic moment:
![](https://freight.cargo.site/t/original/i/c7fff3000268d0675b79fc784610b8aabc87ed3dc4e48a958d9bf81e0800d6a7/3ZaE419A.jpeg)
![](https://freight.cargo.site/t/original/i/ccfa48ad7268cbf5c3425ac664e91efc606193e2df82551d47a8028b20af1f0c/DC3mmtng.jpeg)
![](https://freight.cargo.site/t/original/i/02af5129ea402797fed7999f5b094050c945e7ab3a661155a07fa8336f95d27c/xqE34GLQ.jpeg)
Large scale projections are provided from desktops that are a part of the same AR environment as the phones, but selectively render more elements to provide further immersion:
![](https://freight.cargo.site/t/original/i/54fd1dc49d533cee5a9428adccd906900aedbff834f2c43b35d61ec2016729ca/nvMldCgg.jpeg)
Custom fiducial markers can be seen scattered around the stage :
![](https://freight.cargo.site/t/original/i/55d9673edd4d9bb4c675a5c75aeeb145bc175a3e1bb272efdaebaf7b01370638/ZP3bTy3g.jpeg)
![](https://freight.cargo.site/t/original/i/568a3a9e8eb9514ac85a9cfe495d8baf70c1e2e94664bc8c5523fcc624c0c6de/ylCQGWGw.jpeg)
![](https://freight.cargo.site/t/original/i/0bccfcd11c5c090c3626da7f0db2259dce5acc9de38d048701c2a73281f9eeca/DfegtyIg.jpeg)