The Making of an Immersical®

,

Posted filed under Experience, Featured.
   |   

Reading Time: 8 minutes

Live theatre performance in XR (extended reality), which includes both Virtual Reality (VR) and Augmented Reality (AR), is a rapidly growing field of digital drama that sits at the convergence of immersive theatre and interactive games design. While a number of theatre companies have adapted their existing work for VR, (notably the Royal Shakespeare Company with Dream Online in March 2021), our research project, Mrs Nemo XR, aims to create new works of musical theatre specifically written, designed and built to utilise the interactive and immersive features of this exciting new performance medium.

Mrs Nemo XR

Our initial project, a short-form immersive musical — which we term an ‘Immersical®’— is inspired by an episode from Jules Verne’s classic Victorian adventure story, Twenty Thousand Leagues under the Sea (Verne, 1, featuring the attack of Captain Nemo’s Nautilus submarine by a giant sea creature, the mythical ‘Kraken’. Adding a contemporary twist to the tale, the story is retold from the perspective of a mysterious female character in a Victorian bath chair, who appears at the outset to be Nemo’s wife, but who is later revealed to be a mermaid, one of the many marine creatures that Captain Nemo has collected for the undersea museum which we use as the setting for the show. 

Although we draw on Verne’s original story for the undersea environment and basic plotline, we have devised our own narrative to tell the tale from an alternate viewpoint; that of one of the sea creatures whom Nemo has captured. As the audience members join the show, they find themselves embodied as individual avatars of deep-sea divers, who have been transported into the submarine, and invited by the curator (Mrs Nemo) to explore the gallery displaying the museum artifacts. As Mrs Nemo tells her own tale through improvised dialogue and song, the audience can interact with the actor by talking to her avatar and examining in 3D some of the items in the museum. 

VR Platform

Staged on the browser-based online social VR platform Mozilla Hubs, accessible  either via desktop computer or VR Headset, Mrs Nemo XR is performed by a solo actor working remotely from home, singing live to pre-recorded music tracks. We are limited to one live singer, due to audio latency issues across the network which prevent multiple performers singing in sync. The show also has a short run-time of ten minutes due to concerns of how prolonged use of VR may adversely affect actors and audience members, especially new users of the technology.

Performances 

Our initial trial run of twelve performances with a live audience began in October 2021 and was staged in collaboration with an informal collective of virtual theatre-makers from around the world under the title OnBoard XR. While the whole event shared an underwater-themed virtual environment, each team designed their own show, including the set, props, avatars, and individual backstage cueing system. This article details the technical, design and production challenges we encountered while making Mrs Nemo XR, with specific reference to the narrative design and scenography of the production.

Performer singing in a VR Headset, while moving her own avatar in VR

Creative Team

This virtual theatre production, as with a typical physical theatre company, has a creative team composed of Producer, Writer, Director, Designer, 3D Artist and Performer; all theatre practitioners and creative technologists, several of us conducting research at Digital Creativity Labs, at the University of York, UK. Throughout the four-week production period in the autumn of 2021, team members worked remotely from different locations across the UK and the USA. Pre-production meetings, rehearsals and performances all took place online, managed through a combination of VR, video conferencing and text and audio messaging. Many of the sessions were intense early morning or late-night events, due to the complexity of distance working across time zones.

Interactive Narrative Design

Mrs Nemo XR, aims to engage the audience with the narrative, while giving them the freedom to explore the immersive submarine environment and interact with the performer and each other. Without a traditional framing device, such as a proscenium arch or cinema screen, we rely on alternate ways of directing audience focus, such as guided direction and environmental storytelling. Our audience, who were represented by avatars of deep-sea divers, were actively encouraged to be part of the story-world, through improvised dialogue and narrative song, and the way in which the actor directed their attention to the interactive props, such as the toy Kraken that could be picked up, moved and resized, and scenic effects like the sudden and dramatic appearance of the giant sea monster outside the viewing window of the sub.

Technical Challenges

One of the main technical issues we encountered when using Mozilla Hubs with our Meta Quest 2 (previously the Oculus Quest 2 VR headsets), is the limited audio capability afforded by this form of Web-based VR (a VR platform that is easily accessible on multiple devices via the Internet). While we did have the capability for spatial audio, we elected instead to have our performer’s voice override this feature, so that she could be clearly audible, regardless of position in relation to the listener. Since we knew that audio and visual latency would be problematic, our actor ran the music tracks from her laptop, whilst performing in a VR Headset and manipulating her own avatar, thus syncing all her actions at the point of transmission. 

The scenic cues, including the change of avatar for Mrs Nemo for the final reveal that she is a mermaid, and the attack of the Kraken, were also subject to delay of operation across the internet, and so were run remotely by the backstage crew to avoid distracting the performer by the desynchronised scene changes. This also enabled the production team to make live adjustments during the show when latency of operation necessitated the removal of used props from previous scenes or by ‘respawning’ the placement of set pieces that had entered the scene at the incorrect size or location.  

Scenography 

The inspiration for set design, props, avatars, and interactive artifacts were both derived from Verne’s original story, and from the writer’s scripted dialogue and lyrics. From the book it was clear that while the Nautilus did have a library and salon with a large viewing window, there was no dedicated space for storing or displaying the Captain’s undersea collection. This led to the idea that the salon should become a museum, its walls filled with books and artifacts relevant to the story. Using Mozilla Hubs’ custom editor, Spoke, which allows developers to create 3D worlds with their own designs or pre-made assets, we built the submarine interior, and using a bespoke plugin Stage Management System, we were able to program cues to make scenic objects, props and avatars appear, animate, and change size and position within the scene. 

Tech rehearsals in Zoom and Hubs showing the cues of the stage management system

Working in Mozilla Hubs

In order to perform well across a wide-range of devices, Hubs has certain operating limitations, such as the inability to move or ‘spawn’ objects easily, and a maximum scene size of 16MB, which resulted in our immersical taking on a low-poly (an art style where the number of polygons in a 3D model are reduced to give a low quality appearance to the resulting graphics. This style has the benefit of being efficient and easy to optimise)

style of graphics. Because of internet latency, we could not accurately predict how the scene would appear on the various viewing devices that would be subject to differing connections, internet speeds and locations. For example, we aimed to trigger the Kraken’s appearance outside the viewing window, as a shocking reveal to the audience, especially for those in VR who would see a gigantic monster attach itself to the side of the submarine. However,  the physical distance between the stage manager in York (UK), the performer in New York (US) and the worldwide audience, meant that the significant delay between cue and action led to some audience members, depending on the efficacy of their own devices and connection, experiencing a slight delay between what the actor was saying and what the scene was showing. 

Set Design and Layout

Our initial design plan for Mrs Nemo XR was to divide the submarine into smaller clustered areas to direct audience movement, (see Figure 3) to where the action in the scene was taking place. We particularly wanted to guide the audience towards the window to provide a good view of the Kraken’s sudden appearance outside the submarine. During rehearsals however, we realised that the audience tended to cluster around the performer and follow her around the scene. This led us to strip back some of the central blocked-out areas to provide a clear line of sight throughout the submarine interior, so that the performer and the action could be seen at all times.

Preliminary design of the submarine interior

Sub interior with Captain Nemo’s portrait on the wall

The eye of the Kraken at the window

Audience member as diver with props

Conclusion 

In spite of the technical limitations of the Mozilla Hubs platform, in particular the external cueing system, the low-poly graphics, and the audio/visual latency experienced, the show was well received and we felt we had achieved our central aim of creating our first fully functioning immersical. Through the rehearsal process and by observing the performances, we established methods of directing audience focus and promoting their engagement with the narrative, in three key non-verbal ways. Firstly, through the use of scenic devices like the viewing window, to frame the attack of the Kraken, and secondly, in spite of internet latency, we were able to sync actions to the music, cueing movement of scenery and interactive props at appropriate times. Thirdly, we found that  guided direction from the performer (e.g. gestures, movement, following her gaze and use of interactive props), supported audience attention and engagement with the story-world. 

Our research continues with a new iteration of the show, where we will be using a different VR platform, and extending the length of the performance, working with innovative techniques for reducing audio latency. We also intend to offer greater user agency and interactivity, and explore different styles of narrative design including non-linear and object-based storytelling. We see a great future for live performance in XR, particularly in terms of ease of access and affordability, and look forward to our next collaboration within the rapidly expanding community of VR theatre-makers and their virtual audiences.

Bibliography

RSC Dream online. Audience of the Future Live. (n.d.). Dream. [online]
RSC Dream online. Latest Press Releases | Royal Shakespeare Company.
Mrs Nemo XR – performance video #OnBoardXR [SHOW3.3] Live Short Series in WebVR.
Verne, J. Twenty Thousand Leagues Under the Sea. (first published in English, 1872). Project Gutenberg.

Credits

Mrs Nemo XR Creative Team: Director: David Gochfeld, Writer: Mary Stewart-David, Designer: Daniel Lock, Exec Producer: Cristobal Catalan, 3D Artist: Guy Schofield, Performer: Vivian Belosky
OnBoard Stage Management System developers: Roman Miletitch, David Gochfeld, Clemence Debaig, Michael Morran.
Digital Creativity Labs & Department of Theatre, Film, TV and Interactive Media, University of York

Daniel Lock (Scenographer) is an XR specialist and academic blending the lines between theatre and Virtual Reality. With an interdisciplinary background in creative technology, theatre and film, his current research focuses on investigating how Virtual Reality developers may make use of theatrical methodology to enhance storytelling, virtual environments and player experience. Daniel holds a Master of Science and is currently undertaking his PhD as a researcher with Digital Creativity Labs and the department of Theatre, Film, Television and Interactive Media at the University of York, UK.   Mary Stewart-David (Writer) is the creative force behind the Immersical®, an innovative system of narrative design and construction for multi-modal musicals staged in interactive and immersive spaces. A veteran of over twenty West End musicals, plus a similar number of regional productions in London, LA and New York, she is also a screenwriter, novelist, lyricist and composer. Trained in technical theatre at RADA and in performance at Royal Central School of Speech and Drama, she holds an MA in Directing from Goldsmiths and an MA in Film from Westminster. Currently she is a researcher with Digital Creativity Labs, University of York.

Related posts