Live theatre performance in XR (extended reality), which includes both Virtual Reality (VR) and Augmented Reality (AR), is a rapidly growing field of digital drama that sits at the convergence of immersive theatre and interactive games design. While a number of theatre companies have adapted their existing work for VR, (notably the Royal Shakespeare Company with Dream Online in March 2021), our research project, Mrs Nemo XR, aims to create new works of musical theatre specifically written, designed and built to utilise the interactive and immersive features of this exciting new performance medium.
Mrs Nemo XR
Our initial project, a short-form immersive musical — which we term an ‘Immersical®’— is inspired by an episode from Jules Verne’s classic Victorian adventure story, Twenty Thousand Leagues under the Sea (Verne, 1, featuring the attack of Captain Nemo’s Nautilus submarine by a giant sea creature, the mythical ‘Kraken’. Adding a contemporary twist to the tale, the story is retold from the perspective of a mysterious female character in a Victorian bath chair, who appears at the outset to be Nemo’s wife, but who is later revealed to be a mermaid, one of the many marine creatures that Captain Nemo has collected for the undersea museum which we use as the setting for the show.
Although we draw on Verne’s original story for the undersea environment and basic plotline, we have devised our own narrative to tell the tale from an alternate viewpoint; that of one of the sea creatures whom Nemo has captured. As the audience members join the show, they find themselves embodied as individual avatars of deep-sea divers, who have been transported into the submarine, and invited by the curator (Mrs Nemo) to explore the gallery displaying the museum artifacts. As Mrs Nemo tells her own tale through improvised dialogue and song, the audience can interact with the actor by talking to her avatar and examining in 3D some of the items in the museum.
Staged on the browser-based online social VR platform Mozilla Hubs, accessible either via desktop computer or VR Headset, Mrs Nemo XR is performed by a solo actor working remotely from home, singing live to pre-recorded music tracks. We are limited to one live singer, due to audio latency issues across the network which prevent multiple performers singing in sync. The show also has a short run-time of ten minutes due to concerns of how prolonged use of VR may adversely affect actors and audience members, especially new users of the technology.
Our initial trial run of twelve performances with a live audience began in October 2021 and was staged in collaboration with an informal collective of virtual theatre-makers from around the world under the title OnBoard XR. While the whole event shared an underwater-themed virtual environment, each team designed their own show, including the set, props, avatars, and individual backstage cueing system. This article details the technical, design and production challenges we encountered while making Mrs Nemo XR, with specific reference to the narrative design and scenography of the production.
This virtual theatre production, as with a typical physical theatre company, has a creative team composed of Producer, Writer, Director, Designer, 3D Artist and Performer; all theatre practitioners and creative technologists, several of us conducting research at Digital Creativity Labs, at the University of York, UK. Throughout the four-week production period in the autumn of 2021, team members worked remotely from different locations across the UK and the USA. Pre-production meetings, rehearsals and performances all took place online, managed through a combination of VR, video conferencing and text and audio messaging. Many of the sessions were intense early morning or late-night events, due to the complexity of distance working across time zones.
Interactive Narrative Design
Mrs Nemo XR, aims to engage the audience with the narrative, while giving them the freedom to explore the immersive submarine environment and interact with the performer and each other. Without a traditional framing device, such as a proscenium arch or cinema screen, we rely on alternate ways of directing audience focus, such as guided direction and environmental storytelling. Our audience, who were represented by avatars of deep-sea divers, were actively encouraged to be part of the story-world, through improvised dialogue and narrative song, and the way in which the actor directed their attention to the interactive props, such as the toy Kraken that could be picked up, moved and resized, and scenic effects like the sudden and dramatic appearance of the giant sea monster outside the viewing window of the sub.
One of the main technical issues we encountered when using Mozilla Hubs with our Meta Quest 2 (previously the Oculus Quest 2 VR headsets), is the limited audio capability afforded by this form of Web-based VR (a VR platform that is easily accessible on multiple devices via the Internet). While we did have the capability for spatial audio, we elected instead to have our performer’s voice override this feature, so that she could be clearly audible, regardless of position in relation to the listener. Since we knew that audio and visual latency would be problematic, our actor ran the music tracks from her laptop, whilst performing in a VR Headset and manipulating her own avatar, thus syncing all her actions at the point of transmission.
The scenic cues, including the change of avatar for Mrs Nemo for the final reveal that she is a mermaid, and the attack of the Kraken, were also subject to delay of operation across the internet, and so were run remotely by the backstage crew to avoid distracting the performer by the desynchronised scene changes. This also enabled the production team to make live adjustments during the show when latency of operation necessitated the removal of used props from previous scenes or by ‘respawning’ the placement of set pieces that had entered the scene at the incorrect size or location.
The inspiration for set design, props, avatars, and interactive artifacts were both derived from Verne’s original story, and from the writer’s scripted dialogue and lyrics. From the book it was clear that while the Nautilus did have a library and salon with a large viewing window, there was no dedicated space for storing or displaying the Captain’s undersea collection. This led to the idea that the salon should become a museum, its walls filled with books and artifacts relevant to the story. Using Mozilla Hubs’ custom editor, Spoke, which allows developers to create 3D worlds with their own designs or pre-made assets, we built the submarine interior, and using a bespoke plugin Stage Management System, we were able to program cues to make scenic objects, props and avatars appear, animate, and change size and position within the scene.
Working in Mozilla Hubs
In order to perform well across a wide-range of devices, Hubs has certain operating limitations, such as the inability to move or ‘spawn’ objects easily, and a maximum scene size of 16MB, which resulted in our immersical taking on a low-poly (an art style where the number of polygons in a 3D model are reduced to give a low quality appearance to the resulting graphics. This style has the benefit of being efficient and easy to optimise)
style of graphics. Because of internet latency, we could not accurately predict how the scene would appear on the various viewing devices that would be subject to differing connections, internet speeds and locations. For example, we aimed to trigger the Kraken’s appearance outside the viewing window, as a shocking reveal to the audience, especially for those in VR who would see a gigantic monster attach itself to the side of the submarine. However, the physical distance between the stage manager in York (UK), the performer in New York (US) and the worldwide audience, meant that the significant delay between cue and action led to some audience members, depending on the efficacy of their own devices and connection, experiencing a slight delay between what the actor was saying and what the scene was showing.
Set Design and Layout
Our initial design plan for Mrs Nemo XR was to divide the submarine into smaller clustered areas to direct audience movement, (see Figure 3) to where the action in the scene was taking place. We particularly wanted to guide the audience towards the window to provide a good view of the Kraken’s sudden appearance outside the submarine. During rehearsals however, we realised that the audience tended to cluster around the performer and follow her around the scene. This led us to strip back some of the central blocked-out areas to provide a clear line of sight throughout the submarine interior, so that the performer and the action could be seen at all times.
In spite of the technical limitations of the Mozilla Hubs platform, in particular the external cueing system, the low-poly graphics, and the audio/visual latency experienced, the show was well received and we felt we had achieved our central aim of creating our first fully functioning immersical. Through the rehearsal process and by observing the performances, we established methods of directing audience focus and promoting their engagement with the narrative, in three key non-verbal ways. Firstly, through the use of scenic devices like the viewing window, to frame the attack of the Kraken, and secondly, in spite of internet latency, we were able to sync actions to the music, cueing movement of scenery and interactive props at appropriate times. Thirdly, we found that guided direction from the performer (e.g. gestures, movement, following her gaze and use of interactive props), supported audience attention and engagement with the story-world.
Our research continues with a new iteration of the show, where we will be using a different VR platform, and extending the length of the performance, working with innovative techniques for reducing audio latency. We also intend to offer greater user agency and interactivity, and explore different styles of narrative design including non-linear and object-based storytelling. We see a great future for live performance in XR, particularly in terms of ease of access and affordability, and look forward to our next collaboration within the rapidly expanding community of VR theatre-makers and their virtual audiences.
RSC Dream online. Audience of the Future Live. (n.d.). Dream. [online]
RSC Dream online. Latest Press Releases | Royal Shakespeare Company.
Mrs Nemo XR – performance video #OnBoardXR [SHOW3.3] Live Short Series in WebVR.
Verne, J. Twenty Thousand Leagues Under the Sea. (first published in English, 1872). Project Gutenberg.
Mrs Nemo XR Creative Team: Director: David Gochfeld, Writer: Mary Stewart-David, Designer: Daniel Lock, Exec Producer: Cristobal Catalan, 3D Artist: Guy Schofield, Performer: Vivian Belosky
OnBoard Stage Management System developers: Roman Miletitch, David Gochfeld, Clemence Debaig, Michael Morran.
Digital Creativity Labs & Department of Theatre, Film, TV and Interactive Media, University of York
2016 Story+ retured to Brisbane Writers Festival (BWF) for its fourth year with internationally renowned award winning writer and digital creator Kate Pullinger and Google Crea...
WALLPAPER is an atmospheric and interactive work of short fiction written and created by Andy Campbell and Judi Alston. It premiered as an installation at Bank Street Arts gallery ...
The Writing Platform is looking to commission articles of approx 750 to 2,000 words on creating stories with augmented and virtual reality technologies. This might include the use ...
There is a statue in Paddington Station: A trench soldier, with a scarf around his neck and a letter torn open in his hands. His lips curve in a smile or a grimace. I have seen him...
We talk to Ian Forrester, a Senior Firestarter at BBC R&D, who has been developing innovative adaptive podcasting technology and working on its potential applications alongside...
Recently, my collection of flash fiction, The Paradise Project, was published simultaneously as an ebook and in a book-arts edition using technology that would have been famili...