Sean Rooney
Mixed reality (MR) refers to emerging technologies and practices that blend physical and digital worlds, allowing for the creation of new environments. This integration of real and virtual experience is made possible by advancements in computer vision, modelling, graphical processing, display technologies, input systems, and cloud computing. MR creates a user environment in which digital content and physical reality are combined in a way that allows interaction with and among real-world and virtual objects.
Mixed reality technology already has a wide range of applications across various industries. In healthcare, it is used for medical training, patient education, and visualization of complex procedures. The architecture and engineering fields employ MR for design visualization and collaboration, and the automotive industry use it for prototyping and assembly line design. In education, MR can enhance learning by providing interactive and immersive experiences. The military and law enforcement fields use MR for training simulations and situational awareness. The entertainment industry has also taken advantage of mixed reality to create unique and immersive experiences for viewers/players.
When defining their concept ‘mixed reality performance,’ Gabriella Giannachi and Steve Benford turn to early MR theorists Paul Milgram and Fumio Kishino’s RV (reality-virtuality) Continuum in order to distinguish between augmented reality (AR) and virtual reality (VR) technologies. Milgram and Kishino distinguish four versions in their “RV Continuum”:
- unadorned, unaltered reality, consisting solely of real objects.
- augmented reality, where digital and virtual data, images, and objects are superimposed or layered upon the real world.
- augmented virtuality, consisting primarily of virtual spaces with some real objects, images, and data introduced into the virtual world.
- purely virtual, which includes both immersive virtual worlds as well as those that are only monitor-based, so long as the simulations consist “solely of virtual objects” (N.p.)
While MR is sometimes referred to as a type of AR, it differs from AR in that it enables the interactivity of real-world and digital elements, placing it further along the virtuality continuum. Giannachi and Benford point out that mixed reality performances do not always fall into one part of the continuum alone. Instead, “mixed reality performances may simultaneously occupy multiple points along this continuum by combining many real, virtual, augmented reality, and augmented virtuality environments into complex hybrid and distributed performance stages” (3). The future of mixed reality in theatre is a topic of great interest and speculation in the academic and performing communities. MR technology has the potential to revolutionize the way theatre is produced, performed, and consumed, providing new and seemingly endless possibilities for creativity and innovation. One of the most promising aspects of MR in theatre is its ability to create immersive and interactive environments for the audience. The age-old theatrical imperative to create an environment which allows for the suspension of disbelief is here tied to advances in technology; as MR technology improves, it will become increasingly possible to blend the physical and virtual worlds seamlessly, allowing audiences to fully engage and interact with the performance and participate in the scene-building and storytelling.
Potentially, this level of engagement could break down traditional barriers between performers and spectators, creating a more dynamic and inclusive theatrical experience. Another exciting aspect of MR in theatre is its potential to expand the scope of production design. By creating virtual sets and environments, designers can push the boundaries of traditional stage design and create elaborate and fantastical worlds that would be impossible to build on a physical stage. While MR in theatre is still in its infancy, it has already shown great promise, as seen in the works of Rimini Protokoll. In their show Remote X, audience members are given devices with headsets and are guided through a city by a voice that tells them what to do and where to go. The experience is a mix of virtual and real-life interactions, as the audience is given tasks to perform and asked to interact with their surroundings in new and unexpected ways.
An emerging area of technology that holds promise for the development of MR is Haptics, where the sense of touch and proprioception are incorporated into the experience. While visual and auditory sensations have been the primary focus of MR development, haptic feedback is essential to creating a more immersive and realistic experience. Haptics technology allows users to interact with the virtual world through touch, adding an additional layer of realism to the experience.
The use of haptic feedback in MR is being explored in a variety of fields to enhance the quality and complexity of interface with the user. In show business, haptic technology is being used to create a more immersive performance experience. D-Box seats with haptic design elements debuted for big-screen cinema in 2009 with the movie Fast & Furious, and now there are home versions available which work with both movies and video games. Installation artists like Masayo Ave (Haptic Interface Design Institute, Berlin) are exploring multi-sensory experience using vests that simulate the sensation of water and wind. In 2015, Extant Theatre Company, a company run by visually impaired (VI) artists, debuted Flatland , an immersive theatre experience that took place in a pitch-black environment and used haptic sensory augmentation design technology to both facilitate audience navigation and “level the sensory abilities” of VI and sighted participantds As haptic technology develops and becomes more affordable, the potential applications in MR theatre will expand.
Works Cited
Benford, S., & Giannachi, G. (2011). Performing mixed reality. MIT Press.
Milgram Paul, and Fumio Kishino. “A Taxonomy of Mixed Reality Visual Displays.” Dec. 1994, https://search.ieice.org/bin/summary.php?id=e77-d_12_1321.