XR Network+ white paper reports on the outcomes of the Royal Shakespeare Company R&D challenge
News
XR Network+ has published a new white paper reporting on the process and outcomes of the Royal Shakespeare Company (RSC) Research and Development (R&D) Challenge.
Led by the XR Stories team at the University of York, XR Network+ provides funding and support to academic researchers and research technicians working in Virtual Production technologies to drive UK-wide research, development and innovation.
The latest white paper from the project presents the technical approaches, challenges and outcomes from the two R&D challenge projects:
- Centre for the Analysis of Movement, Entertainment Research and Applications (CAMERA), University of Bath: Making “all the world a stage” with dynamic motion capture solutions for theatre production.
- Centre for Performance, Technology and Equity (PTEQ), Royal Central School of Speech and Drama: Theatre virtually everywhere: A new framework for virtual rehearsal

Selected following an open call and peer review process, the projects received funding and support from XR Network+ to work with the RSC to explore how Virtual Production technologies could contribute to live performance and provide new theatre experiences for audiences across various platforms.
The paper also explores the role of the XR Stories team as expert intermediary in the R&D Challenge progress. It outlines the value of this role in managing innovation to de-risk creative R&D and achieve high-quality, relevant outcomes.
Making “all the world a stage”
CAMERA’s project explored how dynamic motion capture could transform live theatre by making the stage a responsive, interactive environment.
Instead of using motion capture solely for post-production in film or gaming, the team explored the development of real-time solutions that put performers at the centre of the technology. This approach allows actors to influence digital effects live, creating a fluid interplay between physical performance and virtual elements.
“We wanted to put the performer at the heart of this technology. Instead of saying ‘this is how the tech must work,’ we asked: how can it serve the actor and the director? Our goal was to make effects responsive—so when a performer moves or gestures, the world around them reacts instantly. That’s where the magic happens,” says Professor Neill Campbell, CAMERA Director.

“By the end of the sprint, we had a working prototype performance where one performer could control lights, projection and spatial sound all in real time using just eight tiny marker clusters,” says Michelle Wu, CAMERA Studio Engineer.
“Having that full creative + technical team co-located was honestly half the magic,” says Eva Martino, CAMERA Innovation Lead. “We were solving problems, testing ideas and translating languages in real time.”
“It’s been a brilliant collaboration, and I think we’ve opened up a genuinely exciting roadmap for where performer-led interaction can go next.”