Virtual Events: Adventures into Extended Reality

Example of a virtual event agency in our XR studio
Event, Extended Reality

Over the last few weeks, we have been working on a really challenging and exciting new project. Our colleague, Nick, wasn’t able to work from home so was self-isolating in our studio in Bristol. This gave us a wonderful opportunity to conduct some R&D to explore the world of virtual events. We got our hands on Stype’s excellent ‘RedSpy’ system for camera tracking for Nick to tinker around with. This is the first but most important bit of a complicated integration of technology to create content in an Extended Reality XR environment.

XR is, in essence, a mix of all the ‘R’s: VR (Virtual Reality), AR (Augmented Reality) and MR (Mixed Reality). The basic premise is that you can have a presenter or actor on a sound stage and build an extended reality around them that places them in any environment. It might sound quite similar to blue or green screen shoots BUT it is very different.

When we shoot on a green screen it is a time-consuming process in post production. We take the footage from the green screen, bring it into the graphics and animation studio then ‘composite’ (cut out) the subject. We then meticulously build a world in 3D or After Effects and place our composite in there. Finally, we render the whole thing out to create a film. 

But the key difference with XR is ‘Real Time’.

Technological advancements mean that computers are exponentially faster and, in turn, the software that runs on them is exponentially better. By investing in the best hardware and software everything can happen in real time which revolutionises shoots. So, the actors can interact with their environments as the content can be shown on large LED walls and ceilings and move in parallax in relation to the camera. The lighting from the LED walls and ceiling can light the subject and everything can be caught ‘in camera’. This has been a game-changer for the film and TV industry already in shows like the Mandalorian.

What we want to do is apply this technology to the world of virtual events.

At our studio in Bristol, we have our own green screen. We got hold of camera tracking hardware and software from Stype, which allowed the camera to track in a 3D space. We then ran the environments through 3D Real-Time software such as Unreal Engine or Notch and then ‘projected’ the environments in virtual space.

Last week we decided to do show it all off and conducted a socially distanced shoot in our studio. We hired in long cables and record decks etc and Nick moved upstairs to operate a complex mix of Touch Designer and Notch, along with an array of other boxes to handle keying and recording etc.

This enabled our founder, Steve, to present downstairs with just him and a cameraman keeping at a safe distance. We hired in a dolly track to move the camera smoothly and show how the camera tracking works. This ‘awards ceremony’ look shows that working well.

We can put a presenter in any environment and ‘beam in’ anyone from around the world to interact with them.

As the environment changes, the lighting on the subject changes, making it feel as if the presenter is really there, in the virtual space. 

We can put them anywhere and they can present from any environment. We tried out a conference stage, a football stadium and the Hollywood Bowl in LA. 

You can see the camera tracking working well here. As the camera moves around Steve, the content changes in the real-time software to ‘swing’ the stadium around him. 

All this footage was captured natively in-camera with no post-production. These are the raw clips.

The next stage of the process is to shoot against LED.

In the coming weeks, we’re going to apply this to a purpose-built studio with LED corner walls and floor. This will enable the presenter to see exactly what is around them. So, for example, if we have a panel of zoom attendees, the presenter can see them on the LED wall and interact directly with them. The added benefit is that the LED wall lights the presented. So, if the LED wall is showing a desert scene of bright yellows and oranges, that light will light their face, body, props and any sets built on the stage. This also means that,  if they are wearing glasses or have some kind of shiny object there, like a car, the reflections of the LED content will reflect in the shiny surfaces in real-time. In a green screen shoot, you would only see green.

We are extremely excited to bring the latest technology of Hollywood movies to the world of events. Our clients demand the highest level of production value of us. If you are interested in virtual events, please don’t hesitate to get in touch with me on 07711131116 or jon@studiogiggle.co.uk.

Previous Article
Animation Technology of the Future- Unreal Engine
Next Article
The Filmmaker’s Guide to Shooting During Lockdown

SHARE ARTICLE

OUR NEWSLETTER

If you would like to be the first to know what we've been working on recently, then please subscribe for regular updates.

Great, that worked! We'll drop you a line soon.