Virtual Production Studios

 

 

Our 8th webcast where Sarah discusses Virtual Production Studios with our friends over at Digital2Physical.

 

Discussion Topics:

– What is Virtual Production

– Equipment and Software

– Virtual set vs. Augmented reality

– Fun and challenging projects from previous work

 

Sarah Marince Host @ www.sarahmarince.com

Evan Glantz Owner & Creative Director @ https://www.digital2physical.net/

Keith Anderson Lead Animator @ https://www.digital2physical.net/

Josh Spodick Technical Director @ https://www.digital2physical.net/

 

Sarah Marince:

Hello, everyone. Happy Wednesday. Welcome to crew talk brought to you by shoots.video. I’m Sarah Marince and I will be your host this evening. And today we are talking about virtual production. So we have Josh, Keith, and Evan here to tell you all about it. And as always, I have my list of questions. And if you have any questions this evening, as you’re listening, if anything pops into your head, feel free to drop it in the Q and a box. And we’ll try to answer it during our conversation or maybe a little later. But yeah, feel free to ask or comment on anything you hear today. All right, everyone. Thank you so much to our panelists for being here today. I’m excited to learn all about virtual production. Of course, of course. So I guess we should just jump right in if that’s okay with all of you.

Sarah Marince:

So other than the most recent buzzword, we seem to be hearing what is virtual production and what can we do to spice things up?

Evan Glantz:

So having here, I’m going to jump in and answer that one. So virtual production is enables for using filming traditional filming techniques, paired with a camera tracking technology that allows you to bring realistic sets and abstract concepts that you would have never really been able to build in the real world without going into super massive budgets, but allows for interactive infographics allows for presentation and also incorporates the the entire spectrum of augmented reality as well. That sounds cool.

Sarah Marince:
Anyone else wants to tag on for anything by the way, feel free to go ahead and yeah.

Keith Anderson:

Yeah, I mean, I’ve been kind of covered the basics, but it really, you know, virtual production is any time you’re bringing the, the virtual graphics, the digital graphics into a real live production, and you’re not just doing things in post, but you’re actually compositing it together all at once, basically. So it could be anything too from the weather channel, you know, overlaying the weather for your, your city on the green screen while the Weatherman’s pointing at it all the way up to high end CGI in Hollywood, or it could be as simple as me being here, I’m gonna zoom chat with a green screen background without using a green screen.

Sarah Marince:
What are some of the most popular software packages used and what can viewer start learning?

Keith Anderson:

There’s definitely a spectrum of stuff to use these days. The, probably the most popular software packages would be like unreal game engine and unity game engine. Notch is definitely didn’t use it, especially in the live entertainment sector. They all have their specific strengths and weaknesses, for example, unreal and unity are both free. So that’s obviously very attractive to most people but they

Keith Anderson:

Take a lot more knowledge and expertise to deliver your final project and to get to that and product. Whereas something like notch, that’s really more aimed at that kind of a project versus a game engine

that could be used for a lot of other things like making a full video game. Something like notch can be a lot more straightforward. You get to the final, final end product, but it comes at a much higher price tag. And especially when you started looking at using it, it with implemented with a media server like D three or disguise, you’re definitely getting into a much higher budget that may or may not be accessible to people who just want to get their feet wet. Some other kind of higher end options are pixel tope and zero density, which are basically like specialized version of unreal that makes things like camera tracking and calibration and keying your talent much more easy to manage.

Keith Anderson:

And you don’t need all the knowledge necessarily that you would need if you were doing an unreal straight out of the box. And kind of what I mentioned before, even something like OBS streaming software could be used for new users to just basically practice keying out talent and inserting them over and background. Obviously it doesn’t have all the bells and whistles, but for somebody who just wants to play around with green screen something like OBS could definitely be used and kind of to add to that, you know, everything I just mentioned is really like your final rendering engine, but you’re still gonna need to be creating the environments, creating the assets with traditional content pipelines, like cinema for the blender, my three D modeling softwares. And so, you know, when you, when there are still a lot of need for traditional content and I guess, you know, for, for people who want to get involved, most, most of the softwares I mentioned do have a free version or a trial version. I usually recommend people kind of start learning with whatever they have access to or whatever that excites them and see where it takes them. There’s not really like a one size fits all. But the good news is there’s tons of new resources. Just even in the past couple of months, a lot of new stuff coming out every day for this field. So the, the, the virtual production community is definitely like providing a lot of info for people who want to get involved.

Josh Spodick:

Wonderful. Since a lot of us are really home a lot during this time, it’s a great opportunity to take advantage of all that free content that’s being produced right now. I know most of us come from the live events industry here, I’m formerly concerts, warring, large music festivals, large corporate events. So I’d be spending 10 to 12 months a year on a tour bus or in ballrooms or in venues now kind of being home just at my computer all day. It’s an opportunity to use a lot of these free softwares, like unreal and contact. A lot of these are the companies and say, what can we make? What content can we create now that we’re homeowner computers? How can we spice this up?

Sarah Marince:

Nice. All right. So in the most basic of setups, what kind of equipment is needed to at least learn and develop virtual scenes?

Josh Spodick:

So honestly first and foremost, it requires a good computer. That’s where most of the initial investment’s going to go. A powerful gaming PC is really all you need with good, dedicated GPU. The better the camera you have, the better your image quality will be. You can do something as simple as it’s normal little webcam, or you could go up to a full cinema camera and pipe it in with capture cards into your computer. For backgrounds, you’ve got to start as simple as a scrap piece of fabric for Walmart, which I’m actually in front of a $4 piece of green fabric from Walmart right now. And a couple of cheap

lights around me just to get this set up and running, where I could actually move around in this virtual space. And all you really need to make that happen is this GPS fabric, a light, a webcam and a computer.

Josh Spodick:

Of course you could go up and up from there, the more you have the better it’s going to look as long as you always pay attention to the fundamentals, but other than green screen, you can also do large led wall backdrops. And kind of, we’ll talk about that later on in the webinar today, but the differences between green screen and led walls, but you could start low and then work your way up to the bigger and bigger things. But first and foremost, computer dedicated graphics card, some kind of camera and a piece of green fabric or the basics and start.

Sarah Marince:

Okay. And somebody actually asks, as soon as you started speaking, if all of you are sitting in front of a green screen, and you mentioned you were sitting in front of the paper, but Evan and Keith, are you also sitting in front of green screens?

Keith Anderson:

No, I do not have a green screen set up at my house. I’m just using the zoom chat virtual background. Yeah, we had originally planned to do this in our studio with a bigger green speed set up where we’d all be in one seat together. But due to COVID and other complications were all at home today.

Josh Spodick:

So sadly I may have been exposed two weeks ago. No symptoms, I’m all good, but just to keep everybody safe, we decided to stay home for a couple of weeks.

Sarah Marince:
Okay. That was the cool when you just showed, when you panned around, that was really cool.

Josh Spodick:
And this is what you could do with a cheap four dollar piece of fabric from Walmart.

Sarah Marince:
Very nice. So virtual set versus augmented reality, what is the difference?

Josh Spodick:

So for a virtual set, really the things you need are a green screen or led and everything in the environment around you is computer generated, augmented reality computers, a standard camera shot in a real physical environment with interactive elements or scenic elements that align with the real world. I’ll usually a lot of camera Lyman camera tracking is needed, but it’s things that you can just augment your part of the term augment the actual reality with any kind of computer generated, generated element you need. Now, of course you can combine the two for mixed reality, kind of the buzzword that’s been going around is extended reality, which kind of encompasses all the different things that one could do with all this stuff. But basically virtual set is the set

Josh Spodick:

Of self was entirely can be all computer generated, augmented reality. There’s just a couple of elements are computer generated. And those things that are computer generated could be set pieces. That could be three elements that a talent would actually interact with. And they could also be graphs and charts and data that you actually populate in real time.

Sarah Marince:
That’s very interesting.

Evan Glantz:

Yeah. The way that I like to phrase it is VR is an entirely virtual environment. AR is the physical environment layered with graphics? Mixed reality is taking a person and kind of bringing them into a virtual world where they can interact through the lens of the camera. And extended reality is the bucket that we’ve decided to just put all of the things in, and kind of the future things in, because who needs more to letter our words to making this more and more complicated. So XR is kind of the extended reality, the future of things moving externality.

Sarah Marince:
So can you give me some examples of places where I may have seen virtual production in use?

Evan Glantz:

Absolutely. Joshua has got some slides that catch your eye. So the big one, that’s the real pusher that they question the industry is going to be the,

Sarah Marince:
I’m sorry, can you repeat that?

Evan Glantz:

Oh, the Mandalorian. Oh yes. Okay. One Disney plus. So their team has really been a huge driving force in the use of led screen for virtual production. And you see some of these pictures here their talent is actually in what they what’s called a led cave and volume, I believe they call it the volume. Yes. wood, which is led screens 2.6 mil absent, a black Pearl or a blue that’s the row actually grow black girl panels in almost full 360 degrees, and then a ceiling of led panel on top. And they’re actually able to they’re able to use the use the perspective of the ANC shooting angle of the camera to drive the environments in the background, as well as having the entire environment actually liked their talent. So, as you see from some of these pictures, they don’t have a ton of lighting in the space because they’re essentially in that environment and other things that you’ve may have seen it in the past minority report, kind of going back a little bit, they used it as well as an oblivion using projector screens rather than led.

Evan Glantz:

But some other cases where you’d see up have seen augmented reality. The big one is the weather channel. Th they worked with pixel tope to launch this new version of their channel, which allows them to do not only the traditional things that they used to, but also what they call destructive environments, where they’re able to take the entire studio and actually put it through similar weather conditions.

Sarah Marince:

It’s wild. So last year as we were all preparing for the hurricane and whatnot it looks like the house and whether there’s anything there it’s being unfortunately destroyed by a hurricane and what’s happening, it’s like, that’s crazy.

Evan Glantz:

Yeah. Within, by Unreal Engine and Pixotope are these, you know, which allows them to really, you know, one recreate that entire environment, but also build it out, use it in real time, generative calls for information, allowing them to, you know, drive these studios to do really incredible things and really demonstrate a big part of it is being able to demonstrate in proper scale what these disasters look like. And that’s one of the big values of augmented reality is that you actually have things in scale with the people that are, that you’re, that are in there. I know. Yeah. And then another great version of augmented reality is also done with Pixotope is the, the Superbowl, if you watch the the super bowl last year, as well as the halftime show, and that was all driven.

Evan Glantz:

Yeah. The the Memorial halftime show that was the hall of fame the, the, the all time team or the a hundred year team hall of fame team or something like that all of that was driven and created within EXITO. And so like what you’re seeing on the screen, you know, that’s being shot through the real camera alive, and those graphics are being inserted and they can actually move to different cameras. And the graphics will still just be floating there, like almost like a holographic effect, if it, back to the previous picture. One of the, one of the elements of it that really helps blur that line between where that, where that that crossover happens is you see, if you look down on the actual field, the numbers and the field are manipulated as well. And they do that by creating a model of the space and overlaying that directly into real world track to the exact same positioning.

Evan Glantz:

So when they manipulate initial model or put put certain textural animations on the initial model and display it through this technology, it actually looks like the field itself is coming apart and greens are coming up and out of the ground, and I’m actually past that lighting pass those shadows. It’s so cool.

Keith Anderson:

Another good example last year, I know the NFL did a fly over at a Baltimore Ravens game or a giant Raven digital Raven flew over the crowd. And of course the people at the game don’t see it unless they look at the big screen, but everybody who’s watching it at home sees this virtual Raven fly over the stadium. Also, I don’t know if any of you have seen the advertisements for the upcoming major league baseball season. But they’re they’re showing their new fan system.

Keith Anderson:

That’s gonna have virtual fans and the empty stadiums. The fans can do the wave and they can change out the, the jerseys and stuff like that. And all of those fans are virtual virtually inserted into the scene using exit tope as well. We’re not only limited to sports broadcast in that large scale broadcast environment. This stuff is also being implemented in before in pre pandemic times, we actually had these live concerts specifically one of the K-pop rooms BTS was heavily utilizing augmented reality and digital projection within their live show. So you’re, as the audience will be watching the show, when you look up at the IMAX screens and you see these elements that the performers are actually interacting

with moving around. So it, it all happens in real time. It can be done on the spot right there and increase the experience, both an audience at home and an audience for this particular instance.

Sarah Marince:

Wow. So you have to be moving fast if you’re that person there at the concert, who’s doing all of the the graphics and everything like that. Yeah.

Josh Spodick:
Well, it’s a sensory overload to be sure. Yeah.

Sarah Marince:

Yes. So what are some ways in which you see this technology becoming more widely adopted and available?

Keith Anderson:

Definitely in the corporate space meetings, conferences, conventions, all shifting to virtual format for the somewhat foreseeable future. Let’s hope not too long, because as much as we love it, obviously we want to get back to to doing events in person and being able to see people in person, but CES just announced, they’re not doing their show in Vegas this year. They’re going to do a fully virtual format. The virtual concerts and festivals are becoming incredibly popular. You know, obviously things like Coachella, they’ve been doing live streaming for years now, but now they’re taking it to the next level. And you experienced the full festival you know, digitally the tomorrow land just did a big event this last week. They’re based in Europe. They had, I think, six stages and over a million paid viewers. So that’s like really setting the bar for for live music events in terms of monetization.

Keith Anderson:

As well as viewers their, their production was very high level. It, you know, if you watched it, it almost felt like you were watching a stream from, from the real thing. They had a virtual crowd, they had pyro, they had crowd cheering when the music was getting exciting, stuff like that. And you know, I think as the economic impact and societal impacts of the current pandemic continue, you know, we’ll only see these platforms continue to expand. Additionally I think the education sector is going to receive a huge boost from virtual production techniques. Augmented reality graphics can already play a huge part in in some, especially in medical training. But I can see that expanding into things like chemistry, math, biology, every day classes, grade school,

Keith Anderson:

Elementary school, high school you know, you can imagine like using your, your [inaudible] graphing calculator from, from seventh grade, using that in, in three D and, you know, punching in the numbers and seeing the drop all around, you could really expand the way people learn. You know, another example that came to mind was like dissecting a digital frog. You don’t actually have to tell the frog anymore, maybe some people’s favorite day of class and other people’s favorite class. You know, I think another great example would be a history professor walking through, you know, an actual, historically accurate model or environment of maybe a place that doesn’t exist anymore. That has been recreated by, by artists and being able to give their history lesson in the space that they’re talking about. So, you know, you can really like take, take your field trips back in time.

Sarah Marince:

That might be one way that I would want to go back to school. That may be like the only thing that would make me want to actually go back to school is something like that.

Keith Anderson:

Well, you don’t have to go back in school to get those kinds of experiences. There’s already VR apps that allow you to do that kind of thing too. But yeah, I think that that would make, I think for a lot of people that would make school a lot more interesting. You know, some people are just such visual learners and staring at a textbook full of math all day is just not for most people, I think. But if you’re seeing things happening in a three D space around you, I think especially the current generation of kids growing up today, who are used to playing with their iPads and iPhones at a very young age, you know, they’re going to be bored to death when they get to school and they get a textbook thrown in front of them. So I think, you know, finding unique ways to integrate graphics could really keep people engaged and create more of an interactive experience too.

Sarah Marince:

I absolutely agree. So can you discuss the pros and cons of, and differences between green screenshots and led wall virtual production?

Evan Glantz:

Absolutely. so green screen one of the main things that green screening and you guys have probably all seen some of the pictures from game of Thrones of the woman heading the green headed thing that replant that gets replaced with the dragon. And basically you have to plan out things a lot differently, including eyeline talent interactivity or you, you kind of get the weatherman effective. Like if you look over here in Tulsa, as they waived to the wrong city a lot of that goes away with the led because you have both the talent, the director everyone involved in the shoot can see the entire environment in real time, especially when you’re using versions of like a state you’re right, where we actually have an led floor as well. Talents you can actually, if you go to, can you Josh, am I going to the the slide three? I think it is

Evan Glantz:

It might be the Katy Perry one. Yeah. That one. So a lot of people probably saw this was, this was the Katy Perry music video that she put out right at the beginning of endemic with debuted on American idol or something like that. This is a great example of extended reality production where they used an entire green screen. So if you look at this little diagram and you watch this video and use it and use an led wall, not a green screen, yeah. I can, excuse me. You can see where the edges of the led wall are, and you can spur her as the talent grab this shoot. She had to stand in very specific places rather than her having to have them pre choreographed marked on the ground and really go through the entire show a bunch of times to make sure she got it perfectly.

Evan Glantz:

You know, they say stand on the cloud when the world, when the room breaks away and she just physically looks down and steps onto the cloud. It makes things like that, a lot easier for production. And this one actually used augmented reality as well, where they were able to track some of her positioning. The things were actually able to somewhat work within, within a scope as well. And the other really big part about using the the led wall people going to about how it creates the environment. Yeah, the, the,

the, with led the talent can interact much more naturally with her environment because I can see what’s around them in real time. And that just makes it much easier for untrained talent. Like, you know, if you have a corporate executive doing a product demo you know, they’re not trained on how to act on a green screen, that’s not their expertise, right.

Keith Anderson:

So putting them in front of a green screen is just asking for something weird to happen. And when you have them in front of an led wall, they can look and see exactly what they’re working with, or point at the product you know, point and point at the exact part of the product that they want to talk about. There’s really just no guessing about what to look at or, or where you’re, what you’re seeing. And another huge advantage for cinematographers using led is getting all those juicy reflections and lighting off of your talent. So, you know, if you have your, your heroes, like looking off into the sunset and wearing sunglasses, you, you actually can get the reflection of the sun off of his sunglasses, and you don’t have to go into postproduction and ask your compositors to try to fake the reflections that you should be seeing. You actually get those reflections because there’s physically light being cast by the led on it. And the director gets that in real time, you can actually be looking through the preview monitor and say, you know what, I need the sun a little bit brighter. If you, at the animation kind

Evan Glantz:

Of playing in Keats background, that’s actually an environment that he created where he created the volumetric clouds. So say the director says, you know what, I want it more cloudy or less potty. And when I read her in his face really easy to make those kinds of adjustments on the fly.

Keith Anderson:

Yeah, you can. And you can, you know, that’s an advantage of doing this over shooting on location is that you can make adjustments to your environment and the real world weather, which you can obviously never do. And in reality, you know, where like some studios have to plan their shoots, you know, so far in advance and just hope that they get the right weather for that day, or they have to book a bunch of extra time on location on the other side of the planet with a whole camera crew, just to hope that they get the right weather in, you know, this X amount of days or whatever that they have to shoot. You know, that’s a thing of the past. Now you can go from day to night to rainy, to sunny with a few clicks of a button, if you have seen program correctly. And you know, the the Mandalorian, if you guys are, you know, I’ve seen that, that’s a great example of the reflections. If you pay attention to the helmet, the hero wears all those reflections. They’re all real time. It’s all happening in camera. And if you tried to shoot that in front of a green screen and then ask your compositing team to go back and edit all the correct reflections in, I mean, that’s, that’s just a nightmare for the compositors, and you’re adding a ton of overhead to the final product. You know,

Evan Glantz:

One of, one of the best ways to say it is you know, you get, you have to do a lot, a bit more work in pre production. If anything, most of the work now is in pre production, opposed to just taking it all in post. But this type of technology is going to allow for more indie level studios to be producing actual for Hollywood quality productions at a third, if not a fifth, if not a hundredth of the cost just being that you can really have those dynamic differences and the gaming engines and the, the assets that are available out there, we can really develop incredibly high quality scenes. So that there’s, you know, you know, you don’t need to build an entire world at a physical props. You’d just, just build them in under,

Keith Anderson:

You need to fly a whole team to Egypt, to shoot in the desert. You know that being said, there are a lot of challenges with led walls. If your hardware, hardware doesn’t have gen locking you’ll inevitably end up with scan lines in your shot, which is like, it’s the same thing where like, if you point yourself on and try to take a video of your TV, especially like back in the day, it was a very big problem where anytime he tried to take a photograph of a TV, you see the lines going across it, right. That’s still a problem. If you don’t have gen lock in gen lock, we’ll make sure that your camera and your graphics are both at the exact same frame rate. So every single time the shutters clicking the new frame is popping up and that that’s really necessary to shoot in front of led.

Keith Anderson:

So there’s a lot of situations where green Springs still kind of holds an advantage for technical reasons. I’d say also one of the bigger advantages, the green screen stuff is one the upfront cost. Say, if you do have that budget for VFX artists, we have a smaller vehicle effects niche for that can handle that work. Totally great to have that later online also say you have something that you think you want to change a little bit in the scenic. You actually have look the lighting when she gets something down the line, you could, of course go into your composite and software, edit the virtual world on a green screen. Whereas everything in led is captured in cab. So you can’t go, if you want to change something, had to go reshoot. It just like you would with a normal production workflow or with green screen productions. We usually suggest to film both the green screen and be virtual production kind of captures. So that way you can kind of go back and do things

Sarah Marince:

Okay. That was very good information. And so now I would like to talk about relating virtual production to more traditional roles on a film set. So if we if you guys could talk about like scenic design and set carpentry as it relates to virtual production.

Keith Anderson:

Sure, sure. Yeah. I mean, obviously, you know, now we have the virtual world versus building a real set. Your scenic designer and carpentry are now kind of replaced or augmented by you know, a three D modeler and a texture art and lighting artists who are going to create your virtual environment. The set painting is true as painting your digital models. And, you know, as, as I kinda mentioned, a few minutes ago that now you have you almost limitless ability to modify your props, your environment, and your weather on the fly, and that’s just something you can’t do on location. That being said, the, in my opinion, the best productions integrate both real scenic elements in the foreground actual set pieces with similar digital elements in the virtual background. And that creates that cohesive look where it becomes incredibly difficult for the viewer to tell what’s real and what’s fake.

Keith Anderson:

And like the weather channel examples are a great, great example of that, where they are shooting in a real studio. And as the lightning flashes, they actually switch out the real studio parts of the real studio for the virtual recreation of the studio. And it’s, it looks so similar that you would never guess if you go back and watch it slow motion, you know, you can, and you know what to look for. You can see it, but good, good set. And scenic design means integrating both the real, the real props and the virtual props. You know, I’ve seen some good shots coming out of the virtual production scene lately, where they have a real car in front of their led wall and the, you know, the car is not moving, but the graphics on the led

wall are flying by them. So it looks like the car is moving and they’re getting all those reflections in the glass. So there’s a lot to think about with, with the way you build your, your set, designing your props, not only virtually, but also in, you know, in the real world to shoot in camera and how they’re going to work together to best integrate with each other and compliment each other.

Josh Spodick:

Can you talk about lighting and virtual production and the role of the gaffer in digital space? So just like before the role of the gaffer is to like tell him like the set, but now instead of placing a bunch of lights around the physical environments, it’s digital lights within scenic space, but the virtual worlds you do have to always make sure that the talent is one, probably live for exposure, but also live to match the environment, whether you’re led or green screen, you want to make sure that your talent always looks like they belong in that space. So, but screen screen, it’s the matter of setting up your normal lighting system, just as you normally would on your talent color, correcting everything, the way you want color, balancing, lighting, any moods and tones, or even atmosphere or something lens if you want, but then as well as giving a nice, even key on the green screen itself.

Josh Spodick:

So it’s easier to key out in posts and you’re doing on led. There’s a lot of different ways you could do it. You could do it from traditional lighting standpoint of lighting your talent. So it looked like they were color matched to the set itself. Or you could actually take elements of the video from the virtual world and map those to either high power video tiles or RGB controlled lighting units, so that your color temperatures and your color tones automatically match your real world space. Whenever you move the virtual set around it constantly a lot. One of the cool things I know Keith and Evan had mentioned before was about the weather and the environment. So a lot of the times your gaffer and your grip team on set would have to work with the sun. If it’s an exterior shot and control any lighting, modifying lighting, coming from the sun or coming through windows of its interior. Now you say like, if the sun’s not in the right spot, the set, you could literally go in software and spin the sky and make sure the sun is exactly where you want it to be. And it matches the orientation of your physical lights

Keith Anderson:

And all that. A lot of productions are giving those controls to the directors, just on a touch screen. And then you can just, you know, sun over here at nighttime or whatever, you know, increased stars, et cetera. So

Josh Spodick:

It kind of reminds me a lot of extra one show thing at the end of the movie when they’re like, Hey, let’s make it nighttime. Okay, sure. Slide aside. They’re across all of a sudden it’s nighttime. No, bro. It’s pretty cool. Pretty futuristic, but what can actively be done now? Also with those things, it’s a lot easier now for continuity of lighting on set. Since you’re no longer running against golden hour blue hour, you’re no longer chasing the sun or any environmental lighting. You have constant control. It’s much easier if you’ve got to do multiple takes and your timing doesn’t work out to exactly as planned. It’s a little easier to control that stuff. And there are also some systems out there such as disguise and a few others that kind of allow that digital worlds be mapped to lighting

Josh Spodick:

For a little bit easier to control. So you don’t actually need a whole lighting program onsite every for every little thing, you kind of map video to it and what you’re telling you. Isn’t good.

Sarah Marince:
Okay. What about camera possibilities and things to look out for?

Evan Glantz:

So there’s kind of a range if you start kind of at the simplest or, or a point of entry it’s going to be using two D planes inside of virtual sets. So similar to what Josh is doing right now, where he is a two D plane that’s being placed into that three D world. And then you can actually do 3d camera movements within that world. And as long as you don’t go too far to one side or the other and Josh starts to lose a little bit of weight which is always a good thing. We you know, you can keep them in still mapped in within that world. And then the next step up from there is actually tracking the movement of the camera. So the entry level version of doing it and just within you know, the basic unreal is going to be using something like a Vive tracker where in this, in this actually in this shot right here we have a camera set up, that’s our plane. That’s capturing our talent. And then we actually add, it had a five tracker controller or Vive controller in a little mini jib rig so that we could move the tracker around and it acted as our virtual camera actually, do you want to jump to the DJ video, real pregnancy? We if that, that plays smoothly,

Evan Glantz:

If this was the setup, we actually had virtual lights controlled within that same environment. So we had our DJ on our green screen and that’s what it looks like in the virtual world. And we’re able to fly the camera around within that world. And as long as we don’t go too far, one side or the other of the DJ, we can still fly all around, light him, how we want to light him. And this is at this shot right here is actually us getting, you know, moving the Vive tracker around, getting a little bit more of that live camera, feel of a person’s hand on it. So not, not necessarily a preprogrammed animation fly through that kind of shows a little bit of the difference. And then you can go all the way up to things like Stipe or MOSIS where you’re tracking everything from the positioning of the camera, to the lens shift and focus zoom. And every part of the, of the camera lens is action and location is actually tracked and piped into either unreal engine pick to chose your density, any, any of those types of software. And you can you can get much more dynamic moving shots that have the talent,

Keith Anderson:

You know, being the actual camera, being able to fly all the way around the talent and keeping them turning and shifting focus and zoom levels while still keeping that integration between the talent and the real world, as well as the virtual camera shot, keeping that the same. So you don’t wind up with something weird where the talents and focus, but so it’s something way back there, you know? So yeah, that whole integration is really the final step.

Sarah Marince:

It looks really cool. The video helped you to see him in front of the green screen, but then when it looks like when it’s, you know, with all the lights and everything, that’s, that’s awesome. Are there any special items that previously could not have been done, like ways to make the environment or set interactive?

Keith Anderson:

It’s, it’s always growing you know, kind of the traditional stuff you think about is like overlays lower thirds static two D backgrounds, you know, small infographics, stuff like that. You know, stuff that pops up on the news channel or whatever, but now with the camera tracking and talent tracking, you can really turn those kind of flat graphics into, or graphs or whatever kind of data you’re plugging in. You can turn them into three D elements that you can actually truly interact with. You can move around, you can, you know, reference in a much more natural way and the viewer will perceive the space with more like true depth because of the way the camera can move around something and you’d get that actual parallax movement. It could be anything from us, you know, a stock ticker, a Twitter feed poll results, et cetera, financial data could all be fed in real time to feed the graphics on screen.

Keith Anderson:

So you could have, you know, sports scores, populating a graph or something as they’re coming in. And I think that the biggest thing is really just like the graphic fidelity and overall quality and photo realism that you can get now, it’s, it’s approaching like almost impossible to tell what’s real and what’s virtual. Whereas even a few years ago, the hardware, as well as the software just didn’t quite have that kind of capabilities, unless you were like a Hollywood studio with a render farm. Now there’s tools out there that somebody who is very new to all of this can be creating photorealistic environments that really sell, you know, to the final viewer.

Sarah Marince:

What are some of the tools that are available for indie development versus professional level production?

Keith Anderson:

I know I’ve been kind of mentioned the, the vive that’s HTC is a virtual reality headset system. I’d say that’s like a great starting point for somebody who wants to get into virtual production, the camera tracking is really like the key to like virtual production at a high level. If you don’t have camera tracking, it’s always going to feel a little flat because your cameras never move and you’re always going to be looking from X perspectives. So the, the Vive system is great because it integrates directly into all of the software as we mentioned, or most of them unreal notch, unity, disguise. There’s also several apps for mobile phones. They can turn your smartphone into a virtual camera, so you can kind of move around and be filming. And you’re not actually not actually filling anything real, but you’re in the virtual space, you’re taking a shot with whatever motions you want.

Keith Anderson:

And I think those apps are very affordable, unreal, remote to is one of them for iPhone and the I will studio Android virtual virtual cam, it’s called for Android, allows you to to enable that capability and take advantage of the gyroscope on your phone. And there’s even apps now coming out that you kind of hold in a head mounted display, and it can track your facial motion and you can get you can get facial mocap just with an iPhone and be driving a real virtual characters, facial expressions in real time. Which is pretty incredible if coming from a traditional content pipeline, facial animation is a huge amount of work. And to be able to drive that in real time now, with something as simple as an iPhone is, is pretty incredible kind of to build on the mocap side, you can use the vibe VR systems actually to get like a budget friendly motion capture, where you strap like a tracker to your hands and your feet and your headset, and you can control a virtual character. But of course, if you’re getting into a more professional

level, you would be using something like a motion capture suit, or a motion capture volume with cameras placed on dozens of cameras placed all around you.

Evan Glantz:

Yeah, and those, those are Samsung brands in there. Rococo is a, is an inertial motion capture suit, as well as it sense. There’s a couple of other that use volumetric capturing as well. When you, when it comes to the camera tracking on the, on that again, on that high end of the budget, you’re looking at Moses state Vicom or end cam they’re all going to be your, you know, your more of your top level or a higher end budgeted virtual production kits or camera tracking kits. And they’ll give you each one of those will give you a variety of, of things that you can do with it. Whether it’s specifically camera tracking or talent tracking augmented reality pieces. It can be

Keith Anderson:

The one thing I’ll add is that there’s not a lot of middle ground right now. There’s kind of like the very cheap entry-level options for camera tracking, like the Vive VR, and then there’s the stipend, the stuff that Evan just mentioned, and the price differences are night and day, you know, so there’s, unfortunately there’s not really like a indie entry level or mid mid level option.

Josh Spodick:

It’s kind of like all or nothing, you know on, and on that front, you could the one place that is very scalable is actually your image capture. So like I was mentioning before you could use a little hundred dollar logic tech web Kennedy grab off Amazon all the way up to a hundred thousand dollars cameras currently, right now I’m using a Sony. A seven has actually captured us with a couple of Cornell’s around me, but we’ve done shoots where with red cameras go to res Alexa’s no laughs really anything works. My biggest motto, anytime you’re dealing with graphics or really anything for that matter is garbage in garbage out. So the better image capturing you have the better result that you can end up with. That’s not saying you need that top of line high end Alexa mini LF to start off with, you could certainly start off with the lower end gear, get yourself used to get yourself working, try to develop some client base, develops some project X, and then as you’re able to sell your skills, start to get bigger and bigger and bigger and went to the gear that you want.

Sarah Marince:

Nice. Okay. So I’m gonna answer my last question. I have written, but just a side note to our audience here. If you have any questions that you want to ask, just type them in our Q and a box and I will get to those. But so of our panel, are there any things that people should be aware of while venturing into virtual production? Like, do you have any pieces of advice or just things that they should know?

Keith Anderson:

It’s a lot of work and there’s a lot to learn. It’s a lot of fun, but definitely a lot to learn. New things are changing and happening pretty quickly. Yeah, that’s, that’s actually a really good point is that the things that we can teach you today might be totally different. There might be totally like a bunch of new options, six months from now, or even two months from now when the next version of the next software comes out. And

Sarah Marince:
Ways people can stay up to date with it. Are there like,

Keith Anderson:

Yeah, there’s a, there’s a lot of great forums. I know the three of us are on this Facebook group called unreal engine virtual production group. It’s run by a guy named Matt Workman who started a YouTube channel called cinematographers database. And he’s been kind of examining cinematography behind the scenes for several years now. It’s been an incredibly valuable resource tons of good Reese information tutorials templates. If you go down the unreal engine learning path, inevitably, you’re probably gonna want to learn something called end display, which is the output module that actually takes the graphics and outputs them to your led wall or your projector, et cetera. It’s not super user friendly. It’s definitely one of the most painful parts of the process. So just prepare yourselves for that. I know there’ve been times in the, in the studio where Keith and I have been sitting there looking over lines and lines and lines of code, trying to figure out why the hell can we knock it out. But three right now. And it’s just one little misspelling somewhere that often ends up being the culprit, but that’s not to dissuade anybody from doing it. I am not a coder. I do not know how to code at all. Maybe 50 lines of JavaScript in my entire

life. But a lot of the stuff is still visual learning. It’s really easy to figure out a lot of the stuff. So don’t be afraid of it. Take whatever downtime you have and learn.

Sarah Marince:
Okay, Josh, the people want to know what does your setup look like without digital enhancements?

Josh Spodick:
I’m going to have like hairline around me, no attention to the man behind the curtain

Sarah Marince:
That is catfishing virtual reality, the virtual production

Josh Spodick:
I’ve noticed I’ve slimmed myself a little bit here too. You know, just a little bit of that.

Sarah Marince:

No, that’s really cool. That is really, really cool. So another question we have is what are some of your favorite projects that you’ve all worked on?

Evan Glantz:

Oh, I’m in, I’ll let you go first. And in virtual production, I mean, really you just been doing a lot of internal projects. That’s in that stuff that we showed you, but outside of outside of virtual production Oh man did a lot of, a lot of fun projects. I used to be the projectionist for camp Bisco. So I mapped the roof of the Scranton amphitheater. This was several years ago. That was a really fun project that I worked on. I used to also projection map, a lot of buildings. Those are always fun.

Keith Anderson:

I’ll chime in one, one fun one that Deb and I both worked on was designing content for a 12 projector theater on board, the newest celebrity cruise ship celebrity edge, which at the time I think it was the most expensive boat ever built which came with the name of the cruise. So spent a lot of grace doing that book.

Evan Glantz:

I was on that ship for 52 days. So it would by by day, like thirties or so, it wasn’t quite as nice. You kind of get a little a stock home since.

Keith Anderson:
I only had to be there for six days. So it wasn’t. So

Sarah Marince:
Two days straight, like you like were on there.

Evan Glantz:

We had to disembark and get back on the ship during a, like the, well, it was a three week crossing from France to the last, and then there was many, many ceilings back and forth just areas. It was not fun. Bahamas. It was have you cruised since no, I have never done

Sarah Marince:
No desire to cruise again

Evan Glantz:

Now. Well actually, no, that’s not true. I did do one group cruise after that, that that chip, which is a three day party on a ship that was that was a good match.

Sarah Marince:

Much different than work crews. Josh, what about you? What are some of your favorite projects you’ve going with my favorite

Josh Spodick:

Projects? Like I honestly can’t give specifics on, but I’ll just NDAs and stuff like that, but those often are the ones with the biggest budgets, the most toys and the most challenges. One of which I remember involved, I think it was 10 tons of robots holding video screens that moved around the site and then mocking up a very large space to kind of transport people from New York city to the Caribbean when they walked in this building with lighting and video and scenic elements from the star thing. Other than that, some of my favorite projects have honestly been the past, my past the music festivals. I got my start as a technical director and assistant technical director in New York city. And when I was 20 years old, I was able to work with an incredible team as technical director of the first ever governor’s ball music festival.

Josh Spodick:

And it was actually on governor’s Island. And I remember having a drink with one of my bosses at the time I was 20, not 21. And I remember he puts a beer in my hand and goes, wait a second, how old are you? I said, don’t worry about it. I was able to share a with the bosses at the end of the show, but things like that coming up in that environment with just anytime I get to create a world creative space for anywhere from 5,000 to 150,000 people to have the time of their lives, I’m happy.

Sarah Marince:

That’s awesome. And are there some cameras that are easier to work with than others? And do you have any favorites?

Josh Spodick:

Yes. cameras are, you could go down the rabbit hall for your entire life and still not learn every single one. There’s some things as simple as a little point and shoots, which you could capture from, there’s a muralist and stuff. I’m personally a Sony guy. I love my Sony 72. I’ve had it for a few years now. It’s amazing. I use kind of in glass personally and rocket on glass. My favorite camera I’ve ever used on a set though, has to be the Arri Alexa mini and the Alexa mini LF. That is something I will never be able to afford myself to own myself, hopefully rent on some projects down the line, but in terms of putting somebody on camera, I find it the most flattering you to skin tone is the easiest to light with the best low light reception, and just need cleanest and highest grade image I’ve ever seen.

Sarah Marince:
Nice. And then our, Keith, do you have anything to add to the camera?

Evan Glantz:

Yeah, Josh is our main camera guy makes the camera calls. I’m always like, you know, anything that I can use to point and shoot.

Sarah Marince:

All right. Well, this is from Joel. Hey Joel, glad that you’re here this evening. So where is this going to on the future of virtual sets via led walls replacing most, almost all sets

Josh Spodick:

God, I hope not. Yeah, I don’t, I don’t think we’ll see that. I think that like the real, the real beauty comes from mixing them together, having physical pieces and digital pieces that so that you really can’t tell the difference. There’s always going to be a need for that traditional film. There’s also just ton of shots that

Keith Anderson:

You don’t have any need to create something that doesn’t exist, that you can’t just easily shoot. You know, if you’re shooting a sitcom in somebody’s apartment, you don’t need to have a digital backdrop. You just build a you know, you just go to somebody’s apartment and set up your camera. You know, the led walls stuff is really like really great for more CGI environments that just could never really exist or something that would just be so expensive to fly your whole team to the top of Mount Everest, to shoot your like final battle scene or, you know, whatever. There’s, there’s definitely a ton of costs associated with setting up an led wall too. So it’s not always going to be cheaper. You know what I mean? Especially if your CGI elements are not super heavy, you’re going to probably create more costs for yourself by trying to do it with virtual production, if you don’t really need it, but it’s not something you will see are some pretty incredible student films coming out in the next five to 10 years as a film school started to integrate this technology.

Evan Glantz:

And you just have like, and kids, you know, working on their senior thesis, that is just going to be really incredible. You’re going to see a lot of really great work coming out of that in the next few years, for sure.

Josh Spodick:

And I’ve also given you, I’d love to shout out one of the people like seeing the attendee list right now, Cole Marcus. Awesome. Dude. I had the pleasure of meeting in LA last year, we were doing disguise. He was doing this guy’s training. I went to visit the office. He actually was able to take a lot of heat on real stuff, set up a few iPads and computer monitors on his desk and put my date Lego Mandalorian. And it looks really awesome. I saw that stuff to start out like that is great. Just that you’re sitting in your office at home in your living room, wherever you have a couple of computer monitors around clay. Yeah. I saw one guy had set up a three iPads built into like a corner of volume and then he had like a, I think it was like a Mario action figure, you know, standing on it. And it was like his tiny little led backdrop and it looked great, but then he mapped it with disguise.

Sarah Marince:

That’s cool. Kelly wants to know how do we think virtual production maybe integrated with live events and settings as we return to them, knowing all we already do, knowing all we can already do now. And what are you planning for?

Evan Glantz:

I think that’s a, that’s a great question. People are already starting to see the integration of augmented reality into live events. People are starting to see that you can create these much more fantastical places and these fantastical kind of concepts for events that you look at tomorrow land as a, as kind of a marker for that. Or if you go back to MTV music awards from last year where they each one of the live performances had an integration of some form of augmented reality. I think a big part of the future that you’re going to see is artists getting to be more creative and expressing, expressing them their creative visions and a lot more dynamic ways. Augmented reality stuff is going to be really big for that. Where like Coachella, you have people whipping out their phones because they, when they look through the phone, they see like a special version of, of what’s happening on stage.

Keith Anderson:

Basically with added graphics. That’s like only going to expand. And I think kind of to get more to Kelly’s question is, you know, you could, let’s say we’re back in, Covitz over, we’re back to doing real festivals. How is this going to integrate into a real festival beyond even the augmented reality is you can have screens behind the DJ that are giving a whole virtual backdrop more than more than just like a, you know, your traditional concert visuals, but creating more of like environmental visuals as these kind of Hollywood techniques ex expand out into other sectors.

Sarah Marince:
Awesome. Well, that was, that was all of our questions for today.

Keith Anderson:
Shout out, Kelly, thanks for tuning in.

Sarah Marince:

Thank you to everyone who asked questions. They were all wonderful questions as always. I always learn so much during these webcasts because a lot of these are topics I’m not familiar with. So thank you to all three of our panelists for being here today and talking all about virtual production. This was super enlightening. And next time I watched the weather channel and I see the house with the car and the hurricane coming home and be like, I know how they do that. I learned all about that. But so thank you so much. And if the three of you want to go around and just kind of say your name again, if you have a website or someplace on social media where people can find you, you can go ahead and promote yourself now.

Evan Glantz:

Yeah, we’re a, so our company is digital2physical design. You can find us on Instagram and the Facebooks and all those things, but he is also works as an independent artists doing some pre cool and fantastical things. You can talk about that and Josh as well.

Keith Anderson:

Yeah, I’ve been doing live DJ’ing as well as creating content for festivals and other musical artists for a few years. My brand and a handle on Instagram is called fractal visions. So you can find me on there as well as I’m the digital, the physical design, where I am the lead animator.

Josh Spodick:

And I’m normally from the live events, concert touring live event world. But Redship designs is my company as well as working with these guys with digital to physical find me a lot of festivals once they come back, hopefully soon, a lot of concert touring. If anybody is LA based we have two driving shows with Fitz and the tantrums coming up next month. So stay in your car, stay distanced, stay safe, but let’s have some form of events.

Josh Spodick:

It will be awesome to have. And again, if anybody wants to go check out the Facebook page with digital2physical, they’ll see us some of our demo real content that we have created since this pandemic has hit, since our time, the virtual world is really focused.

Keith Anderson:

We’re posting new updates all the time, as we’re doing more R and D. We like to share with all the working on so yeah, tune in and get some, get some looks behind the scenes as what we’re doing.

Sarah Marince:

Great. Thank you guys so much. And as always, I’m Sarah Marince at SarahMarince.com. Have a great rest of your Wednesday, and I will see you next time. Bye everyone.

 

Check out our other webcasts:

Hair & Makeup

Coloring

Music Production

Leave a Reply