Clean

The Volume and The AR Wall

Published Aug 1, 2022, 9:53 PM

Franchises like Star Wars and Star Trek have recently received a filmmaking upgrade in the form of The Volume and The AR Wall (respectively). What do these systems do and how could they change filmmaking? 

Welcome to tech Stuff, a production from I Heart Radio. Hey there, and welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with I Heart Radio. And how the tech are you? I got a request on Twitter from Bronson Adams to talk about the tech of the volume, which is extensively used in the more recent Star Wars projects that we have seen, particularly on Disney Plus, as well as the A R Wall which is being used for Star Trek projects. So once again it's Star Wars versus Star Trek. These two technologies are actually quite similar to one another, so I figure we can tackle both of them together, and right at the top, I'm going to say that these technologies in some ways actually harkened back to ye olden days of filmmaking, and in other ways they are an attempt to address the problem ms that come with setting a film in a fantastical place, and that includes, you know, worlds that just don't exist. So how do you bring those to life in a way that's believable and immersive so that you know, you're actors don't have any issues performing within the space, and your audience hopefully totally buys into it. So when you're doing that second thing, that is when you are trying to create a world that doesn't really exist and make it exist on film, you're putting a lot of pressure on yourself and the actors. Maybe you have a massive budget and you can actually build sets so that your fantastical world is more or less realized, but that might not be a practical solution, and maybe instead you film everything against a green screen, or maybe you have an entire set where there are elements that are greened out and then you use the chroma key process to replace it with, you know, whatever is supposed to be there. If you've ever seen shots of the prequel trilogy from Star Wars when it was in production, you'll see like there were entire sets where they were just like arches and things that were all decked out in green to be replaced later. But then imagine the actors who are in that place, who are trying to perform in what is essentially a featureless green environment and still do a believable job that is not easy. If you need any evidence about how challenging this can be, even for revered actors, you can listen to Sir Ian McKellen talk about it in the commentary for The Hobbit and Unexpected Journey, though I would understand it if you didn't want to do that, because it would mean having to watch The Hobbit. Now. I say that as a dude who hasn't had to en Elvish on my left arm. I'm a huge Hobbit fan. It's my favorite book in the world. I am not a fan of the films anyway. Mckillan said that he was absolutely miserable acting in these green screen environments against no one at all, because you know, all of the elements of the scene that he was supposed to be reacting to they would all be added in later. And he said that he actually considered quitting acting if that is what acting now was going to be, which is a big old roof. But the volume and the A R wall both are systems that can project stuff that otherwise wouldn't be within the scene on L E. D screens. Now that means the actors are not just sitting there trying to imagine what's supposed to be around them and hoping that they get it right. You don't have to worry about five different actors having five different eye lines towards some imaginary spot because they can actually see the thing that they're supposed to be looking at, it could be much more immersive with some restrictions. Now, before I get into all that, I think it would behoove us to talk about some of the methods that came before where filmmakers were trying to get around these limitations of reality. Filmmaking has always been about finding ways to push against those limited aations, and some of those efforts worked to a degree, and some, especially in the early days, didn't. So let's talk about some very early visual effects or VFX techniques that relate to one another. And a lot of this material is stuff that I have talked about in previous episodes, so I'm not going to go into exhaustive detail, but it really falls under this general topic of compositing. Now. In film, compositing is where you're doing the incorporation of two or more images from separate sources into a final shot. For example, a typical green screen application would see a character in the foreground that would be an image from source number one. Right, you had your primary camera on your main character against a green background. That background would be coming from a second image source so that would be image number two. Maybe it's a filmed source. Maybe you went out and filmed on locations inplace, and you know it wasn't practical to get the actor out to that location, or maybe it wasn't safe, and so you pair them that way. Maybe the background's computer generated, maybe it never existed in the first place, doesn't really matter. The point is the finished scene, the finished film is this composite of these two images into a final image to make the illusion that that character was in that location at that time. So your foreground characters appearing in front of wherever you wanted them to be. Now, way back at the turn of the twentieth centuries, around nineteen a filmmaker named George Mellier was pioneering all kinds of techniques. Like you want to talk about a a brilliant filmmaker. Melia is one of the best. You know, some filmmakers were just gob smacked at the idea of being able to record a performance that could be played back endlessly, or at least, you know, until the film war out. Mellier was getting really experimental. Now, remember in ye olden days everything was shot on film, and some of you might not really know that much about film because things have changed so so quickly. You know, back when I was a young lad, hardly anyone had access to digital cameras. That we all had access to film cameras. Film is a strip of plastic that's coated with chemicals that are photoreactive. That means that when they are exposed to light, they undergo a chemical reaction. This opens up the opportunity to do stuff like double exposures. Now, a double exposure is when you expose the same piece of film to light more than once. Uh. Now, sometimes that would happen by accident. You know, if you had a film camera, you might accidentally take a photo on a frame that had already been exposed to light once before, and you would end up with sometimes an unsettling image. You know, it might look like there was kind of a ghostly figure appear ring in a separate scene, and it could be really off putting. Sometimes it was just a mass, depending on what you were taking images of. But if you did it purposefully and carefully, you could achieve really interesting effects. And Malier created a film called The One Man Band that used seven exposures and it's a silent film. It's a short silent film. You can actually find it on YouTube. You can do a search for the one man band and watch it. And it starts with a man walking in front of seven chairs that are set out on a stage there, side by side, and he points out those seven chairs very uh, in a very exaggerated fashion, and then he sits in the first chair on the left side of the view, and then stands up, but a figure of him remains seated in that chair number one. The standing version of him moves over one, sits in chair number two, then stands up. Copy of him remains in chair number two. So now you've got pies of them in chairs one and two, and a standing version that moves to three. And he does this all the way down, so that there are ultimately seven of him, so presumably one original and six copies. And not only that, but each time he stands up, the one that's in the chair as holding a different musical instrument, and uh, the one in the middle actually stands up and conducts the other six as they appear to play music against the silent films. You don't actually hear anything, but it is a clever show of how this multiple exposure approach would allow you to do in camera effects. Now that technique is interested, but it is limited. You know, you can technically clone a person on screen using multiple exposure, but you know, if they cross paths, you get this kind of ghostly effect, and you would have to do it with right lighting so that no one appeared kind of like transparent, unless I guess that's what you were going for, in which case, go for it. But you know, the next example is one that more closely ties into what the volume and the A R wall are doing. And it is called background projection. And yeah, it's it's exactly what it sounds like. It's projecting a background. So imagine here's a very typical use of it. Imagine that you've got a sound stage, right, you can control everything in the sound stage. It's an ideal environment for shooting a film. You don't have to worry about being on location and dealing with things like whether or noise or anything like that. So you're on your sound stage and you want to shoot a sequence where your characters are in a car and they're traveling somewhere. Well, clearly you can't actually have the car driving around on the sound stage at least not more than a few feet. So one way you would get around this is you would set up your car. You would have your actors sitting in the car, and behind the car, to the side of the car, whatever angle you were looking at, you would set up screens or plates, and upon that you would project video or really film of movement that would be in the right angle compared to the vehicle's orientation. So this is something that would require a lot of uh, finagling, just to make sure you had it right, otherwise it would be very off putting. Your camera would need to be at the right angle, pointed at the car, and the screens would me need to be at the right angle, and the footage would need to be shot at the right angle to give the illusion that the people saying in the car actually traveling down like a country lane or something like that. Meanwhile, you film the whole thing in your film camera, so you're actually shooting not just live actors, but you're shooting pre shot filmed footage that's playing in the background. It wasn't perfect, you know. You can definitely tell when this is used. If you watch any old film or old television series, you can pick out when this happens when characters are in a car and you just look and you say, yeah, that that that is clearly background projection. And obviously if you didn't have the alignment right, it would be kind of weird too. Right. Let's say that the angle you've chosen to shoot your scene is kind of a three quarter profile type thing, but the angle of the footage is straight on going down the street, Well, it would look like the car was somehow traveling out of its alignment down the road. So everything had to fit or else the illusion was completely broken. And even when the illusion wasn't completely broken, it's still pretty obvious. So if you watch films from the so called Golden Age of Hollywood, you're probably gonna see a lot of background projection. It was very, very popular, and you can usually tell right away the background tends to be extra grainy compared to stuff that's going on in the foreground. Film has a grainy quality to it, So shooting film on film gets you know, that grain equality gets amplified, right, because you're picking up the grain of the background film. Plus it's the grain of the actual film you're using, So it's stacks and there are tons of other camera tricks that I could talk about from the classic film age, but most of it doesn't actually actively apply to the volume and the A R wall. Instead, we should specifically talk about techniques where you're trying to create foreground action and a manufactured background, because early on filmmakers would create elaborate backdrops to serve as a background. And again that meant that you could shoot on a sound stage, so you had your controlled environment and it would cut back on a lot of things that could potentially waste your time. You have to also remember in the film days, you had a limited amount of film that you had at your disposal, right, and if you ran out a film, you would have to purchase more. So you couldn't just shoot and shoot and not care whether or not it was working. You had to care a lot because it directly related to how much money you had to spend in order to make the movie. So sound stages were preferred in many cases to shooting on location because of that. So if you could create a really detailed background that you know due to how the camera is going to focus on characters in the foreground, it can pass as realistic. Right. You can have a painting in the background and it's fine. Uh. Some of these backgrounds called mats were pretty convincing. Some we're not. But a great matt painting is something that I always enjoy seeing in a movie. And once in a while you'll get a filmmaker that uses matt paintings today, typically because they want to evocus specific sort of reaction from their audience. It's not not used as a common technique today. It's almost more as a stylistic choice and the word that would be used for a lot of different techniques and equipment to achieve the goal of come binding image elements from separate sources into a final image. We'll talk about more of that after we come back from this quick break, Okay. So ultimately, matts can be different things, Like it could be a little bit confusing when you're talking about mats in filmmaking because I don't know about you, but I'm used to using a word to mean something specific, and matt is usually used more as a way to achieve a specific result. So a matt could be a mask that sits on the camera that shields the film from being exposed to light beyond a certain point. So, in other words, it's it's like you're you're keeping a blackout area so that film doesn't get exposed to light. And then you're shooting another scene where it's the complementary scene, the couple of mimmentary footage I guess I should say, and has an opposite mask, right, It masks every anything else and leaves open the part that had been covered in the first mask. And then you combine these two sources of film together to create your final image. Uh, here's an example. Let's say that you're shooting a scene in which your hero is in an alley and their back is against the brick wall of a shop, and the hero is close to the corner of the front of the shop, and so the way you positioned the camera, you can see the hero in the shadows of this alley, and you can also see on the you know what's around the corner. And let's say you wanted a really interesting shot of a foggy street. Well, you're brick building is a facade that's inside the sound stage, and what you've done is you've created an amazing Matt painting of a foggy street. And so you've got a mask on your camera that blocks everything that's around the corner of the wall, so that you can fill it in with something its later. And then you shoot a a shot of the matt painting where you've got a mask that blocks out where the brick wall would be and your hero, and then you combine these two elements afterward to create the actual finished effect where you have your hero against the brick wall in the shade and the alley and around the corner you see this foggy street. That's a very super simple example. The mats in that case are static. The camera is locked down. You can't move the camera at all in that kind of shot. But let's say that you do want to create a shot in which people are moving, or maybe stuff in the background is moving with relation to the people. Uh Like, Let's say that you also want a shot where our hero is stepping beyond the corner of the building to look down this foggy street that doesn't actually exist in the sound stage, or maybe you want the camera to be able to move to pan and and dolly and that sort of off. Well, in that case, you need what is called a traveling matt. Also a shout out to those of you who are familiar with Fraggle Rock. That show used a few different technical film terms in it. Uncle traveling Matt was named after this particular thing in filmmaking. Gobo is another example, but I digress. There are a couple of different approaches to traveling mats. You know, it's not again just like the word matt. There are multiple examples of it, but the one that I referred to earlier was one where you use a green or a blue screen. And if you're talking about traditionally in the old days of filmmaking, it was typically a blue screen. This is called chroma keying, and the basic idea is simple to understand from a very high level, but it's actually really tricky to make it work properly in the film days in particular. So you take a solid color such as blue, and you use that to serve as the background as you're shooting your principal action. You make sure none of your performers have any closer props that are that color of blue really blue at all if you can avoid it, and you film your actors in front of this blue screen, and then you pair that with other footage that's meant to be the background. It effectively replaces all the blue parts of the background with this other footage that you've shot or created. Now, doing this was not easy back in the film days. To accomplish a traveling matt, you had to shoot on three separate strips of film simultaneously. One strip would be for red, one for blue, and one for green. You would take the negative image for the blue strip of film and the positive image for the green strip of film to combine those to create a solid matt. Then you could composite that with the background footage that have been shot against a separate blue screen, and these separate strips of film would be fed to an optical printer to create the composite final image and the whole system you would be calling this keying out. You'd be keying out in edges to replace with some other footage. And it worked, but you typically could see the effects on screen. Um, if people were really, really, really meticulous about it, it was pretty effective. But largely you would look at this and think like, oh, no, I know how they did that. I'm not necessarily completely pulled out of the action. But depending on how bad the effect was, it could do that. So you know what, when you see it, there's this kind of there can be a hard line between the borders of an actor and the background, and you can tell that the people who are in the scene aren't really in that environment, whatever that environment might be. Now. Back in the day, movies that did use chroma key typically use blue screens because film itself has a blue emulsion layer in it. And when the transition to digital cameras happened, there was a switch to green screens, which were more effective with digital cameras, and you could set up enormous green screens outside even and not have to worry about the camera catch blue sky and creating an opportunity for the chroma key process to bleed over um. In that case, obviously, you don't want people wearing green stuff because it would be the same issue where the whatever the background footage would be replacing the green on the character or proper whatever. Green screens gave filmmakers the chance to transport characters to places that don't exist without having to actually build all the darn fantastical landscapes themselves. But as I mentioned earlier, this also puts a burden on the actors as they can't actually see whatever it is that's going to show up in the final film. They have to imagine it. That's not always easy when you're also trying to hit your mark and say your lines in a way that's, you know, good. And obviously the more unusual the location or the characters that are surrounding you are, the more it's asking the actor to use their imagination. And it could very well be that the actor is totally off base when they're imagining what the filmmaker is imagining. It could that those two visions don't line up at all. It creates problems and chrome Mickee also has some limitations that you have to light the shot really well or else the effects won't look as good. And by well, I mean that that can limit how much you can play with light and shadow. So you can create incredible backdrops, and you might not be able to to tell the difference that you know that backdrop wasn't actually there, but you might not be able to light the scene in a way that's really compelling. It is possible to do, but it's pretty complicated because you have to very carefully light your foreground and your background separately, and then you have to keep in mind stuff like reflections and shadows. It's just it's really hard to pull off, and it's very easy to make mistakes, and for the very keen eyed movie goers out there, those mistakes end up being obvious, not not to me so much. I typically don't notice unless it's unforgivably obvious. Now, what if you could combine elements like background for action, traveling mats and a controlled studio environment. What if your actors could actually see the amazing, perhaps even entirely virtual locations that their characters in habit, which eliminates the need for them to rely solely on their imaginations. What if you could leverage the benefits of computer generated environments and put real world actors, props, and set pieces inside it. What if these environments were generating light that could interact with the stuff that's really there in front of the camera, So you get reflections and shades of light on the performers because it's not that they've been virtually put into this environment, it's that the environment is virtually around the actors. Here's where we get to the volume and the A R wall. So these are systems that use L E D screens, huge l E D screens, much much bigger than your television and they are positioned around and above a studio area, and it creates a panoramic view that partially surrounds the actors in a scene. So with the A R wall, it's like a very large arc that also has a ceiling with screens in it. Uh. If it's the volume, it ends up being almost a complete circle, also with a ceiling that has led screens in it. So if you're familiar with the Star Trek Next Generation television program, then you are also familiar with the holiod Deck. The Holid Deck is a room aboard the starship Enterprise and presumably other ships too, And in this Holid Deck you could go in and the ship's computer would generate an entire virtual world within that room, complete with illusions that you could sense beyond just seeing them. You could smell things, you could even pick stuff up, and if you were a Commander Riker, you could hit on ai generated women in a way that was totally creepy and weird and definitely has not gotten better with age. Uh. Also, the Holidack would malfunction in pretty much every other episode had appeared in which left you wondering why anyone would ever go inside the darn thing. In the first place. But the point being the holidack would create this virtual environment around you so that you had this immersive experience. Well, the volume and the A R wall are kind of like the holidack, but obviously with huge limitations, Like the holidack could move stuff around. Uh, it could adjust things so that people were not gonna walk into a wall. Somehow they always happen to know where the wall was. I don't know how that happened, because it looks like you're in this you know, immersive environment. But anyway, you wouldn't walk into a wall. You could actually pick things up because this virtual world could you know, replicate stuff so that there were physical things for you to contact. Obviously, we can't do that in our real world, and we're not at a point where we can, you know, manipulate the environment in such a way that you never get to the edge. You will eventually get to the edge, because that's how reality works. So on a base level, what is going on is that these systems are projecting complex, dynamic, and immersive backgrounds. Now, all three of those things are important, right, They are complicated backgrounds. It's not just a simple green screen. Although they can do that too. They are dynamic in the sense that it doesn't have to be just a static background. It's not like a matte painting. It can have It can incorporate moving things that you could have shot other footage and use that projected within these environments. So if you wanted to, you could film extras milling around, you know, a medieval battleground after a war has been fought, and then just use that as background against the action of your characters in the foreground if you wanted to. This has some obvious benefits. One is that you don't need to go shoot on location and a world WARLD travel restrictions can pop up quickly due to stuff like COVID and now monkey pocks. That's a huge asset to a production company because just think for a moment about the logistics that are required for you to shoot on location, particularly if you want to shoot someplace that's remote and maybe in another country, because you want a place that doesn't have as much, uh, you know, identifiable landmarks that are on on site. You because you want to create this illusion of this this world, whether it's fantasy world, science fiction, whatever it might be. Well, just getting access to that location, getting the gear and the crew and the cast all to the right place. All of that requires huge amounts of work, their entire production departments dedicated to making that happen. Then once you get there, you might have to worry about stuff like weather and lighting and all that. So in a controlled studio environment, you don't have those issues. Right as long as people are testing clear so they're not, you know, coming down with COVID or anything like that, then you're good to go and you don't have to worry about relocating everybody halfway around the world. And of course a huge issue is that it starts to bring elements of pre production, production and post production together in one environment. I cannot stress what a huge change this is for filmmaking because it means that ultimately you don't need as much time to complete a project. And y'all time is money. So if you have a system where you can design and then project a a background onto these screens that your actors can actually see, that the camera can see that, you don't even have to change for it to appear. As as camera ready, it's it's to go, and then say the director says I want to change things. It gives you the opportunity to do that, so you can actually do visualization really close to production. And because those effects are already there, and the lighting effects and everything are already there, it reduces the amount of work you need to do in post production. That's phenomenal. Now, obviously this still requires a ton of work. It's it's not like it just all magically happens. You have to create the background images. That's I'm going to require a lot of work, both from an artistic point of view and a technical one. Uh. And it's not just that these environments can show digital images and act kind of like an LED version of a Matt painting. Uh. The A R wall in particular, has a really cool feature in it in which only sections of the wall that are in view of the camera are creating high resolution output. And and that's a very clever solution. So just imagine that this is how the old works. For a second, imagine that whenever you look around, only the stuff that's actually within your frame of you is well defined. In other words, the stuff that you see in focus is the only stuff that's in focus in the world. It's whatever you're looking at. Anything you're not looking at, like anything that's outside your frame of you is more hazy until you turn to look at it, at which point it instantly appears in clear focus. Now you can argue that the real world doesn't behave this way, But a philosopher would ask, are you sure about that? Are you are you absolutely sure about that? Or is it just that every time you look around the world snaps into focus and everything else kind of fades into this fuzziness. Well, that's the way the the A R wall works with a relation to cameras. I'll explain why that's really cool and important after we come back from this quick break. Okay, so we have seen this approach of making sure that only the thing that's within focus, that that that's being looked at is within focus, I guess I should say is important. We've seen that in the past. So, for example, VR engineers will often create headsets and software in which the stuff that is appearing directly in front of a person's eyes is in high resolution, but the stuff that's at the edges, you know, in the peripheral view of the user, can be in lower resolution. Because that's how our eyes work, that's how our vision works. That saves on compute power and it means that more resources can go to producing other stuff like a high frame rate. For example, if you're not looking at it, then it doesn't need to be in high resolution, it doesn't even have to be rendered at all. So the same can be true for the A R wall. The way they do this is that they pair tons of sensors together so that they can know where the camera is pointing at any moment. And by no, I mean like the system is aware of where the camera is looking at. It's not a human who's doing this. So each camera, like production camera, the camera that people are using to shoot an episode of Star Trek for example, has mounted on top of it a frame, and that frame holds some little spheres on it, which act is kind of like an orientation key for the camera sensors that are incorporated all around the A R wall. So these are little things that get can be digitally erased in production, but they are at all these different points within the A R wall frame. Those sensors are all picking up that that little series of orientation spheres that are mounted onto the cameras, and by interpreting the angles, a computer system knows where that camera is pointed at. Any given time and thus can dedicate the resources needed to generate high resolution for just the frame of vi for that camera. It's incredible compute power to do this right, to be able to detect where the camera's pointed, you know, how far in is it zoomed? How much of the screen do you need to bring into sharp focus? But that's how it works. And as you move the camera around you see this. You could see the images on the the led walls come into you know, tighter focus or fade if the camera has just moved away from them. It's pretty trippy to watch. And the backgrounds can actually move with the camera to which helps create the illusion that there's something definitely back there as opposed to a flat image. And this kind of relates to something that's called motion parallax. Now, first let's just talk about parallax in general. So, assuming that you have vision in both eyes, you experience parallax. And what parallaxes is the apparent difference in position of an object along different lines of sight. Now that's a fancy way of saying that where something appears to be changes depending upon your point of view. If you were to hold your finger up in front of your face and close one eye and look at your finger, and then you close the open eye and open the closed eye. It would look like your finger had changed positions, right, because you're looking at it from your other eye, which is obviously not in the same physical position as I number one. You know, our eyes are are located side by side, So if we do that, if we switch back and forth, closing one eye and opening the other, it'll seem as if our finger is jumping back and forth between two different positions. It's not really doing that, obviously, that's just what you're seeing based upon that particular line of sight, and our brains combine these two lines of sight to create the image that we use to kind of interpret the world around us and to do things like get an idea of how far away an object is. For example. Now, motion parallax is slightly different. Imagine that you've got two characters, one who is closer to the camera, one who is further back. And let's say that they're both walking from the left side of camera view to the right side, and they're both walking at the same speed. Well, the person in front will cross the frame faster because they're closer to us. Than the person who's further behind. So this also applies if you're moving the camera and the various components are standing still. If the person in the foreground and the person in the background are standing still and you're handing the camera across, then the person in front is going to cross the frame much faster. Right. This is stuff that we intuitively know because it's how we experience the world around us. Well, when you're creating a digital system that is standing in for a real background, sometimes you've gotta move stuff around and cheat in order for this impression to be realistic and immersive. So you can actually have the camera's movements and the backgrounds movements choreographed together, so that as you're moving the camera around, the background is actually moving slightly to in order to get the right effect. That's something you cannot do with a Matt painting, obviously, because a Matt painting is just going to be that static image forever and ever. Now, another benefit of this technology is, like I said, it provides lighting for a scene. So if a character like say the Mandalorian, is inside an environment that has lots of lights all around, the lights from the screen will reflect on the Mandalorian's shiny best car armor, so it creates a more immersive, realistic effect. If you have a close up on the Mandalorian's helmet as he's sitting in his spaceship and he's going into uh To, you know, hyper speed, well you'll see the reflections off his hell because they're actually projecting that hyperspeed imagery all around the set that has been built inside the volume. And again you don't have to do that in post. It's created right there in the camera shot. This is stuff that typically you would have to add afterward, but you have removed that step by including it in the moment when you're actually shooting the scene. Now, to make all of this happen, there have to be these big old computer systems in the background calculating things like the angle of view of the camera, the motion of the camera, how the motion of the background should match that. And obviously you have to actually create the assets that are going to be on display on those screens. That might be computer generated assets, in which case you have artists creating these things, building them in virtual engines which then get rendered onto the screens. Alternatively, you might incorporate some old school stuff like miniature work that's been mapped to a digital background that then it's projected onto these l e D screens or by these LED screens. I say projected on, it's really by because the the images coming from the l e ED is not being projected onto them. Or you might include camera footage from real locations and again that could be mapped onto the screens. So the first generation volume, the one that was used in Star Wars projects like the first season of The Mandalorian, was a seventy five ft diameter twenty one ft tall environment. They used the Unreal game Engine for initial pre viz. That's pre visualization and design, and I think that's pretty darn cool. It's really neat to see filmmakers leverage a tool that was meant for game development and use it to create film sets, digital virtual film sets that then could be displayed on these these screens. Obviously, there are a lot of other software packages, many of which were specifically created for the filmmaking industry, many of them created by Industrial Light and Magic itself, which is the entity that created the Volume UM in order to create the graphics that are seen within these environments and ultimately on the final product. Whether that's a film or series. The volume used seven machines to create the compute power to provide the graphics and dynamics necessary to make it all work for that season one. For the second season, the Mandalorian Industrial Light and Magic created Stagecraft two point oh and Stagecraft two point oh handles rendering and projection along with the software and hardware that makes it all work together, and it also works with other products like Helios, which is a video rendering and lighting package specifically made for the film industry. Beyond that, the nature of the technology means that changes can be made relatively quickly. I mean, computer generated assets can be manipulated. So if you are looking at a background and you think, you know, I wish that mountain were larger, you could actually make the mountain larger. It's a digital asset. You can change the shape and size of it. So they can be enlarged, they can be shrunk, they can be repositioned, changed in various ways, all near real time. Like it's not it's not instantaneous, but it's super fast. So if a cinematographer gets in the space and they see the virtual background, they say, you know what, that craggy tree would actually look better if we put it closer to this little hill over here, it will make my shot really work. Then people working on the film can actually jump in onto the virtual environment, make those changes, and then project the new version of it within minutes, as opposed to this being something that takes days or weeks to do and post. Now, these capabilities still do have their limitations. While the areas inside the volume and the A R wall are are pretty large, you still do have walls, just like the enterprise is holiday and there's no way for a computer to shift people inside those environments so that they don't have to worry about that. And for that reason, it could be a little challenging to choreograph like epic action sequences within these still limited spaces, and you might have seen some recent examples of action scenes I'm thinking specifically of the Book of Boba Fette that didn't quite live up to the term epic. These led background environments represent powerful new tools for filmmakers, but just like other tools like three D and c g I, they have to be used properly to contribute to the experience of watching a series or watching a film, rather than detract from them. I think that the volume and the A R wall both give filmmakers some of the same tool sets that people who have been working in in c G animation have had at their disposal forever. So if you look at Pixar and the way they create films, the way they create virtual environments, a lot of the same tools that they are using are the ones that we're now seeing used in live action filmmaking. At the point where you're actually shooting and where those elements are there, now, you don't have all the advantages that you would with c G. So Pixar, for example, they could technically create a camera that could do all sorts of things that a physical camera never could, right, they are not limited by the rules of real world physics when it comes to that. Pixar doesn't typically do that because they learned very quickly that if you start violating basic rules, uh, audiences get pulled out of the action because they say, like, well, that's impossible. It's it's you have to work somewhat within the rules in order for it to to work for audiences. But it doesn't matter for the volume more for the A R wall, because in both cases you're still using real physical cameras to shoot real physical people the background images. You could mess with that, but then that doesn't make a whole less sense either unless you're doing something really trippy, like I could see maybe a future Doctor Strange movie using that sort of thing to kind of be mind bindy and create something that is absolutely impossible, and yet the actors who are there would be able to see it and react to it. Um, that is something where I could see that work. But otherwise, what I think is really cool is that it does give those tools that that c G animators have had at their disposal forever and put it into real live action filmmaking, as well as that whole concept of bringing pre and post production closer to production itself so that you are able to make these decisions. Directors can focus more on getting the performance they want and not worry as much about whether or not it's going to the way they imagined once you finally get all the effects added in, because that's that's the big issue with like shooting against screen screen. You might have an idea in your head what's supposed to look like, but until all the effects departments have put in their work and then keyed out the the shots that you got on set with all the green screens, you don't really know. I would argue that that perhaps is one reason why I find the third act of most Marvel films to be really messy, Like they just look too chaotic and video game ish for me almost every case. So I feel like these sorts of tools could potentially help with that. But then you also have to remember you have to work within the constraints of the space. You know, you can't go beyond the borders of the volume or the A R wall or else you lose the benefit of those and you break immersion. Anyway, Still really cool, really interesting to see how this convergence of technologies, from the super high resolution LED screens that are capable of making backgrounds that don't look like they're screens, like it looks like these are real locations, all the way to the real time computation of measuring things like camera angle and displaying the differences in the background in real time so that you get the the realistic effects you want. All of that is phenomenal to me. There are, by the way, production features on both the A R wall and on the volume that you can find online. I recommend if you're interested in these, you watch them because that will give you a little more information. They come across a bit marketing ish to me, especially the volume, because it's really industrial light and magic saying hey, we got this cool tool. If you're making a movie, call us, But it does give you more insight into what they are doing and why it's important. Anyway, thank you Bronson for your request. If any of you out there have a subject that you would like me to tackle, you can get in touch with me and let me know. There are a couple of ways doing that. One is you can hop on over to the I Heart Radio app. It's free to download. Navigate over to the tech Stuff podcast. There's a little microphone icon there. You can leave a thirty second message for me and let me know that way, or you can always pop over onto Twitter use the handle tech stuff hs W to leave your request like Bronson did. And I look forward to seeing those and I'll talk to you again really soon. Text Stuff is an I Heart Radio production. For more podcasts from my Heart Radio, visit the i Heart Radio app, Apple Podcasts, or wherever you listen to your favorite shows.

In 1 playlist(s)

  1. TechStuff

    2,435 clip(s)

TechStuff

TechStuff is getting a system update. Everything you love about TechStuff now twice the bandwidth wi 
Social links
Follow podcast
Recent clips
Browse 2,432 clip(s)