Inside the Star Wars machine: part two
At the Letterman Digital Arts Center in San Francisco, legendary special effects house Industrial Light and Magic shares its office space with LucasArts, the game development arm of LucasFilm. Here, concepts of game and filmmaking are closely entwined – the two work together, drawing ideas and technologies from each other as the twin entertainment media converge. It's a gaming event, for example, that helped kickstart a new R&D project at ILM – real-time movie-making using motion capture systems. As Mike Sanders, the digital supervisor at ILM, explains: "On the first Force Unleashed game, LucasArts called us and said they were doing a press junket and wanted to do the Clone Cam thing with everyone on the tour. So I said, that's going to take a while, why don't we do something different? Let's write some code to port the virtual reality in this room into the game engine live and then we'll put the press people right into gameplay and they can lightsaber battle against the apprentice in real-time." Sure enough, a week later ILM had modified a system that could port motion capture footage directly into a game engine. In other words, you could have two actors in mo-cap suits performing moves, and everything they did was fed live and direct into a PC running an environment from the game. "This was the basis for us thinking, 'hey why don't we use the game engine as a real-time renderer for… anything'". As he's speaking Sanders picks up a camera plugged into his computer set-up and points it at different parts of the room. On a PC screen he's running an environment from Force Unleashed 2, and as he pans the camera, the in-game camera changes accordingly. Of course, ILM has been converging real and CG elements like this for years (including the ground-breaking creation of Bill Nighy's Davy Jones character in Pirates of the Caribbean) – it's just that in the past, all this data would have been fed into a computer to be later lit, tweaked and placed within the environment. The thing is, not only does such a staged process take more time, it also robs the director of creative autonomy. What Sanders and his team are doing now is evolving and condensing the practise. With the ILM system, directors will be able to 'film' CGI sequences in a real room with real actors, while looking into their cameras and seeing the virtual world. "It's kind of how Avatar was shot," says Sanders. "That's the idea: real-time animatics, real time game cinematics, whatever you want. So for VFX shots, rather than doing render passes overnight, you can do base layouts, base lighting tests and animation tests and watch in real-time – this will be a game changer in the future." This R&D concept is now part of ILM's Director's Toolkit, an array of digital tools designed to facilitate moviemakers working on CG-heavy projects. As ILM spokesman Greg Grusby explains, "This new technology means we can give a director a view of their characters performing in the actual environment they will be seen in for a given shot or sequence with interactive lighting, shadows and a choice or render looks. It's much more robust, and the system is flexible enough to be placed in a director's home or office. Via a game controller, the filmmaker can scout virtual locations, create storyboards based on actual camera placement, test camera moves or begin choosing lenses, camera platforms or map out other technical details of a shot." It's also a very hands-on intuitive system. "I wanted to show management how easy it was to use the tech," Sanders continues. "So by myself, I shot three pieces of motion capture, processed it, looped it back over the game engine, and with the virtual cameras, I did maybe 30 different shots and 120 camera moves. Then I cut it all together in Final Cut Pro with audio in less than two days. The motion system is just broadcasting what's going on in the room and the animation system's picking it up, retargeting it on to the character and recording it to the game engine. I was just trying to show that you don't need $10m and a team of 30 people to pump out animatics, or even game cinematics, anymore..." And in this way, the director has complete control over his CGI shots, using cranes, dollies, whatever he wants to achieve a shot, which is then converted straight into CG footage. It's the sort of virtual filmmaking that can be difficult to grasp for conventional helmsmen. As Sanders explains, "We work with a lot of directors, some of them get it, some of them don't in terms of the spatial understanding. Spielberg's awesome – he'd just grab the camera and go look for a shot that you would never have imagined, but he knew where it was in his head and when he went to it, everyone would think, 'ah that's why he's doing it.'" Sanders is certain that this virtual filmmaking technology will filter across into games, and we'll start seeing movie sequences that have been constructed in real-time. "I think you're going to get way more dramatic and cinematic quality in game scenes," he says. "It's already happening in the film sector, and in animatics in general – there's a lot more decision making going on at the front end of the art process. "The creative process is no longer linear," he continues. "It's more cyclical in terms of iteration. Traditionally, you'd have a script, then a storyboard, then you go into asset build, and you do your pre-production art then you go into production and post-production… Well all of that stuff is being considered at the front end now. And nowadays we're still using production tools in post-production – with Star Trek and Transformers 2 we used virtual cameras in post-production, live in the shot, actually adding CG camera moves on the animation, adding camera shake – JJ loves his camera shake. We replicate all of that realistically, in real-time, on Maya… it's happening throughout the process. It's changed the way we think." It is somewhat comforting to note, however, that while digital technology guides everything that's done here, there is still room for techniques and influences garnered from the original movie trilogy. In another part of the building Matt Omernick, talks about how the design of Force Unleashed 2 has been heavily inspired by the grungy, melancholic look of Star Wars and Empire Strikes Back. "We go back to the original inspirations: that's true for story – back to the Joseph Campbell stuff – and that's true for art," he explains. "We look at the Ralph McQuarrie drawings a lot. Even though it's a very different art style, there's a spirit to what was done there that you don't see anywhere else." "We drew a lot of inspiration from the very Earth-based, gritty, lived-in type of design that was new to sci-fi when the first films came out. Tatooine embodies it the most, that's for sure, especially when you get into Mos Eisley and everything is sand-beaten and torn up and chopped away. The cantina, the interiors, we used as visual reference quite a lot – we'd say, look, you can tell this door has had eight different types of droid bump into it seven times a day for 50 years. And then in contrast the empire is so clean and perfect and reflective and shiny… We used those as our spectrum – it should be one step above Tatooine, somewhere around the bad part of Naboo!" He also talks about how the art team has learned key techniques from Lucas and Irvin Kershner. "The Star Wars films were really smart about using colour carefully and thoughtfully," he explains. "Most backgrounds in the films are quite de-saturated – they reserve colour and saturation for the most important elements on screen. This gives the viewer absolute clarity on what's important in your image." He shows a still from the opening moments of Star Wars, with R2 and C3P0 walking down a corridor in the doomed rebel ship. The stark featureless white background means our attention is locked on C3PO, while R2, with some blue colouring, is the secondary priority. There's similar background uniformity on Hoth and on Endor, forcing our attention onto the warring Rebel and Imperial forces. In Force Unleashed 2, then, the team regularly uses flags and doorways painted in saturated blues and reds to indicate new areas for the player to explore. As Matt Omernick explains, "It works really well because during gameplay you have experience points popping up at you, you have holocrons, there's all this critical information, but what we don't want to end up with – that a lot of games do – is this sort of confusing visual confetti. Sometimes, you just see colour everywhere. So we use colour cautiously to guide the player's eye and give them information about their priorities." The team has even produced a chart showing the over-arching game plot and the points in which emotion and action build to crescendos. Each different stage of that process is given a relevant colour, matching the intensity of the narrative, and that colour is used throughout the environment. In this way the scenery becomes a subconscious guide to the story. Omernick also talks about how the Force Unleashed design team has been given access to props from the films. "Up at Skywalker Ranch, there's this giant warehouse of archives, which has everything from Jabba the Hutt to a box that was in the corner of the cantina. Thankfully, a lot of those props still exist, but Lucas has also done a good job of documenting the movie props via photography and making that available to us. And there's all this really cool prop re-use that you only pick up on it when you go through the archives. An example is, in the cantina in Star wars, behind the bar, there are these weird cylinders with holes punched in them – presumably that's what they pour drinks out of. But when you fast-forward through the movie and you see the droid, IG-88 – his head is the same object! (a Rolls-Royce Derwent jet engine burner, according to Wookipedia ) According to Omernick, even in the area of character design, where human modelling is now uncannily detailed and intricate, the team is continually drawing back to the basics. "Many games suffer from a similar visual problem – when you see an enemy way off on the horizon, you're like, 'well, what is that? Is it a stormtrooper, is this person going to hurt me?' So we spent a lot of time working on each character's silhouette – we weren't thinking about volume, or colour or do they have lights or VFX on them, we just thought let's get the silhouette right, so when you see a guy on the horizon you can immediately say, 'holy shit, that guy's huge and he's got a frickin flamethrower!'" To pack in more emotion and atmosphere, the art team (which reached 60 members during the peak development time on TFU2) has built a new deferred lighting engine, which allows multiple real-time light sources, adding an extra sense of Empire Strikes Back-style shadowy menace to everything. And for the vast menagerie of characters, there are new facial models with more polys, better use of shaders and a dramatically improved facial animation system. There are nice little graphical flourishes too – like when lead character Starkiller ventures outside on the storm-swept planet of Kamino, his clothes get visibly soaked from the top down. When he goes in again, they gradually dry and lighten up. For any new planets, species or spacecraft that LucasArts adds to its games, they need to consult LucasFilm licensing, the Orwellian-sounding department that checks everything is canon. "They understand what is, and what isn't Star Wars," says Omernick. "If there's something we're excited about, but unsure about, that's where licensing will come in. They'll be like, 'this is fine, but change this, alter that, the skin colour should be different…' That's the whole reason that division is there, to maintain continuity... They are the Star Wars brain!" Continues tomorrow See part one here
Market Reactions
Price reaction data not yet calculated.
Available after full seed + reaction pipeline runs.
Similar Historical Events(1 found)
MarketReplay Insight
1 similar event found. Price reaction data will appear here after the reaction pipeline runs.