Creating a Sense of Realism in ‘Pokémon Detective Pikachu’

Framestore’s Jonathan Fawkner and MPC’s Pete Dionne discuss the various ways visual effects were produced to help director Rob Letterman bring the smart-aleck yellow cartoon sleuth to the screen.

Legendary Pictures’ and Warner Bros. Pictures’ comedy adventure ‘Pokemon Detective Pikachu,’ a Warner Bros. Pictures release. All images © 2019 Warner Bros. Entertainment Inc. and Legendary.

Despite the cartoon origins of the Pokémon characters, filmmaker Rob Letterman (Monsters vs Aliens) and production VFX supervisor Erik Nordby (Passengers) were determined to situate the wise-cracking super-sleuth, Detective Pikachu, and his fellow characters in real-life locations and sets for Warner Bros. and Legendary Pictures’ action-comedy, Pokémon Detective Pikachu.  “Erik was aware that he was making fantasy characters that were intrinsically artificial, so he wanted to do everything possible to ground them in a sense of reality,” Framestore creative director of film, Jonathan Fawkner, describes. “The vast majority of our footage was plate-based that we shot on film.” 

The Third Floor was embedded within the production in order to provide the postvis needed to help the filmmakers make editorial sense of the mostly empty live-action footage. “The postvis was an interesting thing for us because on our big sequences, there were lots of second unit shots,” Fawkner explains. “You end up with an awful lot of empty plates, which makes it difficult for an editor to sort through the footage and know what he’s looking for.  What was quite new for me on this one was that I persuaded Eric, Rob, and the editor Mark Sanger to handover all of the rushes for our big action sequences. That enabled us to come up with animation ideas and create plates that would support them.  We would recut the sequence with different plates, put the postvis on top and send them back to the editor who either said, ‘That was a good idea.  How about we put it here?’ or, ‘That doesn’t work.’  It was an organic process editorially.”

Pipeline R&D for the production centered around character development. “The big new thing was our fur solver called Fibre, which we used for the first time on Christopher Robin,” Fawkner notes.  “We were also able to use our jet system, which allows us to setup creature effects pipelines that often work right out of the box.  We can jet any one shot without having to set it up, and the artists get a great first stab of looking at the creature effect. That stopped creature effects from being a bottleneck as a manual per-shot activity, which allows us to be a lot more bespoke with the shots which demanded it.” 

“Part of the story involves a poison cloud that effectively fills up this big space,” Fawkner reveals.  “That was probably our biggest simulation challenge. It’s not just about making it big or making it read correctly. Every shot had to tell a story, so there was a huge amount of art direction that went into designing where the cloud of poison gas was going to fit.  With the intricacies of the story, that meant some characters were going to be in this poison gas, but other characters had to avoid it.  Every shot had geographic and editorial requirements, so the pictures became very art directed and that’s always difficult when you’re dealing with big simulations.”

MPC digitally recreated an entire Scottish valley for giant Torterra to occupy. “Our heroes find themselves in the middle of a forest, which begins to move and shift under their feet as they flee for safety – at the end of the scene its revealed that they are actually on the back of a mountain-sized Torterra as it rustles around with its pals in what we previously thought was a natural valley of mountains,” MPC VFX supervisor Pete Dionne shares. “For this, we shot as much plate-based material as possible in the mountains and valleys of the Scottish Highlands, which established the look of our immediate digital forest, as well as served as inspiration for how to scale a normal Torterra Pokémon up to the size of a mountain.”

A team of five people travelled across Northern Scotland to capture location data through a combination of ground, drone, and helicopter photography as well as fine and large-scale LiDAR scanning. “This data capture was crucial, as we had hundreds of assets of all scales and frequencies to create, from high-res clumps of grass to entire CG mountain landscapes,” Dionne explains. “Creating this CG forest with a convincing amount of detail and variety required scattering millions of instances of our assets throughout the terrain. We would begin by composing our terrain from bare sculpted grounds, derived from LiDAR and photoscans, then scatter all of our grass, foliage, tree, and debris assets on top of it, using a complex Houdini workflow to procedurally replicate the natural logic of this Scottish landscape.”   

“Every single foliage and tree asset was simulated and cached out in a variety of movements, building up a vast library of clips which were triggered based on the level and frequency of movement of the terrain on any given shot,” Dionne continues. “All of this, including the simulations, were designed with an intricate instancing workflow which then allowed us to render around 100 shots of this dense shaking forest. This was probably the most complex CG build and sequence that I’ve ever been a part of, and the team did an amazing job executing it.”

A climactic third act battle takes place in Ryme City, with principal photography taking place in London.  “The majority of street-level scenes and set enhancement was limited to skyline embellishments and adding additional Pokémon-themed CG signage throughout the city,” Dionne states. “Where things got really challenging was for all of the aerial establishing shots and the battle above the street, as the scope of this work required a full CG city. Our goal was to create a modern, bustling city, but with a much denser and taller skyline than what London had to offer.”   

A five-block radius of Leadenhall Street was rebuilt for the film’s Pokémon parade scenes.  “This required a crazy coordinated effort of street-level photography, street-level and rooftop LiDAR acquisition, as well as both extensive aerial drone and helicopter photography,” Dionne adds.  “From all of this data, we were able to recreate a high-resolution CG version of a downtown core to use as a base. Our next layer of development was to scatter around 50 additional custom designed skyscrapers across downtown, which gave us a much more congested area for Mewtwo and Pikachu to weave through during their aerial battle.”

In order to create CG establishing shots of Ryme City from the outside looking in, the environment needed to be extended.  “For this, we accessed a large-scale open source LiDAR of Vancouver and its surrounding environment, and then dropped our embellished London core into the middle of it,” Dionne remarks.  “As Vancouver lacks the tower density that Rob Letterman desired, Manhattan was used as an inspiration for laying out hundreds of additional skyscrapers into our digital city. It took us over eight months to complete. However, with this build being so robust and fully digital, it allowed us to then turn around 200 massive shots in about two months, which was a thrill to watch come together.”