Tag Archives: MPC

Ed Koenig has rejoined visual effects and post production studio MPC as an executive producer. Formerly EP of color at MPC’s Los Angeles office, he was one of the first hires the company made when it opened there in 2008. Koenig brings a broad range of post experience to his new post, where he’s tasked with continuing to grow the studio’s network of official partner facilities and expand its remote services beyond color grading. He will be based in New York, but work out of LA as well.

MPC has also announced new additions to its official partner facility roster: 11 Dollar Bill in Chicago and Hero Post in Atlanta. They join Charlieuniformtango in Austin and Dallas, The Work in Detroit and Ditch in Minneapolis as the studio’s list of partner facilities.

“We’re not merely looking to connect with top performers in major markets around the country,” Koenig explains, “but to redefine how these independent companies work with a studio as multifaceted as ours. I’ll also be functioning as a kind of roving advance scout for MPC, finding ways clients anywhere can take full advantage of what we have to offer in a way that works best for them.”

He cites as an example the work they’ve done with Ditch since adding them to the partner roster last year. MPC has not only provided several of the boutique’s clients with high-end color grading, performed by colorists in both its LA and New York offices, but also compositing, finishing, Flame work and a range of other VFX services, all performed by artists based many miles from the Twin Cities. “In these instances, we just point our signal toward Minneapolis and we’re collaborating with Ditch owner and editor Brody Howard and his entire team.”

“A big part of what I’m doing is bringing wide-ranging projects into MPC through our remote partners,” he continues. “While remote color has become an accepted part of the post production mix, our push to expand into a broader roster of visual effects capabilities puts us out in front.”

Warner Bros’ Fantastic Beasts and Where to Find Them is considered by some a Harry Potter prequel and by others an entirely new J.K. Rowling franchise. Filled with nearly 1,500 VFX shots, this live-action, CG and character-driven movie put a huge emphasis on pre-pro and established the increased role of postvis in the film’s visual effects post pipeline.

For the film’s overall visual effects supervisors, Tim Burke and Christian Manz, it was a family reunion of sorts, reteaming with many of the companies and individuals that worked on the Harry Potter movies, including director David Yates and producers David Heyman, Steve Kloves, J.K. Rowling and Lionel Wigram.

According to Manz, one of the most significant aspects of this film was how visual effects were integrated into the story from the very beginning. The direction from Yates was very clear: “Make things fantastic, but not fantasy.” For every creature design presented, Yates would ask, “What would be the story behind that creature? What would that character do if the audience saw it from one moment to the next?” Says Manz, “It all had to work to support the story, but not be the story.”

Manz feels that this movie speaks to a new way of storytelling with VFX. “Visual effects is now a part of that filmmaking and storytelling team rather than being the guys who stick in everything afterwards.”

Starting in January 2015, while Burke was busy as VFX supervisor on The Legend of Tarzan, Manz worked with Framestore animation director Pablo Grillo, a Framestore art and animation team and a group of freelance concept and previs artists doing creature development and scene design. Over eight months of design sessions they created 18 main animal types based on hundreds of variations, and worked with the Framestore art department to conceive the turn-of-the-century New York City sets and set extensions.

“Of course, there were creatures we tested that didn’t make the film,” says Framestore animator Andras Ormos, “but it was about the process of whittling it down, and that was the real joy of working on this project. The creative input stretched beyond the post production stage, deciding what worked and what wouldn’t in the overall film.”

“J.K. Rowling’s wonderful script was filled with characters and creatures,” explains Grillo. “Having seen how animation is such a big part of the process in a film like this, we decided that it was important to be involved from the concept stage onwards.” The character development and scene work sessions were so impressive they actually influenced subsequent drafts of the script.

Burke came on full-time in June 2015, and they split the movie in half. Manz took the lead developing the world inside Newt’s magical case, and Burke did the “Obscurus” and the third act. Principal photography took place from August 2015 to January 2016, and they took turns on set supervising their own and each other’s VFX sequences.

With Framestore and Double Negative taking the lead, the shots were spread out among nine main VFX and three previs/postvis companies including: Cinesite, Image Engine, Method Studios, Milk Visual Effects, Moving Picture Company, Nvizible, Proof, Rodeo FX, Secret Lab, The Third Floor and others. Burke says they divided the work by “the strengths of the companies and without overlapping them too much.”

Framestore
Framestore took on the majority of the complex character animation: the Niffler, Gnarlack, the Erumpent and Picket Bowtruckle, as well as many goblins and elves. Grillo first tackled Niffler, described by Rowling as “a long-snouted, burrowing creature native to Britain with a penchant for anything shiny.” The creature design was a mash-up of a spiny anteater, platypus and mole and went through hundreds of iterations and many animated prototypes. Framestore used the skin and muscle rigging toolkit Flesh and Flex developed for Tarzan on Niffler’s magic “loot stuffing” pouch.

Framestore

The reason the audience is so delighted when this character first appears, explains Ormos, is that “this scene is driven by the relationship between Newt and the Niffler. There was a history we had to get across — the fact that the Niffler was notorious for escaping and pick-pocketing, and that Newt was going through the motions in trying to catch him. They understood each other and there were little looks, a language in their movement.”

Gnarlack, an American, cigar-chewing, snarky goblin, voiced and facial mocaped by actor Ron Perlman, “is one of the best digital humanoids yet,” reports Grillo. Perlman donned a Vicon Cara 3D facial motion capture headset, surrounded by four high-resolution, high-speed witness cameras. According to Framestore VFX supervisor Andy Kind, Perlman also sat in front of 98 cameras for a facial action coding shape (FACS) session so the team could sculpt the face directly in 3D.

“We created CG characters for the giants, elves and band ensemble,” says Kind. “Then we gave them crazy instruments, including a sousaphone/trumpet concoction.”

A 17-foot carbon fiber puppet, built by Handspring Puppet, substituted for the amorous rhinoceros Erumpent during the Central Park chase scene. It was switched out with the CG version later and dynamic simulations of shattering ice, explosive snow and water effects were added to the concluding shots. There’s this liquid, light-filled sack on the Erumpent’s forehead that Manz says, “made her slightly more unusual than a normal creature.”

“There was an awful lot of digital environment as well as the beast itself,” continues Manz. “David Yates fell in love with the postvis for this scene. It was great to be able to play with shots and offer up suggestions for the edit. It was a very organic way of filmmaking.”

Newt’s pocket-hiding creature sidekick, Picket Bowtruckle, took two months and 200 versions to get right. “We were told that Picket moved too slowly at first and that he appeared too old. We played with the speed but kept his movements graceful,” explains Manz. “He didn’t really have any facial animation, but he does blow a raspberry at one point. In the end, we added more shots to get Pickett’s story to go through, as everyone just loved him.”

MPC
The Moving Picture Company (MPC) completed more than 220 shots and created the Demiguise, Occamy and Billiwig, as well as 3D set extensions of period Manhattan.

For Demiguise’s long, flowing hair and invisibility effect, MPC used their Furtility groom technology. According to MPC VFX supervisor Ferran Domenech, using Furtility “allows for the hair to move naturally and interact with the creature’s arms, legs and the environment around it.” Demiguise was animated using enhanced mocap with keyframed facial expressions.

MPC

MPC built the large feathered dragon-snake Occamy in sections to fill the real and CG extended attic. They used Furtility once more, this time to add feathers, and they augmented the code so that in the climatic fight scene they could scale the giant version of the creature down to mouse-size. MPC’s effects team then used its in-house Kali destruction technology to wreck the attic.

Finally, MPC worked on the Billiwig, a magical bug that can change its flight mode from dragonfly to propeller plane. “This little creature has lots of character and was great fun to bring to life,” reports Domenech.

Previs and Postvis
A major technical advance for Fantastic Beasts can be found in the workflow. It’s been 15 years since the first Harry Potter movie and five years since Deathly Hallows. Over that time Burke had designed a very efficient, streamlined, mostly film-based VFX workflow.

“In the past, we were always stuck at the point where when we shot the film, it was put into editorial, they cut it and then gave it back to us — quite often with big holes where creatures would exist or environments needed to be placed,” describes Burke. “Then we would have to involve the facilities to use their real power to push things through and start blocking out all of the characters. This took quite a bit of time and would always slow the process down, and time is really the key difference with everything we do these days.”

In the past, says Burke, he might wait two months to see an early block of an animated character, “which always then restricts what you can do at the back end or restricts the director’s ability to make creative changes.”

Thankfully this wasn’t the case with Fantastic Beasts. “In the middle of the shoot, Christian and I started supervising the postvis of the scenes we’d already shot,” he explains. They assembled a 50-artist in-house postvis team comprised of members of The Third Floor, Proof and Framestore. While some of the movie was prevised, all of the movie was postvised.

“The path from previs to postvis varied from sequence to sequence,” explains Peter McDonald, previs/postvis supervisor for The Third Floor, London. “At one end of the scale, we had sequences that never really survived through shooting, while at the other end we had sequences that were followed shot-for-shot during the shoot and subsequent editorial process.”

Third Floor postvis

“As an example,” he continues, “the Demiguise and Occamy scene in the department store attic was heavily prevised. The final previs was a pretty polished and spectacular piece in its own right with some relatively sophisticated animation and a highly refined edit. This previs edit was taken onto the stage, with printouts of the shots being referenced as the shoot day progressed. What later came back our way for postvis was very similar to what had been designed in the previs, which was very satisfying from our point of view. It’s nice to know that previs can help drive a production at this level of fidelity!”

One of the benefits of this process was having a movie during editorial that had no “holes” where VFX shots were to later appear. The “postvis” was so good that it was used during audience screenings before the VFX shots were actually built and rendered.

“There were a couple of factors that elevated the postvis,” says McDonald. “Probably the primary one was integration between Framestore’s team with our team at The Third Floor London. Having them present and being supervised by Pablo Grillo guaranteed that the work we were putting together was being judged from almost a “finals” point of view, as Pablo and his artists would also be the ones finishing the job in post. It meant that our postvis wasn’t a throw away — it was the first step in the post production pipeline. This philosophy was present beyond the personnel involved. We also had creature rigs that could be translated with their animation down the line.”

Third Floor’s previs of subway rampage.

One example of a scene that followed through from previs to postvis were parts of the Obscurus rampage in the subway. “Pablo and I worked very closely with artists at both Framestore and The Third Floor on this ‘sequence within a sequence,” says McDonald. “We started with bifrost fluid simulations created in Maya by our own senior asset builder Chris Dawson. We then had our animators distort these simulations into the virtual subway set. Through iteration, we developed the choreography of the piece and further refined the overall rhythm and shape of the piece with our previs editor. This previs then became the template for what was shot on the actual set with Eddie Redmayne and Colin Farrell in the roles of Newt and Graves. When the plate edit returned to us for postvis, we were pretty much able to drop the same distorted simulations onto the plates. The camera angles and cutting pattern in the previs edit had been followed very closely by the live-action unit. We then animated a layer of environment destruction and comped it into the shots to help tie it all together.”

During postvis, says Manz, “A character could go into a shot within a day or two. You would track the shot plate, put the character in, light it and stick it back into editorial. That sort of turn around, that in-house work that we did was the big, big difference with how the film worked. It allowed us to feed all that stuff to David Yates.”

Yates showed his director’s cut to the studio with every single shot of the film blocked out. There were no empty spaces. “We even got the environments in so he had a representation of every street,” says Manz. They completed a three-hour assembly of the film in about five months.

Creatively, it was very liberating, which enabled them to do additional shoots just to enhance the movie. Burke says they were designing and changing shots right up to the end. The final reveal of Jonny Depp as dark wizard Gellert Grindelwald came together all at the last minute.

Fantastic Beasts is like a Harry Potter movie because it exists in the J.K. Rowling story universe and is part of the Harry Potter lore. “Where it’s similar to Potter in terms of the filmmaking,” says Manz, “ is in making it feel very real and not fantasy. What I always enjoyed about the Potter films was they really felt like they were set in a real version of the UK; you could almost believe that magical sort of school existed.”

Third Floor previs

How Fantastic Beasts is different, says Burke, is that it is set in turn-of-the-century New York City, a real city, and not the magical school environment of Hogwarts. “We were dealing with adults,” continues Burke, “we’re not talking about a child growing and running everything through a child’s life. We’re talking about a series of adults. In that sense, it felt when we were making it like were making a film for adults, which obviously has great appeal to children as well. But I do feel it’s more of a film for grown-ups in terms of its storyline, and the issues it’s challenging and discussing.”

Someone told Manz that somebody “felt like Fantastic Beasts was made for the audience that read the books and watched those films and has now grown up.”

IMDB lists four new Fantastic Beasts movies in development. Burke and Manz are already in pre-production on the second, penciled in for a November 16, 2018 release date. “I think it’s fair to say,” says Burke, “that we’ll obviously be trying to expand on the universe that we’ve started to create. Newt will be there with his animals and various other characters, which are going to bring a lot more interest as the story evolves.”

Manz predicts, “It’s just trying to push the believability of the interactions and that world even further. The first film was all about creating that new world, and now it’s out there. It will be a new city (Paris) so we’ll have that challenge again, but we’ll build on what we’ve learned. You don’t often get an opportunity to work with the same team of people, so that’s going to be the great for the second one.”

After starting his career as a runner at MPC in London and working his way up to director of the VFX department, Patrick Davenport has returned to the studio as chief operating officer in the US, managing operations in Los Angeles and New York. He joins MPC during a US growth spurt — the company is moving from its current Santa Monica office to a larger, purpose-built studio in Culver City, slated for this summer.

Davenport left the UK in 1997 to join Digital Domain in Los Angeles, helping to set up and oversee a newly dedicated commercial division. In subsequent years he has held varied roles in VFX and production, but has spent the last six years working as SVP of global operations for Method Studios. He became managing director of CO3 and Method Studios in Los Angeles in January 2015.

“I’m thrilled and delighted to be coming back to MPC, who has had incredible global expansion in both advertising and film over the last 25 years, working at the very highest creative levels,” says Davenport. “I’m excited to be helping with the new space in Culver City and expanding the existing teams in both New York and Los Angeles.”

Disney’s The Finest Hours is based on the true story of the greatest small boat rescue in Coast Guard history. As you can imagine, the film, which stars Casey Affleck and Chris Pine, is chock full of visual effects shots — 900 of which were supplied by London’s MPC. Over a 10-month period, MPC VFX supervisor Seth Maury and producer Felix Crawshaw worked closely with the film’s director Craig Gillespie and production VFX supervisor Kevin Hahn

MPC’s work primarily consisted of recreating an immersive nor’easter storm, including ocean swells, turbulent seas, blowing rain and snow, and a sequence where a 36-foot Coast Guard boat must cross a digital version of Chatham bar, with 30- to 50-foot waves rolling, swelling and crashing around them.

Shooting began in the fall of 2014 in a 120-by-80-foot-by- 12-foot-deep water-tank built for the shoot at a warehouse in Fore River Shipyard in Quincy, Massachusetts. The filmmakers built a large gimbal set of the Coast Guard CG36500 rescue boat, a full-size mock-up of the Pendleton hull and a replica of the Pendleton engine room that could be flooded with 6ft of water. A number of scenes were also filmed at various locations in Cape Cod and four period rescue boats were restored for production.

The visual effects work started early on in production with the team creating some full CG shots in order to understand what would be required to create a storm of the magnitude required to tell the story.

MPC’s team completed around 300 large-scale water simulation shots (using Flowline), 300 weather and environment shots and 300 shots of the ocean behind actors shot on bluescreen. Digital doubles of the crew members, as well as digital versions of the Pendleton T2 tanker and CG36500 Coast Guard boat were also built by MPC’s team.

For the large-scale full CG water simulation shots, MPC built a library of FX elements and CG renders that were needed to create the ocean, consisting of the base ocean, a foam pass, a mist pass, a fine spray pass, a bubble pass, a water texture pass and refraction and reflection passes for water surfaces. A constantly blowing CG mist pass that the filmmakers named Speed Mist was added to the real footage shot on location in Cape Cod and on set in Quincy to accentuate the storm. For the shots that were panoramic enough to show interaction between the practical or CG boat and the water surface, there was also a suite of elements rendered to connect the boat and water, such as splashes, sprays, bow wakes, tail wakes, an engine wash of bubbles, turbulent water, foam and spray.

Simulations were created in Maya, Flowline and Houdini (including Houdini-Engine). Houdini was used for streaming water and environmental effects such as rain, snow and blowing mist effects — it was chosen because of its strength in handling many shots procedurally. Most of the shots were cached using the Houdini Engine plug-in within Maya. MPC built custom interfaces to simplify their workflow, manage assets and enable artists to handle multiple layers and shots together. They called on VRay and PRMan (Katana) for rendering.

The most challenging work involved a sequence where the CG36500 Coast Guard boat crosses the Chatham Bar in 30-50ft waves. The waves were in various state of swelling, crashing, dumping and spilling. MPC developed systems for creating each of these waves, and additional systems for adding layers of waves into the same scene. Many shots in this sequence were crafted as single-solution waves, such as looking down the barrel of a crashing wave, or being pushed along and backwards by a wave that had already crashed. Maya was used for wave rigs, layout, animation, geometry preparation and some of the FX work. MPC built a complex (and quite flexible) wave rig used by the animation department to prototype and design waves. Once size, speed and shape were locked in animation/layout, the FX department was bringing caches into Flowline for dynamic simulations, sprays, etc.

VR was everywhere at CES earlier this month, and LA’s MPC played a role. Their content production arm, MPC Creative, produced a film and VR experience for CES 2016, highlighting Faraday Future’s technology platform and providing glimpses of the innovations consumers can expect from their product. The specific innovation shown in the CES VR film was a concept car — the FFZERO1 high-performance electric dream car — and the inspiration around Faraday Future’s consumer-based cars.

“We wanted it to feel elemental. Faraday Future is a sophisticated brand that aims for a seamless connection between technology and transportation,” explains MPC Creative CD Dan Marsh, who also directed the film. “We tried to make the film personal, but natural in the landscape. The car is engineered for the racetrack, but beautiful, in the environmental showcase.”

To make the film, MPC Creative shot a stand-in vehicle to achieve realistic performance driving and camera work. “We filmed in Malibu and a performance racetrack over two days, then married those locations together with some matte painting and CG to create a unique place that feels like an aspirational Nürburgring of sorts. We match-moved/tracked the real car that was filmed and replaced it with our CG replica of the Faraday Future racecar to get realistic performance driving. Interior shots were filmed on stage. We chose to bridge those stage shots with a slightly stylized appearance so that we could tie it all back together with a full CG demo sequence at the end of the film.”

MPC Creative also produced a Faraday Future VR experience that features the FFZERO1 driving through a series of abstract environments. The experience feels architectural and sculptural, and ultimately offers a spiritual versus visceral journey. Using Samsung’s Gear VR, CES attendees sat in a position similar to the angled seating of the car for their 360-degree tour.

MPC Creative shot the pursuit vehicle with an Arri Alexa and used a Red Dragon for drone and VFX support. “We also mounted a Red, with a 4.5mm lens pointed upwards on a follow vehicle that allowed us to capture a mobile spherical environment, which we used to map moving reflections of the environment back onto the CG car,” explains MPC Creative executive producer Mike Wigart.

How did working on the film versus the VR product differ? “The VR project was very different from the film in the sense that it was CG rendered,” says Wigart. “We initially considered the idea of a doing a live-action VR piece, but we started to see several in-car live-action VR projects out in the world, so we decided to do something we hadn’t seen before — an aesthetically driven VR piece with design-based environments. We wanted a VR experience that was visually rich while speaking to the aspirational nature of Faraday Future.”

Adds Marsh, “Faraday Future wanted to put viewers in the driver’s seat but, more than that, they wanted to create a compelling experience that points to some of the new ideas they are focusing on. We’ve seen and made a lot of car driving experiences, but without a compelling narrative the piece can be in danger of being VR for the sake of it. We made something for Faraday Future that you couldn’t see otherwise. We conceived an architectural framework for the experience. Participants travel through a racetrack of sorts, but each stage takes you through a unique space. But we’re also traveling fast, so, like the film, we’re teasing the possibilities.”

Tools used by MPC Creative included Autodesk Maya, Side Effects Houdini, V-Ray by Chaos Group, The Foundry’s Nuke and Nuke Studio and Tweak’s RV.

Fabric Software is launching Fabric Engine 2 at SIGGRAPH 2015 in Los Angeles,. The second generation of the development platform features the Canvas visual programming system that was previewed in March, as well as integration with Pixar’s RenderMan, The Foundry’s Modo and the Unreal Engine 4.

Fabric Engine 2 makes it even easier for studios of all sizes to build high-performance tools and applications for VFX, games, virtual reality (VR) and visualization. Fabric Software has more than 400 users on the beta program and expects to release Fabric Engine 2 in September.
On Tuesday, August 11, during a Fabric Software user group at SIGGRAPH, major VFX studios such as Double Negative, MPC, Hybride and Psyop will demonstrate how Fabric fits into their pipeline.

CAN YOU DESCRIBE YOUR COMPANY?
MPC is an international creative studio. You’ll find our work just about everywhere, from the VFX sequences in feature films like Godzilla to a mobile app for the X-Factor series, and everything in between. I work in our motion design studio in London. This is technically our advertising department, and we do a lot of commercial work. But we also make pop promos, virals and film titles.

WHAT DOES THAT ENTAIL?
Motion design covers a vast spectrum of work, from graphic 2D animation to photoreal 3D. Our clients bring us a huge variety of work, so I’m usually jumping between a range of tasks. One minute, I might be in a meeting with clients, bashing through treatments and schedules. The next minute, I could be animating a complex 3D scene. The rest of the day-to-day work usually involves leading a team of animators as we put together a TV commercial, pop-promo or title sequence.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
A lot of use come from fields outside of animation and design. My degree is in American Literature. Right now I share a desk with a former sommelier. We also have a former rock star in our midst, but he’d rather I not mention it.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I think it’s the mix of the creative and technical roles. When I was little, I loved taking apart things in the house and putting them back together, with varying degrees of success. My dad got so fed up. He eventually started bringing home broken appliances, like radios, for me to play with. These usually involved mains electricity. So I really shouldn’t be alive today. Since then I’ve always wanted to take things apart, see how they work, and then put them back together the way I want them to be. I see it as a form of creativity. So using technology to tell stories is a dream job for me. I think motion graphics is full of people like me: people who want to be creative and technical at the same time.

WHAT’S YOUR LEAST FAVORITE?
The gender imbalance. We really need to get more women working in our industry.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Anytime the phone stops ringing and the email stops pinging and I can get down to work.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Good question. Hopefully, something with a little more fresh air, but, honestly, I love this work.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I have a learning disability, which makes it very difficult to work with traditional media, like a pencil or brush. This was a real frustration growing up because I’ve always loved design. I remember the rush I got watching Star Wars; all those beautifully designed ships. Even the Star Wars logo inspired me. I desperately wanted to make my own spaceships and title designs, but I couldn’t even draw a straight line. That changed when I started playing with computer graphics. Suddenly my clumsy hands didn’t matter anymore. Even the rudimentary tools in the first Mac computers were enough to convince me that this is what I wanted to do.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’ve just wrapped up a commercial for a cognac brand. The job involved over 10 shots of fluid simulations, so it was a technical challenge, but a rewarding one. We’re pretty confident we can animate any sort of falling booze now. I’ve also worked on the Adidas Predator Instinct commercial (above) and projects for Liberty Human Rights (pictured below left) called Where Do They Go? and Amnesty International called I Have a Name (pictured below right).

Conversely, I’m now directing a series of 2D character animation films for a pharmaceutical brand. It feels like a completely different world. But it’s a great chance to direct something with a bit of humor.

I’m also doing some tests using game engines for our animation work. This could be a new frontier for us.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Many years ago, I made two ads for charities I support: one for Amnesty International, and one for Liberty Human Rights. They just started out as ideas that I wanted to try out, but as I worked through them, I realized they could actually help these groups out. So I showed them both to their target charities. In both cases they picked up the spots and integrated them into their TV, cinema and online campaigns. Looking back now, the ads were pretty simple, and in some ways naïve. But I’m still glad I got them out there. Working in advertising often means creating work that pushes the boundaries of honesty. So I think it’s vital we use our skills to help people, rather than just convince them to buy things.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
It’s probably obvious, but I can’t live without my smart phone. The strange thing is I hardly use it as a phone. It’s kind of a digital Swiss Army knife. I have a long commute to work, so my phone is like an entertainment center. I watch films on it, read books and Tweet the occasional rant about my train service.

And the Internet… I think sometimes we forget that it’s a piece of technology, but it’s had such an impact in the way we work. It’s an amazing resource for learning and researching projects, but at the same time, I think it facilitates laziness. It’s so easy now for designers to find something cool online and knock it off as their own project. I see quite a few creative briefs come in that are clearly little more than a trawl through the creative blogs. So I suppose it’s a blessing and a curse.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
The usual suspects: Facebook, LinkedIn, Twitter. I’m also pretty active on forums like CG Talk. I regularly check the motion graphics blogs like Motionographer, Stash and Mograph, but I really prefer the face-to-face meet-ups like See No Evil in London.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I rarely get a chance, but when I do, being an expat American, I like to listen to US online radio stations like KCRW.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’m a father of three. Work is where I go to relax.

One of MPC’s latest commercial productions features the story of a mother and son on an epic trip in search of dinosaurs in the new Nescafé Goldblend spot from Publicis London. The commercial was filmed on location in South Africa’s Royal Natal National Park, and the MPC VFX team slightly augmented the park’s famous rock amphitheater to create part of the scenery.

Jon Park supervised the 3D team in the creation of the Jurassic creatures, with a main focus on the T-rex and her babies. Each dinosaur was treated individually, with the team modeling and rigging herds of brachiosaurus, pterodactyls, stegosaurus and T-rex. Shooting conditions in the park proved challenging, with a daily two-hour hike to locations and the team carrying all required equipment. Local Zulu guides were employed to help with the more treacherous terrain, including crossing a rocky stream via a rope bridge.

Back in the studio, VFX supervisor Dan Sanders led the MPC team in creating a prehistoric look and integrating the CG dinosaurs. In addition to the matte painted volcano and corresponding reflection seen in the canoe scene, the team added extra waterfalls, clouds and mist, and altered the appearance of the sky and time of day.

The dinosaurs were integrated using multiple techniques. For instance, GOBO techniques were used for the T-rex scenes to create a dappled light effect over the dinosaurs’ skin and draw the eye quickly to the animated creatures.

Tools used included Autodesk’s Maya and Flame, Adobe Photoshop and The Foundry’s Mari and Nuke.

Jo Arghiris has joined MPC Los Angeles as executive producer for VFX. In her new role she will work closely with managing director Andrew Bell and alongside executive producer Elexis Stearn to build and strengthen client relationships, oversee operations and manage production teams.

Arghiris began her career in production at MPC in London and has over 15 years of VFX advertising experience. Before rejoining MPC, Arghiris rose through the ranks at The Mill working across London, New York, and most recently Los Angeles as an executive producer where she has led projects with leading agencies, including Deutsch, Goodby Silverstein & Partners, 180LA, Wieden+Kennedy and TBWA Chiat Day.

Arghiris has won a Silver Lion at Cannes, two Yellow Pencils at D&AD, a Gold Clio and a Gold at the One Show, among others. She has worked with directors such as Traktor, Nicolai Fuglsig, Jake Scott, Stacy Wall, Dante Ariola, Harold Einstein and Jim Jenkins on campaigns for Nike, Budweiser, Adidas, Target and Coca-Cola. She has also been instrumental in projects such as Tooheys The Quest, directed by Garth Davis at Exit Films, and PlayStation Greatness Awaits directed by Rupert Sanders at MJZ.

MPC Creative, the content production division of MPC, has hired executive producer Sophie Gunn. She brings with her extensive experience across agency, VFX and motion design production and until recently has been overseeing the growth and development of the MPC Motion Design studio, which specializes in dynamic motion graphics, design, illustration and animation.

In this new role, Gunn has been involved in the production of 19 online films for adidas, including Predator Instinct, celebrating the 20th anniversary of the eponymous boot, and starring German footballer Mesut Özil.

Gunn is now working closely with head of creative strategy for MPC Ben Cyzer, who says, “MPC Creative incorporates concept development, direction, production and VFX all in-house. Sophie’s experience, along with her kooky personality, is perfectly placed to strengthen and develop the MPC Creative brand. The success of our collaborations with adidas is testimony to Sophie’s inventive ways of working and aesthetic eye.”

The news comes alongside the recent appointment of Zak Thornborough to EP of MPC Creative in the US.