A number of robots are happily engaged in a series of repetitive tasks, completing each cyclic movement with clockwork precision. But when the system begins to accelerate, the poor automations are pushed beyond their operational limits, with grisly consequences …

Vicious Cycle is a personal project by director Michael Marczewski, who works as a motion designer at ManvsMachine in London (the studio responsible for the stunning Versus, commissioned by MAXON for the release of Cinema 4D R18). The film, which was completed entirely by Michael himself over the course of a year in his spare time, was a Vimeo Staff Pick and has been entered into several short film festivals.

48857

"I've always been intrigued by intricate mechanisms," he comments, "and I began to play around and make some in Cinema 4D. Then the idea of connecting them to helpless robots came, and it evolved from there. My initial idea was to make a fake instructional video with the robots acting out various tasks to demonstrate how to do them but then everything starts to go wrong. I think the malfunctioning allows for a lot of comedic moments in the film."

The amazing thing about Vicious Cycle is that it was all done with just a handful of keyframes and not a single IK rig. Instead, Marczewski relied almost entirely on Cinema 4D's rigid body dynamics. "I designed the [mechanisms] to be simple yet effective but also look visually interesting. I set the scenes up, added a small number of keyframes here and there, and then let the dynamics in the scene play out."

Around 80 percent of the motion is dynamic but Michael admits that some moments had to be 'faked' by manually animating part of the scene, or baking the simulation and tweaking the keyframes. He also used cuts in the edit to switch between dynamic motion and faked animation.

Marczewski knew what motion he wanted the robots to perform and started by sketching out how each mechanism worked. "The trick was to link those two things together – it was done through a lot of trial and error! Each scene had to be approached a little differently to make sure the motion looked smooth and natural."

A typical setup starts with a keyframed object, such as a spinning cylinder with a Collider Body tag acting as a motor. However, Michael often used an animated object instead of a motor, as many of the setups use repeating keyframes and it was easier to keep track of the animation. A variety of Connectors were then used to link everything together. "The simplest setup was the waving robot in the intro," he says, "and the most complex was probably the baseball scene."

The film begins with a few rudimentary activities, and then we see a robot with the Sisyphean task of carrying batteries up an endless conveyor belt. The arms are rigged with springs in the shoulders, elbows and wrists, enabling it to carry its cargo. However, while only the legs are being driven, the robot remains suspiciously vertical. "His torso is kept upright with a connector and spring linked to the floor," admits Marczewski. "Looking back, I should have had a visible supporting pole to the pelvis."

Next, a serving robot collects plates of sushi and drops them onto a rotating carousel – and it's all done with physics. "The only thing in this scene that is animated is the sushi bowls coming down the tube to land on the tray one at a time. It was just too tricky to get a mechanism to work that would release a sushi bowl when the tray came under the tube. It would have been possible, I think, if I had had more time."

However, the first real cheat comes in the segment where a robot uses a chainsaw to cut chunks off an endless log. As the blade passes through the log, a Boolean object is used to cut a slice out, and then a new piece was made dynamic so it falls off. Marczewski also uses some slight of hand in the scene with the robots hitting and catching baseballs. First of all, despite Cinema 4D's support for Aerodynamics and Wind objects, the balls aren't really being held aloft in the tube by the air pressure. "I had to animate this bit," he concedes, "purely because the ball has to travel accurately and directly into the catcher's hand."

48858

The striking of each ball was also animated manually in order to precisely guide the ball into the catcher's hand. Once the ball is caught, a keyframe is used to drop the ball back onto the track at the right moment. "The motion of the catcher's body is keyframed," acknowledges Marczewski, "but everything else in the scene is dynamic: the balls on the runs, the mechanism for releasing a new ball, the little see-saw that allows an extra ball through each time, and the swinging motion of the batter. The scene was quite fiddly because of the ball run – it was a chain reaction, with each part relying on everything else working."

Dynamics was largely responsible for the robots' natural-looking movement – with subtle jiggles and rebounds. "I used springs with the connectors. I just had to fiddle with the rest angle, strength and damping." To get the dynamics to work properly, Marczewski also used the same real-world scale for each setup, as if each was a miniature, desktop-sized scene. Then, for his settings, he used a value of around 15 Steps Per Frame, with Maximum Solver Iterations slightly higher.

The automatons' recurring duties are interrupted when the driving mechanisms inexplicably grow faster and faster. Unable to cope, the cycles are broken, often with gruesome consequences: limbs are removed, torsos splintered, and heads separated from bodies. Marczewski explains that the little spurts of robot 'blood' were produced using an emitter to generate tiny dynamic cubes. "Sometimes I needed to use a Cloner as an emitter, simply by keyframing the number of the clones, and the Dynamic tag on the cubes had an Initial Velocity to create the spurt."

One of the trickiest sections was with the pickaxe-swinging robot, which dislodges a chunk of mineral that flies out and hits his colleague in the head. The dizzied robot spins around and ends up on the rock just before the next, fatal blow. "This was a bit of a nightmare to do entirely with dynamics," the artist admits. "I couldn't get the second robot to land on the rock how I wanted. So, in the end, I used a cut in the edit and created two versions, which gave me a lot more control of the situation. In the film, the edit is pretty seamless – hopefully!"

In terms of modelling and rendering, the bodies, arms and legs of each robot were UV-mapped (under duress: Marczewski hates doing UVs and texturing), and then a texture was created in Photoshop to add subtle details and provide worn edges. Each scene was then lit with a traditional three-light setup: key, fill and back lights. Although some setups – such as the baseball scene – proved trickier because they had many points of interest to focus on. In those instances, a few smaller light sources were used to add specular details. The scenes were then rendered using Solid Angle's Arnold renderer, with motion blur at render time rather than in post. "I find the results much more effective."

For Cinema 4D users inspired to have a go with dynamics themselves, Marczewski suggests they "just play around and keep going! If you enjoy it, you'll get a lot of satisfaction from problem-solving and figuring out how to get objects to act realistically."

"A lot of beginner dynamics videos I see always look too slippery to me," he continues, "so I would advise to pay attention to the 'Friction' option in the Dynamics Body tag – I normally have it set quite high, between 60-80%. Also, while I'm talking about the Dynamics tag, try to use 'Box' as the shape as much as possible, It's limiting with complex shapes but it gives the best results from my experience, and it keeps your playback pretty quick. I never use the 'Moving Mesh' shape unless I really need to!"

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-6417Mon, 10 Jul 2017 12:45:54 +0200Ghost In The Shellhttps://maxon.net/en-us/industries/movies-vfx/ghost-in-the-shell/Territory Studio Creates Futuristic Cityscape for Ghost in the Shell Using Cinema 4D news-6402Tue, 27 Jun 2017 16:02:14 +0200Stopping Abuse Before it HappensThrough films, videos and other media, the Hidden Tears Project aims to raise awareness about youth prostitution and trafficking, domestic abuse and more.Hidden Tears Project in 2015, their goal was to create high-quality content to raise awareness about issues such as gender inequality, human trafficking and domestic abuse. Working in collaboration with non-profits, experts on women’s rights and other issues, law enforcement, child advocates, psychologists, filmmakers, writers and many others, they have created films, videos and virtual reality productions that make clear the harsh reality experienced by too many women and girls.
Recently, Hidden Tears finished work on No Porn for Kids, a short educational video that aims to help children and their parents understand the need to talk openly about pornography, which is easily accessed using cell phones but difficult for young people to make sense of. Made using Cinema 4D, After Effects and motion capture, No Porn for Kids was commissioned by Culture Reframed, an organization founded by scholar/activist Gail Dines to “address pornography as the public health crisis of the digital age.”
In the video, young boys express regret that their dads never talked to them about sex, girls and how girls want to be treated. “Instead, I learned all of that from porn, but it wasn’t the truth,” the narrator says as a group of boys encircles a terrified and sobbing girl, some of them realizing the horror of what’s happening. The message: fathers and sons must talk about how girls are friends, sisters, wives and mothers to be cherished, respected and loved.
I asked Jason, Jordan and Jason Deparis, the freelancer who served as writer, producer, animator and director, to explain the making of No Porn for Kids, as well as how they hope their work can make a difference. Here is what they said:
Jason and Jordan, talk a little bit about your backgrounds and why you founded the Hidden Tears Project.Jason: I founded Green Dog Films and have been making films for a long time. I directed a short, anti-trafficking documentary called Until They All Come Home, starring Mira Sorvino, as well as The Submarine Kid, a horror movie called Avenged, as well as other features. For me, Hidden Tears was really born out of the incredible shock of learning, through the trafficking documentary, that this was such a huge problem in the United States. I decided to put my skills and relationships that I’ve developed over the years to create high- level content that can move the needle on the issue in a way that engages viewers to actively get involved.
Jordan: I am a dancer, actress, producer and choreographer for film, television, music videos and live concerts. I started my journey to end human trafficking and sexual assault when the girls were taken by Boko Haram. After doing our work individually on the issue, we co-founded the Hidden Tears Project to tell innovative and real stories about human trafficking, gender inequality and sexual assault. We gathered a team of writers and producers from House of Cards, Empire, Narcos, Sherlock and more, and gave them a three-hour presentation at the LAPD anti-trafficking unit downtown.

Talk about some of the other work Hidden Tears has done so far. Jason: We’ve made two short films, Unseen Dances, which is about human trafficking. Jordan was co-director and choreographer and it was part of the Red Sand Project. Tanya is about child trafficking and it was written by Sam Forman of House of Cards and directed by Monica Raymond from Chicago Fire. We work with some really talented people, but we’re very grassroots. We are continuing to fundraise project by project as we grow.
Jordan: Right now we’re developing a series with a unit of San Diego’s special forces and detectives and human rights attorneys that focus on rescuing girls, and there are some boys too, from gangs and pimps. We’re also doing quite a bit of virtual reality work for different companies. We have a narrative series that we’re doing in VR. It’s bilingual and it’s kind of like True Detective because it’s about two detectives trying to rescue kids. We’re also working closely with an appointee of the Obama administration who is working in private sector on assault and human trafficking issues.

What was Culture Reframed’s concept for the No Porn for Kids video?Jordan: Gail Dine speaks a lot about trafficking, which is very connected to the commercial sex industry. A lot of girls in porn are also victims of trafficking, and porn is getting more violent, so what used to be a small group of people’s preference has become normalized. This is what young people are seeing. Culture Reframed wanted to make something that could help stop the cycle of abuse.

40606Jason Deparis, can you talk about your role as director and animator?Jason D.: I am a television producer, animator and director, and I’m currently in the process of starting Infiniverse VR, a socially minded virtual reality production company. Not that long ago, I developed and produced a human trafficking documentary for MSNBC, and being on the front lines of that changed my life in ways that I can’t go back from. People think trafficking is not happening in our country, but it is and I want to do something to help. We got the original concept from Culture Reframed and I used that to develop a concept that was eventually accepted after it was shown to focus groups and we made changes. It was a very low budget, but all of us believe in this cause and want to make difference so we did our best.

How would you describe the look of the video?Jason D.: They wanted a kind of 2D, hand-drawn look. I figured C4D would be great platform for that. What we tried to do visually was have something dark, but not so dark that it would make people feel hopeless. There is hope, and we wanted to show that. It’s sensitive subject matter because it involves pre-teens and this video will be shown during talks with parents.

How did you handle the motion capture on a tight budget?Jason D.: We got a script approved and then we had child actors come in and do their parts, but we could only do that one time. I had to perform a lot of the motion capture we needed myself, especially the facial movements for everybody. The animation was done in two processes—do the motion capture and clean up, and then do additional animation on top of motion capture. To do that, I brought the motion capture into C4D and animated what was needed.

How long did this take, and what did you find most difficult?Jason: We worked on it for almost a year on and off. It was definitely a learning process for us.
Jordan: It’s hard when you’re working with clients who don’t understand why something they want, like having a character take a bite out of an apple, will take three days to animate. They’re not upset, they just don’t understand it so a lot of explaining needs to happen. Also, we are experienced filmmakers, but animation is a whole other world for us and we’re learning. We set up a rigorous approval process and that helped.

How can people help support your work?Jordan: People can follow Hidden Tears Project on Facebook, Twitter and Instagram. We have a blog, vlog and podcast that they can check out as we are always adding new content. They can tell their friends and spread the word so that this atrocity ends. We welcome partners and financial contributions so that we can continue to reach more people and support our partner non-profits. Join us in the fight to end trafficking, gender inequality and sexual assault.
We are working on creating a constant flow of innovative content to inspire and educate people around the US about how they can actually effect change in their communities, with local law enforcement, advocates, churches and others. By donating any amount that you can to Hidden Tears, you are directly influencing thousands of people who will be engaged to act across the country.
Click here to support the Hidden Tears Project.

Meleah Maynard is a writer and editor in Minneapolis, Minnesota.

]]>news-6107Mon, 27 Mar 2017 11:01:52 +0200When Two Monkeys See the Lighthttps://maxon.net/en-us/industries/animated-films/shine/In their thesis work ‘Shine’, students from the Filmakademie Baden-Württemberg create a charming depiction of the courtship between very special primates.news-6082Thu, 09 Mar 2017 14:15:27 +0100SPOV Helps Doctor Strange Live up to his Namehttps://maxon.net/en-us/news/case-studies/movies-vfx/article/spov-helps-doctor-strange-live-up-to-his-name/The latest Marvel Studios movie required convincing medical elements – made in Cinema&nbsp;4D - to sell its lead character's expertise While the bulk of the movie is packed with kaleidoscopic visual effects, the beginning of the story remains grounded in reality with segments showing the good doctor at work and, after the car crash, being operated on himself. For these sequences, the production required some authentic-looking medical displays, showing Doctor Strange's groundbreaking neurological work plus a montage in which we see other surgeons trying to rebuild his broken hands.

The task was handed to London-based SPOV, a design studio in central London, whose portfolio includes creative development, user interface and motion graphics work for Call of Duty: Advanced Warfare, Titanfall 2 and Tom Clancy's The Division among others.

We spoke with Allen Leitch, founder and creative director and senior designer Adam Roche. Leitch explains how, after SPOV's success with videogame cinematics, they started chasing movie work and quickly landed a job on Mission Impossible: Rogue Nation. "We made a few friends on Mission Impossible, and one of those friends, Art Director Alan Payne, who's an art director – called us and said, 'There's a requirement on Doctor Strange'."

That requirement was a series of medical displays showing images of Strange operating on a patient with brain lesions as well as X-rays and scans of the doctor's own fractured hands. After creating some basic 'block-outs' in Illustrator, the team then began a dialogue with the client to highlight the right hierarchy and pinpoint key elements.

"Because you're in the movies, it's difficult for us to know what's important," says Leitch. "If it was a real-life product for a medical client they would dictate what's important for a doctor to have feedback on. In the blink of an eye you should look at a screen and be able to pull out the information that you need. In the movies, it's 'what does the audience need?'"

The team's initial port of call was various ready-made anatomical models but these proved to be less than ideal. "All the off-the-shelf medical products we were looking at as well as the ones the client was pointing us to were all really clunky when you got close to them – and we needed to get very close," says Leitch.

The models were low-poly and either too smooth or with very low levels of detail, making them not very convincing. The team had little choice but to up-res the meshes, adding detail with Cinema 4D's sculpting toolset. "It took a lot of just very subtle sculpting from Adam and Julio [Dean, SPOV's technical director, who is based in Barcelona]. We've used things like Mudbox and ZBrush in the past but in this case the core Cinema 4D sculpting tools did the job."

"We just worked on top of the mesh," adds Roche. "We didn't bother to retopologize it because we always knew it was going to be in quite a stylized treatment. So for this, it didn't matter too much. Actually, some of the lower fidelity of the mesh kind of worked in our favor to look like an actual scan."

To create the traditional X-ray appearance, the team employed an inverted Fresnel shader in the material's Transparency channel, which instantly gives objects that translucent, ephemeral look. When combined with multiple layers and objects, the effect is thoroughly convincing.

38031Another crucial element of the sequences is the effect of revealing layers of skin and bone, much like a CT scan, which was achieved using a combination of Boolean operations and Cinema 4D's Proximal shader. This lets you apply an effect – such as localized color or an alpha region – controlled by the proximity of another object or null. As the distance between the two is reduced, so the effect becomes more pronounced.

"We rendered black-and-white with the Proximal shader," says Roche, "then used it as a matte in After Effects. It was a combination of Booleans and the Proximal shader. We were able to cut away but also bring back some of the transparency in the other layers."

"Proximal's a lot more subtle than hard Boolean," adds Leitch. "I think that's why we employed it here. It's also how you project it onto the mesh as well, the angle of attack. Part of the trick here was to match where Doctor Strange puts his probe – we had to match the same angle."

Another option for the cutaway shots was to use the Cinema 4D camera's Near and Far Clipping planes. Accessed from the Camera object's Details tab, clipping simply tells the camera to render only the objects or parts of objects that lie within that region. And by animating the clipping planes you can cut into an object, revealing the elements within, much like a Boolean operation. All these techniques were used to generate various layers, which were then composited and softened in After Effects.

Other tools used include Cinema 4D's Hair tools – although not in any conventional way, as Roche explains Roche. "We used it on splines as a very efficient way to render how you would a Sweep, for example. Instead of using a shader on it, we'd normally use a Hair material. But you can also render out a points pass on the geometry, which also gives a nice point cloud texture."

In the latter case it's simply a matter of adding a Hair shader to an object that has a transparent material. With a nice bright shade in the Color channel, and the length set to very low percentage, you can replicate a cool point cloud effect.

Insydium's X-Particles was also employed, says Leitch, but in this instance it was used to help create the arteries. The arteries were drawn as splines and then xpSplineMesher was used to mesh them into a form that they could be rendered.

Another unusual inclusion was MoGraph, which the SPOV team is starting to use to render interface elements. "Whereas previously everything would be done in Illustrator and After Effects, we're now rendering a lot of UI elements straight out of Cinema 4D," says Leitch, saying that it's an idea they picked up from the UI work done by Bradley Munkowitz on Oblivion. "It was a bit of an eye-opener for me," he continues. "It made me think there's more to the MoGraph toolset than I'd previously thought. If you start rendering MoGraph stuff in an orthographic view, it really adds an element to our UI work that previously we weren't really employing it for."

SPOV's animations were combined with real X-rays provided by the production and it's a testament to their work that it's almost impossible to tell what's real and what's CG. But the project wasn't without its challenges: "For me, it was the modeling," says Leitch. "Adding the detail. There are a lot of tools out there in the world to cut corners and make things easy. But to get the believability here, you had to do the work. You had to put the time in. I think that is what sets it apart and gives it the believability or authenticity that is worthy of a film like Doctor Strange."

Leitch admits to being a big fan of Cinema 4D. "It goes back to us being a design company," he explains. "We're not a VFX company; we're not really a CG company. We tend to work with talented designers and animators; we don't tend to work with CG specialists. We don't bring in a lighting designer. We don't bring in texture artists. Basically, what usually happens in here is somebody is given a shot, saying, 'That's your shot, and you need to generate pretty much everything in that shot – the UI elements and the CG elements.' For the most part, at SPOV, people have a shot or a sequence – and it's theirs. You complete it from start to finish. Cinema 4D allows people who are not specialist CG artists to generate quality 3D content."

The SPOV team would like to give a shout out to art director Alan Payne, production designer Charlie Wood, and the on-set supervision team at Compuhire.

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

All images courtesy of Marvel Studios 2016.

SPOV Website:www.spov.tv ]]>news-6024Tue, 14 Feb 2017 14:08:23 +0100The Last Flight of Astronaut #909With its animated short ‘909 Depart’, Uber Eck let their creativity roam free with Cinema 4D for their most elaborate project yet.38865
Originally, this project started as a test project for the Uber Eck team (Tobias Alt, Niklaus Hofer and Sebastian Schmidt). Whenever they wanted to test new tools or functions in Cinema 4D, ‘909 Depart’ was used as a testing ground. The Uber Eck team also wanted to push their creative limits with this project and prove that they can do much more than just shade, light and render nice 3D objects. ‘909 Depart’ was a conscious departure from their conventional everyday work, which generally had to be completed within a tight deadline. In the end, the trio worked on ‘909 Depart’ over five months in-between client projects.
After the project’s concept had been finalized, the team first created an animatic with a large space station frame and conceptual camera moves. One of the team members continually fine-tuned the camera animation and the remaining team members modeled the rest of the space station.
Uber Eck modeled the majority of the models in Cinema 4D and took advantage of the speedy workflow the Polygon Pen tool offered, as Niklaus explains: “The Polygon Pen makes polygon modeling much faster and even makes it possible to spontaneously design new objects while you’re working.” Since this tool combines so many important modeling functions, the designers were able work fluidly and comfortably, without having to constantly switch to a different tool. The team quickly got used to functions such as welding points, edges and polygons and sorely missed these when later working with other 3D applications.
The team used the Take System in Cinema 4D to render the scene from multiple camera angles, each of which showed a different shot space station. This made it possible to quickly create preview renderings for fine-tuning the scenes.
This project remained challenging to the end: The film’s premier and its comic rendition by Martin Hager took place at the Science&Fiction Festival 2016 in Munich, Germany. In order to finish on time, the team made a very precise calculation of the time needed to render the project and were rewarded for their efforts with just four hours to spare before the film was to premier – which is also when the Uber Eck team saw the final film for the very first time!
Uber Eck websitehttp://ubereck.com/
Music by http://www.brothersfour.com/
Audio design by http://www.reddawn.audio/]]>news-5913Tue, 17 Jan 2017 10:25:31 +01003D Animation and VFX Workflows for “Doctor Strange”https://maxon.net/en-us/industries/movies-vfx/doctor-strange/Leading design studios Sarofsky, Perception and SPOV credit Cinema 4D for inspiring creation ofmain-on-end titles, previz for main assets, medical animations and new Marvel Studios logo. &nbsp;news-5870Thu, 12 Jan 2017 09:41:36 +0100Immerse Yourself in Galactic Beautyhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/immerse-yourself-in-galactic-beauty/Motion Designer Takayuki Sato used Cinema 4D to create wonderful worlds filled with animals, plants, waterfalls, shooting stars and numerous other beautiful elements! 38882
Even though it was not planned as a sequel, the first sketches for Beyond the Moment of Beauty showed numerous parallels to The Moment of Beauty. Takayuki Sato remembers that so much of what was being sketched seemed so familiar that he decided to revive the original concept from 2014 with a new view of the world.
It wasn’t easy creating a filigree world that was constantly changing, growing and transforming. “In this work, I wanted to apply the new skills I had attained over the past few years. The challenging part was putting the images in my mind’s eye to film”, says Sato.
In 2015, Takayuki Sato had already tried to realize his vision as a film and although he was able to create a style frame of his vision, the underlying story distracted far too much from the imagery – which was more important for Takayuki Sato. He made a second attempt but was again not able to create the animations exactly as he wanted and ended up not completing this venture either.
“After the project had been restarted twice from the ground up I was finally able to create the imagery and animations that I had imagined and to the standards that I wanted”, explains Takayuki.
“Cinema 4D played the key role in the project’s realization. The floors, stones, plants, birds, the waterfall, the planets and the stellar fog – and much more: with the exception of the characters and several effects, everything was created in and rendered with Cinema 4D. The compositing was done in After Effects”, remembers Takayuki Sato. “I worked with Release 18 and I was particularly impressed with the new Parallax shader, which I used for the floor. I also applied displacement to several materials in the scene and got great results!”
“I used the new Thinfilm shader, which was added to Release 18, to create the crystals’ material”, remembers Takayuki, “and I am very happy with the result!”
“I created many of the effects using XParticles and Turbulence FD. XParticles itself is a very powerful tool for creating large numbers of images and impressions, which is why I used it for many of the scenes”, says Takayuki. “The connectivity between Cinema 4D and XParticles, Turbulence FD and Krakatoa made it possible for me to finally realize my visions. This project was also an excellent opportunity to get to know these tools.”
Takayuki Satos web site:http://otas.tv/Beyond the Moment of Beauty - Into the Galaxyhttp://otas.tv/work/btmob/]]>news-5830Mon, 19 Dec 2016 09:58:19 +0100Animated Morgellons and Nanobotshttps://maxon.net/en-us/news/case-studies/movies-vfx/article/animated-morgellons-and-nanobots/Visualizations about DNA can often appear quite chaotic to the untrained eye. This controlled chaos is often made more spectacular using special effects created in Cinema 4D. 37874

First, the filmmaker gives the audience insights into Morgan’s creation, including shots of cell manipulation at a microscopic level. In order to meet the expectations of both the director and the production designer while maintaining the required realistic look, MadMicrobe worked with an animation studio that specializes in creating precisely these types of effects. Co-founder Joe Dubin explains how the shots were created using Cinema 4D, X-Particles and After Effects:

“The first scene that we wanted to create involved embryonic cells into which modified genetic material is injected using a micro pipette. This key sequence in the creation process basically shows Morgan’s inception. She can be seen on a computer monitor, which had to have the typical look of electron microscopes. We were asked to create a sequence that shows the modified material flowing into the cells, as if they were morgellons. We created the part where the pipette presses through the cell wall, which then returns to its original shape after being penetrated, using Soft Body elements with Dynamics in an FDD and a Deformer in Cinema 4D. The injected DNA had to resemble morgellons, which we realized using X-Particles that we applied like a turbulence effect. We rendered the scene using the Physical Renderer in Cinema 4D, which in my opinion delivers much better results much faster than many people may think!”

In another scene, MadMicrobe had to show how a nanobot infiltrates Morgan’s neural network. The scene the nanobot hovering in a weave of neurons and then settle onto a knot which it then penetrates. “It was not immediately clear how this scene should look. There were only rough concepts for the bot design and how it should penetrate the knot, which made it possible for us to use our own ideas and imagination. Since we only had about 2 weeks to create the shots, including style frames and test renders, we brought on Jon Bosley, a Cinema 4D artist based in the UK who helped us with lighting, texturing and particle effects.”

“We used a simple polygon model to create the nanobots and distributed the antennae across their surface using MoGraph. Even though the final scene had to have the monochrome look of an electron microscope, the director insisted that the nanobots be colored orange. After the bot concept had been approved we created a high-res model and added abrasions and wear to the models using Cinema 4D’s Sculpt tools,” Joe explains.

After completing this rather spectacular film project, MadMicrobe returned to its core business of creating medical and scientific animations. “Of course this is what we specialize in but we’d love to use our skills and expertise for other types of projects in the future!” concludes Joe. ]]>news-5796Tue, 29 Nov 2016 10:02:23 +0100Spotlight on Raoul Markshttps://maxon.net/en-us/industries/movies-vfx/raoul-marks/Interview with the Emmy-awarded motion designer and Cinema 4D artistnews-5690Tue, 01 Nov 2016 13:08:55 +0100Making 3D Films – a One-Man ShowCreate a short film by yourself for your thesis in just eleven months? No problem for Shawn Wang with Cinema 4D as his tool of choice! 36421The fact that Shawn was – and still is – an enthusiastic Cinema 4D user and wanted to use this opportunity to expand his skills with the software helped shape his concept: he wanted to portray two robots that were looking for life on a distant planet. Friendship was the motivating factor for the characters and an asteroid storm was going to put this friendship to the test.

Shawn designed both robots as vehicles with added features to give them each a unique personality. “I chose this solution because I was planning on completing the entire film by myself. The vehicles are driven by caterpillar tracks and wheels, which were each given a special rig. These are in fact two separate systems that control the rig. The first system is based on the Constraint tag’s Clamp function, which lets it react to the underlying terrain. The tracks and wheels were always had contact with the ground and rolled when the character was moved. The second system was created using XPresso, which I used to execute a series of mathematical calculations to create restrictions for the tracks and wheels. This ensured that they only moved when they touched the ground. Both systems were used to make up the vehicles’ control system and a series of additional XPresso Expressions and several Python scripts were used to create the actual rig.”

Another challenge for Shawn was the high number of assets that he needed to populate his scenes. He created a custom library in the Content Browser for his assets, which made it possible for him to quickly and easily access all the rocks, asteroids, materials and other elements he needed. All he had to do is drag and drop them from the Content Browser into his scene.

Shawn used the Octane renderer, which offered render times of 4 to 15 minutes per frame. He primarily rendered overnight and had finished sequences waiting for him the next morning that were ready for compositing. Rendering took a total of about four months to complete.

Shawn worked on the project for eleven months and it was well worth the effort: his film has won more than half a dozen prizes at various international short film festivals and was chosen as Staff Pick at Vimeo where it’s received more than 140k clicks. Meeting and mastering the challenges that this project offered paid off for Shawn: his Cinema 4D skill level increased dramatically. “In the end,” he explains, “it’s part of the ice berg that you don’t see right away that really pays off: the tools and usability concept that never let you down when you’re working on a highly complex project!”
Rig demonstration: https://vimeo.com/147696468https://vimeo.com/137055688]]>news-5384Thu, 08 Sep 2016 09:59:00 +0200Ugly: A Single Word Sums it Uphttps://maxon.net/en-us/industries/animated-films/ugly/A short film about dynamic simulations with characters so ugly that they inspired this project’s name. This is a project that explores new horizons – and uses Cinema 4D to so it! news-5102Wed, 24 Feb 2016 15:18:00 +0100Monitoring Spectrehttps://maxon.net/en-us/industries/movies-vfx/spectre/How Vincent London used Cinema 4D to deliver a host of displays and infographics for the latest Bond movienews-4951Mon, 11 Jan 2016 14:40:00 +0100Bringing the Martian to Lifehttps://maxon.net/en-us/news/case-studies/movies-vfx/article/bringing-the-martian-to-life/Discover how Territory Studio used Cinema 4D to create convincing on-set technology for Ridley Scott's Golden Globe awarded sci-fi thriller.By Duncan Evans

Creating on-screen graphics for actors to interact with and relate to is a specialty of London-based Territory Studio. However, when those screens have to be scientifically accurate or coherent in terms of existing NASA technology and practices, it becomes a whole different ballgame. That was the challenge facing David Sheldon-Hicks, founder and creative director of Territory Studio. Fortunately, having already worked on massive sci-fi projects like Jupiter Ascending and with Ridley Scott on Prometheus, David had a good idea of what was going to be required and how Cinema 4D could help make it happen.

In the seven months it took to create The Martian, five artists at Territory created around 400 screens across eight different sets, and Mission Control alone featured almost 100 screens. The displays and animations were all for live playback on with which the actors to engaged. Most of them were interactive as well.

The film was based on Andy Weir's best-selling novel about stranded astronaut Mark Watney. It's set 20 years in the future and tells the story of NASA's third manned mission to Mars. The key concept was that the plot, and thus the on-set screen displays, was fixed in real science rather than fanciful popcorn sci-fi. This meant that Territory had to research the science being used before getting started with any of the designs.

David explains what was involved: "It was an incredible amount of research, from poring over reference photos of Mission Control and JPL at NASA provided by Dave Lavery, Program Executive for Solar System Exploration at NASA, to vehicle schematics, and the Pathfinder transmission specs. Plus, how NASA received and decoded messages. And, knowing that NASA is always one step ahead of current technology, we had to imagine ways to represent things, thinking about technologies they are testing now or have not even started developing. We also needed to understand what the on-screen data feeds meant and why they were necessary and when they came into play. Sometimes it felt like a crash course in rocket science and space exploration – it was definitely interesting but it also made the work more challenging to try and get our heads around."

Art Director Marti Romances pointed out how everyone was on a learning curve: "I have to say I learned a lot about things and how they work that I hadn’t known before this project. For instance, knowing what material the camera is pointing to by shooting a laser at it and analyzing the light reaction from it. Learning how all those things work to be as authentic as we could was a big challenge."

Peter Eszenyi, head of 3D, explained what he had to do: "My role was to design and deliver 3D plates, renders and animations to be used in the on-set screens. Initially, the brief was to develop a look and create animations that show the location of the HAB on Mars in different lighting conditions and states. One of the crucial moments in the film is when NASA figures out that Watney is alive by looking at satellite images of the area and realizing that the solar panels have been cleaned. This initial brief was extended to show the giant storm around the HAB, the rover driving through various regions on Mars, creating the pictures that the Pathfinder beamed back from the surface of Mars, creating various 3D elements for the screens such as launch simulations, potatoes, soil samples and so on."

The team used Sketch and Toon plus the standard cel renderer for some of the wireframe meshes, and they used X-Particles with the standard renderer to render the more particle-based assets like the potato displays, soil sample screens and the dust storm.

One of the more challenging areas was the actual NASA Mission Control set because of the need for realism but in a slightly futuristic setting. David Sheldon-Hicks explained how Territory approached it: "With a brief to remain authentic to current data conventions, we researched all the screens that Dave Lavery had forwarded. We studied what data was prioritized and when, how that was organized and depicted on screen and in the Mission Control space, and how the crew interacted with it. It included what commands were given and how that changed the data display. We also talked to NASA about how they think that will evolve over the next 20 years. Once our research had been finished, we created a visual language that was very true to the data requirements and the spirit of NASA's current Mission Control. The backgrounds we chose were black and dark blue with white fonts and light blue indicators. Red was used to highlight mission critical data and indicate warning status. The overall look of the interface is serious and authoritarian but the hierarchy of information is clearly readable to tie in with story points."

On the screens where the terrain is being displayed, along with the dust story, it steps away from the technical, scientific look to show actual landscapes. To create these, Territory obtained a very low-resolution model of the area in Jordan, which doubled for the Mars surface. The team used the plugin DemEarth for that. However, since the available dataset was not detailed enough they remodeled the area based on satellite and location photographs. Sub-polygon displacement was used for the mountains. Proxy models were used for the HAB, the solar panels and the Rovers and those were swapped with the high-resolution models supplied by the production. Since the exact coordinates of the actual location as well as the shoot dates were known, it was possible to create accurate lighting setups for the relevant hours. X-Particles was used to simulate the base of the dust storm and a smoke simulation on top of that was created with TurbulenceFD.

As the screens were live on set for the actors to interact with, last-minute changes were sometimes requested. One example was to change the size of a crater, which was initially around a kilometer or so in size, to be 100 kilometers wide. Cinema 4D made it easy to remodel the crater and when the size was approved, details could be quickly added to the base mesh. The standard renderer was used, so it was very fast to create the final frames with the changes applied.

For the interactive screens, most requests from the production were for image sequences for Territory's on-set engineering partners at Compuhire to program. They had to be programmed to display images that refreshed within in a realistic time-frame, and the animations and simulations were usually reduced to one image every three seconds. Other screens were programmed to simulate typing scenes from the crew's laptops or Mission Control computers. So no matter what the actors were typing, the right message appeared on screen.

David explains the workflow required to create the interface graphics for the MAV simulator that astronaut Martinez uses from the main ship (the Hermes): "It was all designed in Adobe Illustrator and featured some interactive buttons that had to be pressed on-set following a sequence. We isolated all the interactive buttons to be programmed for the on-set performance. In the middle of that screen we had to render a visualization for each stage of that remote-controlled probe. So we animated and rendered each different stage of the lift off of the MAV in a way that could be triggered on-set, connected with those buttons. The MAV 3D renders were done in Cinema 4D using a wireframe pass to achieve a more realistic look."

With all the design work being done during the day, most of the rendering work was done overnight on individual 3.5GHz, six-core Intel Macs. After seven months of intense science, Peter Eszenyi, head of 3D, summed it up with, "As always, Cinema 4D was a reliable tool that gave us all the options we needed to create the huge number of screens for Ridley Scott's The Martian."

Territory was naturally delighted when The Martian won 'Best Motion Picture Musical Or Comedy' at the 2016 Golden Globe Awards, held at the Beverly Hilton in Los Angeles, California. It was the second win of the night for the film. The Martian actor Matt Damon took home 'Best Performance by an Actor in a Motion Picture Musical or Comedy.' Ridley Scott was also nominated in the 'Best Director - Motion Picture' category. See more at: www.goldenglobes.com.

Duncan Evans is the author of Digital Mayhem: 3D Machines, from Focal Press

Mockingjay Part 2 is the fourth and final epic science fiction dystopian film installment in the The Hunger Games film series, and the second of two films based on the novel Mockingjay, the final book in The Hunger Games trilogy, by Suzanne Collins. The film follows Jennifer Lawrence who reprises her role as Katniss Everdeen, the reluctant leader of the rebellion against the Capitol district in the post apocalyptic nation of Panem who must bring together an army, while all she holds dear hangs in the balance. The film was released on November 20, 2015 in the United States.

Hristova explains that the studio’s primary focus on Mockingjay Part 1 was to generate interactive on-set monitor graphics that would stream live during shooting and that for Mockingjay Part 2 the team was given a more extensive range of post-heavy challenges that included the exclusive use of Cinema 4D to create two main hologram effects and to composite hero monitor screens that required story specific information and animations to advance key plot sequences.

“Our aesthetic reach in this film was much broader and required us to deliver a more advanced design style as we were asked to create detailed 3D hologram content for the Capitol – marking the first time that The Hunger Games film audiences could get a close-up look at the capital city of Panem and the Capitol, its chief legislative building, as well as a collection of other Panem districts,” says Grunfeld. “We were also tasked with generating the hologram for the “Nut”, a strategic military compound located in an underground mountain bunker which houses all the military equipment for the Capitol.”

“Although Lawrence and the Lionsgate team had very clear ideas in regards to the aesthetic needs of each corresponding environment in Mockingjay Part 2, the collaboration gave us plenty of liberty to experiment with style and feel,” said Grunfeld. “To meet the technical complexities of a movie of this scope demanded that we rely on the various toolsets in our Cinema 4D pipeline which provided us the creative flexibility to explore and tweak story notes throughout the project, critical in delivering a number of beautiful storytelling graphics on some very challenging design sequences. In combination with After Effects, we were able to generate external compositing tags and compositing project files throughout all of the hologram sequences that further resulted in an efficient workflow.”

The “Nut” hologram, in particular, challenged the Cantina Creative team to deliver a fully detailed, 3D photoreal and precise mountain range with only a single 2D matte painting to use as reference. 3D artist Jayse Hansen who worked with Cantina on Mockingjay Part 2 in preproduction and on location designed the initial hologram concept including the entire interior of the Nut (crew quarters, armory, hovercraft hangars, etc.,) explains: “Leveraging the landscape generator in Cinema 4D to block off shapes and the expansive sculpting tools in the software provided great creative latitude so that I could hand off a number of very precise hologram looks and models of the mountainous terrain surrounding the “Nut” and Capitol from different perspectives,” said Hansen. These concepts were then further refined by Stephen Morton who created a final workable version that helped the team work out positions for important story elements for several hologram sequences.

“In addition to the sculpting tools which proved critical in allowing artists’ to further customize and organize the project in layers and work nondestructively to churn out dozens of iterations of mountain topography, we used Cinema 4D to render a number of passes, namely a few variations of cel renders, layered procedural shaders, and a depth map. We also exported light and null data to use for additional elements in comp,” adds Morton.

“The time effector tool was also extremely useful on the mountain hologram to rotate topography designs uniformly across nearly 30 shots in the sequence and made it easy to explore rotation speeds and successfully arrive at something that both the client and we were pleased with.”

Cantina Creative website:www.cantinacreative.com/]]>news-4841Mon, 09 Nov 2015 10:55:00 +0100Merging Traditional and Modern Animation TechniquesThe team at Tiny Inventions combines traditional puppeteer techniques with modern 3D animation to create a highly unique look.Artists in ResidenceAfter successfully completing several projects, including highly acclaimed music videos for the band ‘They Might be Giants’, Ru and Max applied at the Netherlands Institute for Animated Films to participate in their Artists in Residence program. They were accepted and moved from Brooklyn to Holland and started working on their short film ‘Between Times’. The film’s story is about time in general and each individual’s different subjective perception of time. They were interested in including 3D animation into their process for some years and they thought the visual style for “Between Times” would work really well as combination of stop motion and 3D animation.

Several years ago Ru had taken a 3D animation course during her studies, which she describes as “frightening” at the time. However, when she got to know Cinema 4D she was surprised at how intuitive the application was to use. “The user interface and icons made it easy for me to quickly and easily learn how to get things done. I didn’t have to learn hundreds of commands because the icons were basically self-explanatory!”

From Puppet to 3D ModelAs with Ru and Max’s other projects, work on ‘Between Times’ began in the analog world: scenery, characters, accessories – in short all elements required for the film – were created using clay, paint, plywood, paper and other materials. While the sceneries were in fact used later, characters and character animation was done using different methods. After Max had photographed all characters from all sides, they used these images to create authentic-looking, animatable 3D models of the characters in Cinema 4D. The photos were also used in conjunction with Projection Painting in Mudbox.

The scenery was set up and lit in the studio and Max created the camera movements frame-by-frame while Ru animated the digital characters in Cinema 4D. In order to combine the elements, the scenes’ basic geometry was re-created in Cinema 4D so Ru was able to reference the entire scene when animating. Only the characters and shadows were rendered in the end.

Real Lighting DigitizedMax simulated real lighting as accurately as possible in Cinema 4D in order to integrate the animated characters into the filmed sequences as seamlessly as possible. Instead of using GI, a set of colored surfaces was arranged in the 3D scenes, which were used to simulate indirect lighting effects. “We determined that we could achieve believable-looking results even with colors that were only 90% accurate. A very crucial element was the direction and colors of the shadows, which had to be correct,” remembers Max. “Especially since the shadows were what linked the analog world with the digital world.” Cinema 4D’s Standard Renderer was used to render the animations and compositing was done in After Effects.

No HDRI lighting and no special render engine were used – yet nowhere in the film can you see that this is a combination of analog and digital elements. Rus and Max’s attention to detail and the effort they put into this project have been rewarded multiple times with a half dozen prizes from various animation film festivals.

First, the appetizer: the opening sequence. Opening sequences are not only used by movies and television series to entice viewers – design conferences worldwide also use opening sequences created by well-known motion graphics artists. Artists also see this as an opportunity to explore many of their own creative and more experimental ideas in these projects. When Semi-Permanent contacted Raoul about creating an opener for the Semi-Permanent Conference 2015, Raoul was instantly onboard. “I was very excited to take a month off my usual work to tackle something new. It was a great opportunity to explore some new techniques and images I’d had floating around in my head,” remembers Raoul.

36625

An photographic series from a 1968 issue of LIFE magazine gave Raoul the inspiration for the opening sequence: an astronaut falling through the darkness of outer space. This together with Raoul’s interest in the movie ‘2001: A Space Odyssey’ were some of the key influences for the titles. “2001 has a stunning ability to hint at something surreal, something undefinable. I’ve always responded to that unnerving feeling in films and loved the awe it was able to evoke. I’ve found myself revisiting 2001 every few years and it always seems to offer up some new form of inspiration.” determined Raoul.

“I’ve worked with Cinema 4D for a number of years now and find it a very malleable and forgiving piece of software. I had a tight timeframe of 4 weeks for concept work and production. Cinema 4D's versatility allowed me to work intelligently and focus on key elements of the titles.” For the main spacesuit modelling was a combination of Cinema 4D and Marvelous Designer. The high polygon count of the model meant a different approach needed to be used for animation. Raoul used low-res versions of the models to create the animations and transferred the movements to a higher resolution model using the Mesh deformer deformers. “For close-up shots I would use the simulations coming out of Marvelous designer and for mid-range to distant shots I would use the Mesh deformer process” explains Raoul.

The astronaut’s free fall and the various elements that drifted around him in space posed a series of problems for Raoul for which he had to find solutions. “I relied heavily on source photography from the Apollo era to reproduce the quality of light and grading reminiscent of that period. I sourced some very high-resolution height maps of Mars to help create the rocky surfaces in many of the shots.” says Raoul.Steam effects for gasses and clouds of dust can be seen throughout Raoul’s film. These effects were created using Cinema 4D and the TurbulenceFD plugin, which is specially designed for creating such effects. Raoul was also focused on finer details such as the uneven structures on asteroids: “In particular I had a lot of light and shadows shifting over very complex sub-polygon displaced surfaces. As the sun shifted over the cratered surfaces of the face I wanted realistic GI light to bounce off all the small details and create a believable lunar surface.”

For the final shots in the sequence, camera projection is used to create 3d shots from still photography. Raoul used this same technique in his work on the opening titles for ‘True Detective’. Cinema 4D was used to create impressive visuals using photos by Richard Misrach amongst others. “We really wanted to use these images in the titles but we wanted to push them beyond being purely stills. I did some research into the camera projection technique. This allowed us to build some low-resolution geometry and then project Misrach’s shots onto it in 3D space. We could then move a camera through that environment recreating an equivalent to a crane-style shot. There’s a little work involved with painting in the occluded surfaces so the camera doesn’t see double images but it’s an extremely effective way to add life to a still image.”

The images in the final sequence of the Semi-Permanent titles were supplied by a colleague of Raoul's who had just returned from a vacation in Iceland. Cinema 4D's Camera calibrator feature was used to help create the shots. “I used to just do the projections by eye, which took a lot of patience and guesswork but since the introduction of the Camera Calibrator, it’s super simple to set up this type of shots.”After the modeling and animation had been completed, the opener had to be rendered. “The usual trade-off with render engines has been between speed and lighting accuracy. Unbiased render engines would give you beautiful photorealistic results but would take hours to render, while fast renders could be achieved with the biased renders but often couldn’t quite capture light in the way I wanted. Octane for Cinema has found a way through that conundrum and given us the best of both worlds. By tapping into the relatively unused power of modern GPUs we can now render beautiful unbiased imagery at speeds much faster than the purely CPU-based solutions.” explains Raoul. “The apparent limited quality settings on an unbiased render are actually a complete blessing. In the past I found working with GI problematic. You’d spend a lot of time dialing in quality parameters. You’d hit render on a sequence, come back a few hours later and find a level of flickering throughout the sequence that you hadn’t noticed in the tests. You would then need to up the quality and start again, hoping you had pushed them up enough. Because Octane is unbiased there is no flickering, there are no AA settings; it’s actually a very pared-back set of controls for the render.”

When asked how Raoul finds the right settings when using the Octane renderer, he replies: “There’s really only one setting to be concerned with: samples. The image starts very grainy and resolves over time to remove that grain. So you would set one image to render to a high level of samples and after it’s reached a level of noise you’re happy with you make a note of the sample count and set the entire sequence off to render to that number.“

The response to the Semi-Permanent opener has been overwhelming. “The titles seemed to have been doing pretty well online and getting a fare bit of exposure, which is nice. Interestingly, a lot of people have been in touch about the prospects of VR in relation to the titles. There really seems to be a buzz out there for all things VR. A number of studios and software developers seem to be racing to create tools and experiences for the new medium. That excitement and enthusiasm is creating a really great atmosphere around the new possibilities in storytelling, visuals and a whole gamut of applications we haven’t even thought of, yet. So I’m quite excited to start exploring what can be done out of Cinema 4D for VR.”

]]>news-4625Thu, 11 Jun 2015 11:25:00 +0200Creating the Dark Side of Marvelhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/creating-the-dark-side-of-marvel/Discover how Territory Studio used Cinema 4D to help meet director Joss Whedon's desire for a gritty and darker story in 'Avengers: Age of Ultron'By Duncan EvansNo sooner than Territory Studio, helmed by founder and Creative Director David Sheldon-Hicks, finished using Cinema 4D for the user interface graphics in the smash-hit 'Guardians of the Galaxy,' did the call come from Production Designer Charles Wood and Art Director Alan Payne to get involved with Marvel's big-budget 'Avengers: Age of Ultron' project.

The task this time was to help support Joss Whedon's artistic and creative vision for the film to be darker and more challenging, exploring the humanity of each of the superheroes. To do this, David was asked to create a visual brand for each character that accurately represented their personality and role within the film.

On top of this, there was an extra requirement in that Charles Wood wanted the screen user interfaces to have a more realistic look and feel, to be futuristic, but with a modern plausibility to them. It sent the Territory team researching the realms of such topics as advanced clinical technology. Of course, it's all good and well creating something from a blank canvas, but the Iron Man persona and Tony Stark within the suit has already laid down a template for the kind of graphics the audience would expect. David explained that Territory wasn't as constrained as you might imagine, "Our task was to approach the UI from a fresh perspective that supported Joss' vision for this film. So, while we worked with established color palettes for Stark and Banner, we were free to create a new look for the UI. For Tony Stark we researched state-of-the -art architectural engineering, avionics and military technology, and crafted a series of screens for Stark Lab that worked together to reflect a more rounded character. Similarly, we researched cellular plant biology and biotechnology and fed that into Banner's UI. Again, the idea was to add a visually rich layer of imagery and animation to the background of Banner's lab that supports his role as biologist. Finally, for Dr. Cho, a newly introduced character with a strong Marvel history, we crafted a UI that reflected her interests in biotechnology innovations and incorporated near-future technology references that we extrapolated from current advances in clinical applications for 3D printed organic matter."

The process of creation flowed both ways because the initial conversations about the designs took place before the script had been finalized. That meant Territory was able to advise on how best to use the UI to illustrate certain narrative points. From there the team refined the creative vision for the visual language of each character and environment with research and references from the art, costume and props department. It was important to both Joss and Charlie that Territory's screens not only gave visual depth to the sets but anchored the Avengers technology in real-world references that the audience could relate to. As David elaborated, "This level of realism was a new approach to Marvel Universe's highly stylized future-tech aesthetic and it was fun for the team to bring the two together."

Needless to say there were a number of technical challenges that had to be mastered. David revealed what was required for one of the new characters to the franchise: "For one scene with Dr. Cho, this involved extrapolating potential clinical applications from current reconstructive biotechnology and 3D printing to create the UI for a clinical technology that 3D prints skin onto a patient. Having 3D screens that looked holographic on set was a real challenge to create from the depth perspective. The temptation is that if you hear that it's going to be a background screen you don't put the quality into it but ever since we worked with Ridley Scott we knew to deliver the best quality and to give the director the option to use it as part of the narrative with a close-up."

Marti Romances, Art Director for Territory, worked on the Iron Man screens and made full use of the speed of Cinema 4D: "With the Iron Man screens I didn't have much time to spend on renders and for that reason I rendered many simple and quick passes from Cinema 4D. This gave me the option of being able to change and play with these passes in After Effects until the directors were happy with the result. Things like Cinema 4D’s External Compositing tag and the HAIR and cel renderer were very handy for these quick turnarounds."

One of the other technical challenges was receiving hi-res models from other VFX vendors who were working on the film and integrating them into the workflow. The Poly Reduce function was used to take a highly detailed Iron Man model from ILM and repurpose it for Territory's needs. Peter Eszenyi, Head of 3D for Territory, had a similar challenge when creating the Leviathan screens. Peter described the process: "We used a very dense and detailed CGI model we received from one of the main vendors. Before we were able to do anything with it, the asset needed a bit of love, stripping out the overly detailed parts, and generally organizing the mesh into manageable chunks. After this, we started to set up some crazy X-Particles magic. We used the mesh as an emitter as well as placing several other emitters around it. For other passes we used the mesh as a collider object, and of course we did use quite a number of different setups, turbulences, wind, follow surface modifiers, the whole works."

David elaborated on how his team of artists kept pushing the envelope creatively: "Often we're finding and exploring new ways of displaying 3D holograms. Thinking Particles, MoGraph and Sketch and Toon are all critical to our ability to be creative in how we approach this challenge. The team here gets tired of just showing wireframes so it's important for us to innovate and test new ways of doing things - MoGraph in particular gives us a toolset to do this."

Peter also used Sketch and Toon on the Leviathan screens he was working on, "We started the Sketch and Toon explorations, experimented with different contour settings, played with lights and shadows influencing line weights – which is an excellent and not so well known feature of Sketch and Toon – generally just trying out as many different ideas as time permitted. Once we rendered all these passes we layered some more traditional passes on top such as shadow passes and ambient occlusion. Then we tried out different tessellation methods, projection textures and so on. We ended up using 10-15 different passes, which were tweaked further in After Effects. This R&D time let us set up a structure that could be deployed on different parts of the project, so once someone got familiar with how things were working we could start producing the different screens."

In previous films like Guardians and Ridley Scott's Prometheus, Territory used UI screens live on set. Only when details or filming restrictions made live screens impractical were the effects added afterwards. That experience came to the fore in 'Avengers: Age of Ultron,' where over 200 screens across 11 sets were implemented as mainly live on-set props with the help of Compuhire, while only a few screens were delivered as VFX shots. It meant the actors were working with live screens that gave them a focus when appropriate and supported their line of sight. As is often the case on projects like this there are last-minute changes, sometimes minutes before they are required on set to be shot. David explained how it was possible: "Speed is key to our projects. Cinema 4D allows fast rendering and pulling things back into After Effects while MoGraph offers an artist-friendly, procedural way of animating. We can create complex structures and make last minute changes very quickly."

All told, Territory committed a core team of six artists for 11 months to the project, often scaling up to 10 artists, rendering out the 2k-resolution screens and creating over 80 minutes of unique animations for Marvel's blockbuster franchise movie.

Duncan Evans is the author of Digital Mayhem: 3D Machines, recently published by Focal PressAll images courtesy of Territory Studio, Marvel Studio.You can see more of the on-set screen graphics from Avengers: Age of Ultron here: www.territorystudio.com]]>news-4624Tue, 09 Jun 2015 11:14:00 +0200Jellyfish From Hellhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/jellyfish-from-hell/Mutant jellyfish terrorize beachgoers in Patrick Longstreth’s sci-fi short, HellyfishOn February 5, 1958, a B-47 bomber dropped a 7,000-pound nuclear bomb into the waters off Tybee Island, Georgia after an F-86 fighter jet accidentally crashed into it while on an Air Force training mission. That bomb, which contains a much-debated quantity of radioactive material, has never been found and the Air Force wishes that people would stop looking for it. Left undisturbed, they say, the bomb is harmless. If it is disturbed, well, things could get ugly.

And that’s exactly what happens in Patrick Longstreth’s short sci-fi horror spoof, Hellyfish, in which the lost bomb is leaking radioactive material and mutant, bloodthirsty jellyfish terrorize and devour beachgoers. “I saw a documentary about the lost bomb and thought it was so fascinating that I considered making my own documentary,” recalls Longstreth, who used MAXON’s Cinema 4D to create his film. “But then I thought about how there really was an actual jellyfish problem in Savannah and how much I love beach monster and horror movies, and the idea just came to me.”

Hellyfish is Longstreth’s first independent film. After earning a business degree, he quickly changed directions and worked for NBC Network News as a motion graphics artist for three years. Next, he got his graduate degree in visual effects from the Savannah College of Art and Design (SCAD) and moved to Los Angeles where he has worked for well-known production studios, Psyop and Imaginary Forces, while also freelancing as a director and VFX supervisor for commercials, corporate videos and independent films.

Longstreth and co-director Robert McLean, who brought his experience working with actors on film sets to the project, shot Hellyfish in Savannah, Georgia. The decision cost them must less than they would have had to spend on the same shoot in Los Angeles. In addition to being able to use some of SCAD’s equipment for free, they were also able to readily find local actors and friends to fill over 20 different roles in the film. Rehearsals were done in Longstreth’s backyard, and the shoot spanned 16 days over the course of six months.

Animating the CreaturesLongstreth modeled the killer jellyfish in Cinema 4D and additional sculpting was done in Zbrush. To create a fully digital environment with 3D camera movement, photos and video of sky, sand and ocean were projected onto geometry in Cinema 4D. But when it came time to create and rig a full-run cycle for a creature with five tentacles, fur, long stringy hairs and soft body dynamics, the team brought on experienced character animator, Pryce Duncalf.

For the final shot, in which the giant Hellyfish destroys the Tybee Pier, they created an exact model of the pier in Cinema 4D and then projected an image of the actual pier onto the 3D model. Because the interaction with the character proved to be too intricate for a full dynamics simulation, each individual piece of the pier was keyframed. “The Cinema 4D deformers were essential to this animation process,” Longstreth explains.

For dramatic effect, some shots were slowed down to 33 percent. This meant it was especially important to eliminate any imperfections in the animation because they would be easier to spot in slow motion.

Creating Realistic Water ScenesThe opening nighttime scene was shot on a green screen to allow for more control of the lighting camera movement. Shots were taken from every angle, including wrap-around dolly shots that were tracked with Pixel Farm’s PFTrack. The night sky was a 360-degree matte painting augmented by moving clouds, the shoreline on the horizon and a blinking lighthouse.

Creating realistic-looking water was one of the biggest challenges, says Longstreth, who laughs when, in all seriousness, he says “we used every trick in the book.” For shots that required CG water, Cinema 4D and RealFlow were combined to get the best results.

To create the feeling of being underwater, floating CG algae, dirt, and bubbles were added with the Trapcode Particular After Effect plug-in, which allowed full control over the density, size and animation. Video Copilot's Optical Flares and Action Essentials were used to help tie shots together. All told, over 20 artists ended up helping with the post-production.Funny ScaryWhen the film was finished, they had an advance screening in Savannah for cast, crew and friends. "Even the kids were cracking up laughing and that was really rewarding," Longstreth recalls, "because if they get the humor, that's exactly what we were going for. We didn't want something gruesome. We wanted it to be Halloween material in the category of Gremlins or Ghostbusters."]]>news-4568Fri, 17 Apr 2015 13:03:00 +0200Into Orbit With Jupiter Ascendinghttps://maxon.net/en-us/news/case-studies/movies-vfx/article/into-orbit-with-jupiter-ascending/Duncan Evans talked to David Sheldon-Hicks of Territory Studio about working on the set of the Wachowskis sci-fi epicBy Duncan Evans

When legendary directors, the Wachowski's, wanted screen graphics and displays for their spaceships in Jupiter Ascending, their first port of call was London-based Territory Studio. It was a selection process made easier for everyone because cCreative Director, David Sheldon-Hicks, had already worked on the iconic futuristic sci-fi screens for Ridley Scott on Prometheus. Territory was duly hired to create user interfaces for screens that would feature as part of the navigation systems in a number of spacecraft scenes. Invisible forces such as gravity, wormholes and cloaking devices needed illustrating, with Jupiter Ascending Production Designer Hugh Bateup suggesting that 3D weather maps would be a good starting point. Then there was the concept art. An entire room of beautiful art with lifts, spacesuits, environments and spaceships. It had already been created and was to serve as inspiration for the visual look and feel of the film, even the bespoke typeface that Territory created for the film.

David and his team of five artists got to work investigating how best to use isometric lines, which are normally used to describe weather fronts, to represent 3D energy fields as animated organic forms. Most of the effects were generated in Cinema 4D using the Thinking Particles plug-in with XPresso being used to control them. The basic idea was to generate anything between 100 to1000 particles and then use effectors to move them around. This way, force fields like weather maps could be created, with morphing, to describe specific story points. Usually, this kind of animation creates chaotic elements but here the team tried to incorporate such effects. The real problems started with getting certain animations to loop. Senior Motion Designer, Nik Hill explained how they solved this issue, "Cinema 4D's MoGraph tracer and Hair shader settings were key to helping us figure out the looping issues with the swirly wormhole graphics."

The other problem was how to compensate for the irregularities of the physical screens in the spacecraft bridge environments. Unlike most VFX projects, the majority of these weren't added in post-production but, they were projected in real-time onto glass screens. This process was developed in partnership with partner Compuhire, the engineers behind getting the graphics on set.

The projectors were either in the floor or from above. This is where Territory's experience with creating the same kind of effects for Prometheus paid off. On the bridge of the spaceship there were five main consoles with glass sheets hung at slight angles. The graphics themselves had to be opaque enough to convey information yet have enough transparent areas so that the actors could be seen through them. David explained how it all worked out, "When you project onto glass it is specialized acetate with imperfections and it creates tiny refractive beams and bounces light back so you get light spill. Ridley Scott liked the light spill and used it in Prometheus."

On Jupiter Ascending they wanted the animated graphics to be placed perfectly, running along angled edges, but the screens were tilted so that they weren't perpendicular to the lens of each projector, essentially warping the projected image on the screen. Through trial and error they figured out a distortion within After Effects to compensate. David clarified the process: "We were inverting the distortion that was physically taking place on set and it worked really well. Some of the panels had geometric designs etched onto them as well so that our kinetic projections mingled with physical glass etchings. It turned out to be a clever merge of 3D set design and animated projections."

The advantage for the actors and directors was that they could physically see the screens, rather than having to imagine everything against a green screen. You also got reflections, light bleed and spill into the environment, making it all appear more real.

Before starting, the Wachowskis had assumed all graphics would be green screen and inserted in post, which is how they worked before. When they saw the tests they wanted the projected graphics on everything, which meant making changes and designing by the seat of their pants. David revealed, "The Wachowski's were a delight to work with and got really energized by working with us on-set. Motion designers Nik Hill and Ryan Hays would be perched with laptops designing, animating and rendering as Mila Kunis and Channing Tatum were being shot. Lana and Andy would take a look and then get us to make changes on-the-fly for the next take. It was a demanding way of working but a lot of fun and very satisfying to see it come together for the actors and directors."

A lot of this was only possible because Cinema 4D is so designer-friendly. It meant that Territory's team, which largely came from a design background, didn't have to get into the technicalities all the time. Cinema 4D tools are accessible so they could create the basic look quickly, get it approved by the director and then have a day or so to put in enough details to make the effects look polished and beautiful. David detailed how Cinema 4D made it all work, "The software is perfect for creating broad brushstrokes and then adding fine details. We do an awful lot of UI interface creation and there needs to be good handover processes between it and Adobe products like Illustrator and After Effects. We were swapping between Cinema 4D and After Effects with cameras going back and forth. With this constant overlap of software processes we can generate up to 30 - 40 screens for the next day of shooting. Normally you get into that position when you have spent a couple of weeks creating one or two hero screens. The director approves the look of those and then we roll out 40 screens based on them. It's not fun but you make quick decisions and find out what your limits are."

In the end, Territory spent four months working on Jupiter Ascending, rendering an estimated 20 minutes of visuals at 2k-resolution with its system of three Mac 3.5GHz six-core workstations. Nik concluded, "Cinema 4D is a great tool for getting good results quickly. By using the right blend of tools we managed to keep up with the high pace environment of film production. When the actors turn up you have to be ready to go and, thankfully, Cinema 4D is really robust, so you can do it."

Duncan Evans is the author of 'Digital Mayhem: 3D Machines,' recently published by Focal Press.

You can see more of the graphics from Jupiter Ascending here:www.territorystudio.com]]>news-4538Tue, 31 Mar 2015 11:43:00 +0200The MothershipTo celebrate their 5th anniversary, Studio FutureDeLuxe wanted to create an animation that stood out from the usual studio look – and Cinema 4D helped them do exactly that.Studio Future DeLuxe has made a name for itself over the past five years with a rather unorthodox approach: an experimental use of graphics programs is used to generate know-how that is in turn used to create spectacular results for clients.Cinema 4D is one of the team’s favorite programs for experimenting, not least because of the endless and diverse possibilities its features have to offer. After the decision had been made to create a fitting animation to celebrate their 5-year anniversary, the FutureDeLuxe team had a treasure trove of experimental material and techniques that they could use for their project. The film they created – Mothership – is colorful and trippy and full of flying shapes, swarms, strange creatures and fantastic formations that fly through a magical world using dynamics.

The team at FutureDeluxe, together with artist Nejc Polovsak, used instances in Cinema 4D to make it possible to handle this vast number of objects in a single scene. “In particular for the part where the camera flies through the tunnel it was especially important to determine the total number of instanced objects that we could use without exceeding our limit. We resorted to using low-poly objects that used less memory, which let us add even more objects to our scene”, remembers Nejc. “When viewed from a distance you can’t tell the difference between high-res and low-res proxies.” The most elaborate scenes had about 1.5 million polygons and ate up 650 MB of memory.

“Numerous Cinema 4D tools and features ranging from modeling to BodyPaint 3D, character tools and the Motion Camera for the soft camera moves were used – without which it would have been impossible to create this film. But what was really indispensible were Dynamics and the simulation features, which were used to create large parts of the animations.”

Most of the FutureDeLuxe team was involved in the initial creation phase of Mothership. As the animation phase began, fewer team members were needed and Nejc Polovsak was essentially the one who animated the short film. “I could still count on the entire team at FutureDeLuxe who also supplied me with plenty of great feedback and outstanding art direction,” remembers Nejc. “It took ten weeks until the scenes were completed and rendered. Rendering was done using VRay for Cinema 4D on a single PC (6/12-core i7 3930, 32 GB RAM). After all the work had been completed on the scenes I was really ready for a vacation. I started rendering before I left and when I got back after 5 days the rendering was done and Mothership was ready to be edited and finalized in After Effects.”

“This project definitely provided a lot of valuable experience that made it evident why I love working with Cinema 4D”, explains Nejc. “It’s the ideal software for generalists like me who want to single-handedly complete entire projects with a single software package. Artists want to realize and experience their visions without having to worry about technical issues. This is exactly what Cinema 4D does,” concludes Nejc.

Process video:www.vimeo.com/109013667]]>news-4423Tue, 10 Feb 2015 09:04:00 +0100Utrecht Goes Hollywood: Oscar Nomination for „A Single Life“Animation studio Job, Joris &amp; Marieke centered their workflow around Cinema 4D.A young woman, a slice of pizza and a mysterious vinyl single: these are the only ingredients needed for being nominated for an Oscar® - at least for the small Utrecht, Holland-based creative studio Job, Joris & Marieke! The small team relied on Cinema 4D as the primary application for the creation of “A Single Life” and used Cinema 4D’s toolset to handle the complexities of 3D content creation, including modeling, rigging, character animation, lighting, rendering and more.

Many of Job, Joris & Marieke's past works are characterized by their charming, imaginative stories, their unusual style, interesting music and lovingly designed characters (Mute, The Daily Drumbeat). Job, Joris & Marieke were guests at last year's MAXON user meetings in Düsseldorf and Munich, and their work was praised enthusiastically by the entire Cinema 4D community.

In the typical Job, Joris and Marieke style, “A Single Life” tells the story of Pia, a young woman who finds a mysterious vinyl single. As she spins the record she is suddenly able to travel back and forth through the years of her life. The three-member filmmaking team of Job Roggeveen, Joris Oprins and Marieke Blaauw managed all aspects of “A Single Life”, including the original idea, writing, directing, design, animation, as well as creating the sound effects and composing the music.

The film was originally created as a part of Ultrakort, a special project of the Dutch Film Fund and the Pathe cinema group in which select short animations are screened preceding blockbuster movies. In the Netherlands, nearly a million people already had the chance to view “A Single Life” on the big screen.

Job, Joris & Marieke was established in 2007, initially as 2D and stop motion studio to create short films, videos and commissioned work, and gradually began to push its artistic capabilities into the 3D animation realm to handle projects with rich layers of detail. “One of our main storytelling challenges in “A Single Life” was to capture Pia throughout the five stages of her life – from a little girl through old age – in five different settings within a very abbreviated timeframe,” says Oprins, one of the film’s directors. “When we launched the studio we had no prior experience in 3D and decided to bring Cinema 4D into the production pipeline because we found it to be user-friendly, integrates well with other professional applications and offers powerful creative capabilities.”

38832

Cinema 4D afforded Job, Joris & Marieke the creative flexibility throughout the filmmaking process to realize the unique animation style in “A Single Life”, which was shaped by the studio’s background in stop motion. Cinema 4D’s Jiggle Deformer was applied to add bounce and follow-through effects to the animations, while XRefs enabled a flexible and streamlined workflow to organize elements, work with placeholder props and animate characters in simultaneous collaboration.

“Pia ages a few times in the film, so her clothes, size and hair style change but her shape doesn't so we were able to use the same character rig,” says Oprins. “This saved us significant time because we could start animating a scene in Cinema 4D before the character was finished.”

The powerful Displacement Map feature in Cinema 4D was also an essential workflow feature for adding detail. “To achieve a dramatic and stable lighting that worked well with Pia's hair we chose not to use Global Illumination and instead built the lighting with a dome and cluster of very soft spots,” adds Oprins. “We rendered everything in 32-bit and were able to stylistically light aspects of the scenes to great success without unreasonable render times."

“A Single Life” has been selected to screen at film festivals worldwide, including the Toronto International Film Festival, Dutch Film Festival, Bay Area International Children’s Film Festival, New York International Children’s Film Festival and numerous others. The Academy Awards for outstanding film achievements 2014 will be presented on Oscar Sunday, February 22, 2015, at the Dolby Theatre at Hollywood & Highland Center, and televised live on the ABC Television Network.

The 2015 Oscars award ceremony took place on the 22nd of February and the Oscar for best animated short film went to Disney Animation Studios' "Feast" by Patrick Osborne and Kristina Reed.

]]>news-4417Thu, 29 Jan 2015 13:55:00 +01003D Worlds in the Shadow of IsolationOriginally planned as a short film and serve as a reference, this project ended up taking three years to produce, and thanks to a great deal of creativity and Cinema 4D, became much more than was planned …Neither Annegret nor Till had previous experience with 3D animation films before they started work on this short film. In the end, they decided to use Cinema 4D for their project. Among the tools needed for the realization of ‘Rue de Fleurs’ were those for character animation, particle effects, hair, technical animation, modeling, dynamic and cloth simulation and more. Annegret and Till had seen colleagues use Cinema 4D, with whom they then made their first attempts at using Cinema 4D themselves. After they saw how easy it was to use Cinema 4D, it was quickly decided that this would be their software of choice for the project.

A lot of research was done on the topic of social isolation amongst the elderly. An entire year was used to prepare for filming, during which the topics of loneliness and isolation were worked into the script. After the script had been completed, work began on the actual film early in the summer of 2013. Annegret first created a detailed storyboard, which was used as a base for animatics, including sound, which were in turn used to assess camera movement, cut sequences and general visuals. Next, Annegret created a clay model the film’s main character Gustave in order to get a better feel for his facial expression.

The clay bust was then photographed from three sides and these images were used as a template for 3D modeling. As soon as the first figures had been built, the team started to animate them. XRefs were used in the beginning. “This made it possible to make changes to character models even after they had been animated. Thanks to XRefs, making subsequent changes didn’t hold up other workflow tasks,” explains Till Giermann. This also made it possible to conduct exhaustive tests with shaders for Gustave’s skin and with his hair, and still have the newest version of the character model for rendering.In addition to Gustave, another 300 objects had to be modeled and textured, which was all done in BodyPaint 3D. The textures themselves were photographed by Annegret and Till and edited for use as textures. MoGraph and Dynamics also played important roles. MoGraph was used whenever leaves flew through the scene, trees had to be added, water drops wandered down window panes or debris had to be strewn across the ground. When Gustave’s world literally breaks apart, Cinema 4D’s Dynamics was used to create these effects.

Being inexperienced as they were, Annegret and Till first filmed themselves moving like the main character in order to get a better feel for his movement. Annegret then used this footage as a rotoscoping background to create the animations. Since this film revolved around a single character, the digital version had to be created so that its facial expressions and gestures conveyed the necessary emotion. This led to Gustave’s character being modified and improved throughout the first year of production.

Of course everything has to be rendered in the end, which was another aspect of 3D production that the team had to learn. The first attempts showed the quality that can be achieved with the Physical Renderer – and the realistic-looking depth of field really impressed Annegret and Till. The team only had three mid-range computers available for their render farm, which meant that they had to do without the features of the Physical Renderer and the depth of field had to be added in After Effects during post-production.

The new Team Render on the other hand really helped in the production phase. “Cinema 4D R15 was made available in the middle of the production phase. We quickly got to know the new Release and everything worked great. Team Render was the most effective solution for our small network,” declared Till.

Schmalbreit Film website:www.schmalbreit-film.de/]]>news-4363Thu, 18 Dec 2014 13:52:00 +0100Galactic Encounter of a Different KindLos Angeles-based Imaginary Forces is always on the lookout for tools that meet or exceed the standards set forth by their demanding team of artists. This creates an ideal environment for testing Cinema 4D Release 16.Imaginary Forces is a well-known name among LA studios. A look at the IF website reveals a wide variety of notable projects on which the team has worked. One of the tools that plays a major role in their everyday work is Cinema 4D. As soon as the beta version of Cinema 4D R16 had been made available the team at IF was very curious about its new features. This was particularly true for Creative Director Ryan Summers, who was very interested in the variety of new possibilities that Cinema 4D R16 had to offer: tools like Polygon Pen, improved sculpting and the impressive materials created using the new Reflectance Channel made a lasting impression on him.
Even before the final version of R16 had been made available, Ryan and his team decided to create a short film to put the new version through its paces under nearly realistic conditions. The new R16 features in particular had to be put to the test. Ryan also decided to let the tools themselves star in the film. Functions such as Polygon Pen, Reflectance channel and Solo button were given starring roles.

The film is a science-fiction story in which a small, somewhat rickety space shop is attacked by a large battle cruiser. Despite its apparent superiority, it’s always outwitted by the small, rickety underdog, who first conjures up a more powerful booster engine using the Polygon Pen tool to escape the first attack and then uses its own reflectivity created with the Reflectance channel to stave off a laser attack. As a last – and finally successful – resort, the small ship uses the Solo button to make the cruiser disappear.

Presented as an instructional film from the Hyperspace Travel Security Authority, the film is somewhat reminiscent of classics such as ‘Duck and Cover’ or ‘Hitchhiker’s Guide to the Galaxy’ and brings smiles to the faces of those who watch it.
With this Cinema 4D pilot project, Ryan Summers not only wanted to put new features to the test, he also wanted to find out just how quickly newcomers can learn how to use Cinema 4D. For example, the ZBrush and Maya expert Amir Karim was asked to do the modeling and sculpting. This was Amir’s first project using Cinema 4D exclusively. Amir said, “I thought that it would definitely be an interesting experience using Cinema 4D instead of my usual software for this project. Soon after getting started I was very surprised to find all the tools that I also know from Maya, and I was able to get rolling without any major interruptions. Especially the fact that I was able to apply the Sculpt tools to normal geometry that hadn’t been subdivided in any special way turned out to be very helpful.”

Character artist Richard Deforno also took part in the project and was fascinated by Cinema 4D’s comprehensive set of functions and how seamlessly Cinema 4D could be integrated into his existing workflow. “Thanks to the wide range of available functions you almost never have to leave Cinema 4D, which means that you can complete almost every phase of work directly in Cinema 4D!” While working with Cinema 4D, Richard constantly discovered new advantages to working with Cinema 4D and also encountered stereotypes: “Many times, colleagues would watch me work and they would tell me that they thought Cinema 4D was only a motion graphics tool. They had animated fonts and flying letters and were pleasantly surprised and excited when they saw me animate the character – especially when they saw how easy it was to work with Cinema 4D."

What was particularly interesting according to Glen Snyder, who is responsible for pipeline development at Imaginary Forces was: “The Python interface is clearly documented and makes it possible for me to program tools for everything that Cinema 4D does not offer as a standard tool. This means that you don’t have to leave Cinema 4D when performing various tasks in your everyday work.”

Everyone who worked with Cinema 4D on this project had a positive reaction: the possibilities that the new features and functions in Cinema 4D R16 offered, its stability and the expansive range of uses means that it can come to fruition in just about any production situation. Ryan Summers is very satisfied after completing this “first run”: “We can hardly wait to use Cinema 4D R16 for our commercial production. I think that the way Cinema 4D is perceived will change in the near future. We are encountering consistently more artists who previously swore by another software package and are now achieving spectacular results in just a short time with Cinema 4D. The entire 3D production landscape will change significantly.”

]]>news-4277Mon, 27 Oct 2014 13:34:00 +0100Planetary EncountersNormally, Andy Lefton works in the field of advertising but with ‘Two Worlds’ he ventures into new territory – with the help of Cinema 4D.Andy’s online portfolio includes a wide range of extremely high-end fantasy works, which put Andy in the top league of professional 3D artists. As such, he doesn’t need to create advertising for himself. Nevertheless, he’s been working on a short film for the past ten years that’s slowly beginning to take shape. He’s completed a trailer for the short film that offers a first glimpse into the project.

A pile of junk, which obviously used to be a space ship, lying in the desert. Elements added after the crash such as simple mobiles, a canopy and an antennae built from spare parts indicate that someone lives here. The stranded person can only be seen in the reflection on his communication computer monitor, which constantly displays the message that no connection can be established. But suddenly the communication systems receive a signal from an aircraft that is landing on the planet …

With this trailer, Andy creates a dramaturgical effect making the viewer want to see more. And the quality of rendering for this trailer is so realistic that you have to look twice to see that it was created using computer graphics. “You could say that just about every Cinema 4D function was used for this project. In particular dynamics, hair and cloth made it possible for me to create exactly the look I wanted. I was able to concentrate completely on the film and didn’t have to worry about how I was going to realize specific parts. For me, Cinema 4D is the most important tool for all aspects of 3D, VFX and motion design!” says Andy.

Andy did a lot of research for his ‘Two World’ project and created numerous production sketches. “Drawing is not my greatest strength but, together with the images I researched, it helped a lot in creating the various elements, characters and scenes as I envisioned them. After the first sketches had been made I created a complete storyboard, which was used to define the speed of the camera movement and the action.” Andy then began creating all elements and scenes in Cinema 4D. “Cinema 4D’s and BodyPaint 3D’s seamless connectivity to Photoshop was invaluable for the creation phase,” exclaims Andy.

The character was not very complex, which made it possible for Andy to animate it using several keyframes, Morph tags and Expressions for facial expressions. Animation will become more relevant for the project at a later point when completely rigged characters will be active in the film – of course all done in Cinema 4D. In the meantime, the ‘Two Worlds’ teaser will continue to wet everyone’s appetite for the final film. Andy is planning on completing the film by the end of 2014. We can’t wait to see it!

Andy Leftons website and "Two Worlds" blog:www.andylefton.com/blog/]]>news-4272Tue, 21 Oct 2014 14:49:00 +0200Stellar Visuals for Guardians of the GalaxyMade with Cinema 4D: Intergalactic interface design for an old spaceship, a crew of cosmic criminals and a virtual walkman ...By Duncan Evans

The jump from acclaimed graphic novel to successful movie tie-in is a tricky one. For every successful adaptation there’s always one that leaves a sour taste in your mouth. For a project as ambitious as Marvel’s 'Guardians of the Galaxy,' it required the efforts of half a dozen VFX companies to do the culture-spanning tale justice. Fortunately, it did just that and was one of the biggest draws at the summer box office.

Playing a vital part was Territory Studio, which was asked to create holograms, screens and interfaces for the Marvel Studios film. Territory was set up in 2010 when Creative Director David Sheldon-Hicks got together with Lee Fasciani and Nick Glover to found the company. Having already created a lot of the screen and graphic interfaces for Ridley Scott’s Prometheus, Territory was the natural choice for the Guardians project.

The brief called for hundreds of designs and animations, with each set requiring its own look and feel to fit the vast number of sets, environments and cultures. There were screens for spaceships, street scenes, gambling dens, communication hubs and a prison. Each one had to reflect the specifics of a particular culture, whether it was human or alien. They also needed to reflect a sense of the wear and tear of function, history and backstory.

Territory took advantage of the massive Cinema 4D toolset using diverse tools such as those for character animation, motion graphics, particle effects and cel rendering. Cinema 4D’s MoGraph tools and workflow with Adobe After Effects were key in simplifying the creation of the complex screen interfaces. "The complexity and variables of on-set screen graphics work requires us to be fast, flexible and creative, and Cinema 4D allows us to do this in bucket loads," states David. "The tight integration with After Effects was essential."

David explained how they approached the task: "We had conversations with Director James Gunn who was very script-focused, with a clear vision of what he wanted. He was also very supportive of our work and gave us a lot of freedom. We also worked very closely with Production Designer Charles Wood and Art Director Alan Payne and the art team. We had bi-weekly meetings with the art team in which they would show us their concepts and visuals for scenes and environments that were further down the line than we were. This really helped us understand the look and feel of the many locations, be they planets, prisons, spaceships, street scenes, gambling dens and so on, which we would reference and support."

The first challenge was receiving high-poly-count assets from the art department, typically around five million polys, and creating usable assets for animation purposes, especially the Nova prison and Milano ship screens. These had to be reduced to 500,000 polys or less. To do this, some of the assets were remodeled while others used the Polygon Reduction tool within Cinema 4D to achieve the same result. The streamlined workflow between After Effects and Cinema 4D certainly helped the process along.

There were a lot of typefaces on show for each area and culture. David went through the design process: "We referenced artwork from the original Marvel comics and style guides and concept art from the art and costume departments. Then Lee Fasciani, my co-founder here at Territory and a specialist in crafting fonts and icons, created each typographic and icon style to reflect the character and personality of a particular culture or location, drawing on colors and shapes to provide a visual language to convey this. Each one was subject to approval and, because of our collaborative relationship with the art department, we didn’t run into any problems."

One of the main areas that saw action throughout the film was the spaceship Milano that the main characters used. The Milano’s backstory was that it’s seen some action and is clearly a bit dated, but still beautiful with many clever modifications. David described how their designs fit in: "Our UI needed to reflect both this engineering sophistication and main character, Peter Quill's can-do attitude to hacking the system to get the extra performance he wanted. He was more interested in effective modifications than in perfect code, so our screen graphics for the ship's navigation, weapons and entertainment were a bit rough and ready to reflect this. We were also able to have a lot of fun with details such as the music interface that Quill hacked to simulate a 1980s tape deck."

In the film, the '80s mix cassette tape was a key personal memento for Quill and he took it, and a Sony Walkman, wherever he went. This connection to the '80s ran through the whole film, in terms of the soundtrack, in some of the styling and in the look and feel of Territory’s graphics. On the Milano, the team wanted to create an interface that was based on an authentic deck that visually converted the media into a cassette tape as it was inserted and ejected, with tape that could be seen to roll as the music was playing.

David concluded: "In terms of the user interface, we referenced the classic reds and oranges against a black background of 1980’s UI, we looked at the paintwork styling of 80’s sports cars and we explored the effect of degradation on screens by looking at how airplane windows show the effect of stress and age – the clouding and scratches that their screens show with time. Our designers then took on board this research and created a look that felt true to the spirit of the film."

Territory also worked directly with a company called Compuhire, who were the wizards behind programming on-set playbacks and getting the graphics in front of actors and directors. Territory would often have to turn around several screens in a day, and with Cinema 4D working so well with After Effects they were able to do this. Sharing nulls and camera data between applications and rendering object buffers were key workflows that the team used on a daily basis. XPresso was also called into play on a sequence on the Dark Aster ship, to make the animation process easier for actions such as finger rolls where David only wanted to animate one slider.

With a large number of VFX teams working on the project, producing work from Cinema 4D that everyone else could use was paramount. David recalled how the Production VFX Supervisor Stephane Ceretti played a key role here, "He really understood our way of working and what we were able to do and because he oversaw all the VFX vendors, they were able to ensure the consistency of our concepts, from screen to post."

In the end, Territory used from two to seven people on the project over a ten-month period, putting up to 18 Mac OS X 3.5GHz six-core machines to work rendering out the various elements.

Duncan Evans is a freelance journalist, photographer and author.

You can see more of the graphics from Guardians of the Galaxy here:www.territorystudio.com]]>news-4041Mon, 25 Aug 2014 12:14:00 +02003D, Constellations and Flying FishArtist Murat Sayginer uses Cinema 4D to create a mysterious flood of images with a deserted island as a backdrop.Volans is the name of a constellation in the southern sky, whose German name is ‘flying fish’. This fish, a deserted island and the starry sky inspired artist Murat Sayginer to create a spectacular short film, which he created entirely by himself. Murat began his career as an illustrator and photographer. According to Murat himself, the evolution to animation was the next logical step if you look at the work featured on his website. Many of his photos told stories and were often shown as photographic series. Murat was introduced to Cinema 4D about five years ago. He wanted to tell stories and photography alone was too restrictive for what he wanted to do.

As an internationally known artist who has won numerous awards, Murat has released four films on the online platform Vimeo – of which Volans is the most impressive. In a style of fantastic realism it tells the story of a flying fish’s voyage of discovery as it frees itself from a stone sphere and explores the world outside – a trip that quickly turns into a cosmic search for meaning.

The fully saturated colors of the scenes are often reminiscent of tantric Hindu meditative illustrations. In order to realize these visions, Murat took advantage of Cinema 4D’s full range of tools. He used the Hair feature to create the carpet of grass that covers the island, and the water that seems to flow weightlessly from the island’s pond towards the sky was created using Real Flow and rendered in Cinema 4D. Splines and Dynamics were used to create the numerous floating light particles.

The music that accompanies this unique film was also created by Murat and is timed perfectly to match the animation. Murat claims that the truth lies somewhere in-between: sometimes the music was made to match the animation and at times the animation was made to match the music.

Due to its popularity at Vimeo, Murat’s ‘Volans’ animation was chosen as ‘staff pick’ even though it wasn’t submitted for any animation festivals. Murat plans to continue along this path. Although he has a strong desire to create traditional film projects he will continue to realize his visions and ideas using Cinema 4D. After all, according to Murat Sayginer, Cinema 4D is “the most user-friendly animation software that was ever developed!”

Murat Sayginer’s web site:www.muratsayginer.com/]]>news-4010Fri, 04 Jul 2014 08:30:00 +0200Happy Drum BeatsA slightly macabre but thoroughly enjoyable music video created by Job, Joris and Marieke using Cinema 4D.A bunch of cute, musically inclined creatures with zebra stripes running a parcours from rooftop to rooftop accompanied by the new song from the Dutch band Happy Camper, The Daily Drumbeat. Soon the race evolves into a macabre version of musical chairs – with drum sets instead of chairs. Whoever’s not sitting behind the drums when the refrain starts is pushed off the building by an invisible hand – and the game continues with one less player …

Joop, Joris and Marieke is a team of Dutch film makers with a love of cartoons and short films. Their projects are created using Cinema 4D, which offers them the flexibility and power they need to successfully realize their complex visions. Joop, Joris and Marieke also have established a reputation with their past work, including their well-known work ‘Mute’, which recently garnered a lot of attention worldwide. This short film also used highly stylized characters and a very dark humor, which also found its way into ‘The Daily Drumbeat’.

“The music is very upbeat but the lyrics have a dark touch: ‘The drum beat will get you!’”, explained Job. “We wanted to express this somewhat foreboding message in our film. This is how we came up with the idea of the invisible hand that eliminates one character after the next from the game.” The visuals had to have a crude, rough and grainy style and look like they were filmed using a handheld camera. This is why the film was made in black-and-white and a lot of motion blur and depth of field was used. “These were all effects that we could easily create using Cinema 4D’s standard renderer without generating exorbitant render times, even though the scene was made up of a comprehensive cityscape. We also used a few tricks,” admits Job. “We created Sky objects to which not only the sky was applied but parts of the city as well. This helped convey the impression of a big city without inflating render times.”The zebra suit-clad characters were modeled, textured, rigged and animated in Cinema 4D. Since the characters’ physiology was very compact, the Jiggle deformation object was used to let their noses whip about nicely and make their movements more dynamic, which in turn produced a very organic look.

After their success with ‘Mute’, ‘The Daily Drumbeat’ also earned the title Staff Pick at Vimeo. Not bad for Job, Joris and Marieke, who actually come from the field of graphic design. “Despite our professional backgrounds, we all had a strong desire to switch from design to animation. Cinema 4D is perfect for everyone wanting to get the best possible start in 3D animation. Not only does everything work great, it’s also very intuitive to use,” says Job.

]]>news-4009Fri, 27 Jun 2014 11:34:00 +0200Floating Metal KeyMusic and images are often intertwined in new media – and CINEMA 4D often plays an important role in adding impressive imagery to music.Prior to the introduction of musician Mathew Wilcock’s newest album, he wanted to use an EP single to raise awareness for the album. Attention to the EP single would in turn be generated by a video. Standing out in the crowd is not an easy task these days, considering the extent to which spectacular visuals are added to videos! The video had to hold its own against the rest, which is why Mathew hired Tony Zagoraios as art director/concept designer and Dan Kokotalijo as director for the project.

All three had often worked together in the past so convincing Tony to take over the role of art director was the easy part. Mathew explained his concept of what visuals he wanted for his music: visuals that are inspired by the underlying music and underscore its ambience and mood – and combine to create a pictorial story. Based on this information, Tony began to create a storyboard.

Tony used CINEMA 4D to create most of the visuals and had to use several elements taken from external applications for the animation – which “… was not a problem thanks to CINEMA 4D’s comprehensive import options,” as Tony states. “Especially the file exchange with After Effects and its seamless integration with CINEMA 4D sped up workflow enormously!” Otherwise, many aspects of the work on ‘Floating Metal Key’ was interdisciplinary and included numerous applications – with CINEMA 4D at the core of the project.

Much of the work done in CINEMA 4D was done using the MoGraph feature, which was used to animate the numerous object fragments floating in the scene. Hair and spline dynamics as well as Thinking Particles, and almost all Deformer objects were used to really put things in motion. Features not offered by CINEMA 4D were simply created via plugins such as Thrausi, for example.

The animation was rendered primarily using CINEMA 4D’s Standard Renderer. The Physical Renderer was used to render a few scenes that required render settings not included with the Standard Renderer. By strategically splitting the render process between both renderers and with the right settings, the team was able to accurately plan the required render time, which was split up across three render clients.

The team members who worked on this project were spread out across the globe and worked over a network to complete the project. Their hard work quickly paid off as ‘Floating Metal Key’ was selected as Vimeo’s ‘staff pick’ only a few weeks after release. Mathew and his team are also planning to enter the animation, which already has over 110,000 clicks, in various festivals. All team members benefitted from working on this project: “The coordination between the various team members and fine-tuning various aspects such as environment, scene setup and timing were challenging and resulted in a vast exchange of know-how amongst everyone involved. Everyone was able to gain a great deal of technical and production experience, and learn a lot about CINEMA 4D in particular.” concludes Tony Zagoraias.

]]>news-3833Thu, 27 Mar 2014 09:11:00 +0100Floods, Fish and a Flooded ValleyAustrian satire is often mordant and exaggerated. The Austrian town of Bad Fucking, pronounced ‘fooking’, takes it to the extreme and CINEMA 4D helped create the effects for this film in truly Austrian satirical manner.It’s a fact that words that have a specific meaning in one language can have a different meaning in another. Hence, a multi-language term can have a very respectable meaning in its language of origin but can be of quite objectionable nature in another language. The Austrian author Kurt Palm took advantage of the fact that an Austrian town carried the name Fucking, whose status he then elevated to that of a spa town and renaming it ‘Bad’ Fucking (Bad is the German prefix for spa) and relocating it to the Alps for his story. He then wrote an outrageous whodunit that satirically pokes fun at the Austrian character. The result was a brutally mordant Alpine satire!

The book went on to become a great success and it didn’t take long until the decision was made to turn it into a movie. The seasoned director Harald Sicheritz, known for his work in both film and television, was selected to make the movie. Since the story included several fairly spectacular scenes, which could never have been realized using live footage, the Vienna-based computer graphics and animation specialists Cybertime were brought on board.

The road that leads to the tourist town of Bad Fucking winds along the edge of a narrow valley. At the very beginning of the movie, a huge avalanche blocks the road and makes in impassable. The avalanche, made up of boulders, mud and debris, crashes through the valley and flattens a car on its way. The team at Cybertime created this scene with the help of CINEMA 4D. “It was particularly important to be able to control the avalanche’s path,” explains VFX artist, supervisor and producer Günther Nikodim. “The Dynamics feature’s follow position settings were extremely helpful. They made it possible for us to use simple keyframe animation to define the avalanche’s path relatively precisely; the motion of individual elements was controlled using dynamics.”

The simulation was then baked and converted to keyframes. The keyframes were then reduced and Alembic was used to export them to Realflow 2013 where the large boulders were used as collider objects for the fluids simulation. The meshes created in Realfow, which contained up to 25 million polygons, were then rendered in CINEMA 4D together with the boulders and all the debris in the avalanche using the Physical Renderer in CINEMA 4D. Final compositing was done in Nuke.

The rest of the movie is no less spectacular: At one point, a police officer is swept away by a tidal wave several meters high that’s teaming with hundreds of eels. The eels were created entirely in CINEMA 4D using the integrated Sculpt feature in addition to other modeling tools. A simple rig was then created using Deformer objects and MoGraph was used to duplicate a total of about 3,000 eels. The water was simulated using Realflow via Hybrido2 and was imported into CINEMA 4D as a baked mesh. “We first experimented with transparencies and absorption for the water’s shading, which unfortunately resulted in extremely long render times,” stated Günter Nikodim. “In the end we omitted transparency entirely and instead used Subsurface Scattering with a long Path length. To our surprise we were able to use this method to quickly create the muddy water we needed and in the desired quality.” In addition, reflections were rendered separately as Multi-Passes and then combined with SSS shading in Nuke. Realflow was used to create Splash&Foam particles, which were then rendered in CINEMA 4D.

In the end, the tidal wave leads to the closing scene in which the entire flooded valley of Bad Fucking can be seen. This live footage was edited extensively in post-production: mountains were removed and the slightly overcast sky was replaced by dark storm clouds. Color correction was used to make the scene even more sinister and foreboding. Most of this editing was done in Nuke and Photoshop. The edited material was then projected onto rough geometry in CINEMA 4D using the Camera Projection feature so that careful camera movements could be created to enhance the scene’s three-dimensionality even more.

A CG water surface was then created in CINEMA 4D that consisted of a simple plane with a bump map and a highly reflective material. The unique challenge presented by this shot was the seamless integration of live footage of a swimmer in a lake into the CG water. The waves created by the swimmer in the live footage continued outside of the picture and therefore had to be supplemented digitally. CINEMA 4D’s Formula deformer proved to be a fast and easy to control solution.

Cybertime’s team of 3 artists completed the main scenes and innumerable smaller corrections and additions within only five months. A total of about 7 minutes of CGI scenes were created and rendered, which made it possible to add the desired exaggerated satirical components from the story line – which are surely a matter of taste. After all, satire is designed to provoke a debate with regard to its content. There’s no question about the quality of the CGI effects on the other hand – they’re excellent!

Movie website:http://badfucking.at/#]]>news-3747Mon, 03 Mar 2014 12:45:00 +0100Ben 10: Adventures in the Third DimensionVincent London transports the 2D cartoon into a futuristic 3D worldBen Tennyson is a ten-year-old boy whose wrist-worn Omnitrix device enables him to change into alien creatures and fend off attack from extra-terrestrial forces.

The Ben 10 animated series first aired in late 2006 and ran for four seasons until 2008. This was followed by three seasons of Ben 10 Alien Force and two seasons of Ben 10 Ultimate Alien. The fourth instalment, Ben 10 Omniverse, has been running since 2012 and is now entering its fifth season, in which Ben comes face to face with Albedo, an evil clone created by the alien Galvan race.

To promote the new episodes, Turner Broadcast and The Cartoon Network turned to Soho-based studio Vincent London, which produced three spectacular slots for the campaign. "The project was intended to showcase Ben and his alter-ego within a dual cityscape environment," explains Creative Director, John Hill. "We wanted to create a flexible futuristic world to host the epic fight scenes between Ben 10 and his new nemesis."

The work is a slick combination of 3D environments created with CINEMA 4D and 2D characters, which were drawn and animated in-house using Adobe Flash. The design was one of the key challenges, states Hill. "The 3D world needed to compliment the iconic Ben 10 artwork and style and allow us to flexibly composite 2D character animation. The buildings and cityscapes had to reflect the good and the bad renditions of Ben 10 and also create awesome spaces for the fight scenes." Accordingly, the green and red outfits of Ben and his clone were used to color-code the environment.

With all the dynamic camera moves and shifts in perspective, the melding of 2D and 3D required thorough planning to ensure that all the elements matched up. "We worked up fairly detailed hand-drawn animatics," says Hill, "followed by more considered pre-vis animatics for each shot. This enabled us to sync up the interactions between Ben and the 3D scenes as well as plan overall shot composition and timings."

"CINEMA 4D was easy and quick to use when realising our hand-drawn cityscape designs," he continues. "It brought us time when wanting to experiment with various sets and environments for the action."

A few of the 2D character shots were combined with the backdrops in post-production but Hill suggests that for the most part it was all done in the 3D realm. "Although some shots were composited in Adobe After Effects, we also rendered them out with the 3D scenes to get additional GI and shadow passes. The more you can do in 3D, the better the composite and final result, we find."

This workflow makes sense when you realize that the characters are often required to produce subtle reflections on the floor and cast shadows (like when Ben is running through the explosions). To achieve this, the team applied alpha-mapped animation onto a flat plane positioned within the scene and animated accordingly. They then used the Reflection, Global Illumination, Material, Diffuse and Shadow passes for use in the final composite.

Stylized explosions were handled similarly: "After animating them in Flash, we added them to the 3D scenes to help add GI lighting and shadows," says Hill. "We mixed 3D explosions with the hand-drawn 2D explosions to merge the styles."

An abstract cityscape of monolithic buildings makes an ominous backdrop to the action. "We sketched out quite a few cityscapes and building designs before settling on the self-illuminating futuristic style," Hill says, "as this worked best for atmosphere and premise. We used mainly geometry for the neon-style wireframes, as the render is always cleaner than using textures."

He adds, "The simple cityscape complimented Ben 10's graphic style well; we had to be careful not to add too much detail as this moved the design too far away from Ben's 2D artwork."

The angular buildings are lit by area lights to add subtle glows, with occasional flashes to make the city feel alive. However, the main light sources come from a network of flying cubes, which also give the city a sense of scale. These were simple boxes containing lights in conjunction with self-illuminating textures to add detail.

Lighting proved the hardest part of the project, admits Hill. "Atmospheric lighting with flicker-free GI is always time-consuming," he says, adding that the challenge was to "create subtle mid-tone detail in a night-time dystopian environment so as to not end up with a flat graphic look."

Naturally, CINEMA 4D's MoGraph tools were used for the floating cubes, with a little manual animation and keyframing to get the right results. "MoGraph is probably my favorite tool," declares Hill. "It's so flexible, and so much complex animation can be achieved with very simple scenes."

In one sequence, evil Ben's influence spreads through the city, creating giant cracks in the buildings and causing explosions. The team employed CINEMA 4D's integrated Dynamics to create the tumbling debris, then hand-animated ripples and deformations in the floors and walls for extra detail around the impact areas.

The Tron-style environment is full of bright objects and details that were required to glow, and it's usually best to add these in post-production for more control. "We used the material illumination pass as an EXR file sequence, and 32-bit linear compositing," explains Hill. "The After Effects glow effect does a pretty good job with high color depth. Blurred overlays in After Effects can also prove helpful for glows when treated well."

To sweeten the shots, lens flare effects were added to bright light sources and explosions. Again, these were added in post using After Effects plug-ins, although some were actual lens flares that were photographed by Vincent's team, or hand-made textures.

Depth of field is used to great effect, too, adding drama and helping to tie the 3D and 2D elements together. "Most depth of field was applied in After Effects but some shots were rendered using Physical Renderer's decent Depth Of Field function," says Hill.

The three promo slots for Ben 10 Omniverse were completed over the course of six to seven weeks. With various projects on the go at the same time, team numbers varied, but usually consisted of six to eight people: 2 to 3 Flash animators, two CINEMA 4D artists and two working on compositing.

The team were able to call on four 12-core Mac Pros and a handful of i7 iMacs as workstations and an overnight render farm. "We could usually render two to four shots in one evening depending on complexity and render engine," comments Hill, but says he has no idea about total render time: "We were rendering every evening for six weeks, but probably reworked each shot a few times, so I'm not sure how long for the final shot renders."

Certainly the end result was worth all the effort. The flawless combination of 2D and 3D elements produced three thrilling sequences with a unique aesthetic. Creative Director Hill is understandably satisfied: "The design process, from a hand-drawn cityscape design and a fight scene moment, to a full 3D animated scene, was really gratifying."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-3749Thu, 20 Feb 2014 11:39:00 +0100Layer Media Uses CINEMA 4D for Promo ParodyIt’s hard to imagine what a humorous take on Ubisoft’s new third-person shooter game, Tom Clancy’s Ghost Recon: Future Soldier might look likeA Student of CINEMA 4DBecause he considers himself to be a “student of CINEMA 4D” with much more to learn, Ieyoub wasn’t planning on using the software to create the visual effects for the video. He changed his mind, though, when a few things went wrong on shoot day - most notably the scene in which the helicopter flies over the shoulder of the Future Intern. The plan was to use his smart phone to fly a Parrot AR.Dronequadricopter into the shot and over the actor’s shoulder.
But even though Ieyoub had practiced quite a bit in his living room, on shoot day the drone took off and flew straight up into the ceiling before crashing to the floor in pieces. “I was like ‘Oh my God! I’m going to have to somehow do this effect with CINEMA 4D now,” he recalls, laughing. After finding a free model of a Parrot AR.Drone online, he “Frankensteined together” a Parrot AR.Drone model with a toy mechanical claw model to create the drone UAV (unmanned aerial vehicle).
CINEMA 4D also came in handy when Ieyoub needed to create some effects to make the scene in which the UPS boxes stack up on people’s desks look more comical and dramatic. After first trying to make the scene work by having a production assistant stand off camera and throw boxes onto the desks from off camera, he decided to use CINEMA 4D to model the boxes. “And then I used proxy geometry and physics to shoot boxes out of an emitter and land in a stack in a cool, funny way,” he explains.
Ieyoub attributes much of his success using CINEMA 4D for this project to the many helpful tutorials he was able to find online on YouTube. And, as an experienced After Effects user, he was grateful for the smooth integration between the two software packages, which allowed him to share a camera and jump back-and-forth quickly and easily. “I’m sure there are smarter ways I could have done things", he says. “But at the end of the day, I thought it turned out really well and the people at Ubisoft were really happy with it, too.”
Read the full-length feature on this project at Renderosity:

http://www.renderosity.com/ghost-team-cms-16798Layer Media website:http://layer-media.com]]>news-3745Thu, 13 Feb 2014 09:58:00 +0100Wonder WomanRainfall Films’ new short film puts a modern spin on a beloved comic book heroineBy Meleah Maynard
In his introduction to Rainfall Films’ new Wonder Woman short on YouTube, Director Sam Balcomb says that he thinks “quite a number” of viewers will agree that “Wonder Woman is a character just as vital and crucial to our understanding of humanity as any other superhero…if not more so.” Judging by the fact that the film got over 4 million views in the first four days, making it the top featured video on YouTube’s home page and triggering an avalanche of exuberant media coverage, he’s probably onto something.
Wonder Woman is by far the most popular short film that Los Angeles, Calif.-based Rainfall Films has produced internally since the launch of the company in 2008. Relying on CINEMA 4D for all of the environments and After Effects for compositing, they worked on the film for most of 2013, fitting it in between paying jobs. Rileah Vanderbilt (Frozen, Team Unicorn) stars, battling bad guys in a modern-day city, as well as minotaurs in Themyscira, Wonder Woman’s island homeland – also known as Paradise Island.
Why Wonder Woman
Rainfall tries to do some kind of short film on their own every year, says Balcomb, the production/post company’s director, writer and producer. But they’re also commissioned for projects such as their first short, a trailer for The Legend of Zelda, which was created for IGN.
“We’re all huge gamers and comic book fans, and our original vision for the company was to be a production studio generating content online or for videos or feature films, so these projects help show that we love to tell stories and we can create our own stuff,” Balcomb explains (watch their new show reel here).
Because they didn’t have time to work on a project of their own in 2012, Rainfall decided to go all out and do something “really cool and fun that would push us as far as we could possibly go,” Balcomb recalls. The hope was to make a short that would be good enough for their show reel. Balcomb proposed that they do something based on Wonder Woman, and the team eagerly agreed.
“My wife is a huge Wonder Woman fan, so we have comic books all over the house and she has an encyclopedic knowledge of the character,” he explains. “We were talking and I realized that Wonder Woman’s background is infused with a lot of Greek mythology and that got me thinking about how we haven’t seen any of that in live action before.” Storyboarding began immediately, with CINEMA 4D being used to create the 3D animatic and everyone wanting the story to be told through action in a way that would be interesting to long-time fans.
Creating Themyscira
Knowing that they were working with a small team and a limited budget, Rainfall Films opted to shoot much of the footage of Wonder Woman (Diana Prince when she’s in her world) and other Amazonians fighting an advancing army of minotaurs on greenscreen. Prior to the shoot, Balcomb and the other artists modeled the mountains and buildings on the island of Themyscira from scratch in CINEMA 4D (watch a VFX breakdown here).
Modeling the buildings was the most fun and most challenging part of the project, Balcomb says, because the background models were highly detailed and took a long time to render using global illumination. Modeling those elements first allowed Rainfall to use CINEMA 4D’s camera to find the most interesting angles ahead of time. Some shots were rendered before the shoot so they could be used as a reference for lighting and placement of the actors.
An Anonymous City In Flames
The next big hurdle was figuring out how to shoot the city scenes. It would have cost too much money to shut down a whole city block to shoot an action scene. So they opted to go the green screen route again – this time on a sound stage in Burbank. “We watched the edit and it was nothing but green, but when we showed people the rough cut with music, people really responded well to that and wanted to see it over and over again,” Balcomb recalls. “That was great, but we couldn’t help thinking, ‘Oh God, we hope it’s as good once the visuals are in place.’”
To create a believable city, Balcomb headed into downtown Los Angeles at 5 a.m. one Sunday, set up his camera and tripod on a street with the “least LA look to it,” and shot hundreds and hundreds of stills. After altering some of the buildings and adding others using CINEMA 4D, he created a rough geometry of the buildings and camera-mapped the stills he took onto that geometry in order to get real parallax when the camera moved.
Balcomb, along with digital artists Jason Schaefer and Nick Viola, composited the film in After Effects and the job took much longer than expected because many of the shots included tricky elements like a lot of hair and motion blur. “We had to fine-tune each shot because each one of them was so completely different – we couldn’t just copy and paste,” Balcomb recalls. Wonder Woman’s costume, which was designed by Heather Greene (Tron: Legacy), was also problematic in post because it was made of so much metal and it was hard to get the grain out and paint out wires.
Wonder Woman’s Revival
Balcomb says everyone at Rainfall has been overwhelmed by the positive response to the film and the exposure it received in Hollywood. Shortly after Rainfall’s video was posted, Warner Bros. released a statement saying how important it is to give Wonder Woman the big-screen respect she deserves and that fans are hoping for. And not long after that, they announced that Wonder Woman would be in the upcoming Batman movie.

For his part, Balcomb just hopes that anyone who brings Wonder Woman to the big screen pays attention to what people love about her character and they don’t dumb her down. “Our company just had the best year yet, thanks to this project,” he says. “Anytime we get to work on something awesome like Wonder Woman, it makes me so thankful to have this job. I'm really proud of what we've done so far, but it's safe to say the best is yet to come.”
Meleah Maynard is a freelance writer and editor in Minneapolis, Minnesota.Credits:
Starring: Rileah VanderbiltClare Grant • Alicia Marie • America Young • Kimi Hughes • Christy HauptmanDirector of Photography: Andrew Finch Key Grip: Duy Nguyen 1st AC: Nick Roney Gaffer: Ryan WaltonStunt Coordinator: Surawit Sae Kang Stunt Team: Surawit Sae Kang • Joe Perez • Billy Bussey• Jason Brillantes • Kerry WongCostume Designer: Heather Greene Costumer: Sarah Skinner Consultant: GoldenLasso.netHair & Makeup: Anissa Salazar Assistant: Yulitzin AlvarezMusic & Sound Design: Jeff Dodson Vocalist: Raya Yarbrough Sound Mastered at: RunsilentEditor: Jesse Soff Visual Effects Supervisor: Sam Balcomb Compositors: Jason Schaefer • Nick ViolaProducer: Jesse SoffDirected by: Sam Balcomb]]>news-3628Thu, 21 Nov 2013 11:18:00 +0100Speechless – but not Because the Cat’s got Your Tongue.Creating a horde of characters and animating them is generally a job for a large studio – but the talented team at Job, Joris & Marieke studios did it themselves using MAXON CINEMA 4D.Creativity can work in mysterious ways, which can also be said about the short film ‘Mute’ that was based on a cut that Joris suffered while swimming: The cut looked like a mouth so he started joking around and made is toe “talk”. Job and Marieke thought it was disgusting – but it gave them a great idea for a short film!

Job, Joris and Marieke are actually designers but they all were yearning to do animation. After the basic storyboard had been finished, the got turned to CINEMA 4D and started character development. They needed numerous characters who would all create their own mouth. In fact, the entire cast is made up of only three different characters whose look was varied using textures. “To find the right wardrobe for our characters, we researched people named Dieter, Klaus, Marcy and Cindy online. Surprisingly enough, people with these names had the exact look we wanted for our characters,” states a smiling Marieke.

Everything from a chainsaw to a kitchen knife, a hand mixer and a record were used to perform the operations. “Since we only had three base models for all characters, modifying the geometry to create each of these effects was not possible. We used sub-polygon displacement maps instead. This was perfect for creating individual mouths for each character. Even the guy who uses a chainsaw to create his mouth turned out just as we planned,” said Joris.

‘Mute’ looks very elaborate – and indeed it was not trivial considering the fact that the team had never before created such a complex animation: Four minutes of “cutting edge” animation with a heavy dose of dark humor. “Thanks to CINEMA 4D ‘s intuitive interface, we were able to quickly learn what we needed to know to animate our characters. It only took us about three months to finish the film,” remembers Job.Since its debut, ‘Mute’ was nominated for the ‘Golden Calf’ award at the Dutch Film Festival and was also awarded top honors in the Dutch Playground Festival’s ‘Best Independent’ category. Surely this is only the tip of the iceberg since ‘Mute’ will be shown at numerous other festivals in the coming months. We wish Job, Joris & Marieke the best of luck!
"Making of" video:www.vimeo.com/78911041]]>news-3331Thu, 29 Aug 2013 13:34:00 +0200PyrrhosomaA dark burrow, a toad, a luminous dragonfly. These are the ingredients for a new film project by Alex Lehnert and Josh Johnson, created with MAXON CINEMA 4D R15!Josh Johnson and Alex Lehnert are freelance 3D artists who both have an affinity to CINEMA 4D and use the program on a daily basis. In fact, they have never met personally but worked together very closely while creating David Lewandowski’s film Tiny Tortures. While working on this project, Josh and Alex decided to create their own short film, which ended up being the cornerstone for Pyrrhosoma.

Although never having met in person before (or thereafter), online communication made a successful cooperation easy. Based on an idea the developed together and a script written by Josh, the artists set out to create their gloomy fairytale in which a toad in a burrow and a dragonfly who occasionally flies by play the main roles. In addition to the film being a promotional film for both artists, it was also meant to showcase the new features in CINEMA 4D R15.

The initial wave of enthusiasm was followed by a small shock. All assets had to be created for the film exactly as the script demanded: the toad, burrow, stones, mushrooms, spider nets and a lot of vegetation. In addition to all the modeling that had to be done, an eye had to be kept on the render time considering the fact that only three weeks were planned for production.

An essential part of the production was sculpting, which was used to create the wide range of organic shapes. Because Josh and Alex were cooperating via the internet, there were always hurdles that needed to be jumped with regard to texture paths and materials but these were quickly mastered using the new Texture Manager. And the new Bevel tool saved the day more than once and made it possible to quickly create optimized geometry at critical locations on the models. Because neither artist really sees himself as an animator, the animations for the toad were created using the new PoseMorph options.

The Physical Sky object was used in conjunction with the Physical Renderer to achieve the right mood. They first rendered to the Picture Viewer where they made liberal use of the filter options. This made it possible to quickly see if the lighting was as they wanted and if the balance of light and shadow was correct. Alex commented, “This is one of CINEMA 4D’s finer features: you can create images with a dynamic color spectrum without having to worry much about the technical aspects. Josh and I both have experience working with live film and we both know how it is when you’re constantly thinking about which lens should be used for the next shot in order to achieve the desired effect!”

About working with CINEMA 4D, Josh adds: “The program is not only completely intuitive to use, but is also a complete package in itself. Regardless of whether you’re modeling, painting textures, sculpting or animating, CINEMA 4D offers all the features you need in a single application. For example, Alex had never sculpted before. When it was time to fine-tune the various assets, he began sculpting for the first time and was able to achieve great results.”

So far, only the Pyrrhosoma teaser has been finished, which was rendered in time for this year’s MAXON user meeting using the new Team Render. The entire short film will be ready by the end of this year.