Allhttps://maxon.net/en_US.utf8MAXONFri, 22 Feb 2019 16:07:26 +0100Fri, 22 Feb 2019 16:07:26 +0100TYPO3 EXT:newsnews-8268Fri, 22 Feb 2019 12:51:00 +0100OnePlus Reaches for the Starshttps://maxon.net/en-us/news/case-studies/advertising-design/article/oneplus-reaches-for-the-stars/Cinema 4D blurs the lines between reality and sci-fi in OnePlus’ 5T spot. Never settle. That has been the motto of smartphone manufacturer OnePlus since its founding in 2013 with a mission to continuously improve their products according to their customers’ wishes. Just how seriously they take their motto was proven with the premier of the OnePlus 5T, which won rave reviews from the press for its nearly edgeless display, modern design and outstanding performance thanks to its very powerful processor.

Wanting to promote their flagship product with a visually impressive spot, OnePlus turned to Shanghai-based motion designer and director, Somei, whom they had full confidence in having worked with him on a previous project. OnePlus’ direction for Somei was straightforward: include the product slogan, use a door as a key visual and create roller coaster-like camera movement.

64530

Inspired by the product’s slogan “A new view,” Somei’s team personified the smartphone using a robot. After waking up in a gloomy factory hall, the robot sprints out into the open and leaps from a great height to freedom. OnePlus’ positive response to the idea surprised Somei: “I’ll never forget the moment our client approved our unusual concept with the running robot.”

A few months before starting work, Somei met Taiwanese concept designer Mark Chang at a conference and really wanted to work together with him. Since Somei’s team didn’t have any experience with characters, Chang’s skills were a perfect fit and he took over design and modeling of the robot. Having an extensive network of artists was a real advantage for this project, Somei says. “In China, the number of motion graphics artists is small, and I’m in touch with most team members online.”

64540

After OnePlus approved the concept, Somei created an animatic defining each shot and camera perspective. Next, motion designer Zaoeyo, who served as art director, began creating the design concepts for the factory hall, the robot’s escape route and the city into which the robot leaps to freedom. To speed up the process, Zaoeyo used pre-built 3D models and arranged them in the scene. Cinema 4D’s MoGraph Cloner and Instance objects were very helpful for quickly and easily filling the scenes with a large number of models.

64535

Somei hired his friend Peter Zhang to animate the robot. Because Zhang didn’t have a copy of Cinema 4D, the team needed to find a way to transfer his character animation to the model of the robot. After a few tests later, they opted to import Zhang’s model in T-pose into Cinema 4D as an FBX. Here, the artists were able to begin texturing right away while Zhang continued to work on the animations for the individual scenes before exporting them in FBX. Since the setup of the textured model still reflected that of Zhangs, the artists were able to use the Retarget tag to easily transfer his animation to their model.

Somei used the Motion System in Cinema 4D to adapt the timing of the robot animation. Since the animations were baked, a keyframe was set to each individual frame after the animation was imported into Cinema 4D. Somei simplified the animation curve using the Key Reducer and then adjusted the curve again to the snapshot of the original curve, which made it possible for him to more comfortably edit the animation in the F-Curve Editor. Octane was used for the lighting and the final rendering.

It was an ambitious project to complete in just a few weeks and Somei credits Cinema 4D for helping his team deliver on time: “The great stability of Cinema 4D, and its excellent connectivity with other software packages, makes it easy for small teams like ours to meet tight deadlines.”

]]>news-8159Mon, 21 Jan 2019 12:02:34 +0100Game Time for Xbox Game Passhttps://maxon.net/en-us/news/case-studies/advertising-design/article/game-time-for-xbox-game-pass/Blind explains the 3D adventure they created for Xbox Game Pass’ E3 show.

In 2018, for the fourth year in a row, Microsoft and the Ayzenberg Group tapped Los Angeles-based design and brand strategy studio Blind to create content for their high-profile E3 (Electronic Entertainment Expo) show. While last year’s project was devoted to the launch of the Xbox One X, this year’s goal was to highlight Microsoft’s monthly subscription service, Xbox Game Pass.

64338

Blind’s challenge was two-fold: Create something fun and engaging that amps up excitement about Xbox’s subscription service while also announcing current and upcoming games in some kind of creative way. Collaborating closely with the Ayzenberg Group, Blind’s team, led by Creative Director Matthew Encina, used MAXON’s Cinema 4D, After Effects, Octane and Red Shift to create a narrative that builds on Xbox Game Pass’s current drive-thru-themed marketing campaign.

The result brings the drive-thru to life, taking viewers on an animated, 3D adventure past glowing neon signs and game art featuring some of Xbox Game Pass’ current offerings, as well as action-packed dioramas showing off games coming soon to the catalog. I asked Matthew Encina and John Robson, who served as associate creative directors on the project, to explain how they dreamed up and executed the captivating project, which took intense coordination among many different teams.

Matt, can you talk about your process for dreaming up the kinds of things you do at Blind?

Encina: Sure, my process is definitely something I’ve developed over time. I think that anytime you’re designing you are problem solving. And it’s not an art: It’s a science. I mean, there’s artistry in it, of course, but design is an objective process and you have to drill down to solve key problems. Blind’s Executive Creative Director, Chris Do, has taught me that you need to make sure you understand things like: What is the goal? What are the challenges? Who is the audience? Once we understand those things, we know that whatever we make will be useful for the client. Then we move on to define the creative parameters.

From the very first creative kickoff meeting with the client, I try to box in the creative. I find creativity in limitation, so I try to determine a very small box defining what the creative can and cannot be. I don’t like it when anything is possible because then everything is possible, and that is too daunting. Narrowing down allows me to focus 100 percent of my attention on one idea, rather than 1 percent on 100 ideas. (Watch Encina’s video on creating his presentation for MAXON’s 2018 SIGGRAPH booth.)

64339

Once you decided on a narrative for this, where did you go from there?

Encina: To visualize the aesthetic, I first designed frames in Cinema 4D and Octane so I could get an idea of the overall look and also some of the key moments. After that I worked with a storyboard artist to create the sequence, and then I teamed up with John Robson to do the animatic. I’ve been working with John since 2006, and the good thing about him is that he has more of a film background, and I have more of a design background and we really push each other throughout the process.

If I can dream up a story, John can come up with ways to do it using C4D and third-party plug-ins. He’s always the guy who pushes me to try something new. I can be pretty simple in terms of the tools I use but John is always experimenting on the fringe and pushes us to adopt new tools and techniques. From there, I just scale up and build the team and hardware according to the specific needs of the project.

John, talk about your role in this project and what you tried that was new.

Robson: There’s definitely been an evolution of how Matt and I work together. We push each other, and with this project, the client knows our potential and wanted to push the bar further too. Matt did some really beautiful style frames and we worked together to come up with ways to expand the drive-thru theme into more of a roller coaster ride. That’s how we came up with the idea for the dioramas. Those were fun, beautiful moments that felt like they were trapped in time. I really like the sense of realism and scale, which makes them feel less CG and more tactile.

The new technology we tried for this project was Redshift. We worked in 4K and with Redshift. We were able to optimize our scenes and get through everything really fast. We were able to add so much more detail and volumetric lighting. But another big thing was that we didn’t have pre-rendered elements from the client. We were able to bring in all of the assets from the different game companies and texture and light them to the level we needed on a really tight deadline. Back in the day, it took so many more steps and technical knowledge to create something that was in your head. Some of these shots only took a week from start to finish and they turned out beautifully.

64342

Describe how you brought your team together and how you worked with others.

Encina: Our process is often the same for a lot of what we do. We try to keep the team as small as possible during development. Once the idea and technique are clear, we build the team based on what we need. For this project, it was me and a few designers at the beginning, and then I brought on John to flesh things out. Then, in the last half of the project, we brought on six Cinema 4D artists and a Maya artist to get us to the finish line.

We like to work that way because it allows us to spend our money very wisely. Our team always stays objective, always tries to do right by our clients and we serve the best creative we can cook up.

Meleah Maynard is a writer and editor in Minneapolis, Minnesota.

]]>news-8090Tue, 11 Dec 2018 17:15:45 +0100Spyscapehttps://maxon.net/en-us/news/case-studies/visualization/article/spyscape/At New York’s new spy museum, the experience begins in the elevator.&nbsp; New York’s Spyscape is not just a contemporary museum devoted to all things espionage, it’s also a place where visitors can find out what type of spying best suits them and get a little taste of what spies really do. In keeping with the plot of most spy dramas, the adventure begins with a briefing, only in this case the briefing room is the largest passenger elevator in the world.

Knowing that the elevator ride needed to be immersive and theatrical enough to captivate visitors from the start, Spyscape turned to Territory Studio, which is known for designing compelling screen graphics and VFX for Hollywood spy thrillers and military dramas. Using MAXON’s Cinema 4D, X-Particles, After Effects and Red Shift, Territory created a three-minute briefing that plays at a slow 8 feet per minute on three of the elevator’s walls on floor-to-ceiling screens augmented by surround sound.

63862

Territory’s Nick Lyons was the creative lead on the project. Here he explains their process from creating the initial storyboards to designing a convincing spy narrative, including a peek at the wood-paneled corridors of an M16 outpost tucked into an ordinary office building and a view of surveillance satellites from space.

Were there particular projects that helped Territory get this project?

Nick Lyons: I was creative lead on this project and supported creative director John Sunter and senior producer Alice Ceresole in bringing the client’s concept to life. In addition to providing creative and technical direction, I worked closely with the rest of the team, including Roland Lukacsi, Dorian Thomas and Melanie Keyzor throughout the project. Once we had sign-off on the concept and direction, we were given free rein to build the story, choose the tools we would use and figure out how it would work, including each scene and the transitions between them.

How long have you been at Territory?

Nick Lyons: I’ve been working with Territory now for three years. I currently call London home, but I moved here from LA where I used to work at Blur Studio. Blur is known for their video game cinematics and film work. I feel lucky to have worked with them and had the chance to be a part of the great culture and work they are known for. It allowed me to open up a dialogue with Territory once I moved to London.

Describe the brief that you got at the start of the project.

Nick Lyons: We were asked to produce the final “briefing” video and audio for the lift. The brief was really loose. They liked our pitch to create an immersive experience where visitors would feel as if they were part of a spy film from the minute they put on their Spyscape wristbands. Our job was to design something with a sense of realism that introduced people to the world of spying and explained how we are all spies now, thanks to social media and the kinds of surveillance that goes on routinely, like CCTV cameras and cell phone monitoring.

How did creating this for a crowded, moving elevator complicate your process?

Nick Lyons: This project had several unique technical hurdles to overcome during production, as the elevator was being built and while we were designing and animating, so it was important that our concept was flexible. We approached this much differently than a typical project where the audience is looking directly ahead at the screen. As the content is wall high and wraps around the elevator, a member of the audience could be focusing on a different section of the screen at a given moment. We decided to try to focus the audience towards the center of the middle screen but if they happened to be looking around at the two side screens, we added relevant storytelling content to make sure any direction they looked helped move the story forward for them.

That was probably our biggest technical challenge. They sent us a schematic of the elevator, and measurements of the screens so we could essentially create three projections: one out front and the others on two sides. The illusion was that the front screen was moving forward or up and down while the side screens were always moving in unison with the depth of the front screen, so they wouldn’t look flat or distorted. To do that, we used Cinema 4D’s CV VR cam to customize a 360 rig. The forward camera, and left and right camera, were each a separate render. If you looked at the actual video file we sent them, the front center third looks normal but the two side thirds look quite skewed on a flat screen. But when you watch it in the moving lift, the two side screens look correct in comparison to the front.

We really had to nail that illusion, so we did a lot of animation tests. We built the lift in Cinema 4D using the specifications they sent us. Next, we funneled our test animations through it to make sure they wrapped around the walls correctly, gave the right feeling of depth and didn’t have issues with visible seams between the renders. Once we finished building the life, it was time to test our footage to see if the animations would work correctly with the projectors, walls and space. They trusted us, and all our testing paid off, and the installation went off without a hitch.

Marble Madness from 1984 is a classic video game. The aim of the game is to guide a marble through an obstacle course, which is not an easy task when you add the marble’s own inertia. A German team of artists and developers who were inspired by the original game created their very own innovative marble labyrinth and endless runner mix and coined it Marbloid.

For the obstacle course, the game selects from a pool of individual level elements to create an exciting live course with curves, jumps, portals and speed pads. The intuitive modeling tools in Cinema 4D were the primary tools used by the development team when designing the course. For example, the Subdivision Surfaces object was used intensively for the construction of the winding curves, and the Spline tools and the new Knife tool introduced in Cinema 4D R18 were used to precisely increase the resolution of the level geometry.

63773

To keep the iteration cycle as short as possible for level design the team decided to take a new path to avoid constant exporting to the Unity3D game engine: “We used the Bullet Physics Engine in Cinema 4D and a plug-in that reads game controller input to control our sphere directly in Cinema 4D. This made it possible for us to quickly design and test new level elements,” explains Andreas Gaschka. “We used XPresso expressions and a bit of Python to reflect the input design of the Unity project. We were even able to implement complex game elements such as our speed pads in Cinema 4D and test our level design live!” “Cinema 4D uses Bullet Physics while Unity uses NVIDIA PhysX. Each offers a slightly different physical result. With some fine-tuning of the various parameters we were able to adapt the Cinema 4D simulation to give Marbloid the same feel as the original game,” adds Johannes Deml.

63770

“As an MSA customer we have access to the Cineversity Smart Export Plugin, which adds helpful functions to the already seamless export from Cinema 4D to Unity. Thanks to the excellent Python integration in Cinema 4D we’ve also developed several scripts with which we were able to prepare the level segments for the exchange,” remembers Andreas Gaschka. “The scripts converted game elements such as spawners, power ups, emojis and decorative objects to Null objects. The naming conventions and an import script in Unity were used to replace the Null objects in Unity by the respective element and during the time allotted.”

For the game’s design, the team was inspired by Supyrb from Vaporwave: Marbloid created a collage of marble, cherry blossoms and 90s technology to create a fresh and colorful game world.

“Cinema 4D’s professional tools and its flexibility made it a core element in our development pipeline. Cinema 4D is perfect for quickly finding the right solution and it also offers the flexibility to find solutions to complex problems. Thanks to Cinema 4D we were able to concentrate on the spirit of the game instead of haggling with technology.”

Marbloid is now available at Apple’s App Store and an Android version is planned for spring 2019. The Control4D plug-in, which was used by the team at Supyrb to control the sphere in Cinema 4D, is now available for free at the developer's website!

]]>news-7995Wed, 21 Nov 2018 14:40:41 +0100Welcome to LUMAhttps://maxon.net/en-us/news/case-studies/advertising-design/article/welcome-to-luma/A small town’s projection arts festival draws artists and crowds from around the world. Binghamton isn’t known for being a major tourist destination. But since 2015 the small town in upstate New York has increasingly attracted crowds, as well as international attention, for the LUMA Projection Arts Festival.

Founded by a street photographer, a film editor and an event planner who had no idea how popular the festival would become, LUMA is currently the only visual arts festival in the U.S. focused primarily on projection mapping. This year’s fourth-annual event will be held September 7 – 9 in downtown Binghamton where the city’s historic architecture makes the perfect canvas for the show.

Billed as a celebration of immersive storytelling, LUMA attracts crowds almost as large as Binghamton’s population of 45,000. Artists from around the world submit proposals to participate in the three-day event, and those who are chosen craft their tech-infused animations using a combination of elements, including Cinema 4D, Houdini, MotionBuilder, high-powered video projectors, live music, moving lights and more. “This is the biggest festival we’ve had so far, and we’re really pushing the boundaries of using new technology to tell stories in brand new ways,” says Tice Lerner, who co-founded LUMA with his friends, Nick Rubenstein and Joshua Bernard.

63757

An Unexpected Success

The three of them never had an elaborate plan to start a festival. Instead, the story went more like this: It was a cold early spring night about five years ago and Lerner, a mechanical engineer turned photographer, asked Bernard if he knew anything about projection mapping. He didn’t but being a photographer, designer and videographer, he found the idea intriguing enough to do some research. “I was blown away by the possibilities the technology offered,” Bernard recalls. Working as a local event planner, he’d long been trying to come up with a unique attraction that Binghamton could call its own. Maybe this was it?

Without the money to hire a production company they reached out to friends and colleagues for help. “Luckily, we knew a lot of geeks who could build computers, run software that had to be operated in specific ways and fake their way through using technology they'd never touched before,” Bernard recalls before adding that, luckily, they also brought on their friend Nick, an experienced art director and motion graphics artist. “That’s how LUMA really started out,” he says, “us hanging out in my apartment with some office projectors, mapping my kitchen cabinets.”

63758

It wasn’t long before they realized that, beyond projection mapping, what they really wanted to explore was the intersection of art and technology. And figuring that there had to be others who felt the same way, they pulled off the first LUMA festival by having an electrician do some rewiring at a friend’s place and set up three projectors in his living room to map the show onto the buildings across the street. “We never say, ‘that can’t be done,’” Bernard says. “There’s always a way.” (See LUMA’s 2018 Kickstarter video here)

But while the whole community had been very supportive, nobody really knew how this whole festival thing was going to turn out. Lerner, Bernard and Rubenstein imagined they’d get a crowd of about 3,000. When nearly 30,000 showed up, they were just as shocked as the local police, who had to quickly block off some of the surrounding streets. “We were so surprised by our success,” Rubenstein recalls. “I think it really showed how hungry people were to see something new and vibrant. By the second year we had grown considerably and nobody could move so, last year we relocated the festival to downtown Binghamton. We also got a lot more sponsorships and community support.”

Bigger than IMAX

Pulling together 3D projection-mapped animated stories that will be shown on sizable buildings is no easy feat. The motivating factor for artists, Rubenstein says, is the chance to work on ideas that have often been percolating for years. “This is art on an enormous scale, much bigger than an IMAX screen, and that’s so exciting for them, and for the crowd. The expressions on people’s faces as their watching are amazing.”

While some artists have shown their work at LUMA in past years, each event also features a few newcomers. For 2018, the main lineup includes Maxin10sity, a world-renowned projection mapping company based in Budapest, Hungary; Light Harvest, a New York City studio known for their award-winning projection mapping work for clients including HBO’s Game of Thrones and Burning Man; Barcelona, Spain’s Onionlab, specializing in projection mapping and VR, they introduced stereoscopic mapping to LUMA in 2017; and Binghamton, New York’s own design studio and production company, Favorite Color.

This will be Onionlab’s second year participating in the festival. Jordi Pont, co-director of 2017’s project, Axioma, will be returning in the same role for this year’s projection-mapped animation, Transfiguration. Both pieces offer very different experiences, Pont says. While Axioma was a red/blue anaglyph animation that explored geometry by taking viewers on a journey through different stages of dimensionality, Transfiguration is an immersive show that will be accompanied by 44 members of the Binghamton Philharmonic Orchestra.

“Through the interplay of lights and music, the United Presbyterian church will turn into a magical place,” he says, explaining that, just for the occasion, Onionlab designed a “complex WYSIWYG (what you see is what you get) lighting control system that uses Cinema 4D to modulate different sensations in real-time throughout the show, creating a delicate and hypnotic choreography of lights that will be elegant and solemn at the same time.”

63759

Collaborating with Artists

Artists are always welcome to submit ideas for future festivals, and all three organizers spend time looking at demo reels and reaching out to people whose work seems like a good fit. “It’s a very collaborative and organic process,” Lerner explains. “I keep a library of the buildings downtown, so I have measurements and photos to send to artists. I also encourage them to go on Google maps and look at the street view to see if there are other buildings they like.” Blueprints are rarely available, so once a building is chosen and an idea has been approved, photogrammetry is used to take more exact measurements and generate point cloud data that’s used to create a 3D mesh of the building’s façade in Cinema 4D.

Light Harvest will also be participating in LUMA for the second time this year. Ryan Uzilevsky, Light Harvest’s founder and creative director, says the studio got involved after Lerner contacted them in 2017 because they liked the “grassroots spirit” of the festival and the fact that the event is driven by artists’ creativity.

63760

This year’s immersive film, The Truth Shall Set You Free, is a follow-up to their 2017 project, Shoulders of the Past. Both feature motion-captured dance performances, with the first film telling the story of how older generations can inspire the young. “This time, the story is about information overload in modern life,” Uzilevsky explains. “It’s about the struggle we all have when deciding what information is important in our lives, and what is just distracting noise.”

In addition to be one day longer than past LUMA festivals, this year’s event also includes a Theremin concert, opera and beer, an orchestral light show, exhibitions by local artists, a late-night after party and much more. How the festival will grow from here remains to be seen. But the three founders feel committed to making it happen. “I moan and groan and stamp my feet sometimes when I have to go find the money for one more light fixture that somebody needs,” Bernard says. “But I know this year I’ll be standing in the back of the church watching Onionlab’s Transfiguration, and about 10 minutes in something extraordinary will happen, a magical moment, and Tice will turn to me and say, ‘That’s what the extra light was for.’ Our artists are extraordinarily good at detail, and we wouldn’t have it any other way.”

As the host of Hulu’s A Curious Mind, Dominic Monaghan (Lord of the Rings, Lost) leads the way as viewers travel though outer space and explore intriguing questions about the universe and science. Inspired by Carl Sagan’s Cosmos, the series’ content is captivating all on its own. But the fact that it gives viewers an immersive 360° VR experience puts it in another realm altogether.

63706

Toronto-based CreamVR masterminded A Curious Mind after talking with Hulu about creating content for Microsoft’s MR headset launch. Because everything except Monaghan would be CGI, CreamVR brought on animation and VFX studio Thought Café to lead creation of the show’s 360° environment using a variety of tools, including MAXON’s Cinema 4D, After Effects, Canvas 360 and CV-VRCam, which enables Cinema 4D to render in 360 degrees using the standard renderer.

Direction for the project was fairly open-ended but CreamVR made it clear that they wanted something extraordinary, something that allowed viewers to really feel like they were inside the adventures unfolding on-screen. Thought Café was up for the challenge but problems arose when the team tried to render some of the stereoscopic footage out of Cinema 4D.

Here, Thought Café’s Creative Director Jon Corbiere and Technical Director Tyler Sammy, as well as Aden Bahadori, CEO of Torus Media Labs, explain how the team created the breathtaking series and solved their technical problems with a little help from developers at MAXON.

Why did CreamVR choose Thought Café, and have you worked with them in the past?

Corbiere: Thought Café is a design, animation and digital effects studio and we focus almost entirely on factual content for education, documentaries or cause-related projects. Over the last few years, our collective of artists, animators, illustrators and producers has been exploring VR work because a lot of industries are trying to figure out how VR fits for them. Virtual Reality is a natural fit for educational content, so we’ve been doing pieces here and there. We’ve worked with some of the creatives from CreamVR and they reached out to see if we wanted to work with them on this project.

I imagine Canvas 360 has been a part of CreamVR’s process in the past too, right?

Bahadori: Yes, we have a long relationship with Andrew MacDonald, one of Cream’s creative directors. They contacted us early on because they know stereoscopic camera tracking can be difficult. People talk about how 360° video is a confusing format, and it can be. That’s why Mike Sevigny and I founded Torus Media Labs and started out by making tools for ourselves. There were no tools available to do camera tracking in 360 degrees, so we made Canvas 360 and then made it available for others to use.

Talk about how you worked together to create the show’s immersive feel.

Corbiere: From the start, we decided to make this series different from what’s out there. Instead of static shots, we wanted lots of moving cameras and crane shots. We didn’t want the host to just stand in front of a static 360 camera where it felt like nothing was in his environment. Anyone can take a locked-off shot and put a character in another background. With a moving camera, what you’re seeing feels more authentic. Most action sequences are shot with a hand-held camera, so we wanted to shoot Dominic with a hand-held camera on a crane, like a Steadycam shot. We also shot from various angles and distances so he would be framed well with the graphics around him.

63720

Sammy: We are consumers of VR content, so we have been making ourselves sick watching it from the early days. Every time we think, ‘This is really good man, but why does the execution have to be so terrible?’ We wanted to make this a mind-blowing experience that wouldn’t make people sick. In our first discussions with Cream, we decided to really push things technically and do something that hadn’t been done in 3D space and 360. We were planning on using a techno dolly so we could have the actual data from the camera. But due to hardware availability and talent scheduling that didn’t work out, so Andrew suggested we contact Torus Media Labs.

Talk a bit about Canvas 360 and how it was used for this project.

Bahadori: We know that Canvas 360 can move scene data from After Effects to Cinema 4D. But things got tricky when we introduced the stereoscopic aspect. Canvas 360 was not designed for a stereoscopic workflow but Mike Sevigny, our co-founder, was already developing S3D360 tools internally at Torus Media Labs. The difficult task, at that time, was to match the stereoscopic rectilinear camera motion seamlessly with the stereoscopic 360 renders. Since then, our stereoscopic tools, Canvas STK, have been released.

You mentioned that the camera tilt caused some complexities. Say more about that.

Sammy: Early on in the testing phase, Thought Cafe identified a bug in CV-VRCam that was preventing them from rendering the scenes with accurate camera tracking data. The problem was that scenes didn’t render properly if the camera was tilted, and a lot of the footage was shot with the camera intentionally tilted down. We contacted MAXON and they put us in touch with some of their developers working on those plug-ins, so we called and said, ‘Hey, we have this crazy project and literally everyone says we’re crazy trying to do this.’ It was great because they understood what we needed and let us use an early beta of CV-VRCam to render the tilted footage. This isn’t a problem anymore since Cinema 4D R19.

Did this project inspire Torus Media to develop different products?

Bahadori: The R&D on this project is what solidified the foundation for our new stereoscopic tools, Canvas STK. One of the major features that sets it apart is that Canvas STK allows you to preview in several different formats, including those supported by 3D Monitors and 3D TVs. This means you can point the camera in any direction and preview the stereoscopic image on a real 3D medium without having to put on and take off the HDM while trying to position elements.

What is Thought Café working on now?

Sammy: We continue to regularly produce traditional and interactive digital content for the web, broadcast, and feature films while also developing new projects in the VR, AR and XR (cross reality) space. Recently, we built on some of what we learned working on A Curious Mind to create two new stereoscopic 360 experiences that combine live action and computer graphics to deliver unique educational experiences.

]]>news-7898Tue, 23 Oct 2018 11:30:00 +0200Creating the Semi Permanent 2018 Titleshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/creating-the-semi-permanent-2018-titles/Joyce N. Ho and Nidia Dias offer a behind-the-scenes look at a global collaboration.For 16 years, Semi Permanent has been bringing together artists from all over the globe for a three-day design conference devoted to sharing experiences and ideas. The event’s title sequence is a key part of the action. Hong Kong-born, New York-based designer Joyce N. Ho directed the 2018 titles, the theme being “creative tension as inspiration.” The first female director in the conference’s history, Ho assembled a team of 11 collaborators who worked remotely from around the globe, including designer and art director and Nidia Dias, who used MAXON’s Cinema 4D, Red Shift, After Effects and X-Particles for Act III.

63645

We asked Ho and Dias to break down the widely praised, otherworldly titles, which were made in partnership with Dropbox, including explaining how the titles were inspired by the work of Ernst Haeckel, a German professor and biologist known for discovering and naming thousands of new species while also helping to visually promote Charles Darwin’s theories of evolution.

Joyce, how did you get the job of creating the 2018 titles?

J.N.H.: I actually reached out to the founder of Semi Permanent, Murray Bell. Since going freelance and moving to New York from Australia, where I grew up, I’ve made it a personal goal to try to direct something every year. Last year, I did titles for Likeminds, and Semi Permanent has always been a part of my design life. Every year they do such amazing title sequences by a lot of designers I really look up to, so I thought, ‘Wouldn’t it be cool if I could direct the titles?’ So I sent Murray an email last year, explaining who I am and that I’d love to do the titles. I didn’t expect to hear anything back but he got back to me a few weeks later to say that Framestore was doing them for 2017 and he would like me to do them next year. I was just over the moon.

Talk a bit about your background, and how you became a motion designer.

J.N.H.: I was born in Hong Kong and my family moved to Brisbane, Australia, when I was 4. I’ve always loved watching TV, and I think that’s how I got introduced to animation, just watching a bunch of cartoons and also drawing and making things. After high school, I went to the Queensland University of Technology and majored in animation. Once I discovered motion design, though, I realized that was for me.

Nidia, how about you?

N.D.: I’m a designer and art director. I grew up in Portugal but I’m living in Toronto right now. I’ve always liked to draw but I went to engineering school for one year before realizing it was completely not what I wanted. So I studied graphic design for three years and then discovered motion design. I moved to Canada in 2017 to work at Tendril.

Joyce, what kind of direction did you get from Murray Bell?

J.N.H.: The titles needed to speak to the idea of creative tension, and Murray explained what that theme meant to him. After the initial call, I broke down his explanation, always thinking about how to tell the story while allowing enough room for other people’s ideas. We only had three months, so it was a big puzzle to solve. Murray also wanted the 2018 titles to be a little bit different. Semi Permanent had a bigger partnership with Dropbox, so they asked me to organize a grand collaboration between artists who work in 3D, design, motion graphics and music. I would be the director and we would use Dropbox to work together remotely. I had free reign to choose my team, which is such a rarity and a wonderful opportunity to work with designers I really admired.

I knew I wanted to center the direction around nature and science because they’re such a huge inspiration to me. While doing research the old-fashioned way at the New York Public Library, I found Artforms in Nature by Ernst Haeckel. He did beautiful illustrations of micro-organisms and algae and creatures from the sea. I was so inspired, and I knew right away that those would be my key references. I wrote a treatment with three acts: Pushing and Pulling, Friction and Release. I thought about how I wanted to visualize each section and then broke down each act into style frames.

I gave those style frames and a base music track I was working with to the collaborators I reached out to. Lucky for me, about 90 percent of the people I asked to participate said yes. They had the latitude to interpret what I gave them, and I was super open to them bringing in their own sense of style, as long as it kept with the color palette and what my idea of what the shot needed to be.

Describe what happens in the titles?

J.N.H.: It’s a microscopic world inspired by Haeckel. You follow the organisms from birth to death and the three acts are really stages of creative tension. The beginning is all about the time when creative concepts are forming. We shot in 4K, so we were able to zoom in and follow the sphere, which was half a Styrofoam ball. I composited a digital sphere later. Act II represents aspects of a project that can cause conflicts, like deadlines and budgets. There’s friction in the process at this point, and I showed that using the principle of cell division. This act represents multiple aspects of a project that may be causing conflict; whether it be the timeline or the budget or the actual project itself, it's where things are evolving and sometimes not working out the way you want it to — that little bit of friction in the creative process. And yes, I felt like I was living out all these stages through the creation of the titles in real life! Nidia created the first half of Act III, which represents the micro-organism, or idea, in its most complex form. At this point, I wanted the organism to have tentacles, which turned out to be challenging. She definitely had one of the toughest 3D shots.

63647

Nidia, explain what you created to visualize “Release” for Act III.

N.D.: Joyce knew what she wanted from the start, which was great, and the music was done so it was easy to put things to the beat. My job was to make something that looked like a living organism. Joyce gave me a bunch of Ernst Haeckel references, and I combined a lot of what I saw into one fake 3D organism, which I wanted to look a bit like a jellyfish. I started by doing an animatic and motion tests to get the movement right before I modeled something. I used MoGraph, the bevel deformer, and a few other deformers to make the organism look like it was alive, but my main challenge was getting the motion right. (Watch Dias’ breakdown video below.)

63646

J.N.H.: The difficult thing with the tentacles was that they looked great from the side, but when we placed the camera above them, they didn’t have the depth we wanted. I animated them in 2D using a rigged Newton joint [a 2D physics simulator], and then Nidia comped those into the 3D shot in Cinema 4D.

N.D.: I also collaborated with Joyce on the tunnel shot where everything feels like it is being pulled into a vortex. I used X-Particles for a few variations of the tunnel, which was emitting little circles from my previous shot. I layered my C4D file in After Effects so that Joyce could use to make the tunnel look so painterly. I had a great time working on this project. Joyce’s art direction was beautiful, and I loved the color palette and that everybody was able to bring in their own style.

How do you see your career evolving?

J.N.H.: Currently, as a freelancer, I do a wide range of stuff. I specialize in art direction for motion, and I really like to direct my own projects. Title design is something I’m very interested in and what to keep doing. I enjoyed the puzzle aspect of this project so much. Completing such a mammoth project and being able to work with so many designers whom I’ve really admired from afar was so amazing. It was a rare opportunity and very much a dream project that means a lot to me personally and as a designer.

]]>news-7901Mon, 22 Oct 2018 15:39:23 +0200Addicted to Social Media?https://maxon.net/en-us/news/case-studies/movies-vfx/article/addicted-to-social-media/This short film about the impact of social media relied on Cinema 4D’s valuable ability to harness and exploit assets from a variety of sourcesBrighton-based motion graphics designer Chris Cousins became interested in the effects the scale of social media was having on society, and he decided to embark on a personal project to examine this. “I was exploring the idea of social media as packaged, industrial pharmaceutical products, created to respond to our weaknesses and insecurities,” he says. “The current situation seems similar to a bunch of experimental drugs, straight out of the lab, without any testing or idea of the consequences.”

This notion led to the creation of Side Effects, a striking short film featuring an austere pharmaceutical-style factory churning out social media tools neatly packaged for delivery to the global audience.

One of the starting points for the project was the lighting, which needed to produce the airy, white-on-white clinical feel that Chris was after. “The lighting is one setup that I repeated in each scene except the final one,” he explains. “A simple dome light with a couple of subtle area lights for highlights. No G.I. was needed; the dome did all the work.”

63648

Chris describes how he used a non-visible dome light for diffuse shading and highlights, and another purely for the environment, without any light contribution, which gave the metallic components something to reflect. He then just dialled up the camera exposure to get the exact balance he required.

A key element of the production was the choice of Redshift, a biased GPU-accelerated renderer, for final output. The longest render times were reserved for the final warehouse sequence, due to the more complicated lighting, but even these were never more than three or four minutes per frame. “The renderer was never a bottleneck; it was always scene-building that slowed me down.”

Chris notes that although the project wasn’t technically complicated, it needed to emulate real-world manufacturing techniques for the comparison to be effective. “One challenge was rigging the machinery and creating believable mechanical motion that could still be ‘read’ by the viewer. There was also a tricky section involving boiling fluid, and directing the flow of vapour through tubing.”

These elements were created using SideFX Houdini and Jawset’s TurbulenceFD plugin. “The volume tools in Houdini were used for the bubbling fluid,” he says. “They enable you to create complex arrangements of intersecting, dynamic forms but in a way that lets you easily smooth and tweak the result, and produce clean geometry that renders consistently.”

Chris found the workflow from Houdini to Cinema reasonably painless, using the Alembic format to transfer assets. “I export one frame midway through the sequence to check that the geometry and attributes all transfer, and then just export the whole sequence. As well as geometry this supports velocity data and any arbitrary attributes like density or position within a volume, all of which can be sampled with Redshift data nodes.”

For the glowing Bunsen burner flame and resulting columns of steam, Chris turned to TurbulenceFD, the GPU-accelerated fluid dynamics plugin. The blue flame was a very small TFD simulation, using a simple, high-output heat source emitting under a collider sphere to create the right shape. The steam effect was driven by carefully placed fans and low-pressure regions, and any leaks were masked out of the render by a glass material matte. “The steam in the glass tube is TFD smoke pushed into the pipe shape,” adds Chris. “It’s all in the simulation; it needed a much thicker-walled tube to constrain the smoke, but the movement is natural, not manipulated in post.”

For the precise mechanical movement, Chris used Cinema 4D’s constraints to orient and position elements. “They’re simple to setup, solid and intuitive to read,” he says. “I’d suggest MAXON move constraints out of the ‘Character’ menu – they have so many uses outside of character work.”

Alongside the repetitive, staccato movement of the machinery, the film also uses a lot of natural motion, which was achieved using MoGraph dynamics. “I make a simple version of each setup – cylinders falling into a tube, balls into a pile etc. – and then bake the result out as PLA (point-level animation) data. That way I can easily control the speed and timing of dynamic motion, and also do tricks like ‘rewinding’ movement to have repeated versions. So where pills fall into the bottles, this is repeated for each bottle – the pills themselves fall again and again.”

As well as Houdini and TurbulenceFD, Chris also did some modelling in Autodesk Fusion360 to create machine components, and used X-Particles for the powder that gets sliced off the white blocks. “The strength of Cinema 4D in this project was its underrated ability to act as a solid ‘hub’ application, holding mesh data from multiple apps, integrating particle and fluid effects, materials from Substance resources, animated meshes and Redshift proxy objects without scenes ever becoming unmanageable.”

The film took around eight weeks to complete, but the result speaks for itself, and quickly became a Vimeo Staff Pick. A personal project like this is always a great way to challenge yourself and acquire new skills. “The fluid and smoke effects were great techniques to learn,” says Chris. “But the big takeaway for me was that an animation doesn’t necessarily need to be driven by flashy or complicated 3D techniques, as much as I like them – keeping an eye on the message and narrative are enough sometimes.”

Despite the complexity of the work, one of the biggest problems was simply how the film should end. “I really struggled with this,” admits Chris. “Then the final pay-off line – ‘Now wash your hands’ ¬– came to me in a flash, while staring into a mirror in a toilet.”Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

All images courtesy of Chris Cousins.Chris Cousins website:https://chriscousins.myportfolio.com/]]>news-7710Wed, 29 Aug 2018 15:38:29 +0200Let Fun Rulehttps://maxon.net/en-us/news/case-studies/advertising-design/article/let-fun-rule/ How Already Been Chewed Combined Live Action and Animation for a Magical Mammoth Mountain Spot. Portland-based brand strategy firm, Nemo, doesn’t ordinarily use 3D or animation in their work. But after seeing some recent projects by design and motion graphics studio, Already Been Chewed (ABC), they decided to try something different when making a new spot for their longtime client Mammoth Mountain. The result is a fun, action-packed departure from the usual TV spot that mixes live action and animation to illustrate Mammoth Mountain’s motto: Let Fun Rule.

61531

Relying primarily on Cinema 4D and Octane, ABC spent about four months working on and off to create the 30-second spot, which founder and creative director Barton Damer describes as “a big accomplishment” from a team perspective. “We ended up having about 9 or 10 people working on this, including bringing in three different freelances at various points,” he says. “It’s unusual for us to need such a large team but, this time, it was great to have a lot of people to help with some of the workload.”

61530 Here Damer explains the making of the spot, which is somewhat of a departure from ABC’s recent more photorealistic style, but it’s not new territory for the studio’s seasoned team.

Have you worked with Nemo before?Barton Damer: This was our first project with them, but we’ve been in contact for a couple of years because whenever I’ve gone to Portland for Nike projects, I’ve done capability presentations at Nemo and talked with them about opportunities to work together. I know some people complain about having to do those kinds of presentations, but I like talking to people about what we can off as a studio and we tailor our presentation for the needs of the agency or client we’re talking with. It’s all about how you frame things and, in Nemo’s case, they thought of us because they’d seen what ABC could do.

61539 Describe the direction you got at the start of this project? Damer: They did a great job of storyboarding out the entire spot for us with sketches and things like that. Our job was to turn those sketches into reality, so we created a look that combined live action and animation and got their approval. We made revisions along the way, but the process went really smoothly. They were great to work with and we’ve already worked with them on another spot.

61540 What do other artists ask about most after seeing the spot? Damer: One of the things that makes this spot unique is the seamless integration of low-poly graphics with video footage. People see things like the paper look of the mountains as they animate up behind a skier and they want to know how ABC shot that. We actually didn’t shoot any of the video clips. They were all pulled from a library of clips shot by Mammoth Mountains videographer. We got that seamless look by 3D tracking all of the video clips and then rotoscoping out all of the backgrounds that we wanted to replace.

61538We were unsure of how it was going to look once it all came together, but once we started dropping the 3D elements behind the rotoscoped footage inside Cinema 4D, we could see it was going to work really well. (Watch the behind-the-scenes video here!)

61537People also ask about the timeline a lot, wondering how long it took to do this. It’s a tough one because we had a long, extended timeline so the client could have plenty of time for feedback. In the end I’d say we had about four months, but we didn’t work on it for four months straight. During that time we also created a huge library of individual assets that could be rendered out and used for print. I think we ended up rendering about 50 different high-res assets, including characters, trees, cars, clouds, suns and snowflakes.

61536Talk a little bit about how you worked with the Mammoth Mountain clips. Damer: Nemo ran the clips by us to confirm that we could use them. We did all kinds of background replacements. For example, about 19 seconds into the spot there is a little town scene where you see lots of little shops and apartments. There was signage all over those buildings, so we used Cinema 4D to model replacements that we could integrate easily so we didn’t violate any anyone’s copyright.

61535How did you create the animated characters? Damer: Our lead animator, Bryan Talkish, who is also a great designer, created the look of the characters. Like the rest of the spot, we wanted them to have a low-poly kind of paper feel to them. Based on the first concept, we thought we would be flying a camera down the mountain and be following the characters as they skied.

But the more we worked, we realized that would need to be a far-away shots, so we didn’t end up seeing a lot of the characters, which was too bad. “All of the parts started from either a cube and/or a simple humanoid mesh, extruding, pulling and pushing out sections to form the characters from shirts and jackets to pants and accessories,” Talkish explains. Dirt shaders were used around the phong angles to further the look of the low-poly creases, shading sharp edges a different texture than the flat areas.

61534This looks a lot different from what you’ve been doing. Is it? Damer: To me, doing a piece that involves low-poly work isn’t new for us because we have done it in the past. But this was definitely the most prominent project we’ve used low-poly animation for. Another good example would be a spot we did for Huawei about seven years ago. Our recent work is much more photorealistic. It was refreshing for us as a team to do something more playful, and I notice that we get a lot more comments when we work in this style because I think people can relate to it more than some of the crazy photo realistic stuff we do.

Meleah Maynard is a writer and editor in Minneapolis, Minnesota. ]]>news-7699Thu, 23 Aug 2018 14:01:58 +0200Dark Beautyhttps://maxon.net/en-us/news/case-studies/advertising-design/article/dark-beauty/ Inspired by what others may find scary, Billelis has built a thriving career all his own. People who see Billelis’ work for the first time sometimes struggle to describe it. A blogger once came close by calling it “gothic chic,” but that sounds trendy when, in truth, his work is much more soulful. “It’s true that I am inspired by dark subjects, so people get confused and think my work looks evil,” the U.K.-based 3D artist and illustrator explains. Really, I just want to create a beautiful look that’s a little bit softer than typical dark art that you see on the internet.”

Though he goes by the alias he’s had since he was a teenage tagger and graffiti artist growing up in Greece, Billelis was born, Billy Bogiatzoglou, and most people call him Billy. Just a few years ago, Billelis was in his fourth year as the lead 3D designer doing all the boards for a motion graphics agency. Days were long but, inspired by Beeple’s Everydays, he tended to be “a bit obsessive about doing personal” work whenever he could. All those hours paid off and, today, he is a sought-after freelancer who specializes in using MAXON’s CINEMA 4D and Photoshop to create fantasy book covers for U.S. publishers, key visuals for TV shows and other types of projects. (Follow him on Instagram for daily updates on what he’s working on.)

Here, Billelis talks about his work, what inspires him and how he continues to build a business doing what he wants to do.

How did fantasy book covers become your specialty?

Billelis: It started about a year ago when I was asked to illustrate the cover for Everless, Sarah Holland’s teen novel. It was a New York Times bestseller, so it got a lot of attention and then a good friend of hers, who is also a top-selling author, had a book coming out so I did that cover too. That turned into a lot more commissions, and now I do covers for Harper Collins, Random House, Penguin, Macmillan and Scholastic. I really enjoy doing them.

What do you like most, and what is the process like?

Billelis: You get a lot of creative freedom. They usually give me a short paragraph about the book and a couple of reference images, like a 2D photo or something. Then, I get to do what I think seems right and they pick from two or three concept mockups that I give them. There’s a lot of trust in the artist once they approve the concept, so I can just take it all the way to the end from there. I use CINEMA 4D about 80 percent of the time and Photoshop and Illustrator the rest.

Did you study illustration in school or are you self-taught?

Billelis: I went to the University of Plymouth in England for coding and art, but I wasn’t very good at it. One day, we had a tutorial on 3D Studio Max and Blender, and I was also getting into Photoshop and Illustrator. Somebody mentioned CINEMA 4D, so I started playing around with it and liked it. I was a bit jealous of friends who were doing interesting work and being featured online and in magazines, so I decided to focus on creating my own style, which was more graphic design at the time.

What do you seek out when you’re looking for inspiration?

Billelis: I really like medieval and renaissance art. I’m interested in crests of armor, shields and religious art with a classic look that I can infuse into a digital medium. I find a lot of inspiring things on Pinterest, like classic paintings, and I also look at anatomy books a lot. There are so many things, really. I follow local tattoo artists and loads of 2D illustrators, and I have a concept book from World of Warcraft as well as darker comics like Spawn and Batman.

Describe a recent project that you really like?

Billelis: With my personal projects, I always like to create a series with an array of illustrations that tell another story. I really like my Blossom project. I wanted to create floral growths on a decaying form and I had fun playing with Octane Scatter. It was a series of personal illustrations inspired by nature, the circle of life and the idea of organic decoration for the deceased on their journey to the afterlife. Flowers, death and life all come together in these 3D artworks inspired by Edvard Munch’s quote, “From my rotting body, flowers shall grow and I am in them and that is eternity.”

How important has doing your own personal work been for your career?

Billelis: I’ve always thought personal work was very important, and I really think it’s the reason I have the career I have now. As a freelancer, I have so many commissions, I don’t have to look for work anymore because work comes to me.

Also, I have a rep and a producer now, and they are always trying to get me into new territories. I’m quite lucky because, let’s be honest—skulls are a personal thing. I did an album cover for Bliss n Eso in Australia and that turned out to be one of the biggest projects I’ve worked on because people saw the skull and wanted to know who did it. That really got things started for me, and I eventually ended up doing a magazine cover for a story about Ozzie Osbourne after the art director saw my work on Instagram and got in touch. I make sure to have personal work everywhere, Twitter, Art Station, Behance, Pinterest, Facebook, Instagram, Dribble. You never know who’s going to see it, so it’s important to have a presence everywhere.

What advice would you give artists who are just starting out?

Billelis: Everybody learns more about themselves and their art throughout their career, but try to have even a rough idea of what you like and want to do starting out. Even if you haven’t got a style or want to be more of a generalist, do as much personal work as you can. It will improve your skills, but also help you figure out what you want to do. Freelance illustrators need to have their own style, and people will get to know the kind of work you do. It takes time to develop a client base, but you can do that if you really work on it.

Meleah Maynard is a writer and editor in Minneapolis, Minnesota. ]]>news-7692Tue, 21 Aug 2018 16:21:08 +0200Jesus‘ Psychedelic Space Journey: Immortalys https://maxon.net/en-us/news/case-studies/advertising-design/article/jesus-psychadelic-space-jouney-immortalys/ For the trailer for Ivan Torrent’s new studio album, David Ariew sends iridescent sea creatures into space.David Ariew, a San Diego-based 3D generalist, is someone who likes to work under pressure and also has enough imagination to master all kinds of creative challenges. Spanish artist Ivan Torrent emailed David wanting to know if he would create a trailer for his second studio album Immortalys (www.vimeo.com/248504302). Since David had missed all of Ivan’s previous mails he only had three weeks production time left – but he still didn’t hesitate to take the job.

Fortunately, the cooperation between Ariew and Torrent, who had already gathered excellent reference images. including fantastic cover art by the Venezuelan graphic artist Carlow Quevedo, worked well. Ariew primarily used Cinema 4D and Octane as well as a pre-animated whale from Turbosquid to create an aesthetically interesting trailer in which an iridescent whale travels through a magical universe to a futuristic city in the clouds.

We spoke with Ariew about the trailer’s creation and how his long hair together with his growing passion for publishing Octane tutorials have earned him the name Octane Jesus.O. K., let’s start with the Octane Jesus thing. How did that develop?David: I was creating Octane tutorials together with EJ Hassenfratz and they somehow became increasingly popular. More and more people saw me and some started calling me Octane Jesus. Some also said that I looked like Grizzly Adams. I had no idea who this was so I decided to stick with Octane Jesus because I am definitely an Octane evangelist. I like to explain things and be creative at the same time, which goes hand-in-hand for me. Just take a look at this tutorial: http://arievvisuals.com/tutorialsWhat was the next step after you saw Ivan Torrent’s reference images?David: Ivan had thousands of ideas swirling in his head even without any reference images and an entire Dropbox folder full of videos that he had collected. He then went on to explain to me how he wanted the whale to float so it appears gigantic. I was thrilled that he had so many ideas but was also a little intimidated. I thought he would expect the same image quality and thought, ‘OK, what you’re showing me here was done by a thousand artists over the course of an entire year and I’m a lone ranger with three weeks’ time.’ His expectations, however, proved to be really down-to-earth and the tight deadline turned out to be an advantage because we could let our imaginations run wild and reduce them to the essential elements.Instead of having the whale burst out of the water, I suggested to Ivan that we should let its journey begin in space. This was much simpler and still looked really cool. He agreed. Ivan was the perfect client. He had a bunch of ideas but also had realistic expectations – and when I created something he liked he was really enthusiastic! He had worked two years on his album and making the trailer was a very emotional experience for him. We both had the same goal: to make the coolest video possible in three weeks. I wish I could have worked on it for a whole year – he was a great art director!Tell us a little about the scene in which the whale reaches the city in the clouds.David: As soon as it reached the city you can see a type of signal buoy in one of the buildings that emits the same energy as the whale. Ivan wanted to create a connection with the cyan-colored glow on the album cover so I applied it to the whales, the buoy, the windows and the particles in the final settings. The scene with the transition from space to the city in the clouds was one of the biggest challenges and I’d like to mention that my fiancé Chelsea Starling was a great help. She was very interested in the project and her suggestions inspired me to work much more intensely on many shots. I had stitched two camera moves, for example, and she said that, although it looked good, it would be much better if I added the city on the horizon so the viewer can get a taste of what’s to come.Where did you create the cloud scenes?David: For the first half of the trailer I used VDB Mega Pack, which I had borrowed from my buddy Mitch Myer, for all clouds. The only exception was the thin layer of fog in the background, which I had created several months earlier for a tour intro for Katy Perry. In the second half of the trailer, the golden city over the clouds, I used Octane’s Fog Volume objects to create the clouds. To prevent the clouds from looking computer-generated, I created a second Volume object with very patchy clouds that were positioned just outside of the depth of field to break up the contour of the clouds to make them look more realistic.

How did you create the whale’s magical shimmer?David: Because of the tight deadline, I was forced to take several shortcuts – starting with buying a whale at Turbosquid. This ended up working really well. I didn’t use the whale’s texture maps. Ivan wanted to have the glow to come from beneath the whale’s skin so I did a lot of experimenting and finally decided to use Trapcode Form in After Effects to apply an animated texture to the whale. I created the gold flakes and the iridescent look in Octane and I used the Octane Thinfilm shader to create the shimmer. I love the extremely green or cyan-colored highlight on the whale’s contours. It’s a really magical, mermaid-like look and also reflect the colors on Ivan’s album.How did you create the city? David: I used a kit from Kitbash 3D. The kit’s elements all have the same style and I used the Art Deco Kit. The kits are very affordable for what you get – a few hundred dollars will get you a complete city that you can assemble, texture and light. I created my city almost symmetrical, with the bridge and main building as core elements through which I could fly the camera and then pull up. The buildings in the background are asymmetrical and rather scattered rather than placed manually but I really liked the irregularity. This mass of background buildings added a nice level of additional detail and helped make the city look huge.Here you can take a look at David’s KitBash 3D tutorial:https://www.youtube.com/watch?v=rFhmgqykEes

How did you create the closing credits?avid: The closing credits had to be legible, of course. The way I had lighted the scene, however, didn’t make this possible so I had to find a solution to set the typography off the background. I ended up widening the aperture a lot to create a very low depth of field, which would only let the text appear in focus. At the same time, I reduced the background illumination, which brought out the text even more. If you look closely at the top of the scene, you will see that I used several small light sources in front of and between the text elements.

I wanted the closing credits to have the same epic, deep mood as the album’s music. The piece with the deep rumbling and the whale song was so inspiring. It was a lot of fun to animate! Ivan’s music is really unique and it was really cool working with someone who can add fantastic sound design elements and adapt the music to match the imagery. This was a special cooperation and it made the project to something very special as well.

Meleah Maynard is a freelance journalist and author based in Minneapolis, MN ]]>news-7543Mon, 23 Jul 2018 09:49:48 +0200The Numbers Gamehttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/the-numbers-game/To illustrate how numbers are used to bend the truth, broadcast designer Sophia Kyriacou turned to Cinema 4D's flexible deformers&nbsp; Numbers can be misleading: when a company reports a five percent loss, that doesn't sound too bad. But when the actual figure is £10 million, that's a different matter. This strategic use of numbers to hide facts or massage a story is the subject of BBC School Report, which required an animation sequence to accompany the script.

Broadcast designer Sophia Kyriacou – who has more than 20 years' experience working alongside the BBC – was commissioned to create the animation, which runs for just over two minutes (or 124 seconds, if we stick with the theme!).

61229

The idea of the report, aimed at 11- to 14-year-olds, is to help children identify 'fake news' by showing how numbers and statistics can play tricks on you. "In the animation, the numbers shouldn't be taken at face value," says Sophia. "Just because a number is large doesn't mean it's big and important. For some, numbers can be quite a dry subject. Therefore, it was important that the sequence not only explained but also entertained in an engaging and humorous way."

Early in the concept, Sophia decided to represent the numbers as animated characters. The long, multi-digit numbers would appear on stage, while the single-digit figures were to be individually branded with catchy names such as Skateboarder8, SuperHero4 and Unzippy9. This also provided a scope to create individual animated GIFs to promote the animation via social media.

61248

As usual, time was a limiting factor, with just three weeks allotted for the full character animation sequence. Initially, the characters’ animations were going to include walk cycles, enabling them to march on and off the stage but due to the time constraints, the decision was made to lock them to the ground and simply use a trap door and wires to bring characters in and out of the scene and employ the theatre curtains for scene changes. Also, to keep things manageable, the numbers were built parametrically using a Text object and animated using Cinema 4D's deformers – most notably Bend, Twist and Shear.

"Time had influenced how everything was modeled," adds Sophia. "By doing this I had a template that included all the deformers in place, and if I needed to change the number value, I simply typed a new number in the text box. The tools in the Deformers menu became my main source for body language expression. While I've always used deformers in my workflow throughout the years, I hadn't previously explored in detail how they could be used in character animation, expressing mood and body language behavior. A simple bend downwards gave the impression the character was feeling despondent; or a twist to express humorous and naïve interaction with other characters. These are all very simple gestures but most of us think that character animation is only made up of complicated character rigs, when this doesn't have to always be the case."

For simplicity, the arms were made using an FK chain with a separate rigged hand attached. The legs were built using Cinema 4D's Character Builder tool on Biped setting but only using the lower half of the body. The use of deformers required some planning in the way the model was going to be built, but once one figure had been created, all the others were based on the same workflow.

61250

"I think deformers are sometimes underrated, yet they are very powerful and relatively simple to use," says Sophia. "Just think about where in your object you want the deformation to take place. Having a deformer facing the wrong axis can give you an undesired effect but this is easy to correct. Animation isn't just created by moving an object from A to B or using Transform or Scale to animate… You can resize or warp objects with such power using simple deformers."

For Sophia, the Unzippy9 character proved the most fun to put together. "I knew I wanted its leather jacket to collapse off it, leaving the character looking bare and a little pathetic. That character had the most workflow, with painting in Substance Painter, cloth dynamics for the sides falling off, animating an interior zero spline to full size and a Melt deformer for the front of the jacket. Choreographing the timings took a few attempts to get right for the slapstick humor, but I was really happy with the result."

61257

For the project, Sophia's renderer of choice was Arnold. "I like Arnold as a renderer and the hyper-real look and feel it gives," she says. "I wanted the characters to look like miniature plastic toys."

However, while the single figures set against infinity curves rendered quite quickly, the theatre scenes were proving something of a bottleneck, even on her custom-built PC workstation. Sophia attributes this to the number of characters, all separately rigged with multiple deformers. To speed up the process, she opted to use the commercial render farm, Pixel Plow. "This worked really well and removed the additional stress of having to wait before seeing any final results so I could start to grade."

Because of the short schedule, Sophia admits she didn't get too hung up on multiple render passes, relying on just a depth pass for any depth-of-field effects, and object buffers for alpha masks for the final stages of post-production. "I graded everything using Red Giant Magic Bullet in After Effects. It's very powerful and simple to use. It's a combination I tend to use a lot now within my post-production workflow."

Despite the time pressures, the final result is testament to Sophia's ingenuity and Cinema 4D's comprehensive toolset. However, she does admit there are several things used in this workflow that she would avoid in a bigger character animation project. "The character builds were slightly cumbersome and having them in Nulls wasn't ideal," she says. "But because they were characters fixed in position without walk cycles and full animations, using the deformers to give them character worked. The Twist and Bend deformers are what brought this little animation to life."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-7508Fri, 13 Jul 2018 10:36:58 +0200Passing Byhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/passing-by/ Dutch trio Job, Joris &amp; Marieke’s new film artfully portrays more than 100 years of Amsterdam’s history in just over a minute. Job Roggeveen, Joris Oprins and Marieke Blaauw were all studying product design at the Design Academy Eindhoven, the Netherlands, when they met and soon realized they really ought to team up and use their animation skills to tell stories. Since founding their studio, Job, Joris & Marieke in 2007, they have collaborated on a wide range of projects, including the award-winning short film, A Single Life. Nominated for the Academy Award for Best Animated Short Film in 2015, the film tells the story of Pia, a young woman who finds a mysterious record on her doorstep and discovers that, by playing it, she can travel back-and-forth in time.

The trio’s latest animated short, Passing By – 110+ Years of Damrak, Amsterdam also focuses on time. Only in this case, time moves in a linear fashion, offering a one-minute history of Amsterdam as seen through the large front window of the Amsterdam department store De Bijenkorf. Using MAXON’S Cinema 4D and Octane, Job, Joris & Marieke created ever-changing street scenes showing everything from important historical events to the evolution of cars and fashion designs.

61230I asked them to explain the making of their new film, which was made as part of the artist-in-residency program Room on the Roof from De Bijenkorf. Here is what they said:

How did this artist-in-residency opportunity come up?Marieke: The department store invites national and international artists to work in their special tower above their main building in Amsterdam. And then they try to find the right spot in the store to exhibit the work of the artist. For us, they chose the shop window and we came up with a concept that was perfect for that spot—an animation that takes the viewer on a rollercoaster through time. De Bijenkorf’s shop window displays have always been quite famous. We wondered what the view must have been like from the shop window throughout time. How did the street change? How did the passers-by change? And what historical events took place?

How did you decide what to portray as you moved through more than 100 years?Joris: We are nostalgic people. We love looking at old pictures. We are all around 38 years old, so we have good memories from the eighties and nineties when we were young. For instance, Gabber culture was part of our youth so we really wanted that to be in this film. We also included other things we feel nostalgic about that have disappeared, like the telephone booth, the good reputation of our Dutch soccer team and the Australian track suit the gabbers used to wear.

We decided to show every decade from the opening of De Bijenkorf in 1914, so we did some research on which historical events took place during that time. We chose the ones we wanted to portray and then we made a list of props and fashions for each decade. We also researched how the street changed over time. How did the street lamps change? What happened to the houses across the street from the store? When were cars allowed on the street and what did the trams look like? We tried to tell the history of this street, but also the Netherlands in general. Each period had a different emotion to it: the 40s were dark, the 50s and 70s were vibrant, the 80s were grim and the 90s were also a bit dark. How did you go about modelling all of those props and characters?Job: With so many characters and props moving around, we had to model everything with as few polygons and details as possible. We needed to work fast but we also had to make sure that Cinema 4D and Octane could handle the files. When we work with props and characters, we always start by searching for a lot of reference images. Then, we stylize those to fit our style. Textures were really important in this film, especially those of the characters and their clothes because fashion was such an important part of portraying different periods in history.

Talk a bit about the film’s wonderful sound design and music?Job: I am our studio’s composer and sound designer. Each period has a specific sound. For example, the 30s are pretty quiet because you don’t hear cars, yet. By 2000, it’s very chaotic with loads of cars, scooters and bikes. Finding all of the sounds for each atmosphere was a nice challenge.

Meleah Maynard is a writer and editor in Minneapolis, Minnesota.

]]>news-7490Thu, 05 Jul 2018 12:11:01 +0200Second Glancehttps://maxon.net/en-us/news/case-studies/advertising-design/article/second-glance-1/ Art knows no limits: For an exhibition, a series of images is given a new virtual dimension with Cinema 4D that can only be seen through the camera of a smart device! Images, whether photos or paintings, are always snapshots of a particular moment. The moments before and after this snapshot can only be shown using film. The goal of the artist group Achter April (8th of April) is to use new technologies to break the binds of still images. To this end they used an augmented reality layer to bestow a temporal dimension to images from graphic artist Carolin Jörg.

The idea was the brainchild of Michael Fragstein, who is an avid digital illustrator and often sketches is projects using this medium. During one of his presentations he got to know artist Carolin Jörg and soon the idea for this project was born: graphics inked on white paper to which, when viewed through the camera of a tablet device, a virtual animated element is added that continues the image’s dynamic. Together with texts from Marie-Alice Schultz and sounds from Marc Fragstein, Second Glance is an outstanding example of what can be created with the help of augmented reality.

60788

The animations on the modified Artificial Reality layer were created by Michael using Cinema 4D and X-Particles. He paid particular attention to correctly simulate the images’ dynamics and the flow of ink.

“With Cinema 4D and X-Particles, smoke, fluids and wind can be simulated, which in turn makes it possible to add extra content to the illustrations for the viewer. This makes it easier to interpret the piece or find clues to its origin,” says Michael.

The fact that these images were created by Carolin Jörg especially for this exhibition opens new possibilities for Michael’s animations for each and every image. Michael and Carolin worked very close together: while creating the images, Carolin thought about how the animations would look and Michael developed these inspirations further in the realization of his animations. This also involved bringing the ideas into Cinema 4D where he found ways to realize each individual animation. In 3 months, a total of ten different videos for ten of Carolin’s works were created. Content created in Cinema 4D was converted to an augmented reality layer in Mataio.

Second Glance premiered at the International Trickfilm-Festival (animation festival) in Stuttgart, Germany, where it was shown in the Galerie Ebene0 and enthusiastically received by the public. Second Glance was then exhibited in the Horst-Janssen Museum in Oldenburg, Germany and the Kusthalle in Hamburg. Second Glance can be visited from June 23 to October 7 in the city gallery of Offenburg, Germany.

]]>news-7430Fri, 22 Jun 2018 16:17:00 +0200Technical Gadgets for Marvel’s Black Pantherhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/technical-gadgets-for-marvels-black-panther/Perception on the 18 months they spent working on Black Panther’s future tech.Holograms are impressive sci-fi mainstays but for Marvel Studios’ Black Panther, futuristic technology had to be much more than “light in air,” says Jeremy Lasky, co-founder of New York City-based Perception. For more than a decade, Perception has created much of the advanced technology seen in Marvel films, including Iron Man 2, Thor: Ragnarok, Captain America: Civil War and The Avengers. Experts at creating gadgets, interfaces, screen graphics and other future tech, Perception knows just how far to push innovation without letting it become unbelievable.

57821

But Black Panther presented new challenges because the African nation of Wakanda—Black Panther’s home world — is far more technologically advanced than any society in the Marvel Cinematic Universe. In this case, Perception collaborated with Marvel Studios for 18 months and used Cinema 4D, X-Particles, Houdini and Redshift to design, develop, animate and render the visionary technology seen in the film and end title sequence.

57822

Here, Lasky and Perception’s creative director, John LePore, explain their process for creating Black Panther’s future tech, which was used alongside the work of other collaborators on the film, including Trixtor, Framestore, Storm VFX and Method Studios.

What kicked off Perception’s involvement with Black Panther?

Lasky: In the summer of 2016, I got an email from Nate Moore, who has produced several Marvel films and we have a great relationship with him. He asked if we wanted to work on Black Panther and, of course, we did because we are Marvel fanatics so we already knew Black Panther’s story and mythology. Marvel already knew they wanted tech to play an important role in the film because Wakanda is such an advanced society. We’ve been working with them so long, they trust us to be technology consultants. Something like this comes up and the initial brief wasn’t, ‘Hey we have a scene that needs a gadget.’ They wanted a whole presentation for director Ryan Coogler and the rest of the Marvel executive team on technology that was not simply more advanced than anything we’ve seen – they wanted the technology to have a distinct character, almost a soul, and something that fit with the secluded world of Wakanda.

57823

What do you mean when you say the technology needed a soul?

LePore: We wanted the technology to be physical matter, something that could be touched. Vibranium is a natural resource in Wakanda. The metal is the country’s main source of energy, and it also powers Black Panther’s bullet-proof suit. Vibranium is in the comic books and it was used in Captain America’s shield, but there isn’t any rulebook to explain its properties or what it’s capable of. We spent a lot of time thinking about those things. Whenever we work on a film like this, we want to make sure the concept feels like it has a lot of depth to it. If you dig into the Marvel archive, you’d get a lot of scientific detail about it. It also has to be capable of capturing an audience, so they feel like, ‘Oh, wow! I want to travel to that universe and experience that myself.’

57824

Explain how vibranium works technologically.

LePore: Knowing that vibranium is bulletproof and absorbs sound, we looked at a lot of research on things like sonic weapons, the concept of echolocation and various tests and experiments by Carnegie Mellon and MIT Media Lab. Jeremy is a Carnegie Mellon alumni so he is always popping his head in there to see what’s happening. And with our experience working on emerging technologies with companies like IBM and SpaceX, we already have a lot to draw from.

Also, while working with some hardware related to mid-air haptics, we discovered some interesting possibilities. Researchers used a pad with an array of transducers on it emitting ultrasonic soundwaves into the air. When they held their hands over the pad, they could feel the sensation on their palms. The idea is that one day we will be able to have the sensation of touching an object that isn’t really there. And we found out that the University of Tokyo is using that same pad to levitate Styrofoam particles. That led us down the path of having vibranium particles, which are a lot like sand, morph and reassemble into different shapes and forms.

57825

Lasky: The sand starts moving in response to sound waves, and you see it appear in several different forms in the film, like T’Challa’s (Black Panther played by Chadwick Boseman) vibranium suit, the tactical strategy table in T’Challa’s Royal Talon Fighter jet and as a communication device. T’Challa can actually pull energy from the vibranium in the ground when he is in Wakanda, so the smart sand echoes the film’s emphasis on the land.

LePore: Before those scenes were even filmed, we used Cinema 4D and X-Particles to create various animations depicting vibranium sand interactions. That allowed us to explore different possibilities and ultimately create references for the filmmakers and cast to use on set.

Talk about how Perception’s ideas and concepts were also used in the end title sequence.

LePore: Because we had already contributed to the film in many ways, we didn’t have to do a competitive pitch for the title sequence so we were able to focus on a single idea that felt appropriate for the film. We told them that we thought the titles should show events from the film—that had been rendered in vibranium sand—being entered into Wakanda’s history. They liked that idea and let us see a rough cut of the film. That was terrific because we were able to think about iconic images and how we could graphically present themes of the film in the title sequence. When each actor’s name appears on screen, we also show something that represents them. We do that same thing with writers, producers, the VFX supervisor, everyone. And we depict everything using this flexible, beautiful medium of vibranium sand.

57826

We were honored to have a music track to work with by Kendrick Lamar. He created a song specifically for the end titles and having that sonic bed by this legendary musician made it possible for us to use Cinema 4D’s sound effectors to make the sand particles pulse and undulate to the music. Everything is slow and sensual and flows beautifully with the music. For us, creating this end title sequence was a great conclusion to our 18-month contribution to the film.

How did you ensure that what you created fit with the Wakanda universe?

Lasky: Marvel’s team really kept us grounded throughout the project. Nate [Moore] was the reality police and if he thought something felt too much like magic, he would bring us back down to the rulebook that was established. The studio provided us with what they called the Wakanda Bible, a 500-page document that was assembled by the production department. All of the visual cues were in there and every component of Wakanda’s culture and history was explained in great detail.

]]>news-7411Wed, 06 Jun 2018 17:30:02 +0200A Thousand Wordshttps://maxon.net/en-us/news/case-studies/advertising-design/article/a-thousand-words/Welcome to the world of 3D artist Josef Bsharah where the scene tells the story.&nbsp; As a teenager growing up in Saudi Arabia, 3D artist Josef Bsharah spent most of his time playing video games and watching movies like The Matrix, TheLord of the Rings and Blade. Intrigued by what he saw, he was curious about every part of the creative process and wanted to learn more. His older brother introduced him to MAXON’s Cinema 4D and Adobe’s Photoshop, and a high school teacher helped Bsharah understand the basics of 3D. He took it from there, landing his first job in 2013 as an architectural modeler at a small, local studio.

Architectural visualization was the next thing he threw himself into while also working diligently on his 3D and computer graphics skills. In 2017, Bsharah went freelance and as he takes on new and different projects, he also dedicates a lot of his time to personal work that has not only helped him get better but has also attracted new clients offering the kind of work he really enjoys. “My work is constantly growing and shifting into different mediums but I would say it has evolved into making hyper-realistic environments and short stories that often have surreal and abstract elements,” he explains.

Here, Bsharah discusses his distinctive style, what inspires him and how experimenting with different styles, software and tools is helping him sort out how to shape his career. We’ll also take an up-close look at some of his projects and the stories they tell.

Your website features collections of personal work titled Tungsten, Jade and Voyager Zero. Is this recent work, and how would you characterize it?

J.B.: I’ve done all of those projects in the last two years. Often, I am referencing a mood or style of a render that I’ve created in the past and I’m creating something that fits within it. I think of it as environmental storytelling, but I am also learning. The tools we use today are rapidly changing and evolving, and each kind of tool or software comes with its own challenges. Cinema 4D has helped me a lot with keeping the momentum of ideas going because it’s easy to make changes quickly, especially when you combine it with Octane Render.

Tell us more about environmental storytelling. And what project most reflects that?

J.B.: That would be Tungsten. That was my second attempt to learn more about environmental storytelling. It’s a busy alleyway and there are a lot of small details everywhere that illustrate the overall chaos of the scene. Everything was conceived and modeled in Cinema 4D. I also used Moment of Inspiration 3D, Marvelous Designer and ZBrush for additional detail. It can be really hard to capture the story behind a place or area without much narrative. I try to capture a lot of small things you wouldn’t notice normally, like a dust patch above a pipe, an old spider web or a spot where paper has been mostly ripped off of a wall.

57540

Describe the other two personal collections on your website, Jade and Voyager Zero.

J.B.: Jade is one of my first attempts at visualizing a small part of the world. It’s kind of a cyberpunk narrative. I’m always fascinated by the incredibly talented Japanese artists who worked on remarkable animated films such as Ghost in the Shell and Akira. The dedication it takes to truly visualize a world like that, on the level of detail that they achieved, is simply mind-blowing.

Jade was also a great learning exercise for me on the technical limits of a full a 3D pipeline with a GPU renderer. I started learning about asset management, organization and levels of detail, along with other things like real-world reference study and matching. Building and texturing everything was a really exciting challenge. Today, I approach massive projects, or any project with many components, differently thanks to Jade.

I spent a year on Voyager Zero. I wanted to discover what I could do with computer graphics, and where I truly wanted to go with my work over time. As I worked, I found myself changing a lot of my workflows and tools, which helped a lot in understanding what kind of work I wanted to spend the rest my career doing. Voyager Zero has reached a lot of people on social media, and I’ve received some really heart-warming letters on some of them.

57596

Describe a recent client project and how clients are finding you through your personal work.

J.B.: So far, most of my clients have found me through the personal projects I’ve done over the past two years. Often, they are looking for something similar, so it’s a really good idea to have some work out there, even if it’s mostly personal. It still shows your overall style, and there is a good chance someone is going to like what you do and be interested in working with you.

Olena Shmahalo from Quanta magazinereached out to me recently to illustrate some articles on quantum computing. She said she had been following my work on Instagram for a while and liked the versatile style of the 3D visualizations. She wanted to see if I could work with her on illustrating quantum computers in an abstract or surreal way. It was a really interesting project. And I want to say a big thank you to everyone who has supported my work over the years because I’ve learned a lot on my own, but the online Cinema 4D community and interacting with artists on social media has really helped me get over the anxiety of ‘bad work’ and helped me just put my stuff out there.

]]>news-7269Tue, 24 Apr 2018 12:05:41 +0200Death Comes in Dreamshttps://maxon.net/en-us/news/case-studies/movies-vfx/article/death-comes-in-dreams/South Korea-based motion graphics artist Taehoon Park created a dystopian short for Pause Fest 2018.As always, Australia’s annual Pause Fest conference inspired artists and studios from around the world to contribute animations based on a theme. For 2018, that theme was Journey = Destination, which got young South Korea-based motion graphics and animation artist Taehoon Park and his friends thinking about life’s final journey and its destination - death.

Using MAXON’s Cinema 4D, Octane and After Effects, they spent three months making a short film called Dreaveler. Dark yet eerily calming, the film tells the story of how humans of the future must deal with death once advanced technology has dramatically extended lifespans. Focused on soothing bored, exhausted and disillusioned souls, death-management company, Dreaveler (a combination of the words dream and travel), offers a type of euthanasia that allows people to once again embrace the past by traveling through their dreams and memories during REM sleep before taking a last breath.

57400

I asked Park to talk about how he made Dreaveler with two college friends, Hyunsup Ahn and Jihoon Roh, and also his current work and plans for the future.

What inspired you all to make Dreaveler?T.P.: The concept and visuals were inspired by the film Ghost in the Shell and the story was inspired by Sleep, a novel by Bernard Werber that shows an interesting aspect of a dream. My best friend, screenwriter Jihoon Roh, came up with our main story. We were intrigued by the possibility and a limitless usability of dreaming, which seems like a very efficient and beautiful way to travel to an imaginary world. And unlike a memory, a dream reflects a person’s desire and can express anything they want. The story imagines how traveling through dreams and memories could be a gift in the last step of life.

57401

Once you had your inspiration, what was your process like?T.P.: Pre-production lasted about a month and then we moved on to the practical work. I managed all aspects of the film, except for the storyline and character modeling. There really wasn’t enough time because I was also working full-time. I usually worked on it at dawn on weekdays and then the whole weekend. It helped that we followed really concrete steps - storytelling, storyboard production, storyboard animation, 3D animation, rendering and compositing. Doing that really helped shorten production time. The most important part of this was collaborating with my friends who work in different fields. Working with them made this project much more meaningful to me.

What scene in Dreaveler do you like most and why?T.P.: I like the lab scene with the main character in the beginning. It was inspired by the character of Hideo Kuze in Ghost in the Shell. I always prefer a dark, serious atmosphere and this scene was made with particular care because of the layout of all of the cables. I spent a lot of time designing and setting up the lighting of the main character. I got some amazing sci-fi models [under Creative Commons] from Beeple’s project Zero Day.

You aren’t long out of college. Where are you working right now?T.P.: We graduated three years ago. I used to do a lot of illustration but I’ve been concentrating on motion graphics now that I work at GIANTSTEP, a well-known post-production company in South Korea. We mainly do TV commercials, game cinematics, title design and VR-interactive content. I’m busy working at GIANTSTEP but when I have time I study animation and other things. I am looking forward to doing some title sequence work with Heebok Lee, our talented creative director.

57402Tell me a little bit about yourself, including where you all went to college. T.P.: We went to Hansung University in Seoul. I earned a bachelor’s degree in graphic design and I never imagined that I would be a 3D artist. But my friend Gryun Kim created a motion graphics study group at school, so we started studying Cinema 4D together, mostly through online tutorials. I painted a lot when I was young. My father was a cartoonist and I think I inherited his talents. At first, I used C4D only for fun and doing assignments. But then I started uploading my work to Behance. The responses I got were so nice. I never expected that, but it made me want to use C4D more and study harder. Explain a bit more about the roles your friends played in making the film. T.P.: Jihoon handled most of the pre-production process, and Hyunsup did character modeling and design. We had worked together already on a short, animated film called The Frog when we were in college. It was really poor quality compared to Dreaveler, so we can see how much we’ve improved since we graduated. Participating in a big event like Pause Fest was a big challenge as a motion graphic artist. To be honest, I was worried that the reaction to our film would not be good. But, thankfully, many people liked and shared our Vimeo link.

What are your plans? Do you think you’ll be making more films?T.P.: I will study motion graphics more, and I want to learn how to do one day be able to do more of the filmmaking on my own from beginning to end, like Raoul Marks. He is a great CG artist. Or I would like to become a great creative director like Heebok Lee, Patrick Clair and Ash Thorp. I have so much to learn from people like them and I respect them so much.

Meleah Maynard is a freelance writer and editor in Minneapolis, Minnesota.

]]>news-7225Thu, 12 Apr 2018 14:39:33 +0200Journey = Destinationhttps://maxon.net/en-us/news/case-studies/advertising-design/article/journey-destination/Pause Fest 2018’s opening titles take viewers on a journey around the world.&nbsp; New York City-based art director and designer Toros Kose has always been interested in maps. Captivated by their beauty as a kid, he thinks it was probably all of the intricate details that make up maps that really caught his eye. Though Kose has used maps as imagery in a few projects, he’d never made them a primary visual focus until recently when he was asked to create the opening titles for Pause Fest 2018.

57267Pause Fest founder George Hedon was unaware of Kose’s affinity for maps when he contacted him about the project. He just explained that this year’s theme was Journey = Destination and asked that Kose keep that in mind when designing the titles. Using MAXON’s Cinema 4D and Octane Render, Kose spent nearly seven months working on the titles in between other jobs.

Meleah Maynard asked Kose to explain his process for creating the distinctive title sequence, which takes viewers on a journey around the world while subtly introducing Pause Fest’s speakers in just two minutes and 27 seconds. Here is what he said:

How did you come up with the design?T.K.: Other than needing to relate to the theme, the brief was very open. I was really excited to be offered this opportunity and wanted to do something that was good enough for the conference. I had a lot of ideas, and once I decided on what to do, I showed George a PDF presentation that included an explanation of what I wanted to do, reference imagery of different sorts of maps and the work of Japanese artist Noriko Ambe. Her work is just stunning and inspired the look I ended up with for the final piece.

57271

Also, early on I found a little piece of a T. S. Elliot poem [Little Gidding] that spoke to me and fit the idea of exploration and journey. I looked it up and thought it would work as a wonderful voiceover for the project. It didn’t end up in the final piece because George already had a copywriter on the project who wrote a custom script. We recorded both scripts and he chose the custom one, which did work out really well.

How would you describe your style and how did that fit with this piece?T.K.: I wouldn’t say that I have my own distinctive style, and that has been okay for me. You can see from the work on my website that I like to get into the details of imagery. I like making futuristic user interfaces and holograms. I’ve used maps a few times, particularly when I worked on the game, Destiny: Rise of Iron. In some ways, this project was a culmination of the work that I have done.

What’s your background? How did you get into art directing and design?T.K.: I was born and raised in Stockholm, Sweden, to Turkish and Kurdish parents. I drew a fair amount as a kid and I was interested in cameras and computer games. I think going to a Waldorf school from first grade all the way through high school might have made some difference in what I wanted to do because there is such a focus on arts and crafts. In my last couple of years of high school, I got into video editing software and made my own trailers for video games and upcoming movies. It was those little animation tools in video editing software that introduced me to After Effects and later on I discovered Cinema 4D.

I started out studying media technology at Södertörn University, but when I discovered motion graphics I applied to Hyper Island in Stockholm, which is a vocational school focused on digital technology. I got my degree in motion graphics in 2011 and did a four-month internship at Tronic Studio in New York City and then another internship in Hamburg, Germany. I was mostly an animator at that point, but once in a while I did a bit of design work. I learned a lot from tutorials and the C4D community, which really seemed to flourish around 2010.

How did you end up back in New York?T.K.: I missed New York, and I had met my then-girlfriend there and I missed her. So I moved back and started working at a place called Super Fad, which is closed now. I had already been thinking of freelancing, but that forced me to do it sooner. I’m really fortunate because everything has been going so well. My girlfriend and I got married in 2012. And I’ve been freelancing since 2013 and I’ve worked for a lot of great people and studios, like Blur, Prodigal Pictures/Danny Yount, Gmunk Inc., Imaginary Forces, Trollbäck and Perception. I used to work at home, but now I rent a studio with a small team of other artists I met on a project in 2014. I still do most of my work alone with companies in New York or on the East or West coast, but sometimes a few of us collaborate on bigger projects.

Talk about your process for making the titles once you got the green light.T.K.: The first thing I did was start playing around with procedural noise textures in Cinema 4D to generate the line work for the maps. By rendering out an image of the noise and treating it a certain way in After Effects, I was able to get it to look like topographical line work and see how that would translate into animation. George liked the look, and it was a simple way to go because I used the same noise texture to generate both the flat line work and the topology. They are a little offset in width, but they follow each other.

The 3D topology was rendered out of Cinema 4D, and then I used that sequence of noise as a texture in Octane Render to create the elevations. It worked, but it was time consuming because I had to render out large 8K textures. So, if I wanted two seconds of noise I had to render out 48 frames of 8K textures and bring that back into Cinema 4D to use in Octane.

]]>news-7172Thu, 15 Mar 2018 15:39:49 +0100Designing Distinctive Sports Graphics https://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/designing-distinctive-sports-graphics/Cake Studios describes their latest work for the Baltimore Ravens.&nbsp; Eight years after founding Cake Studios, Jim Steinhaus and Mannix are now known for creating memorable, visually captivating sports packages that stand out from the pack. That’s why the Baltimore Ravens recently called on them to design something unconventional and unmistakably unique to the team.

Steinhaus and Mannix did not disappoint. After making a trip to Baltimore to meet with the client and tour the city for inspiration, they used Cinema 4D and After Effects to create a comprehensive, stylized stadium and broadcast package. Rooted in the gothic feel of what was once Edgar Allen Poe’s hometown, the elements include exploding cobblestones, flocks of flying birds and an array of smoke and water effects.

How did you and Jim decide to focus on sports graphics? Mannix: Jim and I worked together at another company before we founded Cake Studios in 2010. At the time, we were right on the heels of the recession and a lot of budgets were still constricted. But we had a client who gave us a presentation on how, despite the economy, budgets for sports packages were actually increasing rather than decreasing like other areas of the entertainment world. So we decided to concentrate on making that our business. It was kind of a happy accident but it’s worked out really well for us because we’ve gotten better and better at what we do. This past summer was the busiest we’ve had, yet.

How did you collaborate with the Ravens on this? Mannix: The whole project was very collaborative. They were energized and excited about the major upgrades at the stadium and how the giant new video board could totally transform the game-day experience. We had never worked with them before but they knew our work for other football teams so they felt comfortable working with us. They didn’t want anything slick. They wanted more of a dark, gothic look –something kind of mean and intimidating. We showed them some concepts based on Edgar Allen Poe’s work and other references to the city but not any specific landmarks.

Once we established the foundation for the design, we built everything off of that, which led to a lot of back-and-forth with the client. One thing we’ve really learned is that it’s best to go for the master look from the start and then reverse engineer elements. That’s especially important in a package this large because, if you don’t have that master look down and you hit a wall, you can’t return from that very easily because you don’t have the time.

56998

How did you create the ravens? Mannix: We had a lot of reference images for them to choose from. They chose one that looked mean because they are a hard-hitting football team, and ravens aren’t friendly birds. The model was made and animated in Maya and we ported it over to CINEMA 4D, which is the base that we work from. We wanted to find a way to have the birds interact with the clouds. We tried a few different things and liked the look we got by using X-Particles and Turbulence in Cinema 4D.

How was this project different from other sports packages you do?Mannix: Every team has its own personality and attitude, which is also a reflection of the city they call home. But from a technical perspective, this project is different because it’s a lot more 3D-heavy and we’ve received a lot of great feedback on that.

Describe the gothic/modern interstitial visuals. Mannix: The vibe is mid-19th century, the era of Edgar Allen Poe, but there are also modern elements, like the manhole cover. We worked to create a hybrid between the two and focused on creating a dark, gritty, ominous look. It was literally a black canvas, so we brought in purple pockets to tie everything in more closely with the colors of the team. To emphasize the point of camera impact for the interstitials, we used X-Particles and rendered them through Turbulence in MAXON’s Cinema 4D. This is the first time we used both plug-ins for a venue package and we couldn’t be happier with the results.

Talk about using exploding cobblestones and water to dramatize touchdowns. Mannix: We’d never done anything like exploding cobblestones before. Outside of our imaginary Baltimore cemetery environment, we built a long entryway made of 19th century-era cobblestones. It’s that long cobblestone path that’s being ripped up by the energy of the text in that interstitial. To do that, we built the geometry for a single brick, used MoGraph within Cinema 4D to duplicate it, and then used dynamics to create the realistic interaction and impact between the bricks and surrounding environment.

For the water, we used the dock area of Baltimore as a design element. Crashing waves reveal the text interstitials. It took a while to work out the dynamics but we ended up bringing in a separate layer and treating it like stock footage. We used Houdini to animate the water effects and then composited it together with our Cinema 4D layers using After Effects. If the client wants to use the water visuals differently, they can swap out the ‘Interception’ text for ‘Touchdown’, or whatever they prefer, really easily. Meleah Maynard is a writer and editor in Minneapolis, Minnesota.

]]>news-7145Mon, 12 Mar 2018 17:24:00 +0100Getting Started is Easyhttps://maxon.net/en-us/news/case-studies/visualization/article/getting-started-is-easy-1/For his first attempts at 3D, Cam Guest got his inspiration from childhood memories and takes us back to 1989. When starting a new job, many people see it as a valuable opportunity to learn new skills and expand their professional profile. Cam Guest was no different when he started his new job as associate creative director at a media company in 2016. Since the company uses Cinema 4D to create product visualizations, he got acquainted with the software very quickly. Even though his job didn’t require experience with Cinema 4D, Cam wanted to gain more in-depth knowledge of his team’s work so he could better assess their work. So he decided to learn Cinema 4D in his spare time.

Since he had never worked with 3D software before, he started by searching online for beginner tutorials and discovered Grayscalegorilla.com. Cam used their videos to learn how the various tools in Cinema 4D work and how to create an efficient workflow. The more he learned, the more he wanted to finally start his own project. He dug one of his fondest childhood memories out of the closet, which had a shape that was perfect for a maiden modeling project: his old Nintendo Entertainment System. Released in 1985 in the USA, this gaming console not only spawned Cam’s memories but also sparked his creativity: “I really wanted to capture a feeling of that care-free childhood when you finish school for the summer and all of a sudden have all this free time on your hands and no restrictions. I had spent a lot of time with friends playing Nintendo until the wee hours of the morning and it's something I look back on fondly,” remembers Cam.

56986

Cam excitedly started modeling the NES case, which was quite a challenge for him as a beginner. He looked for an easier way to model and decided to create a schematic illustration of the NES module in Illustrator and extrude it in Cinema 4D. This did the trick and Cam had his very first ‘aha’ experience with Cinema 4D, and he also used this method to model the NES case. Cam bought a caliper so he could scale the console to the correct size. Using photos, he even built the motherboard and the entire inner workings of the NES, including all components, to scale. His attention to detail can also be seen in the console’s wiring, which is exactly like the real NES. The game modules and the controllers and their circuits are meticulously modeled. Many of these finely modeled details aren’t even shown in the final rendering. For example, a completely modeled antenna setup is hidden behind the television!

The next big challenge was texturing the models: after quite a bit of experimentation, Cam stumbled across a tutorial from School of Motion that helped him understand how the Cinema 4d UV tools work so he could texture his models to make them look realistic. By combining various material channels, he added a range of details such as scratches, finger smudges and other irregularities.

56964

Cam needed other objects from the past to round off the scene and engulf the viewer in the 1980s. He searched the Web for reference material such as televisions and video recorders of the time and modeled these in Cinema 4D. Cam modeled every object himself, with the exception of the Coke can – an exercise that taught this Cinema 4D newcomer quite a lot!

“In the end, this was an exercise in pure joy. I absolutely loved the problem solving, the attention to detail and trying to make my final scene as realistic as possible” says Cam about his work on the project. “I spent endless nights until 4-5am modeling on one screen while having Twitch open on the other. I’ve played virtually no video games in this time, even though I have a GeForce GTX 1080Ti under the hood!”

Cam Guest Websitehttp://www.camguest.ca/

]]>news-7168Mon, 12 Mar 2018 16:28:35 +0100360° Content for Immersive VR App “Vegas: Alter Your Reality” https://maxon.net/en-us/news/case-studies/visualization/article/360-content-for-immersive-vr-app-vegas-alter-your-reality/The new ‘Vegas: Alter Your Reality’ (VAYR) 360 virtual art experience, produced for the Las Vegas Convention and Visitors Association (LVCVA), is a unique destination brand marketing program that features an interpretation of the Las Vegas experience – the attractions, night life, restaurants, and more “only in Las Vegas” scenes – in virtual reality (VR) through the lens of five of the world’s top artists: Beeple, Adhemas Batista, Fafi, Insa and Signal Noise.

John Fragomeni creative & media entertainment executive, and Andrew Cochrane, immersive creator and director, were contracted by R+R Partners to handle the VR production. Throughout the six-month project, the team, working in conjunction with Sunset West Media production studio in Los Angeles, relied on MAXON Cinema 4D, and Cineversity plug-ins CV-ArtSmart and CV-VR Cam to create five individual VR videos totaling 12.5 minutes of immersive 4K stereo animation.

56996

The virtual art collection debuted last December at the prestigious annual Art Week and Art Basel festival in Miami and can be experienced now on the Vegas VR app. The activation is presently scheduled at select locations around the world.

Fragomeni and Cochrane have been at the forefront of VR production since 2013, and together have more than 30 years experience in live action, design, animation, post-production and high-end visual effects. Fragomeni has extensive experience in film, TV, music videos, commercials, interactive and emerging media, including feature-film work on “Pacific Rim,” “Terminator Salvation” and “Pirates of the Caribbean.” Cochrane has directed top immersive experiences for brands such as Google, Intel, and USA Today.

MAXON recently caught up with Fragomeni and Cochrane to speak about the 3D creative experience in producing Virtual Reality content for this massive production.

MAXON: According to the LVCVA press materials, "Vegas: Alter Your Reality” is described as a “…creative interpretation of the Las Vegas experience, direct from the minds of five distinguished international artists. Transformed into vibrant and immersive content pieces, their personal narratives are given the ultimate presentation as a virtual reality art collection...”. At what point were you and the Sunset West Media team brought into the project?

JF: It all began in early 2017 with a call from Executive Producer Vaitari Anderson and Creative Director Dwayne Grech at R+R Partners for the VAYR project. From the start we knew that to bring this experience to its full potential meant we had to break away from the traditional client/vendor paradigm. This was not only a collaboration but a partnership where art meets technology.

MAXON: You commented that 360º production is still relatively new, with very few projects of any scale attempting the amount and level of work that this project required. What made this project so unique?

AC: Most 360º content is either a tie-in with an entertainment property or a product – marketing has driven most of this emerging medium thus far. This project is rare in that there was no mandate other than for each of the artists to create. R+R Partners and the LVCVA were uniformly focused on maintaining the purity of this goal. In every way, this is an art installation project, with Las Vegas as its patron.

MAXON: What were your pre-production discussions like on VAYR? More specifically, how did your CG animation team work with each of the artists in bringing their ideas and impressions into the virtual world?

JF: Our team of CG design and animation experts worked directly with each artist to guide them through an asset creation phase where they made individual images and style frames in 2D for our team to use as the basis for our 3D work in Cinema 4D.

Each two to three-minute immersive video is a direct translation of the artist’s work into VR, with assets they created and that our production team converted into 3D using Cinema 4D and CV-ArtSmart, then animated and then rendered into omni-directional stereo using CV-VR Cam.

56990

AC: The process was fluid and involved back-and-forth discussions and interactive sessions. We worked from production storyboards and artists’ mood boards with references to their 2D animations, or in some cases even a 2D file that we could extrude and turn into a 3D asset in Cinema 4D for animation and rendering.

MAXON: What software tools were used in the production pipeline to insure each artist’s vision of Las Vegas would be interpreted uniquely in VR?

AC: Artistically styling 3D simulations and animations to match a two-dimensional aesthetic in a fully immersive 360º world required a multi-faceted team and a flexible production approach. We used Cinema 4D as our primary tool for rigging, animation, environment creation, particle effects, and lighting, with OTOY Octane for rendering. Adobe After Effects and Autodesk Fusion 360 were used to composite and Mettle's Skybox plug-ins were used to create content and to make WIP edits in Adobe Premiere so that we could check progress in the VR headsets.

MAXON: Tell us about the 3D VR workflow challenges your team faced creating such a huge volume of immersive content for VAYR.

JF: The sheer volume and variety of needs that these five films presented as far as we are aware, is unprecedented in 360º animation. Using Cinema 4D, with the addition of complex simulations from SideFX Houdini and Next Limit RealFlow, we successfully modeled, textured, lit, rigged, simulated, and rendered over 500 2D assets that had to be converted to 3D or wholly created in CG.

56991

AC: We relied on Cinema 4D techniques from pretty much everything we have ever worked on. Converting Signal Noise and Batista’s Adobe Illustrator files would have taken months if not for CV-Artsmart. Making Insa’s desert shatter pushed the MoGraph and Rigid Body systems beyond anything we have ever attempted in Cinema 4D to date. Additionally, we used CV-VR Cam heavily on Batista’s film which proved ideal for realizing his 2D/ illustrated style in VR. Beeple is a prolific Cinema 4D power user and his VAYR piece was created almost entirely using the software with Octane as the renderer. Uniquely, he would hand off his scene files to our team to do all final lighting, finishing and rendering.

56992

JF: We enjoyed a close technical partnership with Paul Babb and Rick Barrett of the MAXON US team, which helped us confidently process all VAYR shots through Cinema 4D. We received R&D support and did things that pushed the boundaries without an engineering group on the project. The MAXON team was huge help, being just a phone call or email away to get quick answers to any questions and find solutions to any technical needs, which enabled a very smooth workflow.MAXON: The VAYR video project’s focus is on individual artists capturing their experience of Las Vegas so that they can be viewed on VR headsets. Were there special considerations in setting up the production pipeline?

JF: With a terrific vision of the scope, lending on the experience of building, managing and running post/VFX studios and thorough extensive pre-planning we assembled an entire production facility in just a week.

Every workstation was a quad-GPU beast, a total of 68 Titan XPs boasting a combined 243,712 Cuda cores, with a total capacity of 816,000,000,000,000 floating point operations per second. This gave us the seamless and efficient throughput to handle the demanding amount of data and workload. The team was relatively small given the sheer volume of work – we generated over 2 million frames in the process of creating the final 12.5 minutes of 4K stereo animation.

MAXON: What did you and your team most enjoy about working on the VAYR project?

AC: The creative collaboration between our team, R&R, and the artists was a dream come true, and the initial brief to “just make art” was never abandoned.

JF: This project came down to the creative scope and freedom to make fresh content on a relatively new story platform as we pushed creative and technical limits. We’ve worked with Cinema 4D across hundreds of projects over the years, but this one demanded us to genuinely push it harder than we have ever done before which led to almost daily discoveries of techniques that we’ll be using for years to come.

# # #All trademarks contained herein are the property of their respective owners.

]]>news-7066Wed, 14 Feb 2018 10:39:25 +0100Busy Bees in Overallshttps://maxon.net/en-us/news/case-studies/advertising-design/article/busy-bees-in-overalls/Television and commercial advert specialists Taiyo Kikaku Co. Ltd and the VR pros at Hololabs Inc. teamed up to create a mixed reality application called HOLOBUILDER™. This application features little workers in overalls created in Cinema 4D. The application was presented at the Contents Tokyo Exhibition where visitors could see these virtual busy bees at work.

Taiyo Kikaku is always looking for new trends and technologies that makes content more interesting and entertaining for the viewer. With HOLOBUILDER™ they entered uncharted territory that gave them the opportunity to test the possibilities of mixed reality.

56938

The application was developed together with Hololab Inc. and is designed to enhance video or live footage with additional virtual elements to create a mixed reality. This extension of reality is primarily made up of 3D graphics which can, for example, be used to make technical manuals easier to understand – such as using a mobile phone’s camera to navigate around a car’s engine and highlighting items such as the oil dipstick, filler cap, radiator, etc. Taiyo Kikaku’s goal was to explain the basic concept and make its possible applications intuitive and easy to understand for potential customers.

The first thing they had to do was explain what augmented or mixed reality is, what it can look like and what can be done with it. Taiyo Kikaku’s senior director Ryo Ihara thought that this could best be done using humor. The demo app that he created used five simple cardboard panels with which the user could interact in the real and virtual worlds. When the user looks through the VR headset all they have to do is place a card on the table to trigger a swarm of virtual construction workers that pan out from a virtual start card to other cards on the table. On their way they avoid real-world obstacles and finally start building the objects as defined by the cards graphic. As soon as they complete their project the virtual characters jump for joy and the user can pick up the card along with the virtual object that was built and view it from different angles.

Thanks to MoGraph’s ease of use I was able to experiment a lot to perfect the construction process for each object

The individual objects and the animations were created in Cinema 4D, with heavy use of the MoGraph feature. “Thanks to MoGraph’s ease of use I was able to experiment a lot to perfect the construction process for each object,” explains Ryo Ihara. At the time, the team also had to make sure that the animations were kept relatively simple for export to Unity – a restriction that has since been removed in the current version of HOLOBUILDER™. “In my next HOLOBUILDER™ application I will definitely use more assets for even more interactivity. This first simplified demo version is designed to offer a preview of what HOLOBUILDERTM will offer and show the creative possibilities it has to offer for a wide range of projects,” Ryo Ihara continues.

Ryo Ihara’s miniature virtual construction crew is a precursor of the potential that this burgeoning technology has to offer together with Cinema 4D, Unity and HOLOBUILDER™ – and can also be the start of something big for other mixed reality concepts!

]]>news-7014Fri, 09 Feb 2018 13:06:20 +0100Blade Runner 2049https://maxon.net/en-us/industries/movies-vfx/blade-runner-2049/Into the future with Alcon Entertainment’s Blade Runner 2049, Territory Studio and Cinema 4Dnews-7012Fri, 09 Feb 2018 10:34:55 +0100Exorbitart Puts Edifices in the Right Lighthttps://maxon.net/en-us/industries/architecture/exorbitart/In order to spark and peak the interest of potential buyers, property developers turn to Cinema 4D specialists to create impressive visualizations.news-7008Tue, 06 Feb 2018 10:23:25 +0100VarCity – City Surveyed in 4Dhttps://maxon.net/en-us/news/case-studies/visualization/article/varcity-city-surveyed-in-4d/A project by the ETH (Technical University) Zürich sees the city as a research object and uses Cinema 4D to visualize the data it gathers. Cities as living space play an important role in our society and offer a wide range of aspects that can be researched in order to better understand them and make city life better. Buildings, streets, parks, rivers, residents, traffic and much more combine to comprise an average city. The VarCity Project, an ambitious project by the ETH Zürich, was initiated by Professor Luc Van Gool and is directed by Dr. Hayko Riemenschneider. The project’s fundamental idea was to create an accurate reconstruction of the city using freely accessible images from social networks and public webcams and to use the continuous stream of images in numerous ways.

56424

To begin with, a series of high-resolution aerial images to which proprietary algorithms were applied were used to create a poin cloud that was in turn converted to an accurate, textured 3D model. The resulting quality would have been sufficient for applications such as navigation systems or even for city planning but the makers of VarCity wanted more. Specially-equipped cars drove through the city, meticulously photographing their environment. The data was combined with additional image material and existing 3D material to create a detailed model of the city.

The scanned images from the webcams were used to fine-tune the 3D model and the data collected was also used to determine the amount of traffic in the city. The system is aware of how many vehicles a particular section of the city has and how many parking spots are available, which can be useful if VarCity is used for more than just navigation, e.g., for city planning or even a parking guidance system.When Carlos Eduardo Porto de Oliviera joined the project, a lot of data had already been collected that could only be understood by those with a scientific background. Carlos was given the job of making this information understandable for the average user. A proprietary script was used to convert the original .PLY files to .OBJ so they could be opened in Cinema 4D. The massive amount of geometry in particular meant that Carlos had to test the scripts first, especially since UV maps were generated automatically. In the end, the team had a tool that made it possible to open the converted data in Cinema 4D.

When the aerial images were converted to models, the buildings’ shadows were being interpreted incorrectly by the algorithms that were used to generate the models. To solve this, an application was developed that automatically corrected the models based on additional image material. This is how the graphics were created that were used in the video that explains what the VarCity Project is .

In the end, this system can be used as a data resource for numerous applications: navigation systems, automated tourist guides, particle systems and much more can be run using this data. Since the feasibility of using Cinema 4D for converting the data in images is being determined in the test phase, Carlos only hast to find out the extent to which Cinema 4D can be used for such applications. ]]>news-6910Thu, 18 Jan 2018 08:51:00 +0100Damian Swiderski – Artist and Cinema 4D Userhttps://maxon.net/en-us/news/case-studies/advertising-design/article/damian-swiderski-artist-and-cinema-4d-user/It can be challenging for artists to find a fitting medium for expressing their art. Damian Swiderski was no exception – until his search led him to Cinema 4D! Damian’s career has taken many different paths. At first, he wanted to study graphic design but soon realized that this was not the right choice. After a friend had introduced him to the profession of theater painter, he saw that this sparked his interest much more than what he had wanted to study.During his vocational training as a theater painter he learned how to use painting tools and media and Damian also realized how limited the field of painting really was: it literally had no depth. All the ideas he had were yearning to be brought to life, and after a brief stint as a furniture designer, Damian discovered 3D. After having tried several 3D applications he realized that they either didn’t have the features he needed or they were very complicated to use.

Only after trying Cinema 4D did Damian discover an application that offered the features he needed and was also easy to learn. His idea was to create real physical objects from virtual sculptures. Attempts to create these sculptures using plaster, clay or even bronze casting proved to be too expensive and time-consuming. Damian decided to produce his designs as 3D printed objects and to finish these manually to give them the final look.

Damian uses the Sculpt function in Cinema 4D because it comes very close to real-world sculpting. He develops his sculptures using basic shapes from the Cinema 4D Content Library. Damian uses the 3D print tools in Cinema 4D to prepare the models for printing and checks for unwanted openings, correct scaling and any polygons that might be flipped.

Since Damian’s virtual sculptures by far exceeded the maximum print size for the 3D printer, he used the printer’s software to subdivide the objects so each part can be printed separately. The Drone sculpture, for example, was printed in separate pieces, which made it possible to create a final object that was much larger than the printer’s printable surface. Damian used two Zotrax printers simultaneously to speed up printing.

The final issue Damian had to deal with was the surface of the printed objects, which clearly showed the layering used by the FDM (Fused Deposition Material) printer during the printing process. He sanded the surface smooth and then painted it. In order to better showcase the printed objects, he created photo-realistic renderings of them in unique environments in Cinema 4D.

About his work in Cinema 4D, Damian says, “Cinema 4D gives me the tools I need to actually bring my ideas to reality without having to be a rocket scientist! All functions and features are very intuitive to use, which means that I can concentrate completely on the project at hand and don’t have to grovel about how I can make the next function work.”

]]>news-6932Tue, 16 Jan 2018 15:07:00 +0100The Ultimate Stage Showhttps://maxon.net/en-us/news/case-studies/advertising-design/article/the-ultimate-stage-show/Los Angeles-based design and animation house Possible explains the over-the-top stage design for Justin Bieber’s Purpose tour. Justin Bieber’s Purpose tour ran for more than a year with more than 150 shows in six continents over 18 months. His third world tour for his fourth studio album, the show featured captivating environments and visuals created by Los Angeles-based Possible.

For the Purpose tour, Possible, led by director Michael Figge, teamed up with creative director Nick DeMoura and show producer Chris Gratton to create looks and cinematics. Using a combination of Cinema 4D, After Effects, X-Particles, TurbulenceFD and Octane, Figge’s team designed, shot, animated and edited 22 full-song scenics, as well as three intros and interstitials.

53820

I asked Figge, Possible’s co-founder, and producers Roy Chung and Ryan Chung to talk about Possible’s role in the tour. Here’s what they had to say:

How did you get this project? Michael: We were approached by Justin’s team to handle the visual components of the tour. Possible is known for producing high-end stage visuals for the biggest artists and shows around the world.

How did they explain the look and feel they wanted for the stage environment and visuals?Michael: The show’s creative director, Nick DeMoura, gave us a deck that illustrated the themes and key moments of the show’s arc, and we all worked to key off of the deck, investigating related aesthetics. Working with surfaces on stage differs from traditional broadcast formats (TV, tablets, phones) because the scale of each piece of art needs to have a relationship with the performers. We used Cinema 4D to help sketch out concepts quickly during the design process and provided stage renderings to illustrate how certain moments could play on stage.

What did you look to for inspiration when designing the show?Roy: When we begin researching design looks for our shows, we start with the artist’s existing aesthetic and feel out how much or little we need to adhere to. We always want the show to feel cohesive with the artist’s brand but if they’re willing to take risks it can lead to some really fulfilling creative departures. For this show, we drew inspiration from Justin’s album art, music videos and a robust exchange of design references between Nick, our team and all the major stakeholders.

At one point, Bieber sings “Mark My Words” while hanging over the stage in a glass box. Talk about how you created that unique environment. Michael: With all of the cutting-edge technology we had on stage, we thought it would be an interesting point of entry for us to start with symbols that have endured the test of time, like classic sculpture and hieroglyphics. We modeled, textured and lit the environments with Cinema 4D and Octane Render. Many of the stage environments include dancers. Could you talk about ways you created content that dancers interacted with? Ryan: Once we had references for certain segments of choreography, we built environments for those movements. For example, when dancers floated at various elevations in front of the upstage video wall, we wanted them to have an ethereal quality that was enhanced by combining their motion with high-density X-Particle emissions that were distorted by TurbulenceFD plumes. That gave the particles a movement that dynamically interacted with the edges of the LED wall.

How did you get the many things happening on stage to come together so seamlessly?Michael: We worked closely with the choreography and lighting teams to coordinate moments. With an LED wall that size, it’s easy to drown out other things that are happening on stage. Talking about when and where we needed negative space helped us make sure we were working in concert with everything else that was happening.

Explain a bit about how you used scale and forced perspective to at times make the stage appear much larger than it was.Roy: Scale is always important to us. If you get it wrong, the content would wind up having a Spinal Tap effect [where miscalculations mean the band in the mockumentary winds up with miniature Stonehenge]. We always take care to make sure that scenic content is built to scale.

Dancers sometimes use set pieces, like a half pipe for skateboarders during “What Do You Mean.” Talk about how you extended those into the screen content. Ryan: Seamlessly merging scenic with LED content is a tool we’ve employed before on some of the award shows we’ve worked on. With limited budgets and timelines, scenic pieces sometimes need to be supported by content in order to feel truly transportive. In this particular case, having a half-pipe on stage was just such a cool and fun idea, and we knew we wanted to create an environment that made sense while also helping the half-pipe feel more dimensional by giving it a context. To make sure the scale was correct, we first brought in the fabricator’s model of the half-pipe into Cinema 4D.

What are you working on now?Michael: We’ve got about a half dozen shows launching in the next month, so stay tuned.

]]>news-6904Tue, 19 Dec 2017 11:06:07 +0100A Gaming Experience Like No Otherhttps://maxon.net/en-us/industries/advertising-design/xbox-one-x-trailer/Blind explains pixel threading and other aspects of the Xbox One X release trailer.news-6858Fri, 08 Dec 2017 12:23:46 +0100Artist Portrait: Humberto Rodrigueshttps://maxon.net/en-us/news/case-studies/advertising-design/article/artist-portrait-humberto-rodrigues/At his Petit Fabrik studio, this very talented artist uses Cinema 4D to create impressive character animations for adverts and industrial projects. As a partner and head of CGI at Brazil-based interactive studio Petit Fabrik, Humberto Rodrigues is well known for doing exceptional character animation. Recently, he and his team wrapped up work on a new app called 18 Histórias [18 Stories], an interactive book of bible stories for the South American Division of the Seventh-Day Adventists.

Several animated, 3D trailers were created to promote the app, which took nearly two years to complete and features more than 180 images made and rendered in Cinema 4D. Here Rodrigues, who has worked on many international projects, including leading a 3D team for Samsung’s first game, Galaxy 11, explains the making of one of the most ambitious trailers about the story of David and Goliath.

53377

Describe Petit Fabrik and how you got involved with the studio. H.R.: We are a small company that has five partners, including me. It was started about seven years ago and I joined in 2011. All of us work together on the creative side, but we also have our specialties. I met my partners when I visited the studio to say hello and they asked if I wanted to work on a 3D project. It was a message about the important of organ donation, and one of my professors saw it and told me that she’d lost her nephew that week and the video helped her see the need to donate his organs. People often think the work we do is just for entertainment, but I think that we can do much more. How did you become an animator and director?H.R.: I’ve always liked art. I grew up on a small island in the Amazon called Parintins. There were no formal art schools there, but there was a folklore festival every year and artists who lived there taught each other. I learned a lot from those people, including a few artists from Italy who had classical art training. I got my interface design degree from FUCAPI Technical College in Manaus in 2008 and then went on to do post-grad work in animation at Pontifical Catholic University in Rio. I got into CGI because of my relationship with one of my professors, and I went to France for an intensive, two-week summer program at Gobelins, a great school for animation.

That experience really changed the way I saw my work and CGI. It’s easy to get very technical with computer graphics and worry about how to achieve things. That is important. But I realized how important it is to let ideas mature and understand the feelings that need to be communicated. I saw how movement communicates feeling and how colors and shapes can be feelings—like angles are more aggressive than curves. That’s been really important in my work.

How did Petit Fabrik get the job of making the 18 Histórias app? H.R.: We’ve worked with Adventist Hospital in the past, including an animated film that tells the story of how the hospital started out as a boat with doctors on it before they moved to a small house and grew from there. We actually proposed this project to them, explaining that we would make 3D illustrations that would behave like e-books for an app. They thought it was a great idea. At the time, we didn’t know it would take two years, but there was a lot of development time involved with making hundreds of 3D characters and working with illustrators to paint over everything in 2D to make things look more organic.

The characters in the David and Goliath trailer are very striking. Talk about how you made David. H.R.: This trailer was very special. We really put a lot of time into modeling and texturing the characters. David is the most human character we’ve ever made, and it was very challenging. Our modeler gave me the mesh he made of David in Cinema 4D and I used the sculpting tools to create the details. I gave him the look I wanted by painting dirt layers over his skin and creating a lot of hair on his face, head and chest. Even his pores are showing sometimes.

How did you create the bear and the lion?H.R.: The animal characters were challenging too. We did a lot of testing for every aspect of the bear. Its teeth have a little bit of reddish yellow in them, and you can see they are also translucent. The rig we made was very good because we needed to control the tongue and lips. Bears have very squishy lips.

53378

I used C4D’s Hair for the lion and I had to get the physics right and really style its hair so it points in different directions and out and away from its eyes. The design is very unique and beautiful, and we used reflections inside the eye to get that red, cat’s-eye effect.

Cinema 4D is not often used for character animation. What was your experience like? H.R.: I have a friend who works at ILM [Industrial Light & Magic) in San Francisco. He says that when people say C4D isn’t usable for characters, he tells them about me. I would say, though, that C4D is fast and easy for character animation and would work for anyone because the tools are all there whether you are working in Cinema or using a pipeline that includes other software. I like how you can watch what’s happening in real time when you’re animating rigs. We optimize our rigs so we can have many characters in the viewport and still have it play smoothly when we are animating. And it was easy to teach to animators who were working with us.

What is Petit Fabrik working on now? H.R.: We’ve got a few different things going, but one is a longer-term project, a show called Lupita. It’s about a little baby girl who is discovering the world of grownups. It’s a challenge because the budget is low and we are doing 13 episodes, each is seven minutes. Our goal is to keep things simple but they still look very good. She learns something about how the world works in each episode, and it will be on the public education channel in Brazil in the spring.

53416

Meleah Maynard is a writer and editor in Minneapolis, Minnesota.

]]>news-6684Wed, 01 Nov 2017 10:31:49 +0100Studio Website in 3Dhttps://maxon.net/en-us/news/case-studies/advertising-design/article/studio-website-in-3d/Web design doesn’t always have to be 2D: Hybrid Forest uses Cinema 4D and WebGL to create a picturesque 3D landscape for its own Website!&nbsp; The Stockholm, Sweden-based creative studio Hybrid Forest is a specialist for the production of VR worlds and 360° videos and deals with highly topical subjects that offer the young team endless design possibilities and creative freedom.

To test state-of-the-art methods of displaying 3D graphics in web browsers, the team experimented with WebGL technology. Still in its infancy, WebGL graphics use the graphics card’s power to render 3D graphics in Web browsers. Users don’t need any extra browser plug-ins.Since Hybrid Forest needed an innovative Website that properly reflects their field of specialization, the team used the results of their WebGL experiments and used them to create their official company website.

For the design, the team at Hybrid Forest used Sweden’s idyllic nature and the team’s own fascination for 3D as inspiration: the site shows a 360° view of a picturesque landscape with a lake surrounded by mountains and a typical Swedish wooden house at the edge of a forest. If you look closely enough you will even see an animated bear.

The entire scene and all assets in it were modeled by 3D artist Ashley Reed using Cinema 4D. He made sure to keep the number of polygons low to create a stylized low-poly look and also optimize the scene’s performance for less powerful end user devices. The Polygon Pen tool was an important tool for the quick and easy creation of geometry. The Polygon Reduction tool was also used to reduce the complexity and triangulate the geometry of the mountains, which were created using the Sculpt feature. This step also prevented issues that can arise if non-triangulated assets are used in WebGL. By deleting the Phong tag, Ashley was able to accentuate the low-poly look because the polygons were no longer rounded by a soft surface shading.

Thanks to Cinema 4D’s ease of use, the team was able to concentrate entirely on expressing their creativeness: “Cinema 4D was the perfect tool that allowed us to create a great symbiosis between the modeler and me, the programmer,” explains Daniel Mayor, game engine and web developer at Hybrid Forest.

A particular challenge was integrating the assets created in Cinema 4D in WebGL on the Website. They were first exported as OBJs from Cinema 4D and a special tool was used to save them as JSON files. These files contain text that is readable for Web applications and can be integrated into the Website with the help of the Three.js JavaScript library.

When the scene was finally up and running in the browser with WebGL, the team experienced limitations to the technology, which prevented realistic shadow casting in the browser. This is why they then deactivated the shadows in the scene entirely. Coincidentally, the light in combination with the low-poly optics in the scene generated shading that gave the impression of cast shadows. They added an Easter Egg by making the position of the sun change according to the time set on the user’s computer. Visitors to the site can also affect the position of the sun using an interface and switch between day and night lighting.

The launch of their very innovative Website was a complete success for the team at Hybrid Forest: The project was presented at an inspirational service for designers and was nominated for a design award at AWWWARDS.

Hybrid Forest was impressed by Cinema 4D and an increasing number of staff is using the software. The team also conducts Cinema 4D workshops on a regular basis. 3D artist Ashley Reed is a big fan of Cinema 4D’s ease of use: “Working with Cinema 4D is incredibly fun because it lets me concentrate entirely on the realization of my ideas.”

]]>news-6669Fri, 20 Oct 2017 10:07:02 +0200Switchblade – Motorized Gladiatorshttps://maxon.net/en-us/news/case-studies/games/article/switchblade-motorized-gladiators/A game in which gladiators in a future world battle each other on vehicles: 10 differently armed vehicles battle each other in a vast arena and all ten vehicles were designed with Cinema 4D.

A battlefield filled with obstacles and unexpected elements in a not too distant future. Vehicles armed to the teeth compete explosively. Switchblade is a game from the English publisher Lucid Games and mixes familiar themes with an explosive range of action. Of the 15 fantastically designed vehicles 10 were designed entirely by Michael Tschernjajew, a talented German artist who uses Cinema 4D to bring his fantasy and design talent to fruition.

Michael and Lucid Games came together quite unexpectedly. Michael’s work at Artstation and Behance reflected the mechanical style Lucid was looking for and after ordering a few samples of his work, they gave him the job of designing vehicles for Switchblade. The briefing was simple: the game needed armed vehicles with four wheels for armed motorsport with a unique, heroic look.

To avoid building each vehicle from scratch while maintaining stylistic consistency and speed up workflow, Michael created an assembly kit made up of various components. “Things such as antennae, suspension and sensors were created according to the client’s wishes and I had more time to work on the actual design,” remembers Michael.

“While creating the designs I worked a lot with the new SSAO and with OpenGL. The viewport renderings looked just like the models displayed by the actual game engine, which gave the client an outstanding impression of the final models at a very early stage.”

“The ideas were realized using Cinema 4D’s modeling tools, which offered the right tool for basically every task: The new Knife tools, the Magnet tool, Edge Smooth and Polygon Shift features all gave me absolute control for modeling. This made it easy to turn concepts into reality!”

Since Lucid works with Maya, an exchange format had to be agreed upon that let the models be imported seamlessly. “We decided to use the FBX format since Cinema 4D has an integrated exporter for this format. The team at Lucid was able to easily import and edit the models I had created in Cinema 4D,” says Michael.Michael worked about a year on Switchblade and now that the project has been completed and the game is heading to retailers, his assessment of Cinema 4D is that it’s “… fast, efficient and indispensable for bringing my designs to life!”

A number of robots are happily engaged in a series of repetitive tasks, completing each cyclic movement with clockwork precision. But when the system begins to accelerate, the poor automations are pushed beyond their operational limits, with grisly consequences …

Vicious Cycle is a personal project by director Michael Marczewski, who works as a motion designer at ManvsMachine in London (the studio responsible for the stunning Versus, commissioned by MAXON for the release of Cinema 4D R18). The film, which was completed entirely by Michael himself over the course of a year in his spare time, was a Vimeo Staff Pick and has been entered into several short film festivals.

48857

"I've always been intrigued by intricate mechanisms," he comments, "and I began to play around and make some in Cinema 4D. Then the idea of connecting them to helpless robots came, and it evolved from there. My initial idea was to make a fake instructional video with the robots acting out various tasks to demonstrate how to do them but then everything starts to go wrong. I think the malfunctioning allows for a lot of comedic moments in the film."

The amazing thing about Vicious Cycle is that it was all done with just a handful of keyframes and not a single IK rig. Instead, Marczewski relied almost entirely on Cinema 4D's rigid body dynamics. "I designed the [mechanisms] to be simple yet effective but also look visually interesting. I set the scenes up, added a small number of keyframes here and there, and then let the dynamics in the scene play out."

Around 80 percent of the motion is dynamic but Michael admits that some moments had to be 'faked' by manually animating part of the scene, or baking the simulation and tweaking the keyframes. He also used cuts in the edit to switch between dynamic motion and faked animation.

Marczewski knew what motion he wanted the robots to perform and started by sketching out how each mechanism worked. "The trick was to link those two things together – it was done through a lot of trial and error! Each scene had to be approached a little differently to make sure the motion looked smooth and natural."

A typical setup starts with a keyframed object, such as a spinning cylinder with a Collider Body tag acting as a motor. However, Michael often used an animated object instead of a motor, as many of the setups use repeating keyframes and it was easier to keep track of the animation. A variety of Connectors were then used to link everything together. "The simplest setup was the waving robot in the intro," he says, "and the most complex was probably the baseball scene."

The film begins with a few rudimentary activities, and then we see a robot with the Sisyphean task of carrying batteries up an endless conveyor belt. The arms are rigged with springs in the shoulders, elbows and wrists, enabling it to carry its cargo. However, while only the legs are being driven, the robot remains suspiciously vertical. "His torso is kept upright with a connector and spring linked to the floor," admits Marczewski. "Looking back, I should have had a visible supporting pole to the pelvis."

Next, a serving robot collects plates of sushi and drops them onto a rotating carousel – and it's all done with physics. "The only thing in this scene that is animated is the sushi bowls coming down the tube to land on the tray one at a time. It was just too tricky to get a mechanism to work that would release a sushi bowl when the tray came under the tube. It would have been possible, I think, if I had had more time."

However, the first real cheat comes in the segment where a robot uses a chainsaw to cut chunks off an endless log. As the blade passes through the log, a Boolean object is used to cut a slice out, and then a new piece was made dynamic so it falls off. Marczewski also uses some slight of hand in the scene with the robots hitting and catching baseballs. First of all, despite Cinema 4D's support for Aerodynamics and Wind objects, the balls aren't really being held aloft in the tube by the air pressure. "I had to animate this bit," he concedes, "purely because the ball has to travel accurately and directly into the catcher's hand."

48858

The striking of each ball was also animated manually in order to precisely guide the ball into the catcher's hand. Once the ball is caught, a keyframe is used to drop the ball back onto the track at the right moment. "The motion of the catcher's body is keyframed," acknowledges Marczewski, "but everything else in the scene is dynamic: the balls on the runs, the mechanism for releasing a new ball, the little see-saw that allows an extra ball through each time, and the swinging motion of the batter. The scene was quite fiddly because of the ball run – it was a chain reaction, with each part relying on everything else working."

Dynamics was largely responsible for the robots' natural-looking movement – with subtle jiggles and rebounds. "I used springs with the connectors. I just had to fiddle with the rest angle, strength and damping." To get the dynamics to work properly, Marczewski also used the same real-world scale for each setup, as if each was a miniature, desktop-sized scene. Then, for his settings, he used a value of around 15 Steps Per Frame, with Maximum Solver Iterations slightly higher.

The automatons' recurring duties are interrupted when the driving mechanisms inexplicably grow faster and faster. Unable to cope, the cycles are broken, often with gruesome consequences: limbs are removed, torsos splintered, and heads separated from bodies. Marczewski explains that the little spurts of robot 'blood' were produced using an emitter to generate tiny dynamic cubes. "Sometimes I needed to use a Cloner as an emitter, simply by keyframing the number of the clones, and the Dynamic tag on the cubes had an Initial Velocity to create the spurt."

One of the trickiest sections was with the pickaxe-swinging robot, which dislodges a chunk of mineral that flies out and hits his colleague in the head. The dizzied robot spins around and ends up on the rock just before the next, fatal blow. "This was a bit of a nightmare to do entirely with dynamics," the artist admits. "I couldn't get the second robot to land on the rock how I wanted. So, in the end, I used a cut in the edit and created two versions, which gave me a lot more control of the situation. In the film, the edit is pretty seamless – hopefully!"

In terms of modelling and rendering, the bodies, arms and legs of each robot were UV-mapped (under duress: Marczewski hates doing UVs and texturing), and then a texture was created in Photoshop to add subtle details and provide worn edges. Each scene was then lit with a traditional three-light setup: key, fill and back lights. Although some setups – such as the baseball scene – proved trickier because they had many points of interest to focus on. In those instances, a few smaller light sources were used to add specular details. The scenes were then rendered using Solid Angle's Arnold renderer, with motion blur at render time rather than in post. "I find the results much more effective."

For Cinema 4D users inspired to have a go with dynamics themselves, Marczewski suggests they "just play around and keep going! If you enjoy it, you'll get a lot of satisfaction from problem-solving and figuring out how to get objects to act realistically."

"A lot of beginner dynamics videos I see always look too slippery to me," he continues, "so I would advise to pay attention to the 'Friction' option in the Dynamics Body tag – I normally have it set quite high, between 60-80%. Also, while I'm talking about the Dynamics tag, try to use 'Box' as the shape as much as possible, It's limiting with complex shapes but it gives the best results from my experience, and it keeps your playback pretty quick. I never use the 'Moving Mesh' shape unless I really need to!"

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-6620Fri, 13 Oct 2017 12:13:55 +0200Blood Roadhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/blood-road/Johnny Likens talks about art directing a new film that follows a mountain biker down the Ho Chi Minh Trail. Blood Road (https://www.youtube.com/watch?v=XCzVqiN950M), a Red Bull Media House production that premiered at film festivals earlier this year, follows ultra-endurance mountain biker, Rebecca Rusch, as she bikes 1,200 miles through Vietnam, Laos and Cambodia along the Ho Chi Minh Trail. Accompanied by her riding partner, Huyen Nguyen, a Vietnamese mountain bike racer, Rusch hopes to find the site where her father’s plane was shot down 40 years ago during the Vietnam War.

48855Art director Johnny Likens (www.johnlikens.com) worked closely with director Nicholas Schrunk on the documentary. He also used CINEMA 4D combined with other software to create the film’s opening graphic sequence, as well as all of the information graphics, the custom map and terrain visuals and the historical archive footage treatments.

Over the years, Likens has worked at many top studios in New York, including The Mill, Framestore and, currently, Method Studios. His many credits include: creating opening title sequences for HBO’s The Night Of and FX’s Taboo, as well as the feature film Collateral Beauty. He has also earned a daytime Emmy nomination for Outstanding Achievement in Main Title and Graphic Design, and was recently made Shots Magazine’s 2017 list of Rising Stars Under 30.

Here Likens talks about his role in the making of Blood Road. See his motion design reel here:

48856

How did you get involved with the Blood Road project?JL: Just out of school, I entered a motion design competition called Cut&Paste. That was where I first met Nick Schrunk, who would soon go on to land a job at Red Bull. Over the years we’ve collaborated on many Red Bull projects together, so when Nick was up for directing his first feature-length doc, I was one of the first people he brought on board. Describe the film’s emotional opening sequence? JL: I spent the majority of my time on this project crafting the opening sequence. It was such an important part of the film because it needed to communicate so much of the backstory, as well as introduce Rebecca, her father and the timeline of events. We built the opening and tore it apart probably eight to ten different times. There were a lot of versions that were scrapped before we found the one that worked best.

How did you work with the director, and what was his vision? JL: There are a lot of hard-hitting statistics that many people don’t know about the Vietnam War. Nick and I talked in depth about how important it was to communicate some of those things without being biased, disrespectful or inaccurate. As I developed the graphics, it was clear that some visuals needed to be more literal whereas others could be more abstract and representational. For example, I created a rising or falling US flag inside an architectural space evocative of a war memorial to communicate that America was very divided over that war and in quite a state of turmoil during those years. Alternately, to make the death toll statistics truly hit home, I made sure there was a clear and accurate visual showing the difference between the 58,000 US soldiers killed in the war compared with 3,000,000 Vietnamese killed.

How long did you work on this? JL: On and off for more than two years, and the project was a long time in the making because there was a ton of planning and preliminary design work that started even before the shoot. In the last couple of months of production, I enlisted a talented friend, Wes Ebelhar (http://www.wesleyebelhar.com/), to help me out with the historical map sequences and other animated shots. Wes is a total design and animation powerhouse, and I couldn’t have been more satisfied with the quality of work we were able to achieve.

What did you enjoy most about working on this? JL: My favorite part of this project, and all of my projects, really, is the initial design phase. That’s when most creative decisions get made, and it’s very exciting thinking about all of the different ideas and creative possibilities. I remember Nick and I going back and forth about what role the design and animation would play in the film, and then I’d go away for a few weeks and come back with a ton of design exploration to show him and we’d go from there. Meleah Maynard is a writer and editor in Minneapolis, Minnesota.

]]>news-6487Wed, 09 Aug 2017 17:42:57 +0200When a Little Nudge Makes a Big Impacthttps://maxon.net/en-us/news/case-studies/advertising-design/article/when-a-little-nudge-makes-a-big-impact/Two 3D artists, four weeks’ work and maximum creative freedom: these are the right working conditions for creating a VR simulation of a Rube Goldberg machine in Cinema 4D. Liberty Global is a global media and telecommunications company and one of the world’s largest broadband providers, with more than 40,000 employees. Consumers are more familiar with subsidiaries such as Unitymedia. In 2011, Liberty Global embarked on a search to find out how they can expand their own innovative strength through more employee participation. They decided to create an innovation platform named Spark where employees could share and discuss their ideas with the entire company.

Liberty Global asked the motion design studio Kreukvrij to produce a promotional film for their informational events and Powerpoint presentations to make Spark more appealing to their staff. The only requirement was that the spot be innovative and finished within four weeks.

43297

Kreukvrij’s Art Director Olaf Gremie and Executive Producer Martijn Gademan from Liberty Global worked together with 3D artist Lars Scholten to develop a creative idea for visualizing the idea behind Spark: Cinema 4D was used to animate a Rube Goldberg machine that uses complicated methods to solve simple tasks to illustrate how a little nudge can make a big impact. What’s special about this project is that it’s a physically correct simulation of the nonsense machine in VR that was created with almost no keyframe animation. To get the most out of VR, the team created the scene differently than for a traditional film. The viewer would have to rotate their angle of view in order to follow the entire chain reaction. This meant that the artists had to make the complete environment visually interesting for the viewer.

The project’s biggest challenge was to make the Dynamics simulation work in a single go. To achieve this very ambitious goal, the 3D artists broke the machine down into individual sections that were saved in separate scene files and added to the master scene using XRefs. They then varied the settings countless times in the Dynamics’ expert settings for each scene until the machine worked as desired in each section. As soon as they had the result they wanted they baked the simulation as an Alembic file to make sure the fruits of their endless labor were saved. This ensured that the animation would still work if the Dynamics cache was accidentally deleted, and also reduced the size of the master scene by more than 10 GB to a slim and trim 71 MB! Also, if the simulation had not been baked, even the smallest change to the scene would have caused Dynamics to behave unpredictably. “Alembic really saved our lives,” says Lars Scholten about the time-consuming work on the Dynamics simulation. The extensive use of XRefs made it possible for the 3D artists to reduce the scene’s complexity and achieve better performance for the simulation in the Viewport. This meant that they were able to work efficiently on the simulation and let the iterations run faster until the desired result was achieved.

To speed up the modeling process, the team used plans of various objects from the Internet for reference and used plant models from Turbosquid. They also observed each model’s distance from the camera to avoid modeling unnecessarily detailed objects.

Cinema 4D’s MoGraph toolset was also indispensible for the project whose Cloners were used to position numerous objects throughout the scene. Cloners were used to create the holes in the wooden calendar and position the colored discs with the help of a Random Effector. The MoGraph Cloners’ render instances were a great help for the CPU-intensive Dynamics simulation of the marbles, as Lars Scholten describes: “We have about 5,000 marbles in the scene that would have overburdened any computer without instancing.”

To speed up the calculation of Dynamics for complex objects such as the wooden airplane, the team used low-poly models that were added to the scene as invisible proxies.

“The project was good fun, because everything works just like a real Rube Goldberg machine and there were many variables that could completely mess up the entire animation,” says Lars Scholten looking back at his work for Liberty Global.

For him, Cinema 4D turned out to be the prefect tool for the project, which was completed in only four weeks: “Cinema 4D is a powerful tool for CG artists who want to accomplish fast and great-looking results, even with a very small team and limited time.”

]]>news-6446Thu, 27 Jul 2017 10:39:29 +0200Audio Creatureshttps://maxon.net/en-us/news/case-studies/advertising-design/article/audio-creatures/Tim Clapham and Mike Tosetto explain how Luxx helped Director Ash Bolland turn the Sydney Opera House into a natural wonder for Vivid Live 2017. Every summer, the Vivid festival briefly transforms Sydney into an enchanting wonderland of light, color and music. At the center of the event is Lighting the Sails, the Sydney Opera House’s tradition of turning its unique architecture, which resembles sails or shells, into a breathtaking canvas.

This year’s Lighting the Sails production, Audio Creatures, was directed by Sydney’s Ash Bolland with soundscapes by renowned Brazilian sound designer Amon Tobin. The elaborate projection-mapped show, which required 16 projectors, reflects Bolland’s interest in the relationship between nature and humans and offers viewers a glimpse of abstract creatures, otherworldly plants and a futuristic chrome world.

43070

Tim Clapham, creative director of Luxx, a Sydney-based motion graphics and 3D animation studio, says Luxx was very privileged to be invited by Spinifex Group to create visuals for Audio Creatures. Working with Mike Tosetto, creative director of Sydney motion design studio Never Sit Still, and using Cinema 4D, After Effects, V-Ray and X-Particles as well as custom code, Luxx collaborated with Spinifex to create about 8 minutes of the 15-minute production in seven weeks.

Clapham: It was a huge honor to be asked to do this. It gets millions and millions of views, and is probably the most viewed projection mapping in the world. It was a once- in-a-lifetime opportunity to do this. We had a couple of other great artists working with us and everybody put their all into it. You could tell people really knew it was a privilege. The Spinifex Group and Ash Bolland approached Luxx independently to collaborate on the project. It was a wonderful coincidence and great to know that both the production company and the director had plenty of faith in our abilities. The project was pretty vast and Luxx produced around eight minutes of the 3D content, all rendered at 4K. That’s a lot of material to create in seven weeks, which required many long days and nights to craft.

What kind of direction did you get from Ash Bolland?

Clapham: Ash gave great direction and was a pleasure to work with. He had some very specific ideas but he was open to our interpretations, too. If there was a cool way to execute his concept, he was open to that. Ash also had some wonderful references for how he thought the creatures should look, and it was our job to interpret them. He created style frames for each creature, some montaged in Photoshop using 3D renders, and others were hand-drawn sketches. We worked together to rebuild those fantastic creatures in Cinema 4D.

Tosetto: He wanted the creatures to feel big and heavy, like Godzilla stomping through a city. The opera house is huge, and the creatures needed to convey weight through slow and substantial movement.

What kinds of creatures did you make?

Clapham: We built many of the creatures and components, including a butterfly, a chrysalis, a jellyfish, ice pods, an octopus, parasites, shells and a magnetic core. Each creature presented its own unique challenge. One shot in particular starts with a chrysalis tearing open to reveal a butterfly. Ash wanted to reveal the butterfly’s wings as individual segments from the wing patters. We ran Cloth simulations on each segment, creating flags flapping in the wind as they unfurl to reveal the butterfly wings.

Talk about the challenges of getting everything to fit on the opera house’s sails.

Clapham: The opera house’s sails are an incredible canvas but it is quite an unusual shape, kind of like pieces of an orange. We needed the animation to be constrained within that shape so it didn’t break out and kill the illusion. We had a 3D model of the opera house and the hero projection in position but we needed to make sure all of the elements fit perfectly. We did a lot with deformers in C4D, and also used reshaping tools in After Effects to ensure everything was pixel perfect to the shape.

We explored multiple avenues in Cinema, which gave us the flexibility to experiment with the ideas we had. For example, when we were working on the magnetic core, which is more like geometry and swirling shapes than a creature, we thought it was going to be a simple shot but it turned out to be really challenging. We used a custom Python Effector that allowed us to arrange objects using the Fibonnaci sequence. [A series of numbers wherein the next number is found by adding up the two numbers before it.]

Describe the projection setup for this?

Tosetto: The projections were split into four separate parts and projected all the way across the harbor. Four projectors were used for the two sails on the left, six projectors for the center sail, three projectors for the right sail and three projectors for the two smaller sales for a total of 16.

What did you think when you saw it actually being projected?

Clapham: Ash really had a crazy vision. He walks though life seeing things most of us take for granted but he stops and looks and sees so much beauty in nature and he sees how some creatures are almost like alien beings. Watching this, I thought the creatures felt really accessible, abstract and beautiful. We went out on a limb with this because the whole subject matter is abstract and bizarre but we kept it fun and light-hearted with creatures that aren’t spiky and aggressive, but kind of fluffy and bouncy.

]]>news-6417Mon, 10 Jul 2017 12:45:54 +0200Ghost In The Shellhttps://maxon.net/en-us/industries/movies-vfx/ghost-in-the-shell/Territory Studio Creates Futuristic Cityscape for Ghost in the Shell Using Cinema 4D news-6411Mon, 10 Jul 2017 11:17:00 +0200VR Worlds for Renaulthttps://maxon.net/en-us/news/case-studies/visualization/article/vr-worlds-for-renault/VR worlds are also primarily made up of images that first have to be created and rendered: something that can be done perfectly using Cinema 4D. And if the deadline’s a little tight you can always count on Google Zync to help you get your rendering done on time.Virtual Reality has been a hot topic over the past few years and artists who create content for 360° presentation are often moving through uncharted territory: the challenges that have to be met range from defining the proper angle of view to technical solutions and scaling for different end devices. This is why it’s important to have a tool you can count on – like Cinema 4D!

Scorch Films, one of the top names for clients who expect perfection in the technical realization of their designs, is located in the heart of London’s Soho district. Scorch’s portfolio includes work for brands such as Pepsi, BBC, Rimmel, Hilton, Unilever, Hyundai, Subway and many more – an illustrious client base that now also includes the French automaker Renault.

What Renault wanted Scorch to do was find a new way of presenting their model lines Megane, Zoe and Clio to the general public and let it find out more about these products. Scorch developed a very unique concept for the campaign: “We created three unique VR environments to communicate the personality of each vehicle,” explains Ramsay. “In these films, participants can look through the windows of each car and at the surrounding scenery – every few seconds the features of the landscape move and shift, delivering a new perspective for the viewer to take in.”

Scorch used Cinema 4D to realize their concept and to create the models, lighting and textures. Early on in the project it became clear that a scalable expansion of the existing render resources would be required to render the project.

“The creation of VR content is a complex process, not to mention the complex and comprehensive renderings that need to be created,” remembers Ramsay. “Even the first tests showed that we would need a Cloud rendering solution in addition to your own resources to make sure that we finished on time.”

The solution they chose was Google’s Zync Render, which is integrated into the studio’s and Cinema 4D’s pipeline. This solution made it possible to master the massive render jobs efficiently and on time.

The actual challenge was the amount of data that was generated and not the creation of the three environment scenes, even if they themselves were not easy to create.

“In order to create a 360° view we used six cameras in each scene,” explains Ramsay, “each of which rendered 1024*1024 that were subsequently combined into a 360° 4K video. This was a lot of image material that had to be rendered!”

Zync Render was easy to integrate into the pipeline: “We used Cinema 4D’s Physical Renderer and took a lot of time for fine-tuning the lighting and the final renderings,” remembers Ramsay.

“When you’re working with such a tight deadline you often have to make compromises with regard to render times and the required render quality,” Ramsay continues. “With Zync Render in the production pipeline we didn’t have to make any compromises and were able to set up the scene exactly as we wanted. The files were uploaded to the Cloud for rendering and the finished results were sent back to our network, ready for final compositing in After Effects. With this solution we were able to stay on budget because we didn’t need any additional hardware or a render farm.”

“We also used the time advantage we gained by using the Cloud to add artistic variations and to fine-tune each scene until we had exactly the look we wanted!”

“With Cinema 4D and Zync, Scorch was able to deliver each video for Renault’s VR presentation in the desired quality and with consistent 4K resolution,” says Ramsay. “You need a lot of render power for 4K in VR. With technology advancing at a rapid pace we also need to be prepared for VR resolutions of 8K, 16K and even 32K in the future.”

“There are also even more technological advances on the horizon: for example, if you want to create VR presentations in 3D, render times will double! A separate film has to be rendered for each eye to achieve the required depth of field. Thanks to the Cinema 4D pipeline and easily scalable ender power of Google’s Zync Cloud render service, Scorch is more than ready to meet these challenges.”

“Cloud rendering makes it possible for us to grab the render power we need when we need it,” concludes Ramsay. “With Zync we were able to get the most render power out of 50 computers – or more if we had needed them. We’ll definitely use this service in the future whenever required. Setting up such a render farm in our own studio would not be feasible and with tools such as Cinema 4D and Zync we’re always ready for any render job as well as the increasingly complex projects that come our way.”

]]>news-6409Fri, 07 Jul 2017 12:42:09 +0200SXSW Gaming Awards 2017https://maxon.net/en-us/news/case-studies/advertising-design/article/sxsw-gaming-awards-2017/After creating two very abstract clips for SXSW, Jeremy Cox and the team at Imaginary Forces want to forge a new path with their next piece – with Cinema 4D as their secret weapon! South by Southwest (SXSW) is a multi-disciplinary annual conference that includes a film festival, a music festival and various conferences. Awards for interactive media and games are also presented at SXSW. Jeremy Cox’s team at Imaginary Forces has been responsible for creating the opening title sequences for the awards for the third year running.

Jeremy’s search for inspiration for this year’s title sequence started with a retrospective of the history of video games and all the gadgets and devices that are ingrained in the minds of an entire generation. His research also included other sources of inspiration ranging from television shows such as Futurama® to film footage of the Rosetta mission. All of these sources crystallized to form the idea of an archaeological site on an asteroid – including innumerable consoles, joysticks and more buried in cosmic dust.

41251

“The animations for the previous two SXSW gaming events were challenging but were created using conventional methods. This time we wanted to enter uncharted territory. With Cinema 4D as our tool of choice we knew we were equipped to meet any challenge,” remembers team leader Jeremy Cox. “We used mood boards, production sketches and storyboards in an extensive pre-vis phase to perfectly stage the story. We wanted to use more photo-realism for this piece, which made it a much more ambitious project as a whole. Work began with the shaping of the landscapes. The basic shapes were created using the Sculpt feature, to which complex shaders with procedural noise functions in the Displacement and Bump channels were applied to create the asteroid’s surface look.”

“We used Substance Painter to texture the video game hardware and I was amazed at how quickly we were able to achieve the desired realistic look. The fact that Substance Painter didn’t offer connectivity to Cinema 4D didn’t bother us,” explains Jeremy.

41252

“We used Mixamo for almost all of the character rigging for the animations. The model of the astronaut was simply exported as an FBX file and uploaded to the Mixamo site. In no time we were able to select from hundreds of animations. The only animation that was missing was for the astronaut sliding down from the top of the joystick, so we created this one ourselves,” remembers Jeremy.

“We used the Physical Renderer for almost all renderings. We didn't use real Ambient Occlusion because this would have required too much render time. Instead, we faked AO with fill lights and several other compositing tricks to achieve the desired look while maintaining acceptable render times. We considered using external render engines but decided to use the Physical Renderer instead. The lion’s share of compositing was done in Nuke, with which we’re very familiar and have adapted to our needs,” says Jeremy.

“We needed 6 months to complete the project, of which two were used for the actual production,” remembers Jeremy. “The clip for this year’s event was much more complex than those in the past but it was fun meeting and mastering this challenge – also because Cinema 4D is a tool that practically never lets us down!”

]]>news-6403Thu, 29 Jun 2017 09:36:53 +0200Leviathan Leverages Cinema 4D to Contribute Extraordinary Video Content to New 150 Media Stream Public Art Initiativehttps://maxon.net/en-us/news/case-studies/advertising-design/article/leviathan-leverages-cinema-4d-to-contribute-extraordinary-video-content-to-new-150-media-stream-publ/Leviathan,&nbsp;a specialized creative agency working at the nexus of design, digital media, and interaction, was recently tapped to provide a sophisticated content management system and contribute video content to 150 Media Stream. The public art&nbsp;and “living sculpture” was revealed in conjunction with the recent opening of 150 North Riverside, Chicago’s exclusive, new 54-story commercial skyscraper. 40614

The project was a massive creative and technical undertaking that included Leviathan in collaboration with Riverside Investment & Development and architect Goettsch Partners among many others. The display system was designed by McCann Systems in cooperation with Digital Kitchen along with content curated by Riverside's Creative Director Yuge Zhou.
Now considered Chicago's largest video wall, the 150 Media Stream installation is permanently located in the lobby of 150 North Riverside. It features a “…3,000-plus square foot canvas…comprised of 89 individual vertical LED displays of various heights and widths, as well as integration of negative space between each blade…” and video imagery that is constantly changing and revolving. It will serve as a digital sculpture for tenants and visitors and feature commissioned works from both established and new artists from around the world.

Cinema 4D has been in use at Leviathan for the past seven years to create motion graphics on varied projects including the main titles for SundanceTV's original series The Road Road, the Chicago Museum of Science and Industry Numbers in Nature interactive exhibit, the 2013 John Deere Product Launch, and more.

Leviathan leveraged Cinema 4D for the video installation primarily for the concepting and design phases of the project to efficiently create prototypes for a number of procedurally driven real time visual effects. “Cinema 4D was used in the 150 Media Stream production pipeline to help design integrated components of the sculpture in order to deliver an intelligent content library with elements that would refresh on a daily basis,” outlined Jason White, executive creative director at Leviathan. Senior creative director Bradon Webb added, “We chose Cinema 4D as our design tool because it allows us to rapidly explore the visual effects and at the same time know that we can later export the geometry and textures that could be used and converted into real time code.”

“One of the challenges we faced when creating content for 150 Media Stream was the sheer quantity of media needed to keep the installation fresh and interesting. We turned to Cinema 4D to help meet our demanding schedule and visualize numerous concepts quickly and efficiently.” Webb explained.

For the Natural Forces Snow effect, Leviathan created snowflake patterns using a series of cloner geometry objects resulting in unique radial patterns that were exported and instanced onto a particle system in the real time engine built using Derivative’s Touch Designer. “We previsualized the ground layers in Cinema 4D using the MoGraph Shader Effectors’ displaced grid geometry, which gave the scenes enhanced depth by adding rolling hills and accumulated ground snow over time,” adds Bradon Webb, senior creative director at Leviathan.

In the Pixel Fountain concept the MoGraph module in Cinema 4D was used to create pattern animations that were rendered to sequences of black and white movies and used as triggers to drive the particle simulations in real time. “What Cinema 4D excels at is rapidly creating multiple variations of pattern animations,” said Webb. “These patterns were stored as a library and sequenced in real time, which helped streamline our workflow and save time.” For the Picture window effect, Leviathan previzualized the motion of wall tiles opening and closing. Using a fracture object and effectors in Cinema 4D gives viewers the impression of a moving wall while many layers of geometry stacked together provided a complex layered look.

“We also used Cinema 4D to previs the motion of the glass shapes in Color Theory,” said Webb. “Being able to use the simple Cloner and Effector systems in Cinema 4D was a benefit that allowed us to define the motion we wanted and sell the concept to the client before executing the real time code,” said Webb.

]]>news-6402Tue, 27 Jun 2017 16:02:14 +0200Stopping Abuse Before it Happenshttps://maxon.net/en-us/news/case-studies/movies-vfx/article/stopping-abuse-before-it-happens/Through films, videos and other media, the Hidden Tears Project aims to raise awareness about youth prostitution and trafficking, domestic abuse and more.When Jason Gurvitz and Jordan Marinov founded the Hidden Tears Project in 2015, their goal was to create high-quality content to raise awareness about issues such as gender inequality, human trafficking and domestic abuse. Working in collaboration with non-profits, experts on women’s rights and other issues, law enforcement, child advocates, psychologists, filmmakers, writers and many others, they have created films, videos and virtual reality productions that make clear the harsh reality experienced by too many women and girls.

Recently, Hidden Tears finished work on No Porn for Kids, a short educational video that aims to help children and their parents understand the need to talk openly about pornography, which is easily accessed using cell phones but difficult for young people to make sense of. Made using Cinema 4D, After Effects and motion capture, No Porn for Kids was commissioned by Culture Reframed, an organization founded by scholar/activist Gail Dines to “address pornography as the public health crisis of the digital age.”

In the video, young boys express regret that their dads never talked to them about sex, girls and how girls want to be treated. “Instead, I learned all of that from porn, but it wasn’t the truth,” the narrator says as a group of boys encircles a terrified and sobbing girl, some of them realizing the horror of what’s happening. The message: fathers and sons must talk about how girls are friends, sisters, wives and mothers to be cherished, respected and loved.

I asked Jason, Jordan and Jason Deparis, the freelancer who served as writer, producer, animator and director, to explain the making of No Porn for Kids, as well as how they hope their work can make a difference. Here is what they said:

Jason and Jordan, talk a little bit about your backgrounds and why you founded the Hidden Tears Project.

Jason: I founded Green Dog Films and have been making films for a long time. I directed a short, anti-trafficking documentary called Until They All Come Home, starring Mira Sorvino, as well as The Submarine Kid, a horror movie called Avenged, as well as other features. For me, Hidden Tears was really born out of the incredible shock of learning, through the trafficking documentary, that this was such a huge problem in the United States. I decided to put my skills and relationships that I’ve developed over the years to create high- level content that can move the needle on the issue in a way that engages viewers to actively get involved.

Jordan: I am a dancer, actress, producer and choreographer for film, television, music videos and live concerts. I started my journey to end human trafficking and sexual assault when the girls were taken by Boko Haram. After doing our work individually on the issue, we co-founded the Hidden Tears Project to tell innovative and real stories about human trafficking, gender inequality and sexual assault. We gathered a team of writers and producers from House of Cards, Empire, Narcos, Sherlock and more, and gave them a three-hour presentation at the LAPD anti-trafficking unit downtown.

Talk about some of the other work Hidden Tears has done so far.

Jason: We’ve made two short films, Unseen Dances, which is about human trafficking. Jordan was co-director and choreographer and it was part of the Red Sand Project. Tanya is about child trafficking and it was written by Sam Forman of House of Cards and directed by Monica Raymond from Chicago Fire. We work with some really talented people, but we’re very grassroots. We are continuing to fundraise project by project as we grow.

Jordan: Right now we’re developing a series with a unit of San Diego’s special forces and detectives and human rights attorneys that focus on rescuing girls, and there are some boys too, from gangs and pimps. We’re also doing quite a bit of virtual reality work for different companies. We have a narrative series that we’re doing in VR. It’s bilingual and it’s kind of like True Detective because it’s about two detectives trying to rescue kids. We’re also working closely with an appointee of the Obama administration who is working in private sector on assault and human trafficking issues.

What was Culture Reframed’s concept for the No Porn for Kids video?

Jordan: Gail Dine speaks a lot about trafficking, which is very connected to the commercial sex industry. A lot of girls in porn are also victims of trafficking, and porn is getting more violent, so what used to be a small group of people’s preference has become normalized. This is what young people are seeing. Culture Reframed wanted to make something that could help stop the cycle of abuse.

40606

Jason Deparis, can you talk about your role as director and animator?

Jason D.: I am a television producer, animator and director, and I’m currently in the process of starting Infiniverse VR, a socially minded virtual reality production company. Not that long ago, I developed and produced a human trafficking documentary for MSNBC, and being on the front lines of that changed my life in ways that I can’t go back from. People think trafficking is not happening in our country, but it is and I want to do something to help. We got the original concept from Culture Reframed and I used that to develop a concept that was eventually accepted after it was shown to focus groups and we made changes. It was a very low budget, but all of us believe in this cause and want to make difference so we did our best.

How would you describe the look of the video?

Jason D.: They wanted a kind of 2D, hand-drawn look. I figured C4D would be great platform for that. What we tried to do visually was have something dark, but not so dark that it would make people feel hopeless. There is hope, and we wanted to show that. It’s sensitive subject matter because it involves pre-teens and this video will be shown during talks with parents.

How did you handle the motion capture on a tight budget?

Jason D.: We got a script approved and then we had child actors come in and do their parts, but we could only do that one time. I had to perform a lot of the motion capture we needed myself, especially the facial movements for everybody. The animation was done in two processes—do the motion capture and clean up, and then do additional animation on top of motion capture. To do that, I brought the motion capture into C4D and animated what was needed.

How long did this take, and what did you find most difficult?

Jason: We worked on it for almost a year on and off. It was definitely a learning process for us.

Jordan: It’s hard when you’re working with clients who don’t understand why something they want, like having a character take a bite out of an apple, will take three days to animate. They’re not upset, they just don’t understand it so a lot of explaining needs to happen. Also, we are experienced filmmakers, but animation is a whole other world for us and we’re learning. We set up a rigorous approval process and that helped.

How can people help support your work?

Jordan: People can follow Hidden Tears Project on Facebook, Twitter and Instagram. We have a blog, vlog and podcast that they can check out as we are always adding new content. They can tell their friends and spread the word so that this atrocity ends. We welcome partners and financial contributions so that we can continue to reach more people and support our partner non-profits. Join us in the fight to end trafficking, gender inequality and sexual assault.

We are working on creating a constant flow of innovative content to inspire and educate people around the US about how they can actually effect change in their communities, with local law enforcement, advocates, churches and others. By donating any amount that you can to Hidden Tears, you are directly influencing thousands of people who will be engaged to act across the country.

]]>news-6329Tue, 16 May 2017 12:47:55 +0200Everything Under One Roofhttps://maxon.net/en-us/industries/architecture/fuchs-vogel/Fuchs & Vogel uses Cinema 4D to create more than 100 renderings to visualize all variations for a series of prefab houses for the Layer Group.news-6298Wed, 10 May 2017 17:16:53 +0200The Bear Who Didn’t Need to Carehttps://maxon.net/en-us/news/case-studies/advertising-design/article/the-bear-who-didnt-need-to-care/Aixsponza created a price-conscious bear for an advert for the German energy company Thüringer Energie AG. 40151
In times of rising energy prices, an increasing number of consumers is keeping a close eye on how efficient their household devices are and making sure not to waste electricity.
The happy-go-lucky mascot of the Thüringer Energie AG (TEAG) is different: He’s a slacker and doesn't care how much energy he wastes, leaves the refrigerator door open too long and dries his fur with not one but two blow dryers. He gets his electricity really cheap thanks to TEAG’s low rates and doesn’t have to worry about high electric bills anymore.
The bear had already been designed in 2D for a previously run print campaign. The 2D bear had a defined look and a single pose and had to be turned into a 3D character, which was just one of the challenges this project posed. The 3D model had to depict the bear in any pose and from any angle, whereas the 2D illustration only had to depict the character from the front and side – independent of each other.
It was also important to the client that the bear didn’t look threatening or intimidating but friendly and a little awkward with fluffy fur and big – but nowhere near dangerous – teeth. The team at Aixsponza used a front view of the print version of the bear as a reference for the 3D character to make sure that a consistent look was maintained throughout all campaign types. Especially the face, eyes and teeth had to be recreated very precisely in 3D.
A particular challenge was creating the bear’s fur. The creative team had to first find the right value for the number of guides. Too few guides won’t offer the necessary control when the fur is combed and too many will also cause problems because individual guides would always lie in the wrong position, which did not look good.
They started out with relatively few guides and added more manually as needed using the Add Guide Tool. This made it possible to simply vary the number of guides to achieve specific effects with the fur. They fine-tuned the fur using the Frizz, Kink and Clumping features. The Cinema 4D Viewport gave very fast feedback, which helped speed up the process immensely.
The Hair tool is fully integrated in Cinema 4D, which meant that the team at Aixsponza didn’t have to spend time importing or exporting files for the bear’s fur. A Wind object was added to the blow dryer meshes to simulate their air flow. This way their airflow always blew in the right direction when they were moved.
To make the workflow even more efficient, Aixsponza used multiple Hair objects on the bear’s mesh. This made it possible for several artists to work on the bear’s fur at once. This also sped up rendering since the 3D artists could hide all Hair objects for the body, arms and legs when they rendered close-ups of the bear’s head.
The Hair tools played a key role for the project, explains Achim August Tietz, partner and creative director at Aixsponza: “We found the Cinema 4D Hair tools to be very powerful, stable and highly expandable thanks to the tight integration with XPresso.”
The Advanced Biped template was used for the character object to rig and animate the bear. Pose Morphs were used to animate the bear’s facial expressions and the soft sofa pillows under the bear were animated manually. The environment through which the bear moves consists in part of live footage (bath, living room, kitchen) and the rest was modeled in 3D.
Cinema 4D’s comprehensive feature set made it Aixsponza’s software of choice for the TEAG advert. “The fast Viewport feedback and great connectivity of the Hair tools to external render engines were also very effective”, says Achim.
Website Aixsponzahttp://www.aixsponza.com/ ]]>news-6236Thu, 20 Apr 2017 14:12:06 +0200Adam Ruins Everythinghttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/adam-ruins-everything/Adam Conover debunks myths with help from visual effects.

Investigative shows are usually dramas of one sort or another, but truTV’s Adam Ruins Everything (www.trutv.com/shows/adam-ruins-everything/index.html) is best described as investigative comedy. Hosted by CollegeHumor’s Adam Conover, the half-hour show examines popular misconceptions and trends, debunking some things while offering in-depth explanations of others.The show’s narrative is supported by distinctive visual effects that run the gamut from funny to heartbreaking. Los Angeles-based Patrick Longstreth (www.patricklongstreth.com/) became the show’s VFX supervisor once it was picked up after the pilot. Relying primarily on CINEMA 4D, Maya and After Effects, he and his team of five artists have so far finished 26 episodes, and they are currently working on 16 more.

Here Patrick explains how the team goes about creating custom effects, animations and graphics packages for each show.

MAXON: How did VFX become such a big part of Adam Ruins Everything?Patrick Longstreth: The pilot had a lot of visual effects, so that sort of set the standard for the show. Our work starts when Adam and the writers finish the scripts, and then everyone involved in production—directors, producers and department heads—sits down together and goes through each episode page by page. I'm very fortunate to be included from the beginning, and it’s an opportunity to offer solutions and put my own creative spin on things. The process of breaking everything down with the directors has led to more opportunities for VFX that complement the story. We talk about things like the best way to make Adam magically emerge from a toilet, or whether it’s better to have a building explode or implode.

MAXON:What’s your pre-visualization process?Longstreth:For scenes where we do a heavy amount of VFX, I start early by storyboarding and pre-visualizing an animatic. I use Cinema 4D for that and what I create can be pretty awkward and goofy, but it works. For example, in the episode where Adam explains voting (www.youtube.com/watch?v=Zd5rul6EdF0&t=9s), the set is meant to look like CNN election night. We built a rough version in CINEMA 4D, and then the director and an actor read through the script.

MAXON:What are some of the more advanced animations or effects you’ve done for the show?Longstreth: I really like what we did for Adam Ruins Shopping Malls (hwww.vimeo.com/208215338). He talks about how malls got started and exposes misconceptions about nutritional supplements, eyeglasses and outlet stores. For the final mall implosion, Adam says “Malls are dying!” and the mall starts to collapse in the background. We filmed Adam and Emily with a green screen behind them, and we projected footage of the actual mall onto fractured geometry in CINEMA 4D. The new Voronoi Fracture tool was used to create random cracks and separations, which was so much better than doing it manually.

MAXON:How did you create the scene with the oversized vitamin supplements for the mall episode?Longstreth:That’s where Adam is talking about health stores and unregulated supplements (www.vimeo.com/202821695). He holds up a large pill, opens it up, and pulls a whole variety of wacky things out of it. The pill in Adam’s hand was practical, so the CG pills had to match the shape, color and lighting. To make the pills match the movement of Adam and his guest, we used dynamics in CINEMA 4D. The pills have rigid body tags and we animated collider objects to push the pills around.

MAXON:Describe how you make the show’s infographics? It seems like there are a lot of them?Longstreth:The show does have a lot of infographics. Adam likes to back up all of his theories and explanations with stats and sources, so we’re often superimposing big numbers and graphs over footage or creating full-screen infographics. We did an infographic for Adam Ruins Football about brain injuries (www.youtube.com/watch?v=EdyLK0ZqFks). The animation represents how a helmet protects the skull, but the brain still moves around inside the skull. We took a stock 3D model of a football player and combined it with a rigged skeleton in CINEMA 4D. When the football players collide, we used the jiggle deformer to make the brain shake and wobble. To create the final X-ray look we used the default lighting in CINEMA 4D and then inverted it in After Effects.

MAXON: Are you exploring any new ways to create VFX for the new season?Longstreth: Yes. Absolutely. The writers and directors are always pushing us into uncharted territory. So far, we’ve had discussions about a doppelganger Adam, paintings exploding into money and a parody weight-loss competition reality show.I used to get freaked out when they would come to me with a wild idea, but now we’ve grown to a point where I feel like we can create just about anything. Our VFX team has a broad range of skills from 2D animation and design to 3D effects and compositing. Each one of them has a great attitude and loves a challenge.We’re also supported by stellar people in the art department, camera, editing and everywhere else. It’s a really fun place to work and I’m extremely grateful for everyone I’ve had the opportunity to meet and collaborate with. One day we’re making a building explode and the next we’re flying through cyberspace. It’s basically my dream job.

]]>news-6131Fri, 07 Apr 2017 12:13:38 +0200The Garden of Earthly Delightshttps://maxon.net/en-us/news/case-studies/visualization/article/the-garden-of-earthly-delights/Dutch Studio Smak used Cinema 4D to bring the visions of Renaissance painter Hieronymus Bosch to life. 40030
“The first step was to adapt the terrain from the original painting to a digital environment. We needed a lot of room since we had to fill the scene with a lot of various creatures. The original painting is really densely populated and after we began implementing all of our ideas we quickly had more than 200 figures that all had to be animated!”
As a rule, a rig is used to animate characters but this is a task that nobody at Studio Smak really likes to do. This is why the team looked for alternatives to make their lives easier and eventually stumbled across Mixamo, a rigging service provided by Adobe. “Considering the number of characters we were working with, this really saved our bacon! The models were uploaded, a rig was added automatically and Mocap data was added. Then the character was ready to do its thing,” remembers Thom Snels who, together with Ton Meijdam, makes up the Studio Smak team. “Other tasks such as combining a character’s various animations or creating seamless transitions between loops were done in Cinema 4D.”
“A real challenge was matching the perspective of the original painting,” says Thom. Bosch painted his pictures using a perspective common in the Middle Ages in which no real vanishing point exists. The individual elements look like they were simply stacked on top of one another. “To reproduce this perspective effect in our animation we used a camera with a parallel perspective. Each element was rendered individually and scaled accordingly in the compositing phase,” says Thom.
“We created these illustrations and the animation exclusively for MOTI (Museum Of The Image) and used just about every trick possible in the Cinema 4D book: Cloner objects to multiply individual elements, Effectors to create specific movements and character tools for the characters’ general movements.”
“Looking back all I see is the incredible amount of work we had to put into this project and that Cinema 4D never let us down,” remembers Thom.
Studio Smak Website:http://www.studiosmack.nl/

Studio Smak at Facebook:https://www.facebook.com/STUDIO-SMACK-287330375863/
]]>news-6110Fri, 31 Mar 2017 11:07:13 +0200Making the Most of 2https://maxon.net/en-us/industries/broadcast-motion-graphics/bbc-2/Discover how Luke Carpenter created 58 shots in just four weeks for a series of BBC 2 introsnews-6107Mon, 27 Mar 2017 11:01:52 +0200When Two Monkeys See the Lighthttps://maxon.net/en-us/industries/animated-films/shine/In their thesis work ‘Shine’, students from the Filmakademie Baden-Württemberg create a charming depiction of the courtship between very special primates.news-6096Mon, 20 Mar 2017 16:04:42 +0100Time Outhttps://maxon.net/en-us/industries/broadcast-motion-graphics/time-out/The Dallas Mavericks’ new hype video turns up the volume when time is called. news-6082Thu, 09 Mar 2017 14:15:27 +0100SPOV Helps Doctor Strange Live up to his Namehttps://maxon.net/en-us/news/case-studies/movies-vfx/article/spov-helps-doctor-strange-live-up-to-his-name/The latest Marvel Studios movie required convincing medical elements – made in Cinema&nbsp;4D - to sell its lead character's expertise While the bulk of the movie is packed with kaleidoscopic visual effects, the beginning of the story remains grounded in reality with segments showing the good doctor at work and, after the car crash, being operated on himself. For these sequences, the production required some authentic-looking medical displays, showing Doctor Strange's groundbreaking neurological work plus a montage in which we see other surgeons trying to rebuild his broken hands.

The task was handed to London-based SPOV, a design studio in central London, whose portfolio includes creative development, user interface and motion graphics work for Call of Duty: Advanced Warfare, Titanfall 2 and Tom Clancy's The Division among others.

We spoke with Allen Leitch, founder and creative director and senior designer Adam Roche. Leitch explains how, after SPOV's success with videogame cinematics, they started chasing movie work and quickly landed a job on Mission Impossible: Rogue Nation. "We made a few friends on Mission Impossible, and one of those friends, Art Director Alan Payne, who's an art director – called us and said, 'There's a requirement on Doctor Strange'."

That requirement was a series of medical displays showing images of Strange operating on a patient with brain lesions as well as X-rays and scans of the doctor's own fractured hands. After creating some basic 'block-outs' in Illustrator, the team then began a dialogue with the client to highlight the right hierarchy and pinpoint key elements.

"Because you're in the movies, it's difficult for us to know what's important," says Leitch. "If it was a real-life product for a medical client they would dictate what's important for a doctor to have feedback on. In the blink of an eye you should look at a screen and be able to pull out the information that you need. In the movies, it's 'what does the audience need?'"

The team's initial port of call was various ready-made anatomical models but these proved to be less than ideal. "All the off-the-shelf medical products we were looking at as well as the ones the client was pointing us to were all really clunky when you got close to them – and we needed to get very close," says Leitch.

The models were low-poly and either too smooth or with very low levels of detail, making them not very convincing. The team had little choice but to up-res the meshes, adding detail with Cinema 4D's sculpting toolset. "It took a lot of just very subtle sculpting from Adam and Julio [Dean, SPOV's technical director, who is based in Barcelona]. We've used things like Mudbox and ZBrush in the past but in this case the core Cinema 4D sculpting tools did the job."

"We just worked on top of the mesh," adds Roche. "We didn't bother to retopologize it because we always knew it was going to be in quite a stylized treatment. So for this, it didn't matter too much. Actually, some of the lower fidelity of the mesh kind of worked in our favor to look like an actual scan."

To create the traditional X-ray appearance, the team employed an inverted Fresnel shader in the material's Transparency channel, which instantly gives objects that translucent, ephemeral look. When combined with multiple layers and objects, the effect is thoroughly convincing.

38031Another crucial element of the sequences is the effect of revealing layers of skin and bone, much like a CT scan, which was achieved using a combination of Boolean operations and Cinema 4D's Proximal shader. This lets you apply an effect – such as localized color or an alpha region – controlled by the proximity of another object or null. As the distance between the two is reduced, so the effect becomes more pronounced.

"We rendered black-and-white with the Proximal shader," says Roche, "then used it as a matte in After Effects. It was a combination of Booleans and the Proximal shader. We were able to cut away but also bring back some of the transparency in the other layers."

"Proximal's a lot more subtle than hard Boolean," adds Leitch. "I think that's why we employed it here. It's also how you project it onto the mesh as well, the angle of attack. Part of the trick here was to match where Doctor Strange puts his probe – we had to match the same angle."

Another option for the cutaway shots was to use the Cinema 4D camera's Near and Far Clipping planes. Accessed from the Camera object's Details tab, clipping simply tells the camera to render only the objects or parts of objects that lie within that region. And by animating the clipping planes you can cut into an object, revealing the elements within, much like a Boolean operation. All these techniques were used to generate various layers, which were then composited and softened in After Effects.

Other tools used include Cinema 4D's Hair tools – although not in any conventional way, as Roche explains Roche. "We used it on splines as a very efficient way to render how you would a Sweep, for example. Instead of using a shader on it, we'd normally use a Hair material. But you can also render out a points pass on the geometry, which also gives a nice point cloud texture."

In the latter case it's simply a matter of adding a Hair shader to an object that has a transparent material. With a nice bright shade in the Color channel, and the length set to very low percentage, you can replicate a cool point cloud effect.

Insydium's X-Particles was also employed, says Leitch, but in this instance it was used to help create the arteries. The arteries were drawn as splines and then xpSplineMesher was used to mesh them into a form that they could be rendered.

Another unusual inclusion was MoGraph, which the SPOV team is starting to use to render interface elements. "Whereas previously everything would be done in Illustrator and After Effects, we're now rendering a lot of UI elements straight out of Cinema 4D," says Leitch, saying that it's an idea they picked up from the UI work done by Bradley Munkowitz on Oblivion. "It was a bit of an eye-opener for me," he continues. "It made me think there's more to the MoGraph toolset than I'd previously thought. If you start rendering MoGraph stuff in an orthographic view, it really adds an element to our UI work that previously we weren't really employing it for."

SPOV's animations were combined with real X-rays provided by the production and it's a testament to their work that it's almost impossible to tell what's real and what's CG. But the project wasn't without its challenges: "For me, it was the modeling," says Leitch. "Adding the detail. There are a lot of tools out there in the world to cut corners and make things easy. But to get the believability here, you had to do the work. You had to put the time in. I think that is what sets it apart and gives it the believability or authenticity that is worthy of a film like Doctor Strange."

Leitch admits to being a big fan of Cinema 4D. "It goes back to us being a design company," he explains. "We're not a VFX company; we're not really a CG company. We tend to work with talented designers and animators; we don't tend to work with CG specialists. We don't bring in a lighting designer. We don't bring in texture artists. Basically, what usually happens in here is somebody is given a shot, saying, 'That's your shot, and you need to generate pretty much everything in that shot – the UI elements and the CG elements.' For the most part, at SPOV, people have a shot or a sequence – and it's theirs. You complete it from start to finish. Cinema 4D allows people who are not specialist CG artists to generate quality 3D content."

The SPOV team would like to give a shout out to art director Alan Payne, production designer Charlie Wood, and the on-set supervision team at Compuhire.

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

All images courtesy of Marvel Studios 2016.

SPOV Website:www.spov.tv ]]>news-6061Tue, 21 Feb 2017 08:33:59 +0100Data Real - Imagine Physically Experiencing Datahttps://maxon.net/en-us/news/case-studies/visualization/article/data-real-imagine-physically-experiencing-data/Cinema 4D was used to conceptualize the haptic interface of the VR “Data Real” project.What is Data Real?
When analyzing data, the user will find interesting patterns by physically sensing the correlations of data rather than interpreting them visually. In that sense, “Data Real" is a completely different approach to data visualization. Users access the stream of information through Oculus Rift and manipulate the data with haptic gloves and LEAP motion for hand recognition. The user can then dive into a flow of data and analyze trends and tendencies. In this project, aircraft flight data was chosen to be displayed in a 3D in a virtual representation of Earth with detailed objects such as airplanes, flight traces, geographical coordinates, etc. Users can touch these objects through the haptic glove device, which provides tactile feedback. The current prototype provides feedback to only a few fingers and Cedric Caremel would like to develop a future version where more actuators would provide an even better haptic experience.
About workflow with Cinema 4D
“Cinema4D was used from the very beginning to help visualize the final concept of the haptic gloves and the VR box before manufacturing them," explained Cedric. "Basic modeling was done with NURBS in Rhino, and then exported to Cinema 4D where the model was fine-tuned for more detail, better materials and an environment setup for rendering. The primary tools used were Subdivision Surface on the polygonal model, instances, MoGraph and the Bend object to quickly replicate the basic physics of the flexible parts. Cinema 4D is a powerful 3D software for both modeling to animation, with an intuitive environment, tons of plug-ins, scripts and constant updates. I have been using it for 5 years now.” In the future, Cedric thinks that going through flow of information will change from visualization only to full immersion with haptic experiences.
Takram URL : https://ja.takram.com/
DATA REAL Project URL : http://scenesunseen.takram.com/data-real/]]>news-6058Mon, 20 Feb 2017 11:15:23 +0100Power Toolshttps://maxon.net/en-us/news/case-studies/advertising-design/article/power-tools/Panoply employs a combination of Cinema 4D, Arnold and Houdini to successfully evoke the brand values of Mercedes-Benz To showcase the production of the latest Mercedes-Benz cars, RAID Films and Atelier Markgraph approached Panoply – a design and motion studio based in London. The brief was to produce the first installment in a series of films representing five brand values: safety, perfection, quality, precision and awakening. The film will be shown daily at three major Mercedes-Benz Visitors Centers for the next two years.

The four-person team at Panoply had three months to generate the film, which employs a mixture of photoreal renders of elements of the car combined with a mixture of abstract imagery to express the themes of the brief. The version seen here is the 'Director's Cut' – a shorter version of the piece that includes abstract imagery that didn't make it into the final edit.

One of the team's biggest challenges was handling the massive data sets of the automotive CAD data. The models are incredibly detailed as they are used by machines for manufacturing the cars. But while this detail is very impressive, the models aren't optimized for visualization purposes in 3D animation applications like Cinema 4D.

"I think the most memorable thing from the production of this project was near the beginning when we received the CAD models for the Mercedes-Benz car," says Mark Lindner, director at Panoply. "We opened the raw triangulated meshes and were amazed at the amount of detail we had to work with but also quite worried when we thought about the amount of cleanup that would be required in order to make use of them. Luckily we didn't have to sort every single mesh we were sent. It was a case of composing our shots how we wanted them after which, once we had the shot signed off, we would go in and retopologize only the mesh that was visible."

Cinema 4D's Polygon Pen tool was vital during the retopology phase, acknowledges Lindner. "It allowed us to quickly and painlessly reduce the super-high-density CAD models down to a fraction of their polygon count without losing any detail."

To facilitate this workflow, the team also relied on Cinema 4D's XRefs, enabling them to animate using low-res proxy models and then swap in denser versions at render time. "Due to the sheer number of polygons in the high-resolution versions we didn't retopologize everything from the beginning. We used placeholder meshes in order to do our animations then once we had a locked shot we would then only retopologize the mesh that would be visible in the frame."

To achieve the render look the studio was after, Panoply turned to the physically-based renderer, Arnold. "We've been using Arnold for over two years now," says Lindner. "We first used it with Houdini before it came to Cinema 4D. However, we've mostly been using it with Cinema 4D for the last 18 months. The level of support has been key in keeping us using the render engine. That, coupled with the incredible number of data-heavy scenes you can throw at it without it even flinching, has been vital to our workflow here."

The Mercedes-Benz film opens with a moody industrial setting, which was built and rendered in 3D. "From this environment we created a high-resolution HDRI render using the spherical camera in Arnold," explains Lindner. "This then served as our HDRI for a lot of the other shots through the sequence. Additional lights in each shot were created using area lights with high-resolution soft box textures to give them an uneven look."

There then follows a montage of abstract sequences, aimed at encapsulating the brand values outlined above. The metallic atom array mesh was achieved using dynamics to create a crumpled version of the structure. The team then blended between the point positions on this version and the uncrumpled original using the Pose Morph tag and a Plain Effector. A slight mesh wobble was added using Cinema 4D's Jiggle Deformer. "Using the Pose Morph tag in combination with the Pose Deformer and MoGraph effectors allows for an amazing level of customization of effects," adds Lindner.

An array of realistic-looking laser beams are simply spotlights with high intensity and narrow angle of influence. They were then rendered using Arnold's atmospheric scattering to give the team the visual effect they were after.

For the more abstract simulations, Panoply turned to SideFX Houdini. "Houdini was used quite extensively during this project," states Lindner,"while all of the rendering was done within Cinema 4D. This meant that we'd need to transfer everything between the programs using heavy Alembic files. All of the abstract animations, particles, ball bearing sorting and liquids were done in Houdini."

The first such example is the wind tunnel smoke trails that swirl around an invisible sphere. "This was created using a line of smoke emitters with a velocity field pushing the volume in one direction. With all turbulence and displacements turned off for the smoke it was a simple case of just putting our collision sphere in place and then making it invisible to the camera at render time."

After some sumptuous shots of car bodywork being fitted together there's a brief sequence showing material being dissolved away. "This was created using a high-resolution displacement texture," explains Lindner, "which was then blended to an alternative shader using the mix node in Arnold. The additional particles floating upwards were created in Houdini, then brought back into Cinema 4D as an Alembic point cloud for rendering."

A brief segment of rippling fluids was also created in Houdini using its FLIP Solver and VDB skinning. A sequence of fluid meshes was generated, which were then loaded into Cinema 4D as a VDB sequence for rendering. Houdini was also responsible for the collection of metal spheres that coalesce into a neat Fibonacci spiral.

When the car's Start button is pressed, it ignites a TRON-style data network, representing the electronic brains of the vehicle. These were created in Houdini using geometry: "We created a procedural system into which we could input any mesh and then generate the line animations. This was made to be highly controllable so we avoided using any dynamics – it allowed us to accurately move hundreds of lines exactly where we wanted them in synchronization."

The quality of the end result is a testament to the ease with which the differing strengths of Cinema 4D, Arnold and Houdini could be combined into a powerful toolset. "Cinema 4D's openness in regards to working flawlessly with Arnold and Houdini was invaluable to our workflow," declares Lindner.Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.All images courtesy of Panoply.Panoply Website:www.panoply.co.uk ]]>news-6024Tue, 14 Feb 2017 14:08:23 +0100The Last Flight of Astronaut #909https://maxon.net/en-us/news/case-studies/movies-vfx/article/the-last-flight-of-astronaut-909/With its animated short ‘909 Depart’, Uber Eck let their creativity roam free with Cinema 4D for their most elaborate project yet.The short film ‘909 Depart’ from Munich, Germany-based Uber Eck studios tells the tragic story of astronaut #909, the last surviving human in an abandoned space station orbiting a dying planet Earth. He eventually decides to accept his bleak fate and pushes into the depths of space to get a final look at the blue planet.

38865

Originally, this project started as a test project for the Uber Eck team (Tobias Alt, Niklaus Hofer and Sebastian Schmidt). Whenever they wanted to test new tools or functions in Cinema 4D, ‘909 Depart’ was used as a testing ground. The Uber Eck team also wanted to push their creative limits with this project and prove that they can do much more than just shade, light and render nice 3D objects. ‘909 Depart’ was a conscious departure from their conventional everyday work, which generally had to be completed within a tight deadline. In the end, the trio worked on ‘909 Depart’ over five months in-between client projects.

After the project’s concept had been finalized, the team first created an animatic with a large space station frame and conceptual camera moves. One of the team members continually fine-tuned the camera animation and the remaining team members modeled the rest of the space station.

Uber Eck modeled the majority of the models in Cinema 4D and took advantage of the speedy workflow the Polygon Pen tool offered, as Niklaus explains: “The Polygon Pen makes polygon modeling much faster and even makes it possible to spontaneously design new objects while you’re working.” Since this tool combines so many important modeling functions, the designers were able work fluidly and comfortably, without having to constantly switch to a different tool. The team quickly got used to functions such as welding points, edges and polygons and sorely missed these when later working with other 3D applications.

The team used the Take System in Cinema 4D to render the scene from multiple camera angles, each of which showed a different shot space station. This made it possible to quickly create preview renderings for fine-tuning the scenes.

This project remained challenging to the end: The film’s premier and its comic rendition by Martin Hager took place at the Science&Fiction Festival 2016 in Munich, Germany. In order to finish on time, the team made a very precise calculation of the time needed to render the project and were rewarded for their efforts with just four hours to spare before the film was to premier – which is also when the Uber Eck team saw the final film for the very first time!

]]>news-6007Fri, 10 Feb 2017 10:29:00 +01003D Tools for 2D Trickshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/3d-tools-for-2d-tricks/For the first release of a Chinese version of the Adobe Cloud, the studios Never Sit Still and Luxx used their combined skills to impressively bring an illustrated dragon to life using Cinema 4D. 38860
The animations were based on an illustration created by Adobe’s Brian Yap. The images was very detailed and artists Mike Tosetto and Tim Clapham – who were tasked with the realization of the project – quickly realized that converting this design to vectors would not be a simple matter. Still, the intricate details of the original illustration had to be maintained for the final 20-second animation. First, the illustration was imported into Photoshop and the dragon was broken down into individual elements: “We made separate elements of the dragon’s head, claws, body and especially its body and facial hair, which we were then able to import into Cinema 4D. Our approach was to break the body down into multiple elements, project these onto surfaces in Cinema 4D and fill the gaps with clones. This was in theory a good idea but what we ended up with was a body with a jagged, irregular outline – far from being smooth.”
Mike and Tim quickly scrapped this approach and took a different path. “We created a smooth, flat body for the dragon using the Puppet Warp Tool, which we then imported into Cinema 4D and projected onto a surface. Since we wanted to animate the surface using the Spline Warp function, we made sure to have enough geometry to work with.”
This approach proved to be feasible. An elaborate rig was set up to control this surface and animate the dragon as desired. “We used a clever XPresso solution to connect the splines’ control points with Null objects. We were then able to set keyframes for the Null objects to easily control the splines. Wherever this wasn’t possible we simply used animations at point level.”
Another challenge was presented by the body and facial hairs, which had to have their own dynamic movement to accentuate the dragon’s movement and make it even more impressive. “We worked with the Jiggle Deformer. We added vertex maps to the geometry so the ends of the hairs would move more than the base. The closer to the hair base, the more restricted the movement of each hair was. A spline and a Spline Wrap rig were again used to create the actual animation.
Finally, the head, including hair, was added to the body. However, this resulted in the head and hairs were affected by the body’s position. We used XRefs to solve this problem. We removed the head and baked the XRef’s Jiggle Deformer for each position.
The clouds and the dragon’s fire were challenges in and of themselves! The clouds had to reflect the style of those in the illustration. They were created using individual splines that were subsequently combined. Some of the clouds ended up being made of up to 100 splines. To create the effect of active brush strokes, the splines were projected onto polygon surfaces as brush strokes and the shape was created using an alpha map. Each stroke generated numerous copies which were deformed and animated along the splines, thus creating the clouds’ unique shapes.
The dragon’s flames also had to reflect those in the original illustration. “Here we used a traditional frame-by-frame animation,” remembers Mike. “The drawings were created in Adobe Illustrator and imported into Cinema 4D where they were projected on to surfaces and animated in relation to the camera position and other scene elements.
Finally, the scene had to be rendered, which was quickly done using the Standard Renderer due to the project’s simple 2D. A depth-of-field pass was rendered for each setting, which we were able to add in the compositing phase for a nice DOF effect.”

38861]]>news-5913Tue, 17 Jan 2017 10:25:31 +01003D Animation and VFX Workflows for “Doctor Strange”https://maxon.net/en-us/industries/movies-vfx/doctor-strange/Leading design studios Sarofsky, Perception and SPOV credit Cinema 4D for inspiring creation ofmain-on-end titles, previz for main assets, medical animations and new Marvel Studios logo. &nbsp;news-5876Fri, 13 Jan 2017 12:16:21 +0100Picture Perfect Soccerhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/picture-perfect-soccer-1/France is rich with famous artists both past and present. Imaginary Forces used this rich artistic history as an inspiration for their ident clips they created for the Euro Cup 2016 using Cinema 4D.38002“Our digital artwork was based on real-world surfaces that we created by generously applying acrylic paint applied with a palette knife. These colored stripes were then digitized and a special rig was created in Cinema 4D which in turn controlled a series of Deformers using a specially created slider. This way we were able to transform the color swatches into strokes that moved along a spline. We developed this setup at the very beginning of the project, which meant that we could apply it to many different motifs such as the Eiffel Tower, which was made up of a series of brush strokes.”
“We used Cinema 4D’s Sketch and Toon feature for the teams’ coats of arms,” explains Jeremy. “Sketch and Toon produces a certain restlessness in the animations it generates, which lends them a slight flickering effect. To counter this, we developed a special technique that involved pre-rendering the scenes and using a camera projection to calm things down a bit. We were still able to maintain the impression that they were being drawn live,” remembers Jeremy.
“Other applications were also involved in this project but the lion’s share was done in Cinema 4D. In the opening sequence, for example, we created dynamic shots of player statues and rendered them using V-Ray whenever necessary.”
“The Cinema 4D Take System was extremely helpful and we used it extensively. I still get the shivers when I think about having to break down the scenes into individual elements without the Take System … It was not cake walk even with the Take System but it was a great deal easier,” says Jeremy and adds: “I love Cinema 4D! Combining parametric tools with its endless range of options lets me achieve results that no-one else has achieved before and let my creativity go wild!”

38003]]>news-5870Thu, 12 Jan 2017 09:41:36 +0100Immerse Yourself in Galactic Beautyhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/immerse-yourself-in-galactic-beauty/Motion Designer Takayuki Sato used Cinema 4D to create wonderful worlds filled with animals, plants, waterfalls, shooting stars and numerous other beautiful elements! 38882
Even though it was not planned as a sequel, the first sketches for Beyond the Moment of Beauty showed numerous parallels to The Moment of Beauty. Takayuki Sato remembers that so much of what was being sketched seemed so familiar that he decided to revive the original concept from 2014 with a new view of the world.
It wasn’t easy creating a filigree world that was constantly changing, growing and transforming. “In this work, I wanted to apply the new skills I had attained over the past few years. The challenging part was putting the images in my mind’s eye to film”, says Sato.
In 2015, Takayuki Sato had already tried to realize his vision as a film and although he was able to create a style frame of his vision, the underlying story distracted far too much from the imagery – which was more important for Takayuki Sato. He made a second attempt but was again not able to create the animations exactly as he wanted and ended up not completing this venture either.
“After the project had been restarted twice from the ground up I was finally able to create the imagery and animations that I had imagined and to the standards that I wanted”, explains Takayuki.
“Cinema 4D played the key role in the project’s realization. The floors, stones, plants, birds, the waterfall, the planets and the stellar fog – and much more: with the exception of the characters and several effects, everything was created in and rendered with Cinema 4D. The compositing was done in After Effects”, remembers Takayuki Sato. “I worked with Release 18 and I was particularly impressed with the new Parallax shader, which I used for the floor. I also applied displacement to several materials in the scene and got great results!”
“I used the new Thinfilm shader, which was added to Release 18, to create the crystals’ material”, remembers Takayuki, “and I am very happy with the result!”
“I created many of the effects using XParticles and Turbulence FD. XParticles itself is a very powerful tool for creating large numbers of images and impressions, which is why I used it for many of the scenes”, says Takayuki. “The connectivity between Cinema 4D and XParticles, Turbulence FD and Krakatoa made it possible for me to finally realize my visions. This project was also an excellent opportunity to get to know these tools.”
Takayuki Satos web site:http://otas.tv/Beyond the Moment of Beauty - Into the Galaxyhttp://otas.tv/work/btmob/]]>news-5844Wed, 21 Dec 2016 15:26:49 +0100Lightshow in York Minsterhttps://maxon.net/en-us/news/case-studies/advertising-design/article/lightshow-in-york-minster/Jason Bruges Studio uses Cinema 4D to put England’s largest Medieval church in a new light. 37890
To be able to control the installation’s 48 spotlights, a production pipeline had to be created: the team started with a streamlined pre-viz system that displayed the spotlights’ behavior in the Viewport in Cinema 4D. Next, a real-time control system for the real-world spotlights had to be implemented so they could be controlled directly in the Viewport. For the final presentation of the light show, the data for controlling the spotlights was exported to a proprietary format. This data was then played using a player software that was specially designed for this purpose.
“Controlling lighting fixtures such as the ones described directly from Cinema 4D is something I'd prototyped a couple of times before successfully. It was the leap from prototype to production which would introduce a significant number of challenges,” explains senior visualizer Adam Heslop.
At the beginning of the project it was not clear, which type of spotlight would be used. Each model requires different control data because the various types all have different functions. A Python plugin generator was developed for use in Cinema 4D specifically for the purpose of testing various spotlight models. This plugin used parameters and control methods for the individual spotlights to create models in Cinema 4D with corresponding properties. This made it possible to control all of a given spotlight’s parameters live in Cinema 4D.
Each spotlight had a defined position in space and a specific target. As soon as the target was moved, the corresponding rotation for the spotlight was activated. MoGraph matrices, which could be easily affected by MoGraph Effectors, were used as targets for the spotlights. MoGraph made it possible to not only affect the spotlights’ parameters but also let all spotlights be controlled as a group. This greatly simplified his workflow, as Adam explains: “Being able to modulate groups of objects spatially (in this case the light target Matrix clones) through effector based setups allows for really fast generation of architectural looks and forms.”
37891
A further proprietary plugin was used to animate the light show, which was used to control the animation on a node basis. The advantage of this was that complex, undulating animations for the spotlights could be activated by the falloff of a single animated MoGraph Effector. This eliminated the need to create keyframes for each spotlight.
“When using C4D for a project like this, the benefits really come from the solid core features,” says Adam, and also praises the software’s flexibility. A stable Timeline for the keyframing workflow and F-Curves makes working with Cinema 4D extremely intuitive.
The designer is convinced that the final result of this highly complex project was well worth the effort: “Hearing the organ with the lights in the space for the first time set as loud as it could go was pretty unforgettable.”

Jason Bruges Studio website:http://www.jasonbruges.com/
Images and minster footage by James Medcraft.]]>news-5830Mon, 19 Dec 2016 09:58:19 +0100Animated Morgellons and Nanobotshttps://maxon.net/en-us/news/case-studies/movies-vfx/article/animated-morgellons-and-nanobots/Visualizations about DNA can often appear quite chaotic to the untrained eye. This controlled chaos is often made more spectacular using special effects created in Cinema 4D. 37874

First, the filmmaker gives the audience insights into Morgan’s creation, including shots of cell manipulation at a microscopic level. In order to meet the expectations of both the director and the production designer while maintaining the required realistic look, MadMicrobe worked with an animation studio that specializes in creating precisely these types of effects. Co-founder Joe Dubin explains how the shots were created using Cinema 4D, X-Particles and After Effects:

“The first scene that we wanted to create involved embryonic cells into which modified genetic material is injected using a micro pipette. This key sequence in the creation process basically shows Morgan’s inception. She can be seen on a computer monitor, which had to have the typical look of electron microscopes. We were asked to create a sequence that shows the modified material flowing into the cells, as if they were morgellons. We created the part where the pipette presses through the cell wall, which then returns to its original shape after being penetrated, using Soft Body elements with Dynamics in an FDD and a Deformer in Cinema 4D. The injected DNA had to resemble morgellons, which we realized using X-Particles that we applied like a turbulence effect. We rendered the scene using the Physical Renderer in Cinema 4D, which in my opinion delivers much better results much faster than many people may think!”

In another scene, MadMicrobe had to show how a nanobot infiltrates Morgan’s neural network. The scene the nanobot hovering in a weave of neurons and then settle onto a knot which it then penetrates. “It was not immediately clear how this scene should look. There were only rough concepts for the bot design and how it should penetrate the knot, which made it possible for us to use our own ideas and imagination. Since we only had about 2 weeks to create the shots, including style frames and test renders, we brought on Jon Bosley, a Cinema 4D artist based in the UK who helped us with lighting, texturing and particle effects.”

“We used a simple polygon model to create the nanobots and distributed the antennae across their surface using MoGraph. Even though the final scene had to have the monochrome look of an electron microscope, the director insisted that the nanobots be colored orange. After the bot concept had been approved we created a high-res model and added abrasions and wear to the models using Cinema 4D’s Sculpt tools,” Joe explains.

After completing this rather spectacular film project, MadMicrobe returned to its core business of creating medical and scientific animations. “Of course this is what we specialize in but we’d love to use our skills and expertise for other types of projects in the future!” concludes Joe. ]]>news-5804Thu, 08 Dec 2016 17:51:24 +0100A League of their Ownhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/a-league-of-their-own/Ultra-fast drone racing with Cinema 4D and Octane Render. 37802
Creative director Alex Donne-Johnson describes how the DRL promotional video required a somewhat deft approach: "On one hand we wanted to educate the viewers about the technicalities of drone racing, and on the other we wanted to add a narrative that amplified the excitement of a race. You have highly skilled pilots, intense racing, crashes and amazingly designed courses. Our role was to look after the channel branding for the forthcoming broadcast series, and the most important part of this was the show opener."
Given that the majority of viewers don't yet understand what drone racing is, the Dazzle Ship team had to tread a fine line to hit the brief. "It's a totally new concept that sits between real world and sci-fi," says Alex. "It's like a computer game with real-world consequences. The problem DRL faces is that people thought it was just a computer game and didn't understand the real-world aspects of players, FPV goggles and highly skilled pilots. The aim of the intro was to educate them but at the same time transport them into a futuristic science-fiction world that represents the experience of drone racing. Too informative and it could be considered mediocre, too sci-fi and people would struggle to understand the concept."
To set the right tone, the sequence begins with a blend of live action footage and convincingly photorealistic CG drones, which grounds the sport in reality. But from there, the viewer is transported into a more conceptual, stylized environment, with sweeping camera moves through neon-festooned racecourses. "The real-world aspects were incredibly important to the narrative, says Alex, "but then we also needed to enhance this to find an abstract way to embody the vibe of a drone race."
He admits that the biggest creative problem was incorporating so many aspects of drone racing within their short schedule. The four-man team had just eight weeks to fulfill the brief. "We actually had quite a short lead time for the level of quality we wanted to achieve. The most important aspect for the client was hitting the broadcaster’s deadline."
To help deliver the project on time, Dazzle Ship employed Octane Render, the unbiased render engine from Otoy. Octane shifts the imaging pipeline from the CPU to the GPU while providing a fast, interactive preview right within Cinema 4D. "The reason we chose Octane was because of the short deadline, says Alex. "We knew that we needed to create proofs-of-concept pretty fast and in some cases do minor tweaks on-the-fly to send back to the client. Octane is very versatile in this respect as it enables you to make changes and render good-quality stills very fast. GPU render farms are becoming more common. However, we did everything locally and got as many GPUs in as possible."
The video begins with shots of the human pilots donning their FPV goggles, complete with the obligatory high-tech overlays and UI elements, which were added in Adobe After Effects. It then cuts seamlessly to the full-CG drones, revving up their rotors and taking off from their launch platforms, complete with a cool slo-mo shot. The drone model was supplied as a 3ds Max mesh but had to be 90% rebuilt to allow for proper texturing and animation in Cinema 4D.
Now airborne, our perspective switches to a drone's-eye view of the course, a futuristic construct of gleaming metal and glowing lights with a distinctly Tron 2.0 feel. "All of the race courses were built in Cinema 4D following strict art direction," explains Alex, "and everything was rendered in Octane for speed, realism and its ability to provide a very high-quality preview in real-time."
The tunnels were created using meshes cloned along a spline, and the tracks were made using a Sweep object with detailed splines to generate the complex forms. To get the nice metallic reflections, an HDRI environment was used with a gradient to provide a sense of variation. "The idea was to keep the lighting as realistic as possible so much of it comes from using emissive textures on things that actually emit light."
The neon effect is a mixture of lights and emissive materials. "Actual lights were used sparingly," says Alex, "mainly to highlight areas that became too dark, but always as naturally as possible. In some ways as you can't use pure black because you risk getting too dark. You need to use a grey and then grade it down in post, if need be. We knew we wanted to keep it dark to really make the light trails pop."
As the drones speed through the sci-fi environment they leave glowing trails in their wake – an effect produced using a fairly unorthodox method. "The light trails were created using Hair objects in Cinema 4D for render speed," explains Alex. "Hair seemed to work very well with Octane and was quite versatile in terms of texturing and lighting. We used emissive materials to create the realistic glow on them. The shots were rendered out with many passes and the scenes were heavily composited in After Effects."
As the drones exit the course they head toward a cityscape, again all modeled in Cinema 4D. The layout was based on a grid by drawing lines that originated from the DRL logo, which is revealed in a pullback at the end of the sequence.
The team rendered out a variety of passes, using an Emission pass to enhance the trails and lights and then Ambient Occlusion, Specular and Lighting passes, depending on the scene. Motion Vector and Depth passes were also used for motion blur and other effects when compositing. "We didn't use Octane's motion blur," comments Alex, "but we did use its depth-of-field on the drone shots at the start of the sequence. It's very quick and powerful. For the rest we decided to keep the control in post-production."
"Octane is an emerging render system with a whole range of settings," he adds, "so the settings for each shot had to undergo a lot of tweaking to get the best result. Rendering locally across different machines, it was important for us to assign the various scenes to each one accordingly. It was labor-intensive to do this but saved us precious render times with the tight deadline. All of the shots were either Direct Lighting or Path Tracing, depending on the kinds of materials we'd used in the scene."
With the rendering complete, work moved to post-production to enhance the overall look and ambience of the shot. "We exported cameras and nulls from Cinema 4D into After Effects and then used [Video Copilot's] Optical Flares. We wanted to keep it very subtle. In terms of the glow it turns out Octane does this very well inside the camera tag. However, we enhanced it slightly in After Effects using render layers."
"Using Cinema 4D in conjunction with Octane gave us a very powerful setup that enabled us to iterate quickly based on our collaborations with an overseas client. Also, because it's fairly new, there is a strong GPU community online that you can get feedback from and troubleshoot issues."
Despite Octane's speedy output, the project still demanded some late nights and long weeks, which had an impact on Dazzle Ships' junk food intake. "We ate quite a lot of burgers during the making of this project," admits Alex, "including one called 'The Devastator,' which has been named Britain's biggest burger and is eight inches high. If timed well you can be in a food coma while waiting for a render."
Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.
All images courtesy of Dazzle Ship.
Dazzle Ship Website:www.dazzleship.com ]]>news-5796Tue, 29 Nov 2016 10:02:23 +0100Spotlight on Raoul Markshttps://maxon.net/en-us/industries/movies-vfx/raoul-marks/Interview with the Emmy-awarded motion designer and Cinema 4D artistnews-5763Mon, 21 Nov 2016 14:25:14 +0100Furry Fun with the UBS Bankhttps://maxon.net/en-us/news/case-studies/advertising-design/article/furry-fun-with-the-ubs-bank/Topsy the fox is Switzerland-based UBS bank’s mascot. After a long and successful 2D existence they decided to bring him into the third dimension with Cinema 4D. 36635“Using Cinema 4D to create a 3D character based on 2D drawings was easy with regard to modeling, in part because the client allowed a certain amount of creative freedom for the design process: the new 3D Topsy had to be recognizable as Topsy but we were allowed to design the character to reflect a contemporary look,” explains Michael Scherrer, project director at Pixcube Animation Studio. Not only was Topsy given a virtual reconditioning – all characters were carefully revised and modernized.

“Based on various feature-length animation movies, the client already had an impression of what could be done with a 3D Topsy & Co. but was nevertheless surprised at the amount of work that went into animating an entire ensemble of figures,” remembers Michael.

“The fact that Topsy and other characters had a fur coat made things the more interesting. Topsy wasn’t the only furry creature in the scenes, which meant that there would be fur-to-fur or fur-to-feather interaction among the characters, which in turn presented quite a number of collision challenges for our team,” said Michael.
36636In addition to all the characters, the surrounding scenery also had to be created, which had to be created in a characteristic “Swiss” look. This meant that mountains, slopes, forests and the characteristic houses and structures also had to be modeled in Cinema 4D. “We made quite liberal use of the MoGraph tools for the creation of the landscapes. Cloner objects, Effectors and render instances were essential elements for this phase. We used the Laubwerk plugin to create most of the vegetation and the Hair feature in Cinema 4D when it came to creating grasslands and pastures.”

“Cinema 4D proved to be a comprehensive program that made it possible for us to meet – and conquer – any challenge we encountered during this project. For example, we were able to save a lot of time by using CMotion to create part of the character animations.”

“We used Cinema 4D’s Standard Renderer in conjunction with the Ambient Occlusion function. We didn’t use the Global Illumination function due to the render times. The renderer delivered the quality we were looking for. We split the rendering processes into three sections: characters, scenery and effects. These were then combined in the compositing phase.”

“We have since created more than 30 spots using this method, which shows that our client is very satisfied with the results – and we are more than satisfied with the results that Cinema 4D and its range of powerful tools delivers,” concludes Michael.]]>news-5759Thu, 17 Nov 2016 15:42:48 +0100Creative Nuts Uses Cinema 4D for Strictly Come Dancing Titleshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/creative-nuts-uses-cinema-4d-for-strictly-come-dancing-titles/Piers Helm of Creative Nuts explains the process of creating the glamorous intro for the BBC's annual primetime show 36593Now into its 14th year, Strictly Come Dancing is one of BBC One's most successful primetime Saturday night shows, partnering 15 celebrities with professional dancers over a 14-week competition to find the best ballroom and Latin dancer.

The on-screen graphics need to reflect the glitz and glamour of the show, and each year this is down to Piers Helm of Creative Nuts, creative director and co-owner of a boutique motion graphics studio based in Soho, London that specializes in title sequences, channel branding and promos. Helm won the pitch for the very first show in 2004 and has been creating and developing them ever since, refining the sequence and graphics for each series.

While the overall structure of the intro sequence remains familiar, the technology has changed drastically since its first outing in 2004. "What's interesting about the Strictly titles is that they've spanned many technological improvements," says Helm. "From the original titles filmed in SD on Betacam, using real objects and curtains and a real mirror ball, to the move to a 100% HD workflow. During the early days a few assets were created in Cinema 4D while I was learning the program. Now everything is created in Cinema 4D and After Effects apart from the dancers! Who knows, in ten years' time even those could be done in Cinema 4D, too!"

Because of the short timescale involved – he has just three weeks to put the title sequence together – Helm builds on assets from previous years, refining them with each iteration. "I always try to improve the lighting and textures in the scenes, add new elements and create several entirely new scenes," he says. "Last year, for example, I added the patterns on the floors, this year the curtains open at the beginning, and I've tried to make the edits and transitions between dancers more fluid. For me it's always a work in progress; I'm already thinking about what I can do for next year's sequence."

Work begins with a green-screen shoot, that Helm directs, as soon as all the celebrities have been announced. "I film all 15 couples over two days. The time I have with each couple is limited as they are on a tight schedule, fitting in stills photography, filming of idents as well as their dance moves for incorporation into the title sequence. I get about half an hour with each couple but as I haven't met them before, I'm unable to prepare and pretty much have to make it up on-the-fly. Bear in mind that this is the start of the competition, so the celebrities haven't had much dance training and whatever shots I include in the titles have to last the entire series."

With the footage safely on disc, Helm moves to Cinema 4D. A key element is the signature mirror ball, which is part of the show's logo and acts as the entry and exit points for the intro. "The mirror ball was modeled several years ago using the Cloner object," explains Helm. "It has 6,544 mirrored squares, and like a real glitter ball each square is a transparent glass square with a mirrored plane behind it to give it depth. It took a while to create the image that's reflected in it – every year I try to change or improve it but I always go back to the same one. It's a trade secret what it is!"

As the mirror ball spins it generates a series of light beams, which are created using a volumetric spotlight duplicated with a MoGraph Cloner object and varied using Effectors. The initial ‘Strictly Come Dancing' font required tidying up and thickening in Adobe Illustrator so it could be extruded and deformed in Cinema 4D. Glitter particles are generated using standard Cinema 4D emitters, and then enhanced in Adobe After Effects using Trapcode Particular plus shots of real glitter.

The circular backdrop is festooned with flashing lights, which are actually small spheres duplicated using a MoGraph cloner. "I use the Multi-shader in the Luminance channel and a Random Effector to get them to flash," explains Helm. "When rendering I have a separate object buffer for them and use GenArts' Sapphire Glow and Trapcode Starglow to make them illuminate. I tried using point lights but as there are so many it made the scene too heavy."

As the camera tracks back, a set of red velvet curtains are swept away to reveal a sunburst backdrop – all of which was created in Cinema 4D and inspired by Busby Berkeley musical extravaganzas. "The curtains are either modeled and a few have been purchased from Turbosquid," admits Helm. "The curtains that open at the beginning are animated using Cloth in Cinema 4D."

At this point the dancers are revealed one-by-one against a rotating circular backdrop, sometimes with elements specific to or relating to each couple and populated with clips of the dancers performing different moves. "The backing dancers are placed in the scene using image planes duplicated in a MoGraph Cloner object. Once I add the dancers into the backgrounds I can then change the colors of the lights to work with the costumes." Helm reveals that the soft smoke in the background is produced using a series of colored Parallel spotlights pointing upwards and set to Volumetric with noise to make them look smoky. The simpler background elements are hand-modeled, while more complex objects such as statues and buildings are purchased online to save time.

With the lighting setup and animation complete, Helm renders the backgrounds out at a resolution of 3104x1746. This gives him some latitude to add camera moves when combining them with the high-res dancer footage.

Alongside the main credit sequence, Helm produces three ‘stings' with variations for the weekday spin-off, Strictly – It Takes Two on BBC Two as well as intros for all 15 couples and on-screen graphics for the Results show on Sunday evenings. "I need to keep things as simple as possible in After Effects," he remarks. "Separate passes are created for Atmosphere (so I can boost the volumetric light smoke), the small lights so I can make them shine and twinkle, and Depth so I can use Frischluft Lenscare for depth of field."

Helm has access to a small, six-node render farm plus two additional Mac workstations, which can be called into action overnight, and uses Thinkbox Software's Deadline to handle the render queue. The whole sequence is output using Cinema 4D's Standard renderer, with anti-aliasing often set to Geometry in order to keep render times manageable. "There's always a compromise," he adds, "and this year the render times varied from ten minutes a frame for simpler scenes to 40 minutes per frame for the mirror ball."

The job relies heavily on Cinema 4D's MoGraph toolset, with a dash of Insydium's X-Particles. But in terms of the challenge, Helm concedes that it's really a matter of timing: "In the past, the titles have been due for delivery at exactly the same time as the birth of both my sons and a house move! It makes for a very busy time!"

"Having been in the industry for many years, starting on the Quantel Classic Paintbox in 1989, I never imagined that a 3D program would end up being my primary tool," says Helm, who only started using Cinema 4D in 2009. "I love using it as it enables me to create things as simple or as complicated as I want. As a designer first and foremost, I find that Cinema 4D allows me to design things fast without technical issues interrupting the flow."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-5712Wed, 09 Nov 2016 13:07:12 +0100Sweet-Tooth Typographyhttps://maxon.net/en-us/news/case-studies/advertising-design/article/sweet-tooth-typography/Text is not only about writing but can also be used as an expressive medium – which is something that can be easily realized in Cinema 4D. Vijay had always thrived in painting, drawing and design. A deciding factor for his artistic orientation that helped him develop his unique style was the fantasy film ‘At the Earth’s Core’, which had a visual style in the direction of what we today refer to as Steampunk. In it, Vijay saw how function can follow design. Together with his affinity for typography, Vijay was able to develop his own visual style.

In addition to his personal works such as ‘Oh Shit’ or ‘Vijay’, in which design elements are used to create fonts, Vijay’s playful approach to creating creative fonts has also garnered the attention of the advertising community. Vijay has been creating advertising visuals for 15 years for clients such as Lucozade, Miele, Tuborg, Harley Davidson and Sky Bingo.

When Cadbury wanted to create a new billboard campaign that depicted a machine and flowing chocolate as letters, Vijay was given the job. While working at the agency bigdog, he was tasked to create a flagship image for Cadbury World. After a few sketches with pen and paper the design was approved by Cadbury and Vijay got to work on it in Cinema 4D. The look that he designed – in close cooperation with Cadbury – had a touch of Steampunk, with a machine that even Willy Wonka would have liked to have in his factory. A design in true Vijay manner.

Vijay used a variety of tools to create all elements in Cinema 4D. The flowing chocolate letters were created using the Sculpt tools; the remaining letters were created as Text objects and were modified using Cinema 4D’s wide range of modeling tools. After all elements had been modeled, Vijay moved them into position and rendered them. Even though all elements could have easily been rendered in one go, Vijay split the renderings to make color correction much easier. He rendered each part using the Standard Renderer in Cinema 4D with Global Illumination activated. All elements were then combined in Photoshop.

This Cadbury campaign is one of the most impressive examples of a project created in Cinema 4D from start to finish, with the artist having a maximum amount of creative freedom. This project not only reflects Vijay’s fable for Steampunk but also his vision of font design and how it can be realized from A to Z using Cinema 4D.

Vijay at Behance:https://www.behance.net/veej75 ]]>news-5690Tue, 01 Nov 2016 13:08:55 +0100Making 3D Films – a One-Man ShowCreate a short film by yourself for your thesis in just eleven months? No problem for Shawn Wang with Cinema 4D as his tool of choice! 36421The fact that Shawn was – and still is – an enthusiastic Cinema 4D user and wanted to use this opportunity to expand his skills with the software helped shape his concept: he wanted to portray two robots that were looking for life on a distant planet. Friendship was the motivating factor for the characters and an asteroid storm was going to put this friendship to the test.

Shawn designed both robots as vehicles with added features to give them each a unique personality. “I chose this solution because I was planning on completing the entire film by myself. The vehicles are driven by caterpillar tracks and wheels, which were each given a special rig. These are in fact two separate systems that control the rig. The first system is based on the Constraint tag’s Clamp function, which lets it react to the underlying terrain. The tracks and wheels were always had contact with the ground and rolled when the character was moved. The second system was created using XPresso, which I used to execute a series of mathematical calculations to create restrictions for the tracks and wheels. This ensured that they only moved when they touched the ground. Both systems were used to make up the vehicles’ control system and a series of additional XPresso Expressions and several Python scripts were used to create the actual rig.”

Another challenge for Shawn was the high number of assets that he needed to populate his scenes. He created a custom library in the Content Browser for his assets, which made it possible for him to quickly and easily access all the rocks, asteroids, materials and other elements he needed. All he had to do is drag and drop them from the Content Browser into his scene.

Shawn used the Octane renderer, which offered render times of 4 to 15 minutes per frame. He primarily rendered overnight and had finished sequences waiting for him the next morning that were ready for compositing. Rendering took a total of about four months to complete.

Shawn worked on the project for eleven months and it was well worth the effort: his film has won more than half a dozen prizes at various international short film festivals and was chosen as Staff Pick at Vimeo where it’s received more than 140k clicks. Meeting and mastering the challenges that this project offered paid off for Shawn: his Cinema 4D skill level increased dramatically. “In the end,” he explains, “it’s part of the ice berg that you don’t see right away that really pays off: the tools and usability concept that never let you down when you’re working on a highly complex project!”
Rig demonstration: https://vimeo.com/147696468https://vimeo.com/137055688]]>news-5634Mon, 24 Oct 2016 12:52:42 +0200An Enchanted Keyhttps://maxon.net/en-us/news/case-studies/visualization/article/an-enchanted-key/In his experimental short film ‘Keys’, Raphael Rau gives keys a life of their own.The works of Raphael “Silverwing” Rau never fail to impress with their virtuoso reproduction of physically correct behavior of materials and light. It was also his intention to create a highly realistic look for his personal experimental project ‘Keys’ while also optimizing his workflow for more complex short film productions.

The keys for which the film is named and their enchanted ways are presented in a medieval setting. Together with the ambient soundtrack by sound designer Simon Damborg, ‘Keys’ unfurls a mysterious atmosphere, which raises the viewer’s curiosity about what secret these keys bear.

35469

Raphael’s penchant for realism is apparent from the very start with the spider webs waving in the wind. These are made up of simple splines that are connected to the geometry of the bowl, dish and floor. To simulate their movement in the wind, the splines were subdivided and a Spline Dynamics tag was added to them. Raphael simulated the extremely light webs by defining a very low gravitational value for the tag. To make the spider webs visible, Raphael assigned them to a Sweep object. “Since real spider webs do not have a uniform thickness, are translucent and are slightly sticky, I tried to simulate these characteristics as well,” explains Raphael. This is why he varied the splines’ radii and used the MoGraph Cloner object to position a variety of dust particles and tiny hairs along the web. A Wind object with turbulence is used to set the webs in motion.

The keys and the entire environment were modeled in Cinema 4D. The new Knife tools added in Release 18 were very useful, as Raphael explains: “Being able to switch between the Knife tool’s modes (Single Cut / Loop Cut) so quickly combined with the snap function when cutting loops dramatically sped up my workflow compared to working with the previous Knife tool.”

Raphael used Octane Render for texturing and rendering. He created several simple shaders to mark the keys’ convex and concave surfaces using black-to-white gradients. Raphael applied a metallic texture to the protruding convex regions and the concave regions were given a darker, rusty texture.

Raphael used the new Voronoi Fracture feature added in Cinema 4D Release 18 to create the animation of the exploding key halfway through the film. Initially, the edges of each fragment were smooth, which doesn’t reflect the realistic behavior of metallic materials. This problem was solved by subdividing the fragments segments and then grouping them in order to create realistic-looking edges. Dynamics was used to simulate the explosion of the key by having the fragments of the subdivided key model collide with an invisible cube.

In order to create a very dense atmosphere and spatial depth, Raphael used volumetric lighting for the first time. He positioned the Volume object from the Cinema 4D Octane plugin in the scene and defined a fog with a fitting density. In the scene with the columns, for example, a Spot light is positioned above the key and creates an impressive backlighting that causes the key to cast a visible volumetric shadow through the fog. A second Volume object is used to create the visible clouds at the top of the scene that cast additional shadows and create the ‘God rays’ through the fog.

The final rendering took about five to six days to complete on a local render farm with 14 GPUs. This was quite good considering the numerous effects that were rendered with almost realistic precision, without anything having to be “faked” manually: “All behavior is physically correct and you don’t have to contemplate long at all to know how real fog would behave in this situation.”

]]>news-5556Fri, 14 Oct 2016 10:32:26 +0200Inverted Gravitationhttps://maxon.net/en-us/news/case-studies/visualization/article/inverted-gravitation/Title sequences for television series have to hint at the upcoming content but shouldn’t reveal too much about it. Dan Braga tried his hand at this In the course of his digital media design studies. 33993The show’s story line – which isn’t meant to be logical but rather serves as a basis for spectacular imagery – revolved around a weakening gravitational force on Earth. This story line made it possible for Dan to use features in Cinema 4D for his animations that he had only used sparingly in the past. “There was no getting around using Dynamics to create a convincing effect of a weakening gravitational force,” explains Dan. “Previously I had only used these tools every now and then but for this project they were an essential part of at least half of the animations I created. I had to figure out how to create these animations, how to cache them and how to bake them in the end."

Since this project closely resembled a real-world project, Dan integrated other programs and plugins in his pipeline in order to gain additional useful skills. He used the World Machine tool to create the landscapes, which he imported into Cinema 4D. The trees and boulders strewn throughout the scene were created using the Forrester plugin for Cinema 4D. The Forrester plugin includes a scattering module with which the trees and rocks can be dispersed across surfaces using an algorithm.

What makes this short film so magical is Dynamics, which Dan used to simulate weightlessness. The final scenes were rendered using Octane and fitting music by Bryan Barcinas was added.

The title sequence ‘Negative Mass’ thrives from the surreal imagery that Dan Braga created using Cinema 4D and a lot of imagination. “Cinema 4D’s intuitive interface is like a 3D sandbox that invites you to play around with the ideas you have. With Cinema 4D you don’t have to conquer a mountain of technological challenges before achieving spectacular results,” says Dan about his experience with Cinema 4D!

Forrester plugin web site:http://www.3dquakers.com/ ]]>news-5525Wed, 05 Oct 2016 14:12:05 +0200Sportschau Rebrandinghttps://maxon.net/en-us/industries/broadcast-motion-graphics/sportschau-rebranding/The Sportschau is Germany’s premier sport reporting show that has for decades been a staple of the German sports diet for millions of sports enthusiasts. Designing a new look for Germany’s most beloved sports show is understandably no trivial matter.news-5519Thu, 29 Sep 2016 11:22:08 +0200Scandinavian Minimalism in 3Dhttps://maxon.net/en-us/news/case-studies/advertising-design/article/scandinavian-minimalism-in-3d/Sometimes less is more! This is proven quite impressively by Toke Blicher Møller’s animations that he created using Cinema 4D.Toke’s portfolio is filled with interesting projects but those that really stand out are his own. They emanate the unique laconic melancholy typical of Scandinavian films and books and are portrayed by Toke with maximum realism and a minimum of animation. His piece ‘One Road Two Worlds’ is an impressive example. Created as a title sequence for a documentation about the social and economic inequality in Denmark, its tonality is defined in the very first seconds of the show.

Toke works with textures that are projected onto 3D shapes. These are arranged onto multiple depth layers and animated to create an intense spatial effect that is underscored using subtle effects such as smoke, depth of field and dynamics. The clip ‘The Invisible Man’ is similar in its subtlety and message. The first impression viewers have is that this is a film about super heroes but quickly realize that the character is an invisible homeless person who is not perceived by most people.

“One of the reasons I keep creating such projects is surely their unique tonality, which suits me better than many of the commercial projects I create,” explains Toke. One look at his work such as the film ‘Lines’ or ‘Square’ shows how loyal he remains to his concept with regard to mood and minimalism. The sparseness of his work is due mostly to the fact that he works alone. Even though he shares an office with a production company whose sound designers and producers he can count on, he’s a lone wolf at the computer when it comes to putting everything together.

This is another reason Toke works with Cinema 4D: “Cinema 4D is not only very powerful but is able to handle everything I throw at it. If Cinema 4D reaches its limits I use plugins like XParticle, which can be seamlessly integrated into my workflow. I often use After Effects, which offers excellent integration and file exchange with Cinema 4D.”

Toke Blicher Møller is a seasoned 3D professional and an artist with formidable talent. With his non-commercial projects he is able to use slick computer graphics to successfully convey various aspects of humanity.

Toke Blicher Møller at Vimeo:vimeo.com/tbmstudio ]]>news-5458Mon, 19 Sep 2016 10:48:26 +0200Kaspersky Lab Advertising Imageshttps://maxon.net/en-us/news/case-studies/advertising-design/article/kaspersky-lab-advertising-images/Kaspersky Labs hired visual effects studio Resight to create illustrations for its IT security products and Cinema 4D was the perfect tool for the job! Security in and of itself is an abstract concept and IT security even more. Not many people can actually picture what IT security is, which is why Andrey Voytisin saw himself confronted with a unique challenge: visualizing the concept of IT security. Kaspersky Lab’s basic idea consisted of using images to symbolize the environment of modern communication, production technology, Cloud and data exchange. As different as these topics may sound, they each represent different aspects of modern technology and also illustrate that everything really is ‘connected with everything else’. The goal of the illustrations was to show customers how these connected situations require comprehensive security measures to protect them.

Stylistically, the illustrations had to resemble science fiction or cyberpunk and be reminiscent of films such as ‘Matrix’ and the color palette had to give a clear impression of computer graphics. Resight worked closely with the client to refine the look of the images. Resight had sent Kaspersky Labs concept illustrations from very early on in the project. As soon as the arrangement in the images was defined, Resight started with modeling and texturing. “We modeled everything in Cinema 4D because we had all the tools we needed in a single application,” explains Andrey and adds, “The only feature we didn’t need for these models was the Sculpt feature!”

Each illustration took about two weeks to complete, which not only included modeling, texturing and rendering but also fine-tuning all details together with the client. “The work on the images often started with pencil sketches on paper, which gave substance to the respective idea. The 3D elements were then created based on this illustration,” continues Andrey.

“Many customers clamor to a particular concept, which for them becomes written in stone. Very often, however, new ideas crystalize during the completion process. Fortunately, Kaspersky Labs understands this process and was willing to accept new ideas if they like them. It was a pleasure working with Kaspersky Labs,” says Andrey laughing.

The final step in the 3D production process was the rendering, which was done using Otoy’s Octane Renderer. The images were rendered in several passes, which were then each loaded into Photoshop and combined. This made it possible to fine-tune details or to adjust or change colors where necessary.

The illustrations that Resight created for Kaspersky Labs show that Cinema 4D is an excellent tool for creating a wide range of illustration styles. Artists with Cinema 4D in their toolset are definitely well-equipped to meet any creative challenge they encounter!

Studio Resight web site:www.resight.ru ]]>news-5384Thu, 08 Sep 2016 09:59:00 +0200Ugly: A Single Word Sums it Uphttps://maxon.net/en-us/industries/animated-films/ugly/A short film about dynamic simulations with characters so ugly that they inspired this project’s name. This is a project that explores new horizons – and uses Cinema 4D to so it! news-5322Tue, 23 Aug 2016 15:15:00 +0200Robin Hood, the True Herohttps://maxon.net/en-us/news/case-studies/visualization/article/robin-hood-the-true-hero/Artist Carsten Mell used Cinema 4D to create a wonderful comic album about Robin Hood. Carsten wants to use traditional 2D techniques as well as 3D elements to illustrate the 44-page story. The aim is to ease his workflow and also achieve the special look he imagines. Carsten has already modeled all important characters in the story in Cinema 4D, which saved him a great deal of time as opposed to using traditional illustration methods. Parts of the characters such as hands, whose movements are constantly repeated, don’t have to be redrawn each time and only have to be copied and adapted to the character using the Sculpt feature in Cinema 4D. Only Maid Marian’s hands were so different that they had to be modeled separately.

As soon as a new character is modeled, a Character object is assigned to it in Cinema 4D, which lets it be positioned quickly and easily after it’s been weighted. Carsten decided not to save poses later use. “Surely this would have been a great help for a comic with numerous repetitive poses but I didn’t want to risk the characters looking like they had been mass produced.”

Not all characters will be created in 3D. Characters that only make cameo appearances will be illustrated using traditional techniques. It takes Carsten about two days to complete one page of his comic – as long as he’s not interrupted by other work. He uses the WACOM Cintiq Tablet to draw the 2D elements.

In addition to the characters, Cinema 4D was also used to create accessories and buildings. Carsten paid close attention to historic architectural details in particular and referenced historic blueprints and modern satellite imagery to recreate the settings. Since the story of Robin Hood is not complete without touching on the part that Richard the Lionhearted played in the crusades, historic cities such as Jerusalem and Akkon as well as London and Nottingham were included, which were created completely in Cinema 4D.

Carsten used Cinema 4D’s Physical Renderer, which let him include depth of field and motion blur in his images. These effects played an important role and Photoshop didn’t offer the level of quality or the look that Carsten wanted.

Carsten has completed about 1/3 of the 44-page story and he plans to publish the work himself on Amazon. As soon as Carsten completes this ambitious project, we’ll have another look!

]]>news-5306Mon, 22 Aug 2016 16:14:32 +0200The Art of Mathematicshttps://maxon.net/en-us/news/case-studies/visualization/article/the-art-of-mathematics/Vicarage Studio's poignant homage to Messiaen's Quartet for the End of Time is built from the same numerical rules as its inspiration. To celebrate its 75th anniversary, Sinfini Music commissioned freelance animation director Simon Russell of Vicarage Studio to produce a video for the piece. The final 3m 46s animation is accompanied by The Crystal Liturgy, one of the eight movements in Quartet for the End of Time.

The stunning animation features a single camera move, rendered in one unbroken take, as the viewer is drawn through the mechanical contraptions and polygonal life forms inhabiting an island floating in a monochromatic void. But amid the seeming chaos, the form and movement is all driven by the same numerical laws that Messiaen employed to conceive his music.

"It's based on a talk by mathematician Marcus du Sautoy,"' explains Russell. "He talked about how Messiaen used mathematics in his music. It's really about the mathematics of nature and how that brings beauty."'

Russell suggests that the challenge was in "trying to understand then translate some quite complex mathematical ideas into animation and then form them into a very watchable, interesting format. Once the concept was done it was a matter of dealing with a polygon-heavy scene and making it feel light and synchronised to the music."'

The sequence begins with the camera dollying back through a forest of trees and bushes, many of which are generated using the Lindenmayer system or L-System, which employs a set of instructions to describe the growth behavior of plants and trees.

"You tell it to 'Go forward two, turn right one',"' says Russell, "and you can also have branching instructions as well. So you can have it like: 'Go forward two steps, and then branch one left and one right.' Then you can say, 'Repeat that'. Or you can do a replacement; you can say, 'Every time I go forward, do the same structure again,' so it branches within itself."'

The end result is a complex system that replicates the fractal forms in nature, and in Cinema 4D is built into the MoSpline object's Turtle mode. Having created the spline, it's then a case of using a Sweep object to give it solidity. The system can also insert Clone groups into the structure, which was used to generate the various forms and rotating shapes at the tips of the branches. But while Cinema's L-System was useful, Russell found that he needed more sophistication, which is when he turned to Houdini.

Cinema 4D has offered integration with Side Effects Software's app since R16, enabling you to build assets in Houdini Engine and then load them into Cinema. "The setup was straightforward,"' says Russell. "I created a fairly simple L-System node tree in Houdini then wrapped it up as an asset with options to allow me to clone elements onto the system in Cinema."'

As the camera moves on we see more geodesic shapes that resemble the growth pattern of sunflower seeds or Romanesco broccoli, both of which follow a spiral based on the Fibonacci series of numbers. To build these, Russell employed the Golden Ratio: "If you put a seed or a point at a certain number of degrees [it's 137.5° – the 'Golden Angle'] and repeat that enough times and keep pushing the pattern out, you start to get these Fibonacci spirals or these really organic structures."' He tried various methods to achieve these shapes, including particles and cloners, but mainly used an XPresso setup to draw a spline and then clone onto it or sweep or extrude it.

"MoGraph allowed me to go so far,"' admits Russell, "but when it came to really heavy geometry I built a Houdini asset that allowed me to create even more complex structures. The system took a base piece of geometry – a cube, tetrahedron etc. – and then 'cloned' these shapes onto its own surfaces. Each generation cloned more of the same shapes but smaller in scale. So it created a kind of fractal crystal growing asset. In Cinema 4D you could simply swap in your base piece of geometry and set the number of generations it should grow by. It was interesting to see how the overall shape changed by simply changing the base geometry. Once I was happy with the crystal shape I then baked it out and put it in a Fracture object to animate it."'

The overall poly count of the scene (which tops out at around four million) was kept under control by baking each model out to an Alembic file. "With the crystals and the crystal animation, I would have a really, really heavy MoGraph setup, but just in another document,"' explains Russell. "You export that as an Alembic file, render it to disk, and then just pull it back in. You can have a huge amount of geometry. It's the equivalent of doing an edit with loads of really good HD footage, that's all on your drive."'

As the reveal continues, we see a geometrically patterned landscape, based on the mathematics of tiled structures. Wikipedia provided a rich source of SVG (Scalable Vector Graphics) files that Russell plundered, and then imported using the free CV-ArtSmart plugin from Cineversity. He then deformed the extruded tiles using the sculpting tools to create the curved overhangs at the edge of the landscape.

A key element of the sequence is the mathematical machine with two rotating cogwheels that appear around the 1:25 mark. The upper wheel has 29 teeth, while the lower one has 17, representing the piano, which plays a 29-note sequence on a 17-note rhythm. "Because they're prime numbers, they don't really ever intersect. So that kind of creates a sense of timelessness, which is what the composer wanted."'

Russell describes how he went through the music, keyframing the piano chords and then using those keys to drive the various objects and mechanisms via XPresso. "It's all based off one hand-keyframed element,"' he says, "which drives the machine in the middle. Each time it moves on one of the setups, it changes a random seed of something. So the cloner expresses in a different way each time; sometimes it's used as a trigger, sometimes it drives it directly, sometimes it's used as a position or whatever. There's a number of fun and fiddly little setups which just push everything."'

The sudden release of a swarm of cicadas – achieved via X-Particles – references the fact that this use of prime numbers also appears in nature. The North American periodical cicada only appears in its adult form after either 13 or 17 years, living 99% of their lives as juveniles underground. They emerge en masse on a specific day in vast numbers, mate, lay their eggs and then die a few weeks later. Its been suggested that the timing of this appearance – both 13 and 17 are prime numbers – is to prevent synchronisation with any predatory species.

The camera continues its languid journey, revealing a pair of watchtowers and a large wall – referencing the POW camp in which the composer was confined. Again, both structures have their bases in mathematics. The tower structure is defined using the Mongean Shuffle – which Messiaen used to arrange notes in Ile de Feu II – in which you take a card from the top, one from the bottom, one from the top, one from the bottom… and so on. Russell used this shuffling pattern for the intertwining lattice of the tower. The wall, which appears to be made up of randomly shaded blocks, is actually based on the number of different rhythms you can make from eight beats (which is 34) repeated around its perimeter. Indian musicians studied the number of rhythms you can make from sets of beats and as the size goes up, the result is always a Fibonacci number. Indeed, the sequence was suggested by Indian scholar Acharya Hemachandra in 1150, some 50 years before Fibonacci presented his work in the mathematical treatise, Liber Abaci.

Eventually the full scale of the floating island is revealed, and again, while it may just appear like a large geometric mass, its origins lie within the music. "I used Houdini's audio analysis nodes in CHOPs [Houdini's Channel Operators] to create the inverted mountain shape the island sits on. The mountain is actually a visual representation of the various frequencies of the piece of music baked out as geometry."'

As the island fades into the distance, the amazing thing about the animation is that it's all done in a single take, with just a handful of passes composited in Adobe After Effects. But despite its seeming complexity, Russell says it was quick to render, taking just eight to ten seconds per frame on his latest model 12-core Mac Pro. "It was fairly quick to render because there's really just one light, so there's no reflections going on, there's no subdivided polygons. It's quite a graphic, illustrational piece."'

The whole scene also features very simple materials, with around six shades of grey – just to differentiate the items on screen – and a degree of luminance. The final greyscale image was then tone mapped in Adobe After Effects using the built-in Colorama filter.

A separate cel-shading pass was rendered and composited over the tone-mapped image. However, because of the size of the elements, Russell needed to limit the effect: "Obviously you've got tons of polygons in the background, and the cel-shader will do a one-pixel line on everything so it gets really heavy. So I used a depth pass to show the cel effect closer to the camera. Then as you go further back, it loses that quality, so you can only see the cel shader on things which are closer to you."'

Incredibly, all the design, modelling, animation and rendering took Russell just two months to accomplish, a feat he modestly puts down to being organised and working in a structured way. "You build it as a very light wireframe and just keep everything very simple,"' he says. "Just keeping organised. So everything in layers so you can turn stuff on and off easily; just being very disciplined in that sense."' He also suggests that the sequence is simpler than it looks: "The complexity is in the synchronisation and the modelling and the ideas. I think because of the kind of thing it is, it looks more complex than it is."'

In terms of Cinema 4D, Russell cites the combination of MoGraph and XPresso as its major strength, enabling him to get up and running quickly. "Cinema 4D was great in the fact that it allowed me to quickly realise complex ideas. You can just throw stuff together and play with complex systems. Just the ability to go from sketch to idea very quickly."'

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

All images courtesy of Vicarage Studio.

Vicarage Studio Website:http://vicarage.studio ]]>news-5260Mon, 15 Aug 2016 12:12:28 +0200Science Center Visitors Explore New Dimensions https://maxon.net/en-us/news/case-studies/visualization/article/science-center-visitors-explore-new-dimensions/“What is real and what is not?” A new exhibit created with Cinema 4D – and an invitation to enter a digital dream world. We break the rules by breaking down the barriers with digital magic,” says CEO and founder, Donald Lim.

E-mmersive Experiential Environment (E3)The E3 installation invites visitors to explore new worlds using interactive and immersive technologies such as Virtual Reality and 3D Projection Mapping on a “Cube Mountain” and an interactive digital video wall environment, complete with track ball and an RFID rotary table. The exhibition was opened early last year and has already attracted many thousands of visitors. The basic idea is to reach out to local people and tourists, making science fun and bringing it to life using the latest high-technology tools and innovative educational methods.

The exciting new space features six giant screens with a freestanding 7m x 7m x 7m projection-mapped Cube on one side. When visitors enter they are immediately surrounded by the images projected on the walls – created using Cinema 4D and powered by 29 Panasonic Laser Projectors.

“We created multiple video layers, including live-captured chromakeyed interactive layers that integrate seamlessly with other real-time video on the high-resolution wrap-around screens,” says Donald Lim of DigiMagic. “Visitors can also interact freely and influence the content – for example, when you roll the drum, the video content changes from 3D wireframe to low-polygon to high-resolution raytrace rendering, and this adds another dimension of interest that makes learning fun.”

The DigiMagic team also used MAXON Cinema 4D to synchronise the 3D mapping and create content – including fish, a coral reef and a kaleidoscope. “MAXON's software made the project easier and faster,” says Adrian Chan, visual effects art director at DigiMagic, “and enabled seamless transition between the 3D mapping and Dataton Watchout.”

Interactive and immersiveThe adventurous project was a collaboration between Science Centre Singapore and DigiMagic, in partnership with exhibit designer SPACElogic and Ars Electronica.

The audience response has been fantastic so far, with visitors of all ages keen to get involved in the experience and interact with exhibits, including “What is real and what is not?” – the spectacular show on Cube Mountain. Younger visitors also enjoy being able to play with exhibits, and see the instant effects on the walls. Two of the most popular exhibits are the “giant track ball”, which is used to roll a boulder on the floor of the ocean, and Leap Motion, where visitors “steer” fish through a virtual aquarium.

“We have introduced new forms of sensory interaction in the virtual environment,” says Donald Lim, “so visitors can experience another dimension and enter a digital dream world.”

Digimagic website:http://www.digimagic.com.sg]]>news-5222Thu, 11 Aug 2016 14:43:03 +0200The Great Projection Mapping Bake-Offhttps://maxon.net/en-us/news/case-studies/advertising-design/article/the-great-projection-mapping-bake-off/Learn how a new start-up business used Cinema 4D to make spectacular cake presentations Candy And Grim Cakes got the chance to try this out by entering a charity bake-off hosted at Hoults Yard in Newcastle-Upon-Tyne. Although not very high-profile, it gave the start-up a chance to field test its projection-mapped cake sculptures, as well as gauge the impact and audience reaction, which as it turned out was huge, bagging first prize for design and second prize for taste.

Simon explained what was involved in the design process and workflow after they had been given the brief and had an idea for implementing it, "The 3D design is done in Cinema 4D. Not only is Sketch and Toon a very useful tool for creating concepts and making a series of quick modifications but using Cinema 4D I can also create an accurate 3D CAD drawing of the cake and layers, allowing me to calculate exact dimensions, quantities and which cake baking tins to use. It also allows me to engineer the cake structure should, for example, I want to create something like a gravity-defying cake and need to work out forces."

The next stage is a practical one. Simon and Fiona actually bake the cake and decorate it. They are exploring using 3D printing for this part of the process, so that at some point in the future they can print directly from a design in Cinema 4D. An Ultimaker is currently used for bespoke cutters and molds. However, Simon plans to acquire the ChefJet when it is launched in late 2017 because it can accurately print 3D chocolate and sugar structures. At the moment, though, it is not currently classified as a food-safe machine in the UK.

Once the actual cake has been made, it has to be scanned because there is some artistic license and variation in the baked version compared to the designed model. Simon detailed the process used to scan it back into the computer: "Originally we were using the Structure IR 3D scanner for the iPad with the Pro version of Skanect. However, we soon discovered that photo-based scans were more accurate than the IR scans with regard to fine detail. At the moment I am mostly using 123Dcatch but I would much sooner use a MAXON product to make it easier to deal with the data. Failing that we may upgrade to PhotoScan by Agisoft, or wait and see if anything comes from Apple's acquisition of Metaio and Primesense. For the photo capture we have a large photo tent studio and use a motorized turntable to uniformly rotate the cake during capture."

Cinema 4D was then used to create a bespoke UVW map from the 3D scan to take directly into MadMapper. Cinema 4D was also used to distort the 3D scan to compensate for projector lens distortion, as this cannot be done as yet in MadMapper.

The aim of the animation is usually dependent on the client’s brand and brief as well as the story Candy And Grim Cakes is trying to convey. However, typically the projection-mapped cake sculptures are not created as a single start-to-finish performance but are expected to play as a loop. Therefore, the challenge is always to create a flowing and highly engaging experience regardless at which stage the audience starts to watch the content. For the animation, most Cinema 4D tools got a workout as Simon explained: "To be honest, there is very little we do not use in Cinema 4D Studio. Naturally, MoGraph, dynamics and deformers play a large part in the animation process. On occasions I have also used XPresso as well as third-party plugins such as Voxygen, Transform and Signal."

The idea is that the projection mapping follows the 3D contours of the baked cake, which makes it different from any other type of projection mapping onto cake service as these typically use flat surfaces. The best experience for the watching public comes when the animation interacts with the cake, and this is what Candy And Grim Cakes was aiming for with the Newcastle event. Simon and Fiona took their experience of large-scale, projection-mapped events, such as outdoor building projections, or projecting onto stage sets and cars, into the world of sugarcraft to great effect.

In the case of the Newcastle bake-off, Simon was up against the clock trying to produce the 1080p animation, remarking: "Render time was a little intense. It took just 12 hours to create two minutes worth of content, including render time."

Having produced the animation, the next challenge was to successfully project it onto the actual cake. This requires knowing what kind of projector is going to be used and again Cinema 4D is an integral part of the process. Candy And Grim Cakes has used a range of projectors with different lens throw ratios, with different budgets and venue restrictions. Generally they need to be at least 4000 lumens and full HD for it to work effectively. Fine-tuning can be achieved in post-production and in real-time using MadMapper, although MadMapper can currently only import baked OBJ files from Cinema 4D, in addition to stills and clips.

Simon added: "Again Cinema 4D was used to calculate the projector position in relation to the cake, allowing for the projector lens properties. In some instances, Cinema 4D's cameras were used to capture the characteristic of a specific projector lens, making mapping a less painful experience. This was particularly useful with short and ultra short throw lens. On occasions we have even used reflection surface mirrors in conjunction with projectors, creating bespoke rigs to redirect the projection or reduce projection distance."

Having triumphed with the cake in Newcastle, Candy And Grim Cakes is currently working on a Mondoshawan cake (a robot alien from 90’s movie 'The Fifth Element') as well as exploring a self-contained modular system, using Cinema 4D with MadMapper/Millumin and some custom-built hardware for other cake decorators to use. As Simon concluded: "In theory, our unit could also be used in other industries such as retail and exhibition, but that is way into the future."

Duncan Evans is the author of Digital Mayhem: 3D Machines, from Focal Press.

See more at www.candyandgrimcakes.com]]>news-5196Wed, 10 Aug 2016 16:07:00 +0200Mapping a New Face with CONNECTED COLORS https://maxon.net/en-us/industries/advertising-design/connected-colors/Tokyo-based WOW studios uses Cinema 4D to combine 3D and live performance with stunning results. While Asai took the role of creative director and technical director, WOW worked as the production company for the live action and CG animations. “Face mapping is a new technique of projecting moving images onto a living face, incorporating and using the natural expressions and expressiveness of the face as part of the final result, similar to the art of makeup, which has millenniums of history,” he said.

“Each project in Intel’s promotion was to have a different theme. The theme of ‘Connected Colors’ was inspired by the connected world and Intel's concept of the ‘Internet of Things’, or IoT. The connected world has two sides - a positive side that brings efficiency, optimization, harmony and symbiosis to people’s lives, and another side recognizes the need for operational rules and administration. I wanted the message for this project to question the use of technology, and decided to create an artwork expressing the positive side of connections.

“When we focus technology on life, we see that animals, plants and insects are all based on the same structures – on DNA – and all have different colors. I used the motif of the colors that all forms of life have in nature, including humans, and connected those colors.”

Previous to the Intel project, Asai had worked with projection mapping and virtual reality on his own. His first venture into face mapping was a private project titled ‘OMOTE’, and later he did some face mapping for Japanese TV programming and music videos.

What was different about this project was the level of precision it demanded from their tracking and projection. Lack of precision would affect latency alignment and color reproduction, causing the audience to feel disconnected. Therefore Asai, WOW’s CG artist Shingo Abe and his team made substantial improvements to the latency, alignment and color systems within their production and projection set-ups, giving them much better results compared to Asai’s previous projects.

Using computers with 6th generation Intel Core i7 processors and a light projector, Asai has successfully turned the human face into a new kind of canvas. The processors are tuned with Intel’s overclocking toolkit to push their performance and achieve the level of precision Asai wanted.

Asai and Abe selected their hardware carefully. Asai said, “For tracking we used five OptiTrack motion capture cameras, which capture at 240fps, and Intel PCs. For projection we had a Panasonic projector, looking for the lowest latency, highest resolution and widest dynamic range we could achieve.

Abe said, “For the animations and elements of the animated textures I used Cinema 4D, compositing in After Effects, which is a very fast and natural workflow for motion graphics. I could produce many iterations very quickly, giving me time to get closer to the result and quality I was looking for. MoGraph also has great tools for this kind of project. I use After Effects and Cinema 4D for most of my work, combined with Photoshop and Illustrator.”

First, the artists took a 3D scan of the performer’s face, unwrapped the UVW and captured the texture of the face, and then imported the geometry and the texture into Cinema 4D. After building a 3D model of the face, they created the animations to fit to the UV of this model, and then applied the animation as a texture. This was done for each animation, and these 3D models then became the projection source, or generator.”

At performance time, the position and rotation of the performer’s face were calculated from the tracking data in real time, and we synchronized the position and rotation of the 3D model with it. Thus, an accurate, perspective view of the 3D model’s animated textures was projected onto the real face.

Abe said, “Cinema 4D’s speed was important because this was the first face mapping project we have done at WoW. The whole project had to be completed in six months, but because it took us some time to get our ideas and approaches into shape we only had one month to spend on production work. The speed of Cinema 4D really helped us.

“One of the first and most important aspects of face mapping we discovered was how sensitive viewers are to even slight changes to the face that are incongruous or out of harmony with the performance. Compared to small alterations made to landscapes or product shots, for instance, audience reactions to faces are much stronger. So we had to do a lot of testing and work rapidly through many trials.”

Cinema 4D’s MoGraph tools were especially useful in three of the animations. The first is the scanning sequence at the start of the video. Later on we see a transition from the model’s skin to a black-and-yellow lizard’s skin, and a few seconds later a black-and-red frog’s skin forms and slides off the face. 12sec, 50sec, 1.03sec

MoGraph’s Inheritance Effector was used in the scanning sequence to transition a composited wireframe as a 3D effect as it moves over the face, and then spreads out from the nose. When this was rendered with the Cel Renderer as a post effect, it created a complex pattern.

For the lizard’s skin, the Shader Effector forced a gradual change across the texture of the face’s surface. The Random Effector was used to produce the randomized, fractured black-and-red pieces of the frog skin, while the slide-off effect was created using Dynamics, which have parameters that can be set through the Effectors.

Once all of these animations, looks and effects were completed as WOW wanted them to be, precise projection data was generated using original software the team developed for this purpose. Interestingly, although this production gives you an impression of interactivity and spontaneity, the actual projected movie is fixed. Only the projection, driven by the motion capture, was interactive. This made it possible to storyboard the piece in a deliberate way.

At times, the model appears to blink and open up her eyes to reveal an animal’s eyes, or to suddenly smile. In those cases, Abe explained, she did not actually blink in real-time. Viewers are watching a projected animation - only. “We have done a lot of research and created numerous simulations for eye expressions, adjusting them over and over,” he said. “People can immediately sense even the smallest incongruity around the eyes. In daily life, they watch other people’s eyes and, from there, read slight changes in feelings. To test those projections, we used a mockup of the scanned face model, Cinema 4D and other software until we could finally create very natural blinks and smiles.”

Asai commented, “Modifying details of the face is very sensitive work. It is fairly easy to express humor or fear, or achieve grotesque results. Conversely, beautiful expressions on faces are very difficult.”
Watch Video

Nobumichiasai website: http://www.nobumichiasai.com/ ]]>news-5201Wed, 10 Aug 2016 11:34:13 +0200Discovering Your Future Selfhttps://maxon.net/en-us/news/case-studies/advertising-design/article/discovering-your-future-self/The annual Pause festival is the tech show for creative spirits. Visitors are invited per video clip, and this year’s invitation was created by seasoned Cinema 4D artist Brett Morris.The team with which Brett wanted to tackle the project included Patrick Goski, a digital sculptor, and Stephen Panicara, designer and animator. Brett and his team gathered visual elements for inspiration, which consisted of three main items: the Marble Canyon, Arizona, cliffs known as ‘The Wave’, a bust and a fluids effect. “During the design phase, a lot of what we had in our mind’s eye was doable but also presented major technical challenges,” remembers Brett. “But we knew from the start that our visual concept had great potential. It quickly became clear that the ‘wave’ formation would have to be constructed in 3D. We wanted to track a shot through the cliffs, which we wanted to depict with a highly stylized structure. We built the assets in Cinema 4D and shaped them using the Sculpt feature.”

The next hurdle the team had to clear was somewhat higher: a wax bust on whose back melting wax had to flow horizontally to form stalagmites. “Here we used a special setup for X-Particles. The first attempt resulted in a monotone look that was much too smooth. Fortunately, X-Particles is a fully integrated plugin and any Cinema 4D tag can be used with it. Simply adding a Jiggle tag gave us the irregular look we wanted for the particles as well as the natural look we had imagined!”

In the next sequence, the bust – which is used as a visual element throughout the motion response clip – had to partially melt and disintegrate into two halves that then rotated to face each other. “We already knew exactly how the melted bust should look but we had no idea how we wanted to create this look,” said Brett. “Fortunately, Patrick – who is also part of the MAXON family – decided to take on the job. He sculpted the melting bust. Since the Sculpt tag saves all sculpting subdivisions, we also had a simplified low-poly version of the model that we could animate. Without this model, the polygon count would have been immense and we would have had to do without the animation,” explains Brett. “Cinema 4D is like a dependable colleague who has your back when things get dicey!”

Looking back, Brett sees this project as something special: “This was a very ambitious project and at the beginning we thought we wouldn’t have enough time to complete the project as imagined. But somehow we were able to realize all the ideas we initially had for the project. In fact, it never really felt like “work” but rather like we were kids playing with our favorite toys – only that our favorite toy now is Cinema 4D!”

Brett Morris website:www.brett-morris.com/]]>news-5198Wed, 10 Aug 2016 11:19:06 +0200Everybody Walk the Dinosaurhttps://maxon.net/en-us/news/case-studies/visualization/article/everybody-walk-the-dinosaur/Creating anatomically accurate dinosaur skeletons for a Microsoft Kinect-based exhibit has proved something of a dream job for Stuart Pond Design The Animal Simulation Laboratory of Manchester University pitched the idea for a dinosaur simulation that would enable members of the public to control a 3D dinosaur skeleton. The system uses an Xbox One Kinect that reads the person's movements and drives the muscles in the simulator, revealing how a creature of that scale and weight might move.

The project is the brainchild of Dr. Bill Sellers – a computational primatologist – who enlisted the help of graphic designer Stuart Pond. Pond has been working as an artist and animator for more than 25 years, specialising in scientific visualisation, but also has another, somewhat unexpected, string to his bow: "My other passion is palaeontology," he says. "Especially dinosaurs. I've been fortunate to travel to digs in the badlands of Montana and been on field trips into the deserts of Utah and Arizona, but my favourite area for finding dinosaurs is the Isle of Wight! It's here we find the dinosaurs and their footprints that I've been working on for the past few years."

He explains that he was a Research Associate at the University of Southampton for a couple of years before starting his doctorate. He planned his PhD (on Early Cretaceous Wealden ankylosaurs, in case you were wondering) so he could apply the skills he learned doing scientific animations to his research. "Working on this project allowed me to work closely with the palaeontology team at Manchester, which was a real pleasure as well as a learning experience, and use my passion for the 3D work I do on a daily basis as a working artist."

"I've been using Cinema 4D since it was ported to the Mac," he adds. "It's my choice for both commercial work and research. It's important that my PhD also feeds back into my day job, as being a scientist as well as an artist enables me to communicate effectively with the medical scientists I work with and has helped develop my research skills. Learning new skills is essential for both artists and scientists, and I'm lucky enough to be able to do both. Science and art are a powerful combination!"

The dinosaur simulation is driven by a modified version of the GaitSym multibody dynamics system, developed by Dr. Sellers. It's a forward dynamics modelling program that lets the user apply forces to a skeleton, and which then uses a physics engine to calculate the motion. In the case of GaitSymKinect, it uses the input from the user's movements applied to 3D dinosaur models as a means of showcasing the real-time physics and to help understand how fossil animals might have moved.

Once Dr. Sellers had the go-ahead from NERC, the first thing was to create some high quality 3D models of a variety of dinosaur skeletons. Stuart Pond was contacted by Dr. Charlotte Brassey, who works with Dr. Sellers, and explained the concept behind the project. After a meeting to discuss the details, Pond got to work on building the skeletons.

"Making the models as accurate as possible was key," he explains, "as they would need to be rigged with virtual muscles. We also needed to keep the polygon count as low as possible so the software could animate in real time, and the models needed to be suitable for 3D printing. All the bones were modelled from scratch and to life size within Cinema 4D."

Before starting work, Pond sourced data from LIDAR and photogrammetry scans, as well as referring to scientific literature and first-hand observation of fossils and casts. "A skeleton has a lot of bones and each needs to be looked at individually to ensure it's correct when modelled," he says. "Modelling organic structures brings its own challenges and this encouraged me to think about the shape of each bone before I started work on it."

He began by blocking out the shape of each bone in Cinema 4D, using as few polygons as possible. The basic form was then taken into ZBrush, where Pond sculpted the finer details before using zRemesher to retopologise the mesh. This was then transferred back into Cinema 4D using the GoZ bridge, where he could use the native sculpt tools, and then assemble each skeleton before exporting to FBX files for use in GaitSym.

"I did this because ZBrush's sculpting tools are quick and easy to use," he explains, "and I find the sculpting workflow and symmetrical modelling much more intuitive in ZBrush, which is vital when creating a dinosaur skull, for instance. I did use Cinema's sculpt tools for adjusting many of the models though, especially the vertebrae where taking each one individually into ZBrush would have proved too time consuming. I like the fact I can move freely between the two apps as this is very important when setting up a workflow for any project that involves a lot of modelling."

Most of the bones were sculpted separately, although Pond admits that some of the vertebrae were simply adapted from ‘master' models: "These bones are repeated many times in the skeleton and whilst each bone is unique there are quite a few common features and you can take advantage of that. In these cases I would model a cervical, a dorsal and perhaps two or three types of caudal vertebrae and then adjust those in between with Cinema 4D's sculpting tools."

Cinema 4D's MoGraph toolset also came into play for replicating and positioning similar bones such as vertebrae, saving Pond hours of work. "I would replicate and scale the various parts of the vertebral column (for instance the vertebrae of the tail) using MoGraph, then convert to meshes and adjust the sculpt and reposition as required. In the case of Edmontonia I used MoGraph for much of the armour as many of the osteoderms [bony plates or scales] are pretty much the same shape, with some repositioning by hand."

Overall, Pond created around 210 bones for each of the six dinosaurs, taking about three days for each model. The armoured Edmontonia took longer because it features an additional 178 spikes and plates. "However it was worth the extra effort because it looks really cool," he adds.

GaitSymKinect and all the skeleton meshes are available for download at http://www.animalsimulation.org, so you can try them for yourselves in Cinema 4D. And despite their use in real-time visualisation, they're surprisingly high-res. "GaitSym is more than capable of handling models with this sort of poly count and greater," explains Pond. "We wanted to make these models as useful as possible to other researchers and so the mesh density was kept as low as possible whilst retaining scientific accuracy."

Clearly the combination of art and science has been the ideal project for Stuart Pond Design, made all the more special as his creations come to life within the GaitSym software. "It's always a thrill seeing your work being used in any situation and in this case it is very satisfying to know we can get actual data using these meshes. GaitSym is a remarkable piece of software and I'm very pleased the models work as planned, both in the Kinect version that allows users to control the dinosaurs and in the version Bill uses for biomechanical work. The fact they look good 3D printed too is great – I love the idea a person could print out a full-sized Tyrannosaurus rex if they wanted to."Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-5215Sun, 31 Jul 2016 16:16:00 +0200Cinema 4D Makes Game Characters Part of Virtual Realityhttps://maxon.net/en-us/news/case-studies/games/article/cinema-4d-makes-game-characters-part-of-virtual-reality/SEHSUCHT uses Cinema 4D and the Unreal engine to create one of the first-ever interactive VR music videos.In addition to affecting the technical realization, the VR production also influenced the project’s design and conception, as Mate describes: “In short, VR means that you have to work without cuts and the viewers themselves determine how the scene is viewed. You can’t frame a scene as you usually would. The viewer’s attention has to be drawn by properly setting up the scene and giving the viewer more time to explore their environment.”
After SEHSUCHT had completed the video’s conceptual phase, they started designing the characters. The characters were modeled in Cinema 4D and loaded into the online auto-rigger Mixamo in whose animation library SEHSUCHT found the movement sequences they were looking for. The project required a video game feel and the Mixamo tool offered 80% of what the team was looking for as presets.
Cinema 4D’s character tools were used for fine-tuning. The animations were exported from Mixamo and converted to Motion Clips in Cinema 4D. These Motion Clips were then combined and mixed to create seamless transitions between the sequences. SEHSUCHT used the Motion Layers to animate details such as special hand gestures, which weren’t included in the motion capture clips. “The Motion Layers were a great help for the character animations because they made it possible to modify and relocate the motion capture clips like you would in a video editing program,” explained Mate. SEHSUCHT used Microsoft Kinect in conjunction with iPi Soft Motion Capture to create additional motion capture sequences.
Before the 3D assets could be exported from Cinema 4D to the Unreal engine, the Bake Object feature had to be applied so the animations would run smoothly in the game engine.
With their interactive VR music video, SEHSUCHT impressively demonstrates just how fascinating VR can be, which is why this project was selected to participate in the Kaleidoscope VR World Tour, which showcases the best VR experiences in cities around the world. This nomination shows that Cinema 4D is an ideal tool for creating 3D VR content!
Note: The interactive VR version for Oculus Rift is available for free at the wearvr website. Visit YouTube to see the non-interactive video version.Credits
MODERAT „Reminder“
Produced by SEHSUCHT BerlinScreenplay and direction: Mate SteinforthConcept, co-art direction and graphics by PFADFINDEREI]]>news-5106Wed, 13 Jul 2016 21:43:25 +0200Virtual Escape From the Daily Routinehttps://maxon.net/en-us/industries/games/virtual-escape/A lot of stressed out urbanites long for a break in unspoiled nature. This longing was the inspiration for the 360° VR video ‘Longing for Wilderness’.news-5105Wed, 13 Jul 2016 13:47:02 +0200BBC TWO: Branding with Characterhttps://maxon.net/en-us/industries/broadcast-motion-graphics/bbc-two/How Cinema 4D, Cineware and Adobe After Effects enabled Vincent London to create its humorous BBC Two idents quickly and efficiently. news-5070Fri, 15 Apr 2016 10:41:00 +0200Get Lucky with Cinema 4Dhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/get-lucky-with-cinema-4d/Sport betting is on the rise everywhere and China is no exception. Betting offices are creating adverts to attract an increasing number of players – and Cinema 4D is at the creative forefront.When China Sports Lottery approached Cinema 4D artist Yan Ge to create an advert for sports betting, several parameters had already been defined: They wanted to combine 3D elements with live footage. In the confines of a warmly furnished workroom, a narrator was to tell the story of how a lucky guy had won the lottery. This had to be illustrated in a cartoon style and take place across the length of the desk. The scene begins with an actor who tells that he won the lottery and then lays the ticket on the desk.

The ticket then unfolds and transforms into a betting office, which is where the winner bought his ticket. This and all other scenes that unfold on the desk were created in Cinema 4D and composited into the live footage. In the course of the film, streets and tunnels emerge, busses and taxis drive by, a thunderstorm erupts, restaurants pop up and speech bubbles constantly appear that comment on what is happening.Combining 3D elements with live footage in this manner is something that has only rarely been seen in Chinese television – which is exactly why China Sports Lottery decided to use this method. The tool used to achieve this was Cinema 4D’s Motion Tracker feature, which has been available since Release 16. With the Motion Tracker, markers can be added to existing footage, which are then tracked by the software. This movement information can then be passed on to a camera in Cinema 4D and in turn mix the virtual scene with live footage. If the lighting mood of the live footage needs to be applied to the rendered elements in the virtual scene, HDR images can be used in conjunction with image-based lighting to do so.

That sums up the advert’s basic concept in a nutshell. However, a wide range of additional tools was needed to complete the intricate story. Artists Rong and Ma handled everything related to Cinema 4D in this project. In the scene in which a bridge appears out of nowhere, they worked extensively with the Polygon Warp to Spline function, which made it possible to design the bridge elements before creating the animation. This made it easier to make changes without affecting existing animations.The final sequences are the advert’s crowning element. The lottery-winning hero floats in a balloon together with his sweetheart over a picturesque virtual landscape at the end of the desk. Rong and Ma made extensive use of Cinema 4D’s Hair feature to create and animate the vegetation on the cliffs and hills as well as the grass from which the balloon took off. The artists remarked: “We also used XFrog vegetation but the lion’s share of the work was definitely done in Cinema 4D!” The clouds in these sequences and the storm clouds earlier in the animation were created with Thinking Particles using a semi-transparent shader. After the elements had been rendered, they were composited for the final film in After Effects.

Looking back at the project, Rong and Ma say, “Working with a feature like the Motion Tracker is not as easy as it sounds. Because the scene had to be done in a single take, we had to track the footage several times until we had all markers properly set so the camera movement could be accurately reproduced and in turn carried over to the virtual camera in Cinema 4D.”

Rong and Ma are students of Yan Ge, who is a certified Cinema 4D instructor:http://www.maxon.net/education/service-menu-partner/partner-locator/certified-instructor-ci.html]]>news-5064Mon, 04 Apr 2016 12:19:00 +0200Cinema 4D Jumpstarts Chuck the Chameleonhttps://maxon.net/en-us/news/case-studies/games/article/cinema-4d-jumpstarts-chuck-the-chameleon/Most games that run on multiple platforms are developed using Unity 3D and creating content for these games can be easily done using Cinema 4D!What do seasoned gamers with years of 3D experience do when their ideal game is not available in the app store? They create the game themselves, like the 3-person team that created the adventures of Chuck the chameleon, whose mission is to explore 3D levels and gather soap bubbles along the way.

After the initial planning phase, the right tools had to be selected with which to realize the project. The team decided to use Unity 3D as a development tool since it could also be used to develop for Android systems in addition to iOS. Graphic artists Alessandro Maniscalco and Elisa Salgarelli created the 3D elements using Cinema 4D.

Cinema 4D had everything the team needed to create all in-game elements but programmer Matteo Porchedda explained that the devil lies in the details: “The exported elements as such worked well but they were not memory efficient! If multiple instances of a given object were in a scene they would be converted to geometry on export, which in turn dramatically increased their file size. Therefore, I create a tool using Python that converts instances in Unity 3D to Unity Prefabs. In Unity 3D we used a second tool to quickly and easily position the Prefabs, if necessary.”

Cinema 4D gave the artists a wide range of freedom for designing levels, which also posed challenges. As Matteo stated, “Since we were developing for tablet devices with limited performance, many of the levels created by Alessandro and Elisa proved to be too detailed and had to be optimized. Fences (Ital.: steccato) were very memory intensive and many could be removed without affecting the character of the levels too much. After removing many of the fences throughout the game I was eventually given the nickname ‘Mr. Steccato’!”

After the objects had been modeled, texturing and lighting was next on the list, all the while making sure to get the best out of each individual object. This was done using MAXON’s mesh painting tool Bodypaint 3D, which let UVs be unwrapped very quickly. “The UV function Stretch to Fit Borders proved to be an invaluable feature,” remarked Matteo, “that made it possible to create seamless transitions between various textures!” Lighting was done using Global Illumination and shadows were added in Cinema 4D. Light Mapping and Texture Baking were used to generate new texture maps using these scenes, which could be used in Unity 3D. In all, over 2,000 textures were created for Bubble Jungle, of which many, such as Chuck’s and the monster’s, were hand-painted in Bodypaint 3D.

It took about two years to conceive, design, texture, export, implement and test all 59 scenes for the game. Bubble Jungle has now been completed and Chuck the chameleon can be experienced on iOS or Android devices. The makers are also planning on making Bubble Jungle available on other platforms.

]]>news-5102Wed, 24 Feb 2016 15:18:00 +0100Monitoring Spectrehttps://maxon.net/en-us/industries/movies-vfx/spectre/How Vincent London used Cinema 4D to deliver a host of displays and infographics for the latest Bond movienews-5016Wed, 24 Feb 2016 12:45:00 +0100DBLG Designs a Flight of Fantasy for Decca Classicshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/dblg-designs-a-flight-of-fantasy-for-decca-classics/“The brief was refreshingly open and we were given a real chance to experiment&quot;"Working to the incredibly moving Arvo Pärt track 'Spiegel im Spiegel' we started to explore the emotional journey we're taken on by the sound of the strings," explains Jason G. Wiley, designer with London agency DBLG. "The brief was refreshingly open and we were given a real chance to experiment and take the visuals somewhere we felt was quite special."
The core idea, he explains, was to take the viewer on a voyage through various stylized abstract landscapes, with the intention of experimenting with animated textures as a feature of the final sequence.
The two-minute animation is actually one long continuous shot, which was inspired by a similar camera move in the 1979 Russian film 'Stalker.' However, this posed its own problem in managing the large file size. "We split the scene into five parts to be worked on individually," says Wiley, "then compiled them into a single scene for rendering. It was challenging to find out what the whole animation would feel like as one shot."
The piece begins with a violin made of glass, which Decca wanted to look like a classic Stradivarius. "We began with a basic model of a violin we had bought and then used Cinema 4D's modeling tools to adapt it accurately using references sent by the client. After we had finished accurately modifying the violin model we then cut it into two pieces!"
For the lighting, Wiley says the idea was to keep the scene bright and clean, mimicking an "other-worldly art gallery." This was achieved using just two area lights, without any high dynamic range images. "The first area light is designed to illuminate the entire scene equally," he explains, "and is directed downwards above the scene and casts soft shadows. The second area light is off to the right and its job is to catch specular highlights and enhance the geometry and bump maps.""We thought the glass might be tricky," Wiley adds, "but were fortunate in the way the glass reflected and refracted the environment around it, which really helps to add realism to the texture."

The violin strings, which extend throughout the scene, were created using a Circle object swept along a spline. The End Growth value was animated in the Sweep object to make the strings appear as if they're growing. For those sections where the strings needed to sit on the landscape, a Spline Dynamics Hair tag was applied to the splines so they would fall due to gravity and conform to the geometry beneath.

As the move progresses we encounter a colorful swirling landscape with the appearance of flowing liquids. However, on closer inspection, the geometry doesn't actually move. Instead, it's a simple yet surprisingly powerful feature buried in Cinema 4D's Material Layer system.

"We'd been experimenting with animated textures within Cinema 4D and found some really interesting stuff within the Layer options of the textures themselves. We settled on using the Distort option within the texture effects, which enables you to distort textures driven by noise. It's very simple to use and gave us the flexibility to animate the textures in lots of weird and wonderful ways. We used this same animated texture for our bump map, which gives the feeling of a kind of fluid moving over the geometry."
After the smoothly flowing colors we arrive at a more angular area, with jagged strings and chunks of glass. The landscape is punctuated by animated scrawls, that were captured using a graphics tablet in Photoshop. "We really liked the idea of drawing these live and recording the screen," says Wiley, admitting that later on during the color grading process that they noticed that the cursor pointer had been left in one scene! "After converting the video to a JPG sequence it was possible to use this animation as a shader. The scribbles animation has an alpha channel so we camera-mapped this onto the floor to avoid any UV mapping issues."

Our journey continues over a gently billowing cloth with an animated texture. Unsurprisingly, Cinema 4D was used to animate it. "We love how quickly you can get results from Cinema's cloth simulations," comments Wiley. "It was a lot of fun playing with all the wind settings - in the end a gentle breeze did the trick. The textures on the cloth are animated in the same way the landscape is using the Distort effect within the texture. However, we didn't animate the bump map for this one."

The end of the piece sees the strings winding their way onto the violin's tuning pegs, an effect achieved by using the same Circle object swept along a Helix spline. The Sweep object's Growth values were then keyframed to match the rotation of the tuning peg.

V-Ray was the render engine of choice, combined with Cinema 4D's Standard Renderer, which provided the hard shadows. Global Illumination was used throughout, which Wiley says was crucial for adding the realism they were looking for. To generate the hundreds of HD 1080 TIFF images, Wiley turned to RebusFarm, the external rendering service.
Post work for the sequence was quite minimal: "All the textures are animated within the Cinema 4D scene, and we chose a single camera move for the video, so no editing was needed. We aimed for the best render we could out of Cinema 4D to avoid extensive post work, which might compromise the clean look of the piece."

Render passes included specular, area shadows, depth and a matte for the strings. Additional work was carried out in Adobe After Effects and included compositing in the hard shadows from Cinema 4D as well as adding depth of field along with motion blur. "We also directed a color grade at Envy Post Production to really enhance the bright, clean look and wild, vivid colors," Wiley adds.

Cinema 4D provided pretty much all of the tools DBLG needed to complete the project in just two months using a single iMac. "It lets us express our imagination. It's like capturing a dream," claims Wiley enthusiastically.
Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.
All images courtesy of DBLG.]]>news-4980Tue, 19 Jan 2016 10:33:00 +0100The Body as an Abstract Animation https://maxon.net/en-us/news/case-studies/visualization/article/the-body-as-an-abstract-animation/As part of a school project, students learned how to use Cinema 4D while creating a metaphoric fantasy voyage through the human body.Mohole is a renowned media school in Milan, Italy, that teaches artistic forms of expression. Students learn all aspects of creation from film to photography, graphic design, comic illustration and 3D animation. Among the programs used to teach animation is Cinema 4D. The primary focus lies on architectural visualization, motion graphics and 3D character design. The Forbhidden project was created for the Milan Design Week, at which Mohole wanted to present its student works.

Students at Their PrimeThe basic idea was to use visual metaphors to showcase the wonders of the human body. A team of four students was responsible for all phases of production, which lasted four weeks and was realized with Cinema 4D – modeling, texturing, animation, lighting and rendering. The students were part of a two-year 3D masters program, of which they had already completed the first semester. The skills they had attained in the course of their studies gave them the ability to handle all project requirements with ease but also master complex challenges that the project posed. One of the many challenges was optimizing the scenes and materials in order to shorten render times.Seamless WorkflowWith Cinema 4D it was easy for the team to separate the scene into layers using complex multi-pass renderings. The passes contained different channels, e.g., alpha and depth of field, which made it possible to add more effects in the compositing phase, which themselves took less time to render. This made it possible to add depth of field effects to backgrounds in After Effects. Cineware was used to fine-tune the scene in After Effects. Finally, Adobe Premier was used for color correction, cutting and adding audio tracks.

Student tutor Enrica Paltrinieri characterized the work with Cinema 4D as follows: “Cinema 4D is so easy to understand and learn that you can quickly realize a project and experiment with various approaches to solutions and workflows without spreading yourself too thin. The seamless integration with After Effects lets students concentrate fully on their creative work without having to deal with bothersome technical issues!”

Creating on-screen graphics for actors to interact with and relate to is a specialty of London-based Territory Studio. However, when those screens have to be scientifically accurate or coherent in terms of existing NASA technology and practices, it becomes a whole different ballgame. That was the challenge facing David Sheldon-Hicks, founder and creative director of Territory Studio. Fortunately, having already worked on massive sci-fi projects like Jupiter Ascending and with Ridley Scott on Prometheus, David had a good idea of what was going to be required and how Cinema 4D could help make it happen.

In the seven months it took to create The Martian, five artists at Territory created around 400 screens across eight different sets, and Mission Control alone featured almost 100 screens. The displays and animations were all for live playback on with which the actors to engaged. Most of them were interactive as well.

The film was based on Andy Weir's best-selling novel about stranded astronaut Mark Watney. It's set 20 years in the future and tells the story of NASA's third manned mission to Mars. The key concept was that the plot, and thus the on-set screen displays, was fixed in real science rather than fanciful popcorn sci-fi. This meant that Territory had to research the science being used before getting started with any of the designs.

David explains what was involved: "It was an incredible amount of research, from poring over reference photos of Mission Control and JPL at NASA provided by Dave Lavery, Program Executive for Solar System Exploration at NASA, to vehicle schematics, and the Pathfinder transmission specs. Plus, how NASA received and decoded messages. And, knowing that NASA is always one step ahead of current technology, we had to imagine ways to represent things, thinking about technologies they are testing now or have not even started developing. We also needed to understand what the on-screen data feeds meant and why they were necessary and when they came into play. Sometimes it felt like a crash course in rocket science and space exploration – it was definitely interesting but it also made the work more challenging to try and get our heads around."

Art Director Marti Romances pointed out how everyone was on a learning curve: "I have to say I learned a lot about things and how they work that I hadn’t known before this project. For instance, knowing what material the camera is pointing to by shooting a laser at it and analyzing the light reaction from it. Learning how all those things work to be as authentic as we could was a big challenge."

Peter Eszenyi, head of 3D, explained what he had to do: "My role was to design and deliver 3D plates, renders and animations to be used in the on-set screens. Initially, the brief was to develop a look and create animations that show the location of the HAB on Mars in different lighting conditions and states. One of the crucial moments in the film is when NASA figures out that Watney is alive by looking at satellite images of the area and realizing that the solar panels have been cleaned. This initial brief was extended to show the giant storm around the HAB, the rover driving through various regions on Mars, creating the pictures that the Pathfinder beamed back from the surface of Mars, creating various 3D elements for the screens such as launch simulations, potatoes, soil samples and so on."

The team used Sketch and Toon plus the standard cel renderer for some of the wireframe meshes, and they used X-Particles with the standard renderer to render the more particle-based assets like the potato displays, soil sample screens and the dust storm.

One of the more challenging areas was the actual NASA Mission Control set because of the need for realism but in a slightly futuristic setting. David Sheldon-Hicks explained how Territory approached it: "With a brief to remain authentic to current data conventions, we researched all the screens that Dave Lavery had forwarded. We studied what data was prioritized and when, how that was organized and depicted on screen and in the Mission Control space, and how the crew interacted with it. It included what commands were given and how that changed the data display. We also talked to NASA about how they think that will evolve over the next 20 years. Once our research had been finished, we created a visual language that was very true to the data requirements and the spirit of NASA's current Mission Control. The backgrounds we chose were black and dark blue with white fonts and light blue indicators. Red was used to highlight mission critical data and indicate warning status. The overall look of the interface is serious and authoritarian but the hierarchy of information is clearly readable to tie in with story points."

On the screens where the terrain is being displayed, along with the dust story, it steps away from the technical, scientific look to show actual landscapes. To create these, Territory obtained a very low-resolution model of the area in Jordan, which doubled for the Mars surface. The team used the plugin DemEarth for that. However, since the available dataset was not detailed enough they remodeled the area based on satellite and location photographs. Sub-polygon displacement was used for the mountains. Proxy models were used for the HAB, the solar panels and the Rovers and those were swapped with the high-resolution models supplied by the production. Since the exact coordinates of the actual location as well as the shoot dates were known, it was possible to create accurate lighting setups for the relevant hours. X-Particles was used to simulate the base of the dust storm and a smoke simulation on top of that was created with TurbulenceFD.

As the screens were live on set for the actors to interact with, last-minute changes were sometimes requested. One example was to change the size of a crater, which was initially around a kilometer or so in size, to be 100 kilometers wide. Cinema 4D made it easy to remodel the crater and when the size was approved, details could be quickly added to the base mesh. The standard renderer was used, so it was very fast to create the final frames with the changes applied.

For the interactive screens, most requests from the production were for image sequences for Territory's on-set engineering partners at Compuhire to program. They had to be programmed to display images that refreshed within in a realistic time-frame, and the animations and simulations were usually reduced to one image every three seconds. Other screens were programmed to simulate typing scenes from the crew's laptops or Mission Control computers. So no matter what the actors were typing, the right message appeared on screen.

David explains the workflow required to create the interface graphics for the MAV simulator that astronaut Martinez uses from the main ship (the Hermes): "It was all designed in Adobe Illustrator and featured some interactive buttons that had to be pressed on-set following a sequence. We isolated all the interactive buttons to be programmed for the on-set performance. In the middle of that screen we had to render a visualization for each stage of that remote-controlled probe. So we animated and rendered each different stage of the lift off of the MAV in a way that could be triggered on-set, connected with those buttons. The MAV 3D renders were done in Cinema 4D using a wireframe pass to achieve a more realistic look."

With all the design work being done during the day, most of the rendering work was done overnight on individual 3.5GHz, six-core Intel Macs. After seven months of intense science, Peter Eszenyi, head of 3D, summed it up with, "As always, Cinema 4D was a reliable tool that gave us all the options we needed to create the huge number of screens for Ridley Scott's The Martian."

Territory was naturally delighted when The Martian won 'Best Motion Picture Musical Or Comedy' at the 2016 Golden Globe Awards, held at the Beverly Hilton in Los Angeles, California. It was the second win of the night for the film. The Martian actor Matt Damon took home 'Best Performance by an Actor in a Motion Picture Musical or Comedy.' Ridley Scott was also nominated in the 'Best Director - Motion Picture' category. See more at: www.goldenglobes.com.

Duncan Evans is the author of Digital Mayhem: 3D Machines, from Focal Press

You can see more of the on-set screen graphics from The Martian here: www.territorystudio.com]]>news-4837Tue, 05 Jan 2016 15:52:00 +0100Getting a Slice of the Actionhttps://maxon.net/en-us/news/case-studies/advertising-design/article/getting-a-slice-of-the-action/Creating a visualization for a product that hasn’t bven been finished requires a flexible and powerful toolset. Here’s how Cinema 4D made it happenBy Duncan Evans

Fundraising websites like Kickstarter are a great way of getting projects off the ground that would otherwise never see the light of day. One such idea was Slice, a media player powered by a Raspberry Pi core. The problem was getting people to back a hardware device that hadn’t been completed, yet, especially when the look of it was going to be a key selling point. That’s when the team behind Slice, Five Ninjas, turned to Toby Pitman to create a 30-second animation in HD resolution that showcased the features as well as the physical form it would take.

For Toby, that meant building an animated visualization of a media player that, at the time, only existed on paper. The product had been designed but was in the process of being prototyped. Toby had to show prospective backers on Kickstarter how the unit was put together and explain the basic feature set in a compelling way to get them to fund it.
Toby explained the main objectives: "Getting everything to scale was probably the most important goal. There were many separate pieces like the hard drive, ports and Pi Compute Module that were all modeled individually and had to fit together on the main circuit board. As the product was did actually exist, all the pieces were modeled to their correct real-world measurements so they all fit together properly."

In practice, that means a combination of methods to get the measurements for the various parts. Tobyx received a CAD file of the casing that underwent re-topology so the dimensions of the real-life version were brought in as a guide. To back this up, he was also sent a real circuit board, which was bare at the time, so it could be accurately measured. The same process was repeated for the Pi Compute Module.

Toby explained how he got the rest of the measurements: "Pretty much everything else was extracted from manufacturer’s data sheets. I was sent part numbers they intended to use and simply looked them up and got the measurements. Many parts have a technical blueprint that you can use as a guide for modeling."

Then it was over to the Cinema 4D modeling tools themselves. Toby photographed the circuit board and traced over it in Adobe Illustrator to create the layout. The result was imported into Cinema 4D as splines and then extruded. He revealed that the hardest part of the process was, "probably the hard drive, which, ironically enough, you don't really see. Most of the parts are really simple in nature but some of them, like the ports, look complex as they have a lot of extra detail bolted on. It's mostly so small that you can't see just how basic it is. The most time-consuming part was manually positioning the resistors and chips on the main circuit board. I'm glad I wasn't the guy who actually had to weld them for real on the prototype."
The real challenge was deciding which tool was going to give the best result in the shortest time and this is where a good working knowledge of hard surface modeling paid dividends. Toby explained, "Sometimes you can just use basic primitives for parts, like the chips and resistors, and sometimes you just have to get stuck in with the Knife tool and build up the detail. Learning about edge flow and topology are probably the best tools in your arsenal. Just a simple thing like modeling with symmetry will save you a bunch of time."

Once the circuit board was UV unwrapped, Toby then built the Color, Bump and Specular texture maps in Photoshop by lifting the details off the hi-res photos he’d taken. The circuit board alone was made up of nearly 300 separate components.

The other feature that really saved time was MoGraph, which was mainly used for the duplication of parts like the LEDs. Toby built just one then used Object mode to clone it onto some simple polygon strips. The lighting effects themselves, a key part of the selling pitch for the project, were done using Adobe After Effects. The polygons at the center of the LED were split off and assigned to an Object Buffer. This Buffer pass was then masked using a simple animated matte layer to reveal it fading on and off around the strip in After Effects. Toby then used the VC Optical Flares set to react to the Luminance value on the Object Buffer animation to produce the simple light effect.

In the animation, the camera pans through the scene as the various elements are revealed to show off what’s inside the Slice box. Toby admitted that camera work isn’t his strongest skill but here Cinema 4D made the job relatively easy as he explained: "I used a Camera Morph in conjunction with Target tags to move between the various angles I wanted to catch. That's a great tool, especially for me. The focus distance was set for each camera and then rendered out as a Depth Pass. This was used with Lenscare in After Effects to create the blurred depth-of-field effect."

The project also gave Toby the chance to try out a feature he wasn’t familiar with. He cobbled together an assortment of different Macs to go with his main iMac 3.4GHz i7 unit and used Team Render to network-render the animation. It saved a huge amount of time and worked flawlessly. Even so, after spending 35 hours rendering he noticed that he’d overlooked an intersection in the geometry. Fortunately, he didn’t have to do it all again, explaining, "Luckily I render to PNG sequences so I could just tweak the offending frames and re-render them while still working in After Effects on the final comp."

Toby concluded, "The more I use Cinema 4D the more I realize what a killer bit of software it is."

Duncan Evans is the author of Digital Mayhem: 3D Machines, published by Focal Press. All images courtesy of Toby Pitman.

Mockingjay Part 2 is the fourth and final epic science fiction dystopian film installment in the The Hunger Games film series, and the second of two films based on the novel Mockingjay, the final book in The Hunger Games trilogy, by Suzanne Collins. The film follows Jennifer Lawrence who reprises her role as Katniss Everdeen, the reluctant leader of the rebellion against the Capitol district in the post apocalyptic nation of Panem who must bring together an army, while all she holds dear hangs in the balance. The film was released on November 20, 2015 in the United States.

Hristova explains that the studio’s primary focus on Mockingjay Part 1 was to generate interactive on-set monitor graphics that would stream live during shooting and that for Mockingjay Part 2 the team was given a more extensive range of post-heavy challenges that included the exclusive use of Cinema 4D to create two main hologram effects and to composite hero monitor screens that required story specific information and animations to advance key plot sequences.

“Our aesthetic reach in this film was much broader and required us to deliver a more advanced design style as we were asked to create detailed 3D hologram content for the Capitol – marking the first time that The Hunger Games film audiences could get a close-up look at the capital city of Panem and the Capitol, its chief legislative building, as well as a collection of other Panem districts,” says Grunfeld. “We were also tasked with generating the hologram for the “Nut”, a strategic military compound located in an underground mountain bunker which houses all the military equipment for the Capitol.”

“Although Lawrence and the Lionsgate team had very clear ideas in regards to the aesthetic needs of each corresponding environment in Mockingjay Part 2, the collaboration gave us plenty of liberty to experiment with style and feel,” said Grunfeld. “To meet the technical complexities of a movie of this scope demanded that we rely on the various toolsets in our Cinema 4D pipeline which provided us the creative flexibility to explore and tweak story notes throughout the project, critical in delivering a number of beautiful storytelling graphics on some very challenging design sequences. In combination with After Effects, we were able to generate external compositing tags and compositing project files throughout all of the hologram sequences that further resulted in an efficient workflow.”

The “Nut” hologram, in particular, challenged the Cantina Creative team to deliver a fully detailed, 3D photoreal and precise mountain range with only a single 2D matte painting to use as reference. 3D artist Jayse Hansen who worked with Cantina on Mockingjay Part 2 in preproduction and on location designed the initial hologram concept including the entire interior of the Nut (crew quarters, armory, hovercraft hangars, etc.,) explains: “Leveraging the landscape generator in Cinema 4D to block off shapes and the expansive sculpting tools in the software provided great creative latitude so that I could hand off a number of very precise hologram looks and models of the mountainous terrain surrounding the “Nut” and Capitol from different perspectives,” said Hansen. These concepts were then further refined by Stephen Morton who created a final workable version that helped the team work out positions for important story elements for several hologram sequences.

“In addition to the sculpting tools which proved critical in allowing artists’ to further customize and organize the project in layers and work nondestructively to churn out dozens of iterations of mountain topography, we used Cinema 4D to render a number of passes, namely a few variations of cel renders, layered procedural shaders, and a depth map. We also exported light and null data to use for additional elements in comp,” adds Morton.

“The time effector tool was also extremely useful on the mountain hologram to rotate topography designs uniformly across nearly 30 shots in the sequence and made it easy to explore rotation speeds and successfully arrive at something that both the client and we were pleased with.”

Cantina Creative website:www.cantinacreative.com/]]>news-5933Tue, 01 Dec 2015 15:28:00 +0100Man vs Machine: Fusing creativity with technology with the help of Cinema 4Dhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/man-vs-machine-fusing-creativity-with-technology-with-the-help-of-cinema-4d-1/For the Cannes Gold Lion-awarded design studio the creative process is all about the synergy that occurs between imaginative concepts and the tools that are used to implement them. 2081“Technology and ideas just go hand in hand,” begins ManvsMachine managing director Tim Swift. “We ended up selecting the name because it sums up our approach – individuals coming up with unique concepts, and then driving them forwards using the latest technology.”

One recent example of this philosophy in action is the recent Air Max Day campaign for Nike, which features CG versions of the footwear being morphed, pulled apart, and explored in the kind of detail only attainable via the digital realm.

In order to create the complex visuals on show, ManvsMachine turned to Cinema 4D, thanks to its balanced offering of power and simplicity. “It’s just a really well-designed tool for what we do,” explains Matthias Winckelmann, head of 3D at ManvsMachine. “The initial burst of ease of use is great, and then as you explore there are the more in-depth functions too – it can do the design work, but can also scale to the heavier VFX content.”

In order to accurately replicate the trainers for the Nike commercial, ManvsMachine turned to 3D scanning technology. “We use it for almost every Nike project, because it’s really important that the shoe looks exactly like the real shoe,” explains Winckelmann. “The Nike design team has spent a lot time designing the product, so we want to rebuild it in a 100 per cent accurate way: there’s details in every joint, seam and component of the shoe that we need to represent correctly. You can achieve that by modelling, but 3D scanning produces a one-to-one replication in a much shorter time frame.”

In order to morph the shoe for the commercial, ManvsMachine turned to Cinema 4D’s surface and mesh deformers. “To do that, we would build a low poly representation of the 3D scan, and then bend and morph that. The surface and mesh deformers in Cinema 4D are really useful, as they help us to work with assets like that which have a really high high poly count. However, when it comes to deforming shoes, in many cases we use different tools for every deformation – sometimes that’s the surface deformer, sometimes the mesh deformer, sometimes a PoseMorph combined with sculpting, and so on.”

It’s this flexibility that makes Cinema 4D the main pipeline tool at ManvsMachine. “During the create process we usually utilize almost every feature Cinema 4D has to offer, so it’s difficult to pick specific features that really stand out!” says Winckelmann. “The main strength of Cinema 4D for us is that we can combine all of those different features – in most situations, when we come up with an idea or a design, we are able to find a way to make it happen using Cinema 4D. It’s versatility is the main reason that we turn to it so often.”

Nevertheless, Cinema 4D is but one tool in the ManvsMachine arsenal – it’s the human creativity that ultimately powers its output. “It’s all about that balancing act of technology and ideas – fusing raw design and branding skills with high-end visual effects output,” concludes Swift. “It’s not just about simply making images – it’s about understanding the client, coming up with the right strategy, approach and visual execution, and then delivering that using the tools best suited to the task at hand. Cinema 4D is truly a key tool in bringing that process to fruition.”

Watch this video to find out how Cinema 4D helped Man vs Machine to create several highly acclaimed spots for Nike:

27032

ManvsMachine’s technical director Simon Holmedal and creative director Mike Alderson discuss how a fusion of technology and creativity allows the studio to flourish, and how Cinema 4D’s flexible toolset enables the company to deliver high-end work for clients such as Nike.

Man vs Machine Website: www.manvsmachine.co.uk]]>news-4898Mon, 23 Nov 2015 14:09:00 +0100Making Science Come Alivehttps://maxon.net/en-us/news/case-studies/visualization/article/making-science-come-alive/Duncan Evans discovers how Cinema 4D helped animate and tell stories for Discovery Science in a series of five-second sketchesThe concept sounded straightforward enough: use the Discovery Science logo, a pebble, as an animated character, that plays through a brief sketch, then turns back into the logo again. The project came about because Discovery Science wanted to launch a new YouTube channel for a wider and younger audience. The series of characters were meant to reflect the different areas of programming within the channel. The challenge for Pryce Duncalf of Munk Studios, who was given the assignment, was the short length of each sketch. A completed clip was just five seconds, but with transition animations at the start and end, that was whittled down to just 2.5 seconds of animation to tell the story.

There were six scenarios, but the problem right from the start was working out how to transition from the logo, which was an irregular and asymmetrical shape, to a well-formed and animated character to perform the gag for a couple of seconds, and then transition back again.

Pryce explains how the transitioning was achieved, "My first step was to rig the pebble in such a way that the shape could be animated into each different character, so the legs and arms were also adjustable in length, thickness and positioning on the body. It took a few days to get this right, but once this main rig was created, the rest of the characters took no time at all to create. All I had to do was model the props and costumes, some of which were purchased from TurboSquid."

Although Discovery Science had a rough idea of how some of the characters were to be dressed, such as using a sweatband and one with lots of gadgets, others were still open to interpretation. Pryce had a meeting with Discovery to discuss exactly how they should all look then went back to the studio to write up short scenarios for each character. Once this was agreed, some rough sketches of shapes and proportions were produced to act as guides for the 3D modeling.

Various tools from Cinema 4D were put to good use in the project for animation, light effects, object placement and destruction. There’s one scene where a character is welding a car. Pryce explains what tools were used, "The light effects with the welding torch were done using MoGraph. I simply used a thin plane object in the cloner and changed the axis, then randomised its scale and spread with a MoGraph random deformer. I set the deformer to noise with a high animation speed setting then rendered this out with an object buffer and added the glows in After Effects."

MoGraph was also used in the scene where a spaceman appears on a lunar landscape. It features lots of rocks scattered about. Here it was used to place the rocks randomly, with the benefit of being able to transition them on and off using a plain effector with a falloff.

One of the more spectacular shorts featured exploding barrels that required a combination of methods, as Pryce revealed, "The explosion itself is composited in After Effects, and I rendered out a depth pass as a matte to help integrate the explosion into the animation. I then used glows and optical flares in After Effects to give an extra bit of dazzle to the explosion. A destruction plug-in was used to break up the barrels and then a dynamics tag was added. The explosion was created using a sphere with a collider tag on it to trigger the detonation."

In one of the simpler effects, one of the characters is integrated with some text, but as a transition starts the text comes away. Pryce applied the text as a texture onto the original log but as it transitioned into the character, the original logo animated off.

By the end of the three-week project there were six five-second clips, rendered out at 1080p resolution using a pair of Xeon Mac Pro's. Pryce summed it up, "It seemed crazy to try and tell a story in 2.5 seconds, but after lots of curve refinements and meticulous tweaking of keyframes we managed to do it. I find Cinema 4D to have pretty much everything you need for narrative animation and rarely find myself leaving the application until the compositing stage. There is a tool for every job and usually more than one solution to a problem."

Duncan Evans is the author of Digital Mayhem: 3D Machines, recently published by Focal Press.

All images courtesy of Munk Studios.

Munk Studio Website:]]>news-4841Mon, 09 Nov 2015 10:55:00 +0100Merging Traditional and Modern Animation TechniquesThe team at Tiny Inventions combines traditional puppeteer techniques with modern 3D animation to create a highly unique look.Artists in ResidenceAfter successfully completing several projects, including highly acclaimed music videos for the band ‘They Might be Giants’, Ru and Max applied at the Netherlands Institute for Animated Films to participate in their Artists in Residence program. They were accepted and moved from Brooklyn to Holland and started working on their short film ‘Between Times’. The film’s story is about time in general and each individual’s different subjective perception of time. They were interested in including 3D animation into their process for some years and they thought the visual style for “Between Times” would work really well as combination of stop motion and 3D animation.

Several years ago Ru had taken a 3D animation course during her studies, which she describes as “frightening” at the time. However, when she got to know Cinema 4D she was surprised at how intuitive the application was to use. “The user interface and icons made it easy for me to quickly and easily learn how to get things done. I didn’t have to learn hundreds of commands because the icons were basically self-explanatory!”

From Puppet to 3D ModelAs with Ru and Max’s other projects, work on ‘Between Times’ began in the analog world: scenery, characters, accessories – in short all elements required for the film – were created using clay, paint, plywood, paper and other materials. While the sceneries were in fact used later, characters and character animation was done using different methods. After Max had photographed all characters from all sides, they used these images to create authentic-looking, animatable 3D models of the characters in Cinema 4D. The photos were also used in conjunction with Projection Painting in Mudbox.

The scenery was set up and lit in the studio and Max created the camera movements frame-by-frame while Ru animated the digital characters in Cinema 4D. In order to combine the elements, the scenes’ basic geometry was re-created in Cinema 4D so Ru was able to reference the entire scene when animating. Only the characters and shadows were rendered in the end.

Real Lighting DigitizedMax simulated real lighting as accurately as possible in Cinema 4D in order to integrate the animated characters into the filmed sequences as seamlessly as possible. Instead of using GI, a set of colored surfaces was arranged in the 3D scenes, which were used to simulate indirect lighting effects. “We determined that we could achieve believable-looking results even with colors that were only 90% accurate. A very crucial element was the direction and colors of the shadows, which had to be correct,” remembers Max. “Especially since the shadows were what linked the analog world with the digital world.” Cinema 4D’s Standard Renderer was used to render the animations and compositing was done in After Effects.

No HDRI lighting and no special render engine were used – yet nowhere in the film can you see that this is a combination of analog and digital elements. Rus and Max’s attention to detail and the effort they put into this project have been rewarded multiple times with a half dozen prizes from various animation film festivals.

Tiny Inventions Website:http://www.tinyinventions.com]]>news-4814Tue, 29 Sep 2015 09:22:00 +0200BBC News - China's PM2.5 Crisishttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/bbc-news-chinas-pm25-crisis/To explain the problems with pollution in major Chinese cities, broadcast designer Sophia Kyriacou turned to Cinema 4D's varied toolsetBy Steve JarrattA BBC correspondent stands on a busy street in Beijing, explaining the issues of particulate matter in the atmosphere – specifically PM2.5, a collection of carbon deposits, organic compounds and specks of metal, just 2.5 microns in diameter. The high levels of PM2.5 found in Chinese cities is a major concern and the piece was designed to highlight the dangers. Of course, these particles cannot be seen so it's difficult for the viewer to grasp their physical nature, and thus the hazards of these tiny pollutants.

The task of visualizing the sequence was given to Sophia Kyriacou, a broadcast designer who has worked at the BBC in London since 1998. "It was essential to create a sequence that made identifiable and understandable comparisons to help make an explanation easier to its audience," she says. "By using the example of an already small particle such as 'a grain of sand' and then informing the viewer that PM2.5 is 20 times smaller made the viewer think that this small pollutant should be taken seriously."

This explanatory sequence was commissioned by the BBC Science Unit and, like all of Sophia's work for the BBC, had to fit in with the channel's branding in terms of its informative tone, style and typography.

Sophia wasn't present at the shoot, so some preparation was done in advance to make sure the footage arrived as planned. "We ran some tests on the roof of Broadcasting House to get a feel of the 'outdoors', and as something to use in R&D to light the scene and play with different camera shots. Once the footage was fed back to London it was pretty much crack on and deliver," explains Sophia.

The first part of the sequence was created using images of a single sphere representing a particle of PM2.5, and a single grain of sand blown up to enormous proportions. To create the reflection needed to tie the sphere in with the hand, Sophia added a floor object under the sphere and mapped the video onto it using frontal projection; the sphere then reflected the hand and its surroundings perfectly. The over-sized grain of sand is just a basic object displaced using the Noise function in Cinema 4D.

The two elements were rendered out as static images with an alpha matte, plus a separate ambient occlusion pass used to add a shadow on the correspondent's hand. Contrast levels were tweaked in Photoshop, and then the images were imported into Adobe After Effects and tracked into the live action shot. However, this was made more difficult by the lack of tracking markers, plus a degree of compression when the HD footage was sent down the line. The task was accomplished by using After Effect's built-in Tracker and then manually tweaking the result.

The second part of the sequence, involving a glass box and multiple particles, proved much more challenging. "Glass is always a tricky material to work with as you can't hide anything," admits Sophia. "The glass was the crucial technical element to get right in the entire sequence, and if it wasn’t done correctly it would look wrong. Obviously the spheres had to be clearly seen as they were integral to the actual explanation, so I decided to make the back of the cube slightly frosted to make them stand out more. I achieved this by applying a separate back glass with added blurriness to the transparency."

Sophia used Cinema 4D's Sun object to light the scene, with a separate key light to highlight parts of the glass and spheres. "The shot footage had strong shadows," she explains, "so this gave me a rough indication of the time of day. I played around with the time and date settings until I had a scene that pretty much matched the supplied footage."

To complete the lighting she took a frame of the video footage and wrapped it around a Sky object to emulate global illumination. "Global Illumination was essential in making this scene work," says Sophia. "I needed my 3D objects to mimic reality and sit as naturally as possible in the real environment in which they were placed. Without GI it simply wouldn't work. I used the shot footage as a 'fake HDRI' to light the scene and accompany the sun light and key light I had in the scene."

Cinema 4D's MoGraph toolset was also brought to bear in generating the PM2.5 'particles'. The Cloner object was used to replicate the sphere, using the Radial settings and both a Random and Delay effector. "I like the Delay effector," comments Sophia. "It gives objects a feel of 'character' and I used it extensively when I created my Paper Town animation for the BBC Business Unit."

To generate the glass reflections, the video footage was added to a separate plane and made invisible to the camera using a Compositing tag. "Seeing the reflections appear comfortably and naturally around the cube was an achievement. It blended well and didn't look out of place – a simple trick with very rewarding results."

"The entire end section of the sequence was filmed as a locked off shot," she adds. "I rendered a static matte of the box so I could cut out the animation of both the reflecting glass box and the animated spheres. I could then composite it back onto the clean background plate for further treatment, such as adding shadows and pulling out extra highlights."

The final step was a color grade in After Effects with a dark vignette to give it a slightly polluted feel. The resulting sequence was then nominated for a PromaxBDA Global Excellence Award for 'Best Art Direction & Design: News Program Informational Graphics,' where Sophia was awarded the Bronze medal.

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

All images courtesy of BBC News.]]>news-4772Thu, 20 Aug 2015 15:52:00 +0200Title Design: Challenging Arthttps://maxon.net/en-us/news/case-studies/movies-vfx/article/title-design-challenging-art/Opening sequences spark curiosity and give a glimpse of what to expect. Raoul Marks does this masterfully and his primary tool is Cinema 4D.

First, the appetizer: the opening sequence. Opening sequences are not only used by movies and television series to entice viewers – design conferences worldwide also use opening sequences created by well-known motion graphics artists. Artists also see this as an opportunity to explore many of their own creative and more experimental ideas in these projects. When Semi-Permanent contacted Raoul about creating an opener for the Semi-Permanent Conference 2015, Raoul was instantly onboard. “I was very excited to take a month off my usual work to tackle something new. It was a great opportunity to explore some new techniques and images I’d had floating around in my head,” remembers Raoul.

36625

An photographic series from a 1968 issue of LIFE magazine gave Raoul the inspiration for the opening sequence: an astronaut falling through the darkness of outer space. This together with Raoul’s interest in the movie ‘2001: A Space Odyssey’ were some of the key influences for the titles. “2001 has a stunning ability to hint at something surreal, something undefinable. I’ve always responded to that unnerving feeling in films and loved the awe it was able to evoke. I’ve found myself revisiting 2001 every few years and it always seems to offer up some new form of inspiration.” determined Raoul.

“I’ve worked with Cinema 4D for a number of years now and find it a very malleable and forgiving piece of software. I had a tight timeframe of 4 weeks for concept work and production. Cinema 4D's versatility allowed me to work intelligently and focus on key elements of the titles.” For the main spacesuit modelling was a combination of Cinema 4D and Marvelous Designer. The high polygon count of the model meant a different approach needed to be used for animation. Raoul used low-res versions of the models to create the animations and transferred the movements to a higher resolution model using the Mesh deformer deformers. “For close-up shots I would use the simulations coming out of Marvelous designer and for mid-range to distant shots I would use the Mesh deformer process” explains Raoul.

The astronaut’s free fall and the various elements that drifted around him in space posed a series of problems for Raoul for which he had to find solutions. “I relied heavily on source photography from the Apollo era to reproduce the quality of light and grading reminiscent of that period. I sourced some very high-resolution height maps of Mars to help create the rocky surfaces in many of the shots.” says Raoul.Steam effects for gasses and clouds of dust can be seen throughout Raoul’s film. These effects were created using Cinema 4D and the TurbulenceFD plugin, which is specially designed for creating such effects. Raoul was also focused on finer details such as the uneven structures on asteroids: “In particular I had a lot of light and shadows shifting over very complex sub-polygon displaced surfaces. As the sun shifted over the cratered surfaces of the face I wanted realistic GI light to bounce off all the small details and create a believable lunar surface.”

For the final shots in the sequence, camera projection is used to create 3d shots from still photography. Raoul used this same technique in his work on the opening titles for ‘True Detective’. Cinema 4D was used to create impressive visuals using photos by Richard Misrach amongst others. “We really wanted to use these images in the titles but we wanted to push them beyond being purely stills. I did some research into the camera projection technique. This allowed us to build some low-resolution geometry and then project Misrach’s shots onto it in 3D space. We could then move a camera through that environment recreating an equivalent to a crane-style shot. There’s a little work involved with painting in the occluded surfaces so the camera doesn’t see double images but it’s an extremely effective way to add life to a still image.”

The images in the final sequence of the Semi-Permanent titles were supplied by a colleague of Raoul's who had just returned from a vacation in Iceland. Cinema 4D's Camera calibrator feature was used to help create the shots. “I used to just do the projections by eye, which took a lot of patience and guesswork but since the introduction of the Camera Calibrator, it’s super simple to set up this type of shots.”After the modeling and animation had been completed, the opener had to be rendered. “The usual trade-off with render engines has been between speed and lighting accuracy. Unbiased render engines would give you beautiful photorealistic results but would take hours to render, while fast renders could be achieved with the biased renders but often couldn’t quite capture light in the way I wanted. Octane for Cinema has found a way through that conundrum and given us the best of both worlds. By tapping into the relatively unused power of modern GPUs we can now render beautiful unbiased imagery at speeds much faster than the purely CPU-based solutions.” explains Raoul. “The apparent limited quality settings on an unbiased render are actually a complete blessing. In the past I found working with GI problematic. You’d spend a lot of time dialing in quality parameters. You’d hit render on a sequence, come back a few hours later and find a level of flickering throughout the sequence that you hadn’t noticed in the tests. You would then need to up the quality and start again, hoping you had pushed them up enough. Because Octane is unbiased there is no flickering, there are no AA settings; it’s actually a very pared-back set of controls for the render.”

When asked how Raoul finds the right settings when using the Octane renderer, he replies: “There’s really only one setting to be concerned with: samples. The image starts very grainy and resolves over time to remove that grain. So you would set one image to render to a high level of samples and after it’s reached a level of noise you’re happy with you make a note of the sample count and set the entire sequence off to render to that number.“

The response to the Semi-Permanent opener has been overwhelming. “The titles seemed to have been doing pretty well online and getting a fare bit of exposure, which is nice. Interestingly, a lot of people have been in touch about the prospects of VR in relation to the titles. There really seems to be a buzz out there for all things VR. A number of studios and software developers seem to be racing to create tools and experiences for the new medium. That excitement and enthusiasm is creating a really great atmosphere around the new possibilities in storytelling, visuals and a whole gamut of applications we haven’t even thought of, yet. So I’m quite excited to start exploring what can be done out of Cinema 4D for VR.”

]]>news-4716Fri, 31 Jul 2015 12:08:00 +0200Virtual Upholstery with Cinema 4D and Octanehttps://maxon.net/en-us/news/case-studies/advertising-design/article/virtual-upholstery-with-cinema-4d-and-octane/Artist, visualization specialist and trainer Christoph Schindelar uses customized workflows in Cinema 4D to create challenging visualizations of interior furnishings.When a client tasked him with creating visualizations for 360 sofas in 60 different virtual showrooms, Christoph immediately started developing a fitting workflow with Cinema 4D, which would let him deliver the desired results on time and within budget.The basic showroom scenes were created in Cinema 4D and filled with objects obtained from 3D model suppliers. Other elements that weren’t available for purchase were modeled according to reference images using photo modeler.

The focus of the project was the furniture that had to be presented in the showrooms. Christoph used Cinema 4D’s Cloth Simulation feature to cover a standard framework with the respective upholstery. The Cloth Simulation feature works so precisely that the upholstery was automatically fitted correctly, including folds in the cloth. Cinema 4D’s sculpting tools were then used to fine-tune the sofas stitching and other details.

After the furnishings had been modeled, sculpted and textured, and the scene arranged, Christoph began rendering using the Octane render engine. Octane is a GPU renderer that uses a computer’s graphics cards for very fast rendering. Since Octane is an unbiased renderer that renders light and surface properties as accurately as possible, the rendered results were really impressive.

Christoph’s decision to use the Octane Renderer for this project was based not only on the fact that it delivers such stunning results but also on the Live Viewer feature, which lets a scene be rendered dynamically in a special preview window while it’s being edited in the viewport.

The variety of projects with which Christoph is confronted has shown that there is not a universal renderer for all jobs. “Each renderer has its own strengths and weaknesses, when creating automotive renderings, packshots or shots with image-based lighting. For VFX, volumetric effects and interior renderings on the other hand I would recommend the Arnold Renderer. If you take that extra step when using specific applications, render quality can be driven to unprecedented heights. If, for example, Cinema 4D is combined with the HDRI Light Studio plugin and the Octane Renderer you get the best workflow for product renderings, jewelry visualizations and packshots that I know of!” explains Christoph.

Websites that feature Christoph Schindelar’s work:www.artstation.com/artist/chris-3dwww.carbonscatter.com/tutorials/shindelar/index.phpwww.3dvisual.at/]]>news-4690Mon, 13 Jul 2015 11:37:00 +0200The Secrets of the Sydney Opera House Revealed in 3Dhttps://maxon.net/en-us/news/case-studies/games/article/the-secrets-of-the-sydney-opera-house-revealed-in-3d/A fascinating structure with an eventful history, explored across 2 distinct projects in which Cinema 4D was used to create an in-depth look into this landmark’s architecture.The story behind this landmark structure is just as fascinating as its design. And as modern as the design still is, so had to be the documentary about the opera house, which was commissioned by Sydney Opera House and produced by the Australian Broadcasting Corporation in 2012.. The project documents many of the steps of the opera house’s creation, from the very beginning through to opening day. The com-plexity that the project encountered had to be explained and depicted in an under-standable way.

3D graphics were used to illustrate these complex architectural and engineering principles. Despite the large number of historic pictures that were available, 3D graphics had to be used to clearly illustrate numerous phases of construction. An animation team of three at ABC, including Reuben Hill were tasked with the creation of these images and animations.

Reuben started his career as a Scenic Artist for films such as Star Wars Episodes II and III, Moulin Rouge and The Matrix Sequels, for which he used traditional methods to texture and paint sets, models and props. He became fascinated with the possibilities 3D graphics offered and got started in the world of 3D with Cinema 4D R11. He worked with Sam Doust, then creative director of the Innovation Division of the ABC on two digital projects including the Opera House Project. Both have since begun working together under the studio name Latchkey, based in Sydney.

Alongside the previously published Opera House Project (2012), Latchkey has recently completed a film for the general tour of the building.

For both projects, the construction process of the opera house had to be illustrated and animated using schematic views. Individual phases of construction had to be shown as isolated elements in order to better explain special features. Reuben ren-dered the structure’s geometry in simple black-and-white with global illumination and ambient occlusion to add volume and depth to the white objects. A Physical Sky was used for illumination and Reuben used both the Standard and Physical Renderers, depending on his needs.

In the course of the 20 years it took to complete, the opera house’s appearance went from looking like a clamshell to resembling a cathedral, which is the shape we know today. This transformation was illustrated using an animation rendered in black-and-white. To make it clear which versions were not the final versions, Cinema 4D’s Sketch and Toon features were used to create an illustrated look for the animation.

A key moment in the transformation was the realization of architect Jorn Utzon that all roof elements could be derived from circular or spherical shapes. This might be difficult to recognize at first but the sequence Latchkey devised makes it easy to understand (Link).

Another animation explains how Jorn Utzon was inspired by classical architecture in the creation of his design. The animation uses an empty foundation and illustrates how Utzon was inspired by the megalithic structures of Central American peoples like the Aztecs. This animation was also rendered in distinctive black-and-white using global illumination and ambient occlusion.

The style of the animations harmonizes perfectly with the documentation, conveys the message clearly and concisely and is very focused – without losing sight of the structure as a whole.

For Reuben, being given the chance to learn about the amazing history, architecture and engineering of the Sydney Opera House whilst creating the graphics and animations for the projects was a great opportunity. “The fact that the the simplicity of Jorn Utzon’s Spherical Solution filtered through to our efforts to recreate and animate the concept in 3D was a very nice and unexpected surprise. Cinema 4D and the Mograph toolset freed us from technical issues and allowed us to put all our creative energy into the projects.”

By Steve JarrattEviivo offers a suite of apps and services designed for owners of hotels, bed and breakfast accommodations, pubs and resorts to help them manage bookings and promote their business. And to promote its business, Eviivo turned to Mill+, an integrated team of directors and designers within London VFX house The Mill. Tasked with offering complete projects from conception to delivery, the team at Mill+ really did do it all – from character design to script writing and even penning the song that accompanies the animation.

"The core idea was to tell a story of everyday life," explains Nils Kloth, head of motion design. "It's a story of someone caught in a rut, truly unhappy, desperate for a change; dreaming of a better life without the stress and responsibility. This someone eventually embarks on a quest to set up a rural B&B going through all the peaks and valleys until Eviivo comes to help."

The concept began by using Eviivo's ‘start' buttons, the red and yellow dots over the ‘ii's in the logo. These dots became circular characters, which inhabit a colorful minimalist landscape. The lead character, the yellow button, is a city worker tired of the continual grind of the office commute, who dreams of opening his own hotel, then calls on the services of Eviivo to make his dream a successful reality."After writing the song and lyrics we embarked on the journey of creating the 3D animation," says Kloth. "For this to work we had to create three minutes of full CG animation with loads of character animation, ample scenes and plenty of particle simulation, which is a challenge in itself."

About 95% of the sequence was built and set up entirely in Cinema 4D, with rendering handled by V-Ray. "All of the designs, models and cameras came directly out of Cinema 4D," notes Kloth. "It was a very tight turnaround and the process was best kept in one application. We used V-Ray for lighting, texturing and rendering to give it that little extra." The three-minute piece was made by a team of three people using 12-core Mac Pros, and output using 64 render clients.

The flat, minimalist approach meant there wasn't much need for post work with the exception of clean-up, color grading and a few graphical elements. "All renders came out of V-Ray 1.8. In addition we rendered out some passes and mattes for post. There was no need for too much control in post due to the flat graphical appearance of the piece."

Kloth explains that the bulk of the animation takes place inside and outside the barn, which called for two main lighting setups. Depending on the scene, custom lights were added and the core setup tweaked to suit.

One of the first sequences has a reveal of the tube train in which our yellow hero is travelling to work. As the camera swings around to the side, the scene cleverly cuts away to reveal the inside of the carriage. "The bulk of this effect was a big Boolean," says Kloth. "The cube subtracting the rest was anchored to the plane while the tube carriage rotated around. We added the wallpaper texture in post and had to paint out various artefacts created by the Boolean."

As the train grinds to a halt, the rotund passengers all tumble to one end, an effect created with Cinema 4D's physics engine. Kloth explains that they used some scripting to blend between the hand-animated elements and the simulated movement.

Later on there are some impressive scenes in which the lead character is literally drowning in paperwork. For these the team turned to Cinema 4D's trusty MoGraph tools. "The sea was animated in a vaguely sea-like way using deformers, by animating a base object, then cloning the paper to that object. We then added dynamics as the final layer of the effect to get the collisions."

A more traditional river is represented by layers of extruded splines. A deformer is used to animate the splines to create the undulating waves.

The little circular characters – and some dancing sheep – were rigged using Cinema 4D's bones. "It was a very easy setup, which made animating them very straight-forward," says Kloth. "The circle characters were rigged with a custom spline-based rig, animated with Curious Animal's Spline Auto-rigging script and powered by Cinema 4D's Spline deformer." He also adds that the lead animator, Dan Fitzgerald, used a variety of the Curious Animal's other deformer plugins.

Overall, the Mill+ team seems pretty happy with the end result. "It was an unforgettable project at all levels," declares Kloth. "We'd never written a song, lyrics or had to deal with hotel management software before. The client was great, the project intense and we had great fun animating the sheep."

And with regards to Cinema 4D he suggests that the application was "essential," adding, "we couldn't have done it in any other package."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

All images courtesy of Mill+.

The Mill's Website:www.themill.com]]>news-4625Thu, 11 Jun 2015 11:25:00 +0200Creating the Dark Side of Marvelhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/creating-the-dark-side-of-marvel/Discover how Territory Studio used Cinema 4D to help meet director Joss Whedon's desire for a gritty and darker story in 'Avengers: Age of Ultron'By Duncan EvansNo sooner than Territory Studio, helmed by founder and Creative Director David Sheldon-Hicks, finished using Cinema 4D for the user interface graphics in the smash-hit 'Guardians of the Galaxy,' did the call come from Production Designer Charles Wood and Art Director Alan Payne to get involved with Marvel's big-budget 'Avengers: Age of Ultron' project.

The task this time was to help support Joss Whedon's artistic and creative vision for the film to be darker and more challenging, exploring the humanity of each of the superheroes. To do this, David was asked to create a visual brand for each character that accurately represented their personality and role within the film.

On top of this, there was an extra requirement in that Charles Wood wanted the screen user interfaces to have a more realistic look and feel, to be futuristic, but with a modern plausibility to them. It sent the Territory team researching the realms of such topics as advanced clinical technology. Of course, it's all good and well creating something from a blank canvas, but the Iron Man persona and Tony Stark within the suit has already laid down a template for the kind of graphics the audience would expect. David explained that Territory wasn't as constrained as you might imagine, "Our task was to approach the UI from a fresh perspective that supported Joss' vision for this film. So, while we worked with established color palettes for Stark and Banner, we were free to create a new look for the UI. For Tony Stark we researched state-of-the -art architectural engineering, avionics and military technology, and crafted a series of screens for Stark Lab that worked together to reflect a more rounded character. Similarly, we researched cellular plant biology and biotechnology and fed that into Banner's UI. Again, the idea was to add a visually rich layer of imagery and animation to the background of Banner's lab that supports his role as biologist. Finally, for Dr. Cho, a newly introduced character with a strong Marvel history, we crafted a UI that reflected her interests in biotechnology innovations and incorporated near-future technology references that we extrapolated from current advances in clinical applications for 3D printed organic matter."

The process of creation flowed both ways because the initial conversations about the designs took place before the script had been finalized. That meant Territory was able to advise on how best to use the UI to illustrate certain narrative points. From there the team refined the creative vision for the visual language of each character and environment with research and references from the art, costume and props department. It was important to both Joss and Charlie that Territory's screens not only gave visual depth to the sets but anchored the Avengers technology in real-world references that the audience could relate to. As David elaborated, "This level of realism was a new approach to Marvel Universe's highly stylized future-tech aesthetic and it was fun for the team to bring the two together."

Needless to say there were a number of technical challenges that had to be mastered. David revealed what was required for one of the new characters to the franchise: "For one scene with Dr. Cho, this involved extrapolating potential clinical applications from current reconstructive biotechnology and 3D printing to create the UI for a clinical technology that 3D prints skin onto a patient. Having 3D screens that looked holographic on set was a real challenge to create from the depth perspective. The temptation is that if you hear that it's going to be a background screen you don't put the quality into it but ever since we worked with Ridley Scott we knew to deliver the best quality and to give the director the option to use it as part of the narrative with a close-up."

Marti Romances, Art Director for Territory, worked on the Iron Man screens and made full use of the speed of Cinema 4D: "With the Iron Man screens I didn't have much time to spend on renders and for that reason I rendered many simple and quick passes from Cinema 4D. This gave me the option of being able to change and play with these passes in After Effects until the directors were happy with the result. Things like Cinema 4D’s External Compositing tag and the HAIR and cel renderer were very handy for these quick turnarounds."

One of the other technical challenges was receiving hi-res models from other VFX vendors who were working on the film and integrating them into the workflow. The Poly Reduce function was used to take a highly detailed Iron Man model from ILM and repurpose it for Territory's needs. Peter Eszenyi, Head of 3D for Territory, had a similar challenge when creating the Leviathan screens. Peter described the process: "We used a very dense and detailed CGI model we received from one of the main vendors. Before we were able to do anything with it, the asset needed a bit of love, stripping out the overly detailed parts, and generally organizing the mesh into manageable chunks. After this, we started to set up some crazy X-Particles magic. We used the mesh as an emitter as well as placing several other emitters around it. For other passes we used the mesh as a collider object, and of course we did use quite a number of different setups, turbulences, wind, follow surface modifiers, the whole works."

David elaborated on how his team of artists kept pushing the envelope creatively: "Often we're finding and exploring new ways of displaying 3D holograms. Thinking Particles, MoGraph and Sketch and Toon are all critical to our ability to be creative in how we approach this challenge. The team here gets tired of just showing wireframes so it's important for us to innovate and test new ways of doing things - MoGraph in particular gives us a toolset to do this."

Peter also used Sketch and Toon on the Leviathan screens he was working on, "We started the Sketch and Toon explorations, experimented with different contour settings, played with lights and shadows influencing line weights – which is an excellent and not so well known feature of Sketch and Toon – generally just trying out as many different ideas as time permitted. Once we rendered all these passes we layered some more traditional passes on top such as shadow passes and ambient occlusion. Then we tried out different tessellation methods, projection textures and so on. We ended up using 10-15 different passes, which were tweaked further in After Effects. This R&D time let us set up a structure that could be deployed on different parts of the project, so once someone got familiar with how things were working we could start producing the different screens."

In previous films like Guardians and Ridley Scott's Prometheus, Territory used UI screens live on set. Only when details or filming restrictions made live screens impractical were the effects added afterwards. That experience came to the fore in 'Avengers: Age of Ultron,' where over 200 screens across 11 sets were implemented as mainly live on-set props with the help of Compuhire, while only a few screens were delivered as VFX shots. It meant the actors were working with live screens that gave them a focus when appropriate and supported their line of sight. As is often the case on projects like this there are last-minute changes, sometimes minutes before they are required on set to be shot. David explained how it was possible: "Speed is key to our projects. Cinema 4D allows fast rendering and pulling things back into After Effects while MoGraph offers an artist-friendly, procedural way of animating. We can create complex structures and make last minute changes very quickly."

All told, Territory committed a core team of six artists for 11 months to the project, often scaling up to 10 artists, rendering out the 2k-resolution screens and creating over 80 minutes of unique animations for Marvel's blockbuster franchise movie.

Duncan Evans is the author of Digital Mayhem: 3D Machines, recently published by Focal PressAll images courtesy of Territory Studio, Marvel Studio.You can see more of the on-set screen graphics from Avengers: Age of Ultron here: www.territorystudio.com]]>news-4624Tue, 09 Jun 2015 11:14:00 +0200Jellyfish From Hellhttps://maxon.net/en-us/news/case-studies/movies-vfx/article/jellyfish-from-hell/Mutant jellyfish terrorize beachgoers in Patrick Longstreth’s sci-fi short, HellyfishOn February 5, 1958, a B-47 bomber dropped a 7,000-pound nuclear bomb into the waters off Tybee Island, Georgia after an F-86 fighter jet accidentally crashed into it while on an Air Force training mission. That bomb, which contains a much-debated quantity of radioactive material, has never been found and the Air Force wishes that people would stop looking for it. Left undisturbed, they say, the bomb is harmless. If it is disturbed, well, things could get ugly.

And that’s exactly what happens in Patrick Longstreth’s short sci-fi horror spoof, Hellyfish, in which the lost bomb is leaking radioactive material and mutant, bloodthirsty jellyfish terrorize and devour beachgoers. “I saw a documentary about the lost bomb and thought it was so fascinating that I considered making my own documentary,” recalls Longstreth, who used MAXON’s Cinema 4D to create his film. “But then I thought about how there really was an actual jellyfish problem in Savannah and how much I love beach monster and horror movies, and the idea just came to me.”

Hellyfish is Longstreth’s first independent film. After earning a business degree, he quickly changed directions and worked for NBC Network News as a motion graphics artist for three years. Next, he got his graduate degree in visual effects from the Savannah College of Art and Design (SCAD) and moved to Los Angeles where he has worked for well-known production studios, Psyop and Imaginary Forces, while also freelancing as a director and VFX supervisor for commercials, corporate videos and independent films.

Longstreth and co-director Robert McLean, who brought his experience working with actors on film sets to the project, shot Hellyfish in Savannah, Georgia. The decision cost them must less than they would have had to spend on the same shoot in Los Angeles. In addition to being able to use some of SCAD’s equipment for free, they were also able to readily find local actors and friends to fill over 20 different roles in the film. Rehearsals were done in Longstreth’s backyard, and the shoot spanned 16 days over the course of six months.

Animating the CreaturesLongstreth modeled the killer jellyfish in Cinema 4D and additional sculpting was done in Zbrush. To create a fully digital environment with 3D camera movement, photos and video of sky, sand and ocean were projected onto geometry in Cinema 4D. But when it came time to create and rig a full-run cycle for a creature with five tentacles, fur, long stringy hairs and soft body dynamics, the team brought on experienced character animator, Pryce Duncalf.

For the final shot, in which the giant Hellyfish destroys the Tybee Pier, they created an exact model of the pier in Cinema 4D and then projected an image of the actual pier onto the 3D model. Because the interaction with the character proved to be too intricate for a full dynamics simulation, each individual piece of the pier was keyframed. “The Cinema 4D deformers were essential to this animation process,” Longstreth explains.

For dramatic effect, some shots were slowed down to 33 percent. This meant it was especially important to eliminate any imperfections in the animation because they would be easier to spot in slow motion.

Creating Realistic Water ScenesThe opening nighttime scene was shot on a green screen to allow for more control of the lighting camera movement. Shots were taken from every angle, including wrap-around dolly shots that were tracked with Pixel Farm’s PFTrack. The night sky was a 360-degree matte painting augmented by moving clouds, the shoreline on the horizon and a blinking lighthouse.

Creating realistic-looking water was one of the biggest challenges, says Longstreth, who laughs when, in all seriousness, he says “we used every trick in the book.” For shots that required CG water, Cinema 4D and RealFlow were combined to get the best results.

To create the feeling of being underwater, floating CG algae, dirt, and bubbles were added with the Trapcode Particular After Effect plug-in, which allowed full control over the density, size and animation. Video Copilot's Optical Flares and Action Essentials were used to help tie shots together. All told, over 20 artists ended up helping with the post-production.Funny ScaryWhen the film was finished, they had an advance screening in Savannah for cast, crew and friends. "Even the kids were cracking up laughing and that was really rewarding," Longstreth recalls, "because if they get the humor, that's exactly what we were going for. We didn't want something gruesome. We wanted it to be Halloween material in the category of Gremlins or Ghostbusters."]]>news-4328Thu, 21 May 2015 14:48:00 +0200An Illustration of Digital Dementiahttps://maxon.net/en-us/news/case-studies/visualization/article/an-illustration-of-digital-dementia/Computers and smartphones make everyday life easier – but it also leads to a reduction in mental arithmetic and memory performance: digital dementia Computers and new media are responsible for significant changes in human behavior: an entire generation of students is increasingly plagued by shortened attention spans and concentration disorders. An ominous threat, referred to as ‘digital dementia’, is spreading throughout all classes of society.

Media design student Felix used digital dementia as the topic for his bachelor thesis. He created an explanatory film about digital dementia, its causes and what can be done against it. He created the project using Cinema 4D – a program he knows very well from his work on a previous project. The digital dementia project would differ from his previous project in that he would have to complete it independently, without the benefit of having a project partner.

Felix wanted to create a mix of 3D elements and motion graphics, and after he completed a script using the various resources he had researched, he designed visuals that could be realized within the framework of a one-man project. Felix decided to use a low-poly look, which could be quickly modeled and was ideal for use with motion graphics animations.

Felix added life to his elements by using dynamic objects that jiggled when they were enlarged. In addition, a range of various deformers was used to put the animated objects in motion.

What really helped the workflow were the Timeline and the Curve Editor, which made it easy to modify animation curves. “There’s constantly movement in the film. Each shot contains objects that rotate, pop into view, tumble in front of the camera, slide in from the side and so on. Each object has its own movement giving it its unique animation. Coordinating these movements as a whole was one of the major challenges of this project,” remembers Felix.

The next step was to determine which colors and materials should be applied to the objects and backgrounds. Felix worked extensively with the Cel shader, which he placed in the textures’ Luminance channel. “If a material only uses the Luminance channel it appears very flat and lights have a very limited effect on them. Almost no color gradations or shadows are generated. The Cel shader let me win back these color gradations and shadows as well as highlights while being able to maintain the desired illustrated look,” explains Felix. This method made it possible for him to carry the look of the low-poly models over to the textures and set this forth in color and material design.

Rendering was done in several phases because each scene was rendered in and of itself. The TIFF sequences that were output were loaded into After Effects for editing. These sequences had alpha channels, an AO pass and a depth pass for objects that were combined, color corrected, had audio tracks and sound design added, which were all brought together in After Effects to create the final film.

]]>news-4571Wed, 29 Apr 2015 11:16:00 +0200Bridging the Gap Between Imagination and 3Dhttps://maxon.net/en-us/news/case-studies/advertising-design/article/bridging-the-gap-between-imagination-and-3d/Thomas Dubois is actually an architect but also likes to create stories with his friends that he then uses as an inspiration for his illustrations. This is exactly how The Ark was created!The common theme of these stories is planet Earth, whose rising sea levels have transformed the living spaces for its inhabitants. The concept that in such an event living spaces must be used in new and more effective ways was what gave ‘The Ark’ its initial spark. Thomas’ idea of hanging structures on a natural arc structure made of rock was an attempt at placing as many structures for living and working in as small a place as possible. Thomas starts with modeling low-poly models of the rock arc in Cinema 4D, to which he then adds UV coordinates.

He then creates another displacement map in Cinema 4D with noise and distortion properties, and renders the arc using VRay to create an impressive stone structure for the arc. Next, he created the structures that hang like swallows’ nests along the arc, which were also created using Cinema 4D. Cinema 4D’s MoGraph feature was used to duplicate and distribute these objects.“As a whole, ‘The Ark’ is actually not a very complex scene,” says Thomas. “The only issue I encountered was when I rendered the scene. Since the image had to be rendered at a very high resolution, the render times proved to be exorbitant when I did the first tests. Especially the arc’s high-res displacement map played a major role in bloating the render time! This is why I rendered the arc and the structures separately and later composited and fine-tuned both elements in Photoshop.” The result speaks for itself and is an excellent example of how Thomas creates spectacular images of his science fiction visions using Cinema 4D.

Once you’ve seen Thomas Dubois’ work you quickly realize why lectures and symposiums at which he speaks are quickly sold out.Thomas Dubois’ web site:www.thomas-dubois.com/]]>news-4568Fri, 17 Apr 2015 13:03:00 +0200Into Orbit With Jupiter Ascendinghttps://maxon.net/en-us/news/case-studies/movies-vfx/article/into-orbit-with-jupiter-ascending/Duncan Evans talked to David Sheldon-Hicks of Territory Studio about working on the set of the Wachowskis sci-fi epicBy Duncan Evans

When legendary directors, the Wachowski's, wanted screen graphics and displays for their spaceships in Jupiter Ascending, their first port of call was London-based Territory Studio. It was a selection process made easier for everyone because cCreative Director, David Sheldon-Hicks, had already worked on the iconic futuristic sci-fi screens for Ridley Scott on Prometheus. Territory was duly hired to create user interfaces for screens that would feature as part of the navigation systems in a number of spacecraft scenes. Invisible forces such as gravity, wormholes and cloaking devices needed illustrating, with Jupiter Ascending Production Designer Hugh Bateup suggesting that 3D weather maps would be a good starting point. Then there was the concept art. An entire room of beautiful art with lifts, spacesuits, environments and spaceships. It had already been created and was to serve as inspiration for the visual look and feel of the film, even the bespoke typeface that Territory created for the film.

David and his team of five artists got to work investigating how best to use isometric lines, which are normally used to describe weather fronts, to represent 3D energy fields as animated organic forms. Most of the effects were generated in Cinema 4D using the Thinking Particles plug-in with XPresso being used to control them. The basic idea was to generate anything between 100 to1000 particles and then use effectors to move them around. This way, force fields like weather maps could be created, with morphing, to describe specific story points. Usually, this kind of animation creates chaotic elements but here the team tried to incorporate such effects. The real problems started with getting certain animations to loop. Senior Motion Designer, Nik Hill explained how they solved this issue, "Cinema 4D's MoGraph tracer and Hair shader settings were key to helping us figure out the looping issues with the swirly wormhole graphics."

The other problem was how to compensate for the irregularities of the physical screens in the spacecraft bridge environments. Unlike most VFX projects, the majority of these weren't added in post-production but, they were projected in real-time onto glass screens. This process was developed in partnership with partner Compuhire, the engineers behind getting the graphics on set.

The projectors were either in the floor or from above. This is where Territory's experience with creating the same kind of effects for Prometheus paid off. On the bridge of the spaceship there were five main consoles with glass sheets hung at slight angles. The graphics themselves had to be opaque enough to convey information yet have enough transparent areas so that the actors could be seen through them. David explained how it all worked out, "When you project onto glass it is specialized acetate with imperfections and it creates tiny refractive beams and bounces light back so you get light spill. Ridley Scott liked the light spill and used it in Prometheus."

On Jupiter Ascending they wanted the animated graphics to be placed perfectly, running along angled edges, but the screens were tilted so that they weren't perpendicular to the lens of each projector, essentially warping the projected image on the screen. Through trial and error they figured out a distortion within After Effects to compensate. David clarified the process: "We were inverting the distortion that was physically taking place on set and it worked really well. Some of the panels had geometric designs etched onto them as well so that our kinetic projections mingled with physical glass etchings. It turned out to be a clever merge of 3D set design and animated projections."

The advantage for the actors and directors was that they could physically see the screens, rather than having to imagine everything against a green screen. You also got reflections, light bleed and spill into the environment, making it all appear more real.

Before starting, the Wachowskis had assumed all graphics would be green screen and inserted in post, which is how they worked before. When they saw the tests they wanted the projected graphics on everything, which meant making changes and designing by the seat of their pants. David revealed, "The Wachowski's were a delight to work with and got really energized by working with us on-set. Motion designers Nik Hill and Ryan Hays would be perched with laptops designing, animating and rendering as Mila Kunis and Channing Tatum were being shot. Lana and Andy would take a look and then get us to make changes on-the-fly for the next take. It was a demanding way of working but a lot of fun and very satisfying to see it come together for the actors and directors."

A lot of this was only possible because Cinema 4D is so designer-friendly. It meant that Territory's team, which largely came from a design background, didn't have to get into the technicalities all the time. Cinema 4D tools are accessible so they could create the basic look quickly, get it approved by the director and then have a day or so to put in enough details to make the effects look polished and beautiful. David detailed how Cinema 4D made it all work, "The software is perfect for creating broad brushstrokes and then adding fine details. We do an awful lot of UI interface creation and there needs to be good handover processes between it and Adobe products like Illustrator and After Effects. We were swapping between Cinema 4D and After Effects with cameras going back and forth. With this constant overlap of software processes we can generate up to 30 - 40 screens for the next day of shooting. Normally you get into that position when you have spent a couple of weeks creating one or two hero screens. The director approves the look of those and then we roll out 40 screens based on them. It's not fun but you make quick decisions and find out what your limits are."

In the end, Territory spent four months working on Jupiter Ascending, rendering an estimated 20 minutes of visuals at 2k-resolution with its system of three Mac 3.5GHz six-core workstations. Nik concluded, "Cinema 4D is a great tool for getting good results quickly. By using the right blend of tools we managed to keep up with the high pace environment of film production. When the actors turn up you have to be ready to go and, thankfully, Cinema 4D is really robust, so you can do it."

Duncan Evans is the author of 'Digital Mayhem: 3D Machines,' recently published by Focal Press.

You can see more of the graphics from Jupiter Ascending here:www.territorystudio.com]]>news-4391Fri, 10 Apr 2015 14:02:00 +0200Tripping the Light Fantastichttps://maxon.net/en-us/news/case-studies/advertising-design/article/tripping-the-light-fantastic/How Jason Bruges Studio used Cinema 4D to control the New Year's Eve lighting display at the top of The ShardBy Steve JarrattTo celebrate the advent of 2015, the developers of The Shard – the tallest building in Europe – wanted to turn the building's glittering spire into a work of art. To accomplish this they turned to Jason Bruges Studio, an award-wining art collective based in London, renowned for creating distinctive interactive installations.

Every evening from Friday 19th December until New Year's Eve, the spire of The Shard would come alive with a light show visible across the city, creating "a dynamic piece of public art designed to reflect and evoke the spirit and energy of London," says Adam Heslop, designer and visualizer at JBS.

The pixelated countdown to 2015 was back-projected through The Shard's apartment windows and timed to coincide with London's famous fireworks display. The celebration lighting comprised of layers of searchlights, sparkling strobes, a dynamic color wash and a huge 2015 numeric graphic.

To achieve the effect, the 'Shard Lights' project occupied the top 40 stories of the building and employed the very latest technologies, including volumetric projection into a mist contained within the spire, and use of the world's first IP65-rated moving head LED lamp. The 850W device, provided by entertainment company SGM, is fully weatherized, making it resistant to water and dust, and its 17,000 lumen rating makes it the brightest of any IP-rated light source. The actual installation was handled by Production Resource Group, which specializes in large-scale theatrical and stadium events.

A project of this scale provided the team at JBS with a number of unique hurdles, the first of which was actually visualizing it. "The main challenge was designing an installation for a skyscraper-sized canvas while sitting at a desk in the studio," explains Heslop. "It's difficult to realistically imagine how effects will look at that scale. Designing a lighting installation for a building made mainly from glass is not without its challenges either – so much transparency means lighting effects won't reflect off the surface and won't be seen. Therefore, designing around this and finding ways to create high-impact effects were a big part of the design process. In addition, we had to ensure that the effects could be seen from the foot of the building and right across the city."

It also didn't help that the team was given just three weeks to get the installation up and running. "Logistically, this was a gargantuan operation," Heslop adds, "getting a substantial amount of heavy lighting and networking equipment to the top of a busy mixed-use skyscraper, plugged in and working across 40 stories, being the main contributing factors."

As Heslop explains, one of the key problems was that it's difficult to assess a complex lighting rig in secret when it's perched on the top of an extremely tall building like The Shard. "In order to get a reasonable idea of how particular effects were going to look on the building, some physical prototyping was necessary," he says. "The most challenging environment for this was the spire of the tower. There was a very limited amount of time to test lots of different lighting effects with a range of different products. The logistics of getting the heavy hardware to the top of a busy skyscraper is just part of that challenge. Knowing this and needing to work in the hours of darkness meant a very small testing window."

So far, so fascinating, but where does Cinema 4D fit into all this? Well, as with many of its installations (such as the Coca-Cola 'Beatbox' which the studio helped create for the 2012 London Olympics), it uses Cinema 4D to assist in visualizing and then operating the various complex mechanisms using XPresso or Python rigs.

Pre-visualization proved a vital aspect of the project in order to provide the client with an idea of what JBS could achieve and how it would look on the day. Once the scope of the project had been determined, the lighting effects had to be designed and accurately simulated since JBS didn't have the time to test every element before it went live. "Cinema 4D's camera calibration tag on photo backplates of the building proved useful," states Heslop. "As potential effects came in and out of the scope, render layers of the effects could be directly composited and layered onto the building visual."

The physical lighting rig used a number of moving head stage lights, offering automated targeting plus a wide range of colors, gobos and various other lighting effects. "This is where I saw the opportunity to use and develop further a Cinema 4D Python plugin that I'd been working on at the studio," says Heslop. "The plugin is a prototype that has the ability to control these moving head light fixtures and all of their features in real-time over a network. It uses Cinema 4D targets and constraints to tell the lights where they should be facing in the real world, while MoGraph matrix arrays were used for driving color and the rest of the features. Making the lights fully compatible with the MoGraph module was great as additional Python Effectors could be written and tiled over the top as in a traditional MoGraph setup."

Heslop's plugin calculates the information needed for every object on the network and sends the data using DMX, a lighting control protocol used in the entertainment industry. As objects are moved in Cinema 4D the lighting fixtures respond accordingly.

"The animations that would eventually be driving the lights were created in a separate Cinema 4D model. This was a simplified setup where the effects could be clearly viewed as a set of MoGraph matrices. The matrices were arranged on a minimal 3D model of The Shard, and the color value of each represented a light source. Controlling the light matrices involved a set of custom Python Effectors, which were designed specifically for this project."

Once the lighting choreography was worked out, the final sequences were exported as a series of low-resolution animations. These were then loaded into an external control system and acted as a pixel map, regulating the brightness and color of the lights. "As the effects were being designed three-dimensionally, a workaround had to be generated for converting MoGraph data into the 2D pixel map. For this a special effector was written to port values from one MoGraph Generator to another. The second Generator was a cloned square plane that, when rendered in an orthogonal viewport, exactly matched up with the pixel-mapped values. This, in addition to a Python helper for changing cameras and render settings, speeded up workflow substantially."

Despite the short deadline and the difficulties in testing a lighting system in secrecy, nearly 1,000 feet in the air, JBS successfully completed the challenge and 2014's New Year's Eve celebration went off without a hitch. "Seeing the searchlights being turned on for the first time and seeing the city sky completely changed was certainly very memorable," declares Heslop.

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

Jason Bruges Studio Website:www.jasonbruges.com]]>news-4538Tue, 31 Mar 2015 11:43:00 +0200The MothershipTo celebrate their 5th anniversary, Studio FutureDeLuxe wanted to create an animation that stood out from the usual studio look – and Cinema 4D helped them do exactly that.Studio Future DeLuxe has made a name for itself over the past five years with a rather unorthodox approach: an experimental use of graphics programs is used to generate know-how that is in turn used to create spectacular results for clients.Cinema 4D is one of the team’s favorite programs for experimenting, not least because of the endless and diverse possibilities its features have to offer. After the decision had been made to create a fitting animation to celebrate their 5-year anniversary, the FutureDeLuxe team had a treasure trove of experimental material and techniques that they could use for their project. The film they created – Mothership – is colorful and trippy and full of flying shapes, swarms, strange creatures and fantastic formations that fly through a magical world using dynamics.

The team at FutureDeluxe, together with artist Nejc Polovsak, used instances in Cinema 4D to make it possible to handle this vast number of objects in a single scene. “In particular for the part where the camera flies through the tunnel it was especially important to determine the total number of instanced objects that we could use without exceeding our limit. We resorted to using low-poly objects that used less memory, which let us add even more objects to our scene”, remembers Nejc. “When viewed from a distance you can’t tell the difference between high-res and low-res proxies.” The most elaborate scenes had about 1.5 million polygons and ate up 650 MB of memory.

“Numerous Cinema 4D tools and features ranging from modeling to BodyPaint 3D, character tools and the Motion Camera for the soft camera moves were used – without which it would have been impossible to create this film. But what was really indispensible were Dynamics and the simulation features, which were used to create large parts of the animations.”

Most of the FutureDeLuxe team was involved in the initial creation phase of Mothership. As the animation phase began, fewer team members were needed and Nejc Polovsak was essentially the one who animated the short film. “I could still count on the entire team at FutureDeLuxe who also supplied me with plenty of great feedback and outstanding art direction,” remembers Nejc. “It took ten weeks until the scenes were completed and rendered. Rendering was done using VRay for Cinema 4D on a single PC (6/12-core i7 3930, 32 GB RAM). After all the work had been completed on the scenes I was really ready for a vacation. I started rendering before I left and when I got back after 5 days the rendering was done and Mothership was ready to be edited and finalized in After Effects.”

“This project definitely provided a lot of valuable experience that made it evident why I love working with Cinema 4D”, explains Nejc. “It’s the ideal software for generalists like me who want to single-handedly complete entire projects with a single software package. Artists want to realize and experience their visions without having to worry about technical issues. This is exactly what Cinema 4D does,” concludes Nejc.

Process video:www.vimeo.com/109013667]]>news-4475Mon, 23 Mar 2015 10:07:00 +0100Marvels of Engineeringhttps://maxon.net/en-us/news/case-studies/visualization/article/marvels-of-engineering/British comic artist Kevin Hopgood uses Cinema 4D to reveal the inner secrets of Marvel's mechanoid monsters.As well as calling on decades of comic art, the publishers also commissioned a series of cut-away illustrations, exposing hidden details of vehicles, buildings, robots and exoskeletons. These images were produced by British comic artist Kevin Hopgood, famous for his work on 2000 A.D. as well as stints on the Doctor Who and Iron Man comics. "One of the first cutaways I did was War Machine," he says, "which was really nostalgic for me as I'd co-created the character during my run as penciler on Iron Man back in the Nineties."

To better help him realize the style of artwork, Hopgood turned to Cinema 4D. "I initially did the illustrations in Adobe Illustrator and treated them as a technical drawing exercise," he explains. "The jobs seemed to be demanding more 'shiny chrome' effects, which can be a pain to draw but a really easy effect to get in 3D. I started off introducing more 3D elements into the artwork but I wasn't confident that I could pull the whole thing off in 3D. I finally bit the bullet with a cutaway of the Iron Man villain The Titanium Man. That one turned out how I wanted, so I stuck with a 3D workflow for the rest of the illustrations."

Each image begins life as a pencil drawing to block out the character's pose. "Once the guys at Eaglemoss have okayed it I use the drawing as a basis for the modeling work," says Hopgood. The characters are then modeled with a combination of Cinema 4D and ZBrush using the GoZ plugin to jump between the two. Work starts by building up a base mesh in Cinema 4D, where Hopgood says he's really enjoying the new Polygon Pen tool and has found the Bevel Deformer to be quite useful. For hard surface elements with complex curves he's turned to Splurf from Blackstar Solutions (www.blackstar-solutions.de). The €35 plugin enables him to define shapes using splines and then generate smooth polygonal patches.

The mesh is moved into ZBrush for fine detailing before being returned to Cinema 4D for materials, lights and final render. "I've also been using the sculpting tools more than I thought I would," he adds. "Quite often I want to do a quick tweak that I don't want to go over to ZBrush to make."

For rendering, Hopgood employs a series of studio lighting setups from C4Depot (www.c4depot.com). "My favorite set up has got three digital 'soft boxes' and a fairly abstract HDR image on a Sky object. Most of the metal material presets seem to give good results with this setup. I've been using Global Illumination with the Physical renderer set to 'Object Visualization – High'."

The poster-size images fold out onto three pages of A4, so the illustrations need to be pretty large. "I've been doing everything in small chunks to keep file sizes down," he comments. "I don't render it all out in one go but render out individual body components and composite it all in Photoshop. If the silhouette doesn't quite match the drawing I'll tweak it using the Liquify plugin; I think that helps to establish and maintain a dynamic pose and avoid the woodenness you sometimes get with 3D figures. I've been drawing the 'wall' between the cutaway bits and the outer surface in Adobe Illustrator. Apart from that it's all 3D."

Since moving into the 3D realm, Hopgood has really embraced the medium, and now has his eye on 3D prints to complement his 2D artwork. "Eaglemoss also do a range of Marvel & DC Comics figurines," he says. "In the past I've done concept sketches for the modelers to work from, but I've been aware that with 3D printing it's possible to sculpt digitally and output a model to make a cast from. I've just had my first figurine model approved by Marvel, so it looks like I'll be doing a lot more 3D work in future."

However, Hopgood notes that not all 3D software is created equal: "Other 3D programs I've used are a tad unstable and prone to crashing," he declares. "When deadlines are approaching it's good to be working with a rock-solid tool like Cinema 4D that can take everything you can throw at it."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

Kevin Hopgood Website:www.kevhopgood.com]]>news-4479Thu, 05 Mar 2015 15:41:00 +0100The fantastic Journey of Woolhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/the-fantastic-journey-of-wool/London studio Neon mixes live action and CG in this finely spun yarn.The brief for 'Lost and Found' was simple but challenging, says Tom Bridges, managing director of Neon. "Woolmark wanted us to describe the process of wool production in an entertaining way. How we did that was up to us and, after some thought, we came up with the idea of showing the process from the perspective of a piece of wool. We won the pitch in January, and then spent a couple of months refining the script. Woolmark gave us an entirely free hand with the look and feel."

The first step was to undertake research, gathering relevant reference material and planning the spot. "Storyboarding and script development took two months, off and on," comments Tom. "We had a very clear idea of the style from an early point: we wanted to create shots that were photo-realistic – yet visually and physically impossible. Our film 'Macro' was a starting point for a lot of what we were trying to do."

Macro was a winter R&D project in which the Neon team set out to make a short film featuring a skateboarder, with the challenge of mixing live action flawlessly with photo-realistic CG. The shot features impossible camera angles with lots of super slow motion and extreme depth of field, and allowed Neon to "develop a process that would allow us to create these shots with a small team of artists – instead of the usual small army." It's clear that the Macro project laid the foundations for the Woolmark spot, with both films sharing the same design DNA. Watch the R&D film 'Macro' here.

‘Lost and Found’ begins with a live-action shot of a sheep being sheared, at which point a tuft of wool floats off and begins its journey. The strands were created using Maya's nHair system and then animated using a combination of simulation and keyframing.

As the strand floats through the manufacturing process, it's wafted through a mill, riding on the warm air from a coffee cup. This shot is typical of the project, incorporating a CG cup with simulated steam set against a live-action background. "We used as many real elements as we could," explains Tom. "All the shots were tracked, and we used a lot of projection mapping."
For the live action, the crew shot footage in Biella in northern Italy, Australia and Huddersfield in the UK. "Getting all of this material to seamlessly intercut, and with our fully CGI set pieces, took a lot of finessing."

To get the shots looking right, the crew relied on HDRIs taken on location in conjunction with Cinema 4D's native lights. "All the depth of field was in-camera," adds Tom. "Cinema 4D's Physical Renderer worked really well here to get the look we were after."

While most of the shots incorporate real-world elements, there are a few sequences that are entirely CG. One such segment features a close-up of a tailor's hand as he chalks the woven material ready to be made into a suit. Although it looks photo-realistic, the hand, chalk and material are all CG: "We created the mesh using photogrammetry, using a lot of stills. Cinema 4D was used for animation and texturing, and we used its particle system to create the lumps of chalk as they came off."

Final output was handled using the Deadline Render Management System from Thinkbox Software, which Tom says integrates really well with Cinema 4D. "We tried to keep the passes quite simple," he says. "A main beauty pass with some technical passes: depth, object IDs and so on."

With the sequence rendered, final compositing took place in The Foundry's Nuke, with Adobe After Effects employed for a few elements. The post work added "lots of lens flares and general beauty work," states Tom. "We added a lot of dust and particles to try and give the CG more of a natural look."

The finished spot runs to 2 minutes, 43 seconds, and was output at 2K resolution. Neon began the project in earnest in April of 2014, and delivered at the end of August. "Different numbers of people at different times worked on the project," Tom comments, "but the team ranged from two to five people. We use mostly iMacs and Mac Pros for the workstations, and have a 144-core render farm on site."

The quality of the final film is a testament to both Neon's ingenuity in seamlessly combining CG and live action elements, but also to Cinema 4D's Physical Renderer, which is clearly capable of producing wonderfully photo-realistic imagery.

Tom's final comment is aimed at anyone wanting to improve their Cinema 4D work: "Make bold strokes," he advises, "and be ruthless about throwing away what doesn't work. Far better to fail early, than to spend ages finessing something that's never going to fit."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.Watch the Making Of movie here: www.vimeo.com/109022399

Neon's Website:www.neon.tv]]>news-4461Mon, 02 Mar 2015 14:52:00 +0100Cinema 4D Goes Swimming with Monstershttps://maxon.net/en-us/news/case-studies/visualization/article/cinema-4d-goes-swimming-with-monsters/Discover how Munk Studios brought some of the world's most dangerous aquatic animals to lifeWhen the Discovery Channel wanted to create a documentary about a man swimming alongside some of the most dangerous aquatic creatures in the world, it was a dream come true for Pryce Duncalf. Pryce is the creative director and production lead of Munk Studios Ltd. and was tasked with helping the audience understand some of the more scientific facts relating to the monsters using a combination of 3D animation, 2D texture mapping and graphic design. With a three-month time period and unprecedented creative freedom, Pryce turned to Cinema 4D to create the 20 minutes of animation and graphics that would be required.

The goal of the TV program was to explore these aquatic killing machines up close and in a way that had never been presented to a viewing audience before. Pryce came up with the idea of putting the creatures into a kind of underwater observatory, being filmed by a sci-fi super camera that could slow down time and reveal details hidden from the naked eye.

The research required was intense. While some 3D meshes were sourced, others were modeled from scratch. Pryce also needed to sculpt the internal bone structures and organs, which required trips to the National History Museum, the ZCL library and a visit to the Grant Museum to examine skulls and bones. More intriguingly, because the facts had to be accurate, the researched revealed that there was still some scientific debate over issues such as the total number of teeth that were in the mouth of an anaconda. Some of the creatures were so new to science that documentation on them was limited, which required contacting experts in those fields.

Once the research had been completed it was time to start modeling everything except for the outer meshes for the shark and hippo, which were purchased and subsequently edited and re-textured with Vray. Pryce revealed that some of the modeling was exaggerated, "The creatures were originally modeled to look quite realistic in proportions, but after client feedback they were to be made to look more like monsters. This kind of feedback can often come after the creatures have been animated, but it's relatively straightforward to adjust the models by adjusting a duplicate mesh with the sculpting tools and then using a Morph tag to reference the new mesh. This is completely non-destructive and doesn't involve any re-rigging or re-weighting, which is a massive benefit."
He then tackled the tricky process of creating the base models for the bone structures. The Cinema 4D sculpting tools were then used to make them more organic.

To convey the feeling of being underwater, the creatures were given movement characteristics and the lighting was dimmed accordingly to suggest a deep, cold environment. Pryce listed some of the processes used: "Everything had to be lit with a limited falloff so the creatures seemed like they were lurking around in the darkness. I also used depth passes and visible lights to create a volumetric environment. These elements were rendered separately and composited in After Effects. A setup that was also really useful was a great MoGraph/Particle rig by Joel Dubin called Microfloaties, which adds random floating particles to your scenes. But like most setups I use, I took apart the rig and adjusted it to suit the scenes."

One of the first creatures to require a complex approach to animation was the snake. Pryce explained how different techniques were required: "After studying the way snakes move, I found they constantly transfer their weight and propel themselves using their coils, which are constantly changing shape. When moving through water, there is a mixture of gliding and rippling their body to propel themselves forward. Then there is the head, which generally leads the way but also moves independently. Taking all this into account, I realized there was no simple way to recreate these general principles in 3D, so the idea was to try to mimic them as best as I could depending on what the scenario was. For the initial shot of the snake moving through the water, I mainly used spline IK with plenty of control points that I had to meticulously keyframe. I kept the IK deformation type set to Equal so that the snake would appear to be constantly compressing and expanding depending on how far apart the control points were."

When the snake goes into action coiling around a deer, it required more far-reaching knowledge of how to use the Cinema 4D toolset. Pryce used a Spline Wrap deformer. The snake started straight initially and was then brought into position to bite the deer. A spline was created that started from the original head position and wrapped around the deer for the Spline Wrap deformer to do the rest of the work. To give the impression that the snake was compressing, the To value in the deformer was keyframed so it shortened the mesh.

One of the more unusual problems came when trying to rig the squid for animation. Cinema 4D rigging is based on a bone system but squids don't actually have any bones. This led to some head scratching and another two weeks of research to solve the problem. Pryce detailed how it was eventually solved: "The squid had multiple rigs and controllers so the creature could transfer its tentacle movement in a convincing way. MoGraph is great for solving complex problems such as the tentacle suction cups, which involved using the Matrix object along with the Surface deformer, which was then cloned with MoGraph and random effectors to give each suction cup its own individual movement."

One of the more startling aspects of the TV program is the way the camera switches from watching the creature move around in its own environment to showing an X-ray type effect as it reveals the internal bone structure. To achieve this, Pryce created at least two projects with each animation. One was a beauty render to show the external features, while the second contained the outer mesh with a Fresnel alpha channel so the camera could see inside. All the texturing turned out to be a simple process for the X-ray passes as they mainly used high luminance with Fresnel alpha channels. Pryce also added a number of multi-pass layers so he could jump between the different ones in After Effects.

After outputting the 20 minutes of completed animation using two quad-processor-based Xeon Mac Pro’s, the sequence was nominated by the Royal Television Society for an award for graphic design. Pryce summed up his experience of using Cinema 4D to create such ground-breaking animation: "The tools available in Cinema 4D are incredibly flexible and there are so many options to solve pretty much any problem I come across. I feel pretty safe when I am approaching the realms of the unknown as I always seem to find a way using Cinema 4D's tools, particularly XPresso, Thinking Particles or MoGraph."Duncan Evans is a freelance journalist, photographer and author.

Munk Studio Website:www.munk.co.uk]]>news-4447Wed, 11 Feb 2015 14:12:00 +0100Recreating the Ancient World of Nikehttps://maxon.net/en-us/news/case-studies/visualization/article/recreating-the-ancient-world-of-nike/Cinema 4D comes out the winner in this long-term project to reconstruct the Temple of the Goddess of VictoryBy Steve Jarratt / Final text edit: John Goodinson & Peter Schultz

In Greek mythology, Nike is the winged goddess who bestows victory on the worthy and proclaims the skill of the triumphant to Gods and men alike. Her spheres of influence are those of war, sport, poetry, and art - wherever there is struggle, competition and victory. On the battlefield, in the arena or on the stage, Nike was present, rewarding victors with glory and fame. As the embodiment of strength, speed, power and victory, Nike was hugely popular in Greek and Roman culture. She adorned many items such as vases, armor and coins, and appeared on friezes and sculptures atop temples, columns and podiums.

Although her cult died many years ago, Nike has had a profound and lasting influence on modern culture, her attributes giving rise to angels and fairies. The common winged figures found on sporting trophies and civic architecture, war memorials and hood ornaments all owe their inspiration to the Goddess of Victory. Perhaps most famously, Nike was the inspiration for the Rolls-Royce hood ornament.

To tell Nike's story, British artist John Goodinson and American archaeologist Peter Schultz have produced an international documentary program that sets the standard for aesthetic and archaeological accuracy. Goodinson is a graphic designer whose previous work in film post-production includes Legend, Aliens and Leaving Las Vegas. Schultz is an archaeologist and art historian whose previous work includes studies of the major architectural monuments throughout Greece. Goodinson took on the challenging job of reconstructing the Temple of Athena Nike, which is situated on the Acropolis of Athens. Built around 425 BC, it is companion to some of the most important buildings of ancient Greece, including the Parthenon, the Propylaia, and the Erechtheion.

The cutting edge project, entitled 'Nike Is Now,' features a website (http://www.nikeisnow.co.uk), which tells the story of Nike, a trailer for the upcoming international documentary, and a showcase of the ongoing work in the recreation of the Temple of Athena Nike. A model of the Acropolis with its world famous monuments has also been reconstructed to show where the temple was situated.

"The team can analyze architectural propositions in 3D, refine the approach, ‘adjust' the model and then consider the reconstruction for credibility," explains Goodinson. "In particular, the model presented a unique opportunity to test theories about the temple's lost beautiful parapet, which ran continually around three sides with sculptures of Nike in various poses. The current temple features conjectural reconstructions for the west temple pediment, parapet and Nike central acroterion (roof sculptures). The west frieze is modeled on the only surviving actual frieze sculptures at the British Museum."

Goodinson's main technical challenge was to create a model that was fluid enough to accept continual adjustments as the project progressed. In particular Cinema 4D's XRefs, without which the placement of individual architecture on the Acropolis model would be difficult to manage.

"Early on it was decided that a low-polygon model with enough detail would be acceptable for the flat surfaces," he says. "Items such as the ionic capitals would be much more detailed. In particular, the frieze sculptures had to be accurate enough to be convincing. Poses had to be correct to the existing archaeological source material. However, as we would always be viewing from a distance, optimum polygon modelling was essential."

Working with Cinema 4D Visualize, Goodinson crafted the figurines using just basic modeling tools. "Cinema 4D's Create Polygon tool together with the Extrude, Knife and Brush tools, for polygon smoothing, were used exclusively for all the sculpture work - which was the most time-consuming."

Building these intricate sculptures one polygon at a time was an arduous process: "For the parapet, the Nike head took about 20 hours," he comments. The Nike with Bull pose took about 40 hours. Then the front temple frieze took a month - about 160 hours. However, tweaking and revising the frieze model was an ongoing process so it's difficult to give a complete production schedule for each model (I wasn't timing myself). Also, a lot of the time was spent analyzing this existing frieze from my own photography to get the poses right."

You may have thought that a sculpting application would be more appropriate for this task but Goodinson disagrees: "I have used other programs for sculpting," he says, "however, the retopo tools do not produce an optimum mesh - which is why I use the polygon tool for the best geometry. You need a slow and steady approach to classical Greek sculpture. In short, the Polygon tool is king."

Work began with a reference image of the existing Nike poses in the front viewport. Goodinson would then create a single polygon and trace the outlines of the sculpture, aligning the mesh to the contours of the original. Switching to perspective view, he then used additional reference material and adjust the mesh for depth. "I was tutored in life drawing by a fine artist for three years at art school," he says. "This discipline in ‘spatial awareness' was paramount to understanding three dimensional forms, especially human."

Not only did Goodinson populate the enormous frieze running around the top of the Temple of Athena Nike and reconstruct the ornate parapet around the base, he also crafted the huge statues that adorn the site. In addition to the intricate acroteria - or roof sculptures - that sit atop the Temple of Athena Nike, he also built the huge golden statues of Athena and Zeus.

"Zeus was built entirely by hand as I've described," he explains. "Athena was partially based on the only reference mesh I had - an online, very low-res scan based on the British Museum Caryatid, which I completely remodeled. Athena's head, arms and upper body and pose are my original mesh work."

For the color and texturing, Goodinson used extensive polygon selection tags, enabling him to easily change colors to suit. This was absolutely necessary, he says, "as colors have to be changed quickly and effectively. Greek sculpture was painted in a polychrome technique using a palette of flat colors, especially red and blue."

The model of the entire Acropolis site was based on Goodinson's own photography taken in Athens, plus side elevation drawings and existing models referenced from the past 100 years. "The best topology reference was the plaster model of the entire Acropolis at the Ashmolean museum at Oxford," he adds. "No data sets were used but the model has gone through several revisions."

The surrounding scenery was populated with simple polygonal shapes for the buildings and trees, although the online tree generator Snappy Tree (http://snappytree.com/) was used for added detail. It was finished with procedural marble textures, with bump mapping and sub-poly displacement used to add realism to the landscape and for ageing the stonework on the top base. For most of his renders Goodinson used Global Illumination with secondary light mapping, with ambient occlusion applied to certain textures and for particular, renders.

Having been working on 'Nike Is Now' on and off for four years, the CG recreation is at a point where it's effectively finished, but open to revisions. "It's in a state of ‘incremental completion', to be adjusted to consider alternate viewpoints on the archaeology and architecture."

For Goodinson, the hardest part of this obvious labor of love is patience and preparation. "It's all in the prep!" he says, adding, "the closer you get, the more detail is needed, think like (Albrecht) Dürer. The further away you get, think like Turner!"

Goodinson first got into CG with Cinema 4D GO in 2000, and began working professionally with Cinema 4D R9. He now uses it every week, and says "the alchemy of software in the imagination of the artist produces something wonderful."

For the ‘Nike Is Now' project, Goodinson used Releases 12 through 15 of Cinema 4D. Now using Release 16, Goodinson explains that two of the new tools are particularly useful for recreating such ancient worlds.

"The new polygon tool is impressive. The ability to 'freeform' and paint a line of polygons, add to it, extend it and move in a single tool is certainly appreciated by me when I am sculpting, especially when the subject is 'hair and clothing' - difficult aspects of classical Greek sculpture. As to workflow, this is a real time saver, a veritable 'Swiss Army Knife' of functionality in this new Cinema 4D polygon tool."

"However, the 'game changer' for me is the reflection channel. The ability to create reflections on multiple levels is a major step forward in recreating real world materials. Especially the look of marble."

"For this project especially (‘Nike Is Now') Cinema 4D Release 16 now has the ability to show our international team more photo-realistic representations of the multiple marble and limestone building materials used on the Acropolis, most especially Pentelic Marble."

"With new projects on the horizon you can be sure that these two new additions will have a dramatic effect - for the better - in future reconstruction and representations of classical architecture."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

The project website:www.nikeisnow.co.uk]]>news-4423Tue, 10 Feb 2015 09:04:00 +0100Utrecht Goes Hollywood: Oscar Nomination for „A Single Life“Animation studio Job, Joris &amp; Marieke centered their workflow around Cinema 4D.A young woman, a slice of pizza and a mysterious vinyl single: these are the only ingredients needed for being nominated for an Oscar® - at least for the small Utrecht, Holland-based creative studio Job, Joris & Marieke! The small team relied on Cinema 4D as the primary application for the creation of “A Single Life” and used Cinema 4D’s toolset to handle the complexities of 3D content creation, including modeling, rigging, character animation, lighting, rendering and more.

Many of Job, Joris & Marieke's past works are characterized by their charming, imaginative stories, their unusual style, interesting music and lovingly designed characters (Mute, The Daily Drumbeat). Job, Joris & Marieke were guests at last year's MAXON user meetings in Düsseldorf and Munich, and their work was praised enthusiastically by the entire Cinema 4D community.

In the typical Job, Joris and Marieke style, “A Single Life” tells the story of Pia, a young woman who finds a mysterious vinyl single. As she spins the record she is suddenly able to travel back and forth through the years of her life. The three-member filmmaking team of Job Roggeveen, Joris Oprins and Marieke Blaauw managed all aspects of “A Single Life”, including the original idea, writing, directing, design, animation, as well as creating the sound effects and composing the music.

The film was originally created as a part of Ultrakort, a special project of the Dutch Film Fund and the Pathe cinema group in which select short animations are screened preceding blockbuster movies. In the Netherlands, nearly a million people already had the chance to view “A Single Life” on the big screen.

Job, Joris & Marieke was established in 2007, initially as 2D and stop motion studio to create short films, videos and commissioned work, and gradually began to push its artistic capabilities into the 3D animation realm to handle projects with rich layers of detail. “One of our main storytelling challenges in “A Single Life” was to capture Pia throughout the five stages of her life – from a little girl through old age – in five different settings within a very abbreviated timeframe,” says Oprins, one of the film’s directors. “When we launched the studio we had no prior experience in 3D and decided to bring Cinema 4D into the production pipeline because we found it to be user-friendly, integrates well with other professional applications and offers powerful creative capabilities.”

38832

Cinema 4D afforded Job, Joris & Marieke the creative flexibility throughout the filmmaking process to realize the unique animation style in “A Single Life”, which was shaped by the studio’s background in stop motion. Cinema 4D’s Jiggle Deformer was applied to add bounce and follow-through effects to the animations, while XRefs enabled a flexible and streamlined workflow to organize elements, work with placeholder props and animate characters in simultaneous collaboration.

“Pia ages a few times in the film, so her clothes, size and hair style change but her shape doesn't so we were able to use the same character rig,” says Oprins. “This saved us significant time because we could start animating a scene in Cinema 4D before the character was finished.”

The powerful Displacement Map feature in Cinema 4D was also an essential workflow feature for adding detail. “To achieve a dramatic and stable lighting that worked well with Pia's hair we chose not to use Global Illumination and instead built the lighting with a dome and cluster of very soft spots,” adds Oprins. “We rendered everything in 32-bit and were able to stylistically light aspects of the scenes to great success without unreasonable render times."

“A Single Life” has been selected to screen at film festivals worldwide, including the Toronto International Film Festival, Dutch Film Festival, Bay Area International Children’s Film Festival, New York International Children’s Film Festival and numerous others. The Academy Awards for outstanding film achievements 2014 will be presented on Oscar Sunday, February 22, 2015, at the Dolby Theatre at Hollywood & Highland Center, and televised live on the ABC Television Network.

The 2015 Oscars award ceremony took place on the 22nd of February and the Oscar for best animated short film went to Disney Animation Studios' "Feast" by Patrick Osborne and Kristina Reed.

For motion graphics artists who make concert visuals, creative direction often comes down to “Oh, make some kind of blinky thing.” So Trevor Kerr always appreciates when he gets to collaborate with clients on visuals that are more interesting and challenging, like a recent project he did for the international visual group, Comix. With “destruction” as the theme, Kerr was asked to create several clips for Comix’s library of stock footage for festivals.

With just five days to complete the project, Kerr used Cinema 4D, Turbulence FD, X-Particles and After Effects to create black-and-white, high-contrast clips with an epic landscape feel. For inspiration, Comix provided mood boards that included a few stills from Woodkid’s “Iron” video. “They also gave me stills showing columns of things crashing down, a frame with a rocky texture and smoke,” Kerr recalls.

The goal was to create visuals that had momentum and were a bit ominous. “A lot of the music in the EDM (electronic dance music) culture has kind of a dark element to it, and it’s clear that’s what Comix was going for with this,” Kerr says. “My natural tendency is to take things in a darker direction, so I understood what they were looking for.” (See a looping video of the clips he created with music composed by Starcadian , here)

Music and VisualsKerr is a full-time freelancer these days. Before that, he worked quite a bit for Good Theory Studios, formerly known as Media Evolutions. Kerr met the company’s founders when he was just 16 and playing in a band. They gave him a job doing audio and before long he started using After Effects to create concert graphics. By 2010, Good Theory was looking to incorporate more 3D into their projects and Kerr noticed a colleague teaching himself to use Cinema 4D.

“What pulled me in was the look of the interface,” he recalls. “I really liked the hierarchy of the layers in the Object Browser, and right off the bat I thought: ‘Now, this is something I can understand.’” Over the years with Good Theory, Kerr helped create visuals for artists such as The Black-Eyed Peas, Nicki Minaj, Jennifer Lopez, Maroon 5 and Avicii. He has also collaborated on graphics packages for network broadcasts, including the American County Awards.In addition to being a motion graphics and 3D artist, Kerr also freelances as a cinematographer. Eventually, he would like to be doing video game cinematics. “To me, that would be the perfect blend of cinematography elements and 3D,” he says, explaining that lately his heart is in creating cinematography elements in the 3D world. “I’ve always been a fantasy video game enthusiast, and being able to combine my two biggest passions (cinematic arts and fantasy) would be something of a dream for me.”

Dark WorldThough the Comix project’s deadline was tight, it helped that Kerr had worked with them in the past and was familiar with the company’s creative process. After getting approval on the ideas he came up with based on the mood boards, Kerr was already off to a good start because he took the time to get the camera animations right for the pre-visualization. “In the music industry, you don’t have a lot of time and there often isn’t a ton of budget, so you have to do as much as you can in a couple of days of work,” he says.

Kerr generated the dystopian landscape using Cinema 4D’s landscape object, and he used a volumetric lighting trick that he likes to efficiently create fog and smoke. By specifying an “Include” list but not placing anything on that list, he allowed the volumetric light to render quickly because it was not interacting with any other scene elements. The sky was composited in using After Effects’ compositing project file (AEC files). Small particles in the sky were composited in afterwards using stock footage from Motion VFX.

While he was asked to create certain elements and looks for the Comix project, he also had a lot of freedom to experiment with tools and techniques. So when he was tasked with creating scenes that included a huge meteor-like rock hurtling through a dark void, he figured out ways to make that interesting. Using a displacer effector, he made the meteor by layering several displacers on the surface of a sphere primitive.

Next, he selected areas on the surface of the object to create glowing pockets of light using polygon selections. “I used X- Particles to emit just from those surface selections, and I made a very low-polygon version of the rock to serve as a collider object so the X-Particles weren’t passing through the rock,” he says. After using volumetric lighting to break up some of the geometry, Kerr made a second X-Particle system to create the small pieces coming off the larger meteor. “After I lit everything, I hit render and used a depth pass to make the orientation points that draw your eye,” he continues.

Explosions, Fire and SmokeCreating the visuals in which columns fall as explosions are happening all around was somewhat tricky because of the looping Dynamics, Kerr says. Everything needed to be the same in the first frame as the last (watch his breakdown of the looping mechanisms he used here).

The fracturing plug-in NitroBlast was used to create three separate explosions. “I like the look of multiple explosions,” he says. “It brings your brain into it a little more, especially if you’re doing slow motion because your brain has more time to break things down.”

He made a simulation and used SteadyBAKE to turn it into a point-level animation so he could have a key frame for every point, allowing him to offset the points in time. Explosions happen in all directions, so there are particles in front of the camera and also in the reflections, which have to match the first frame in order to loop.

X-Particles did most of the heavy lifting to create the shape of the clouds of smoke as well as the smoke trails. X-Particles were emitted from the polygons of the point-level animation simulation, offering viewport feedback in real time. Hence, adjusting things like the way the wind carried smoke volumes became a relatively easy task (check out Kerr’s smoke tests here.)

“Turbulence FD can be set up to inherit forces from the X-Particles, and from there it’s mostly just simulate and tweak your smoke shader,” he says, adding that between the powerful plug-ins and Cinema 4D he is able to get results often expected of a pipeline of artists. “I’m just one guy taking on what a studio normally does and that’s just because the tools allow us to do these kind of things now,” he says.

More works by Trevor Kerr:www.kmd.cgsociety.org/]]>news-4421Wed, 04 Feb 2015 15:29:00 +0100Tracking success at the Commonwealth Gameshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/tracking-success-at-the-commonwealth-games/When the BBC's own in-house CG specialist wasn't available, the Corporation turned to Xrayvision to help complete a preview for the GamesBy Duncan Evans

With 25 years of broadcast experience at the BBC and a couple of Gold Awards at the PromaxBDA Global Excellence Awards in New York, Rob Shergold is used to having to produce quality work with demanding deadlines. So, it was no surprise when Nick Davey, Senior Designer at BBC News, called with a bit of a problem: the deadline was in a scant seven days.

Rob had set up a small independent design company called Xrayvision in order to produce work for a wider range of clients than what he covered at the BBC. There were other driving reasons, too: "I also wanted to be free of a purchasing bureaucracy where hardware and software upgrades can get lost for months or years in endless meetings and paperwork. I wanted to work with the best kit available. I signed up to Adobe Creative Cloud and upgraded the Lite version of Cinema 4D that came with After Effects to a full Studio license at a click of a button. And I sensibly bought MAXON's Service Agreement, which meant that when R16 came out the upgrade was automatic."

Rob explained why he chose Cinema 4D, "I'd used other 3D packages before but I knew Cinema 4D could produce beautiful results quickly. And with Cineware it had an easy-to-use workflow with After Effects, which itself had a much improved camera tracker. Without this workflow this job wouldn't have been possible in the time."

The job in question was this. As creative director of the piece, Nick Davey had shot a preview of the Commonwealth Games for the BBC, a big shoot involving steadicam, a camera drone, Sophie Raworth as presenter and access to the main stadiums in Glasgow. The footage needed camera tracking and various CG elements added – some simply decorative, some more technically illustrative - for example to show how the pitch at Hampden Park was raised two metres to accommodate the athletics track.

In total there were a dozen shots, which needed compositing, of which five were farmed out to Xrayvision, the rest completed in house at BBC News. Nick called Rob who explained the task he was facing, "To meet the deadline meant that every shot had to track perfectly first time. They didn't. Three of the shots were pans but without any parallax they are notoriously difficult to track and one was a pan over moving water. The other two were shot in large stadiums, where all the possible tracking points were in the stands with very few near the camera down on the pitch to give the much needed foreground reference."

There were other problems, too. There were two traffic cones to the right of the running track. The client wanted them painted out but there was no way to make a good job of that in the time allowed, so Rob created a blank advertising stand in Cinema 4D and placed it over the cones. At the client's suggestion he created another to give it some context. To cover the cones properly, the fake advertising stand had to straddle the point at which the pitch had split in Rob's original design, so he had to move this split forward so that it looked like the stand was on the grass. In turn this meant moving the grillwork, the pillars and almost everything in shot to make it work. Being able to view these changes live in After Effects via Cineware was a huge help.

Rob explained how the overall process worked, "I used a combination of After Effects' built-in tracker and Pixel Farm's Matchit to get a basic solution on each shot. Even then the camera keyframes required a lot of editing and smoothing out. Once I had a usable camera solution for each shot, the ability to work with Cinema 4D directly in the After Effects timeline, to add compositing tags in Cinema 4D and bring separate layers for shadows, reflections and ambient occlusion into After Effects made the rest of the job very straightforward. I was able to adjust textures and lighting in Cinema 4D to match the original footage and see the results straight away in After Effects. Compare that to the process I used on the UK Debt graphic last year, which picked up a Gold PromaxBDA in June, where I would take still frames from the footage as reference back plates in Cinema 4D and then render multi-pass test frames and then import those back into After Effects before I could see if final composition worked."

To get a good idea of when the sequences had been shot Rob talked through the shots with Nick and then used Cinema 4D's Physical Sky to correctly position the sun and shadows and get the right color light for Glasgow in June. The best part was that the pipeline between Cinema 4D and After Effects was seamless. Rob was grateful that: "I had no problems whatsoever - which was good because there were plenty of problems for me to cope with already."

The only CG challenges then were editorial ones because between shooting the footage in late June and the Games starting in July, a couple of cyclists dropped out due to injury. This meant the velodrome sequence that had been created, which was of course the only sequence that tracked perfectly first time, had to be severely edited. The final piece was color graded in After Effects by Nick Davey at BBC News.

Originally Rob was going to render out the project on an 8-core Xeon PC built with 32GB RAM and a Titan Black card. However, from the first day the PC simply didn't run as fast as it should. Later, it went back and Rob now has a 12-core Mac Pro. To actually get the job done, though, he used Team Render to split the job between the PC and his MacBook Pro laptop and that worked superbly.

The job was completed in time, as Rob concluded, "I used to work at the BBC so fast turnaround is second nature to me. Nick knew he could trust me to deliver a high-quality product to tight BBC guidelines and that Xrayvision had the latest versions of Cinema 4D and After Effects necessary to get the job done."

You can see the treated shots – including the unseen original velodrome sequence – here: http://vimeo.com/xrayvision and the full broadcast version of the completed preview on Nick Davey's Vimeo site: www.vimeo.com/101328562 and it's worth noting that the project has also been shortlisted for the Information is Beautiful awards.

Duncan Evans is a freelance journalist, photographer and author.

More of Xrayvision's work can be seen at:http://xrayvision.tv]]>news-4417Thu, 29 Jan 2015 13:55:00 +01003D Worlds in the Shadow of IsolationOriginally planned as a short film and serve as a reference, this project ended up taking three years to produce, and thanks to a great deal of creativity and Cinema 4D, became much more than was planned …Neither Annegret nor Till had previous experience with 3D animation films before they started work on this short film. In the end, they decided to use Cinema 4D for their project. Among the tools needed for the realization of ‘Rue de Fleurs’ were those for character animation, particle effects, hair, technical animation, modeling, dynamic and cloth simulation and more. Annegret and Till had seen colleagues use Cinema 4D, with whom they then made their first attempts at using Cinema 4D themselves. After they saw how easy it was to use Cinema 4D, it was quickly decided that this would be their software of choice for the project.

A lot of research was done on the topic of social isolation amongst the elderly. An entire year was used to prepare for filming, during which the topics of loneliness and isolation were worked into the script. After the script had been completed, work began on the actual film early in the summer of 2013. Annegret first created a detailed storyboard, which was used as a base for animatics, including sound, which were in turn used to assess camera movement, cut sequences and general visuals. Next, Annegret created a clay model the film’s main character Gustave in order to get a better feel for his facial expression.

The clay bust was then photographed from three sides and these images were used as a template for 3D modeling. As soon as the first figures had been built, the team started to animate them. XRefs were used in the beginning. “This made it possible to make changes to character models even after they had been animated. Thanks to XRefs, making subsequent changes didn’t hold up other workflow tasks,” explains Till Giermann. This also made it possible to conduct exhaustive tests with shaders for Gustave’s skin and with his hair, and still have the newest version of the character model for rendering.In addition to Gustave, another 300 objects had to be modeled and textured, which was all done in BodyPaint 3D. The textures themselves were photographed by Annegret and Till and edited for use as textures. MoGraph and Dynamics also played important roles. MoGraph was used whenever leaves flew through the scene, trees had to be added, water drops wandered down window panes or debris had to be strewn across the ground. When Gustave’s world literally breaks apart, Cinema 4D’s Dynamics was used to create these effects.

Being inexperienced as they were, Annegret and Till first filmed themselves moving like the main character in order to get a better feel for his movement. Annegret then used this footage as a rotoscoping background to create the animations. Since this film revolved around a single character, the digital version had to be created so that its facial expressions and gestures conveyed the necessary emotion. This led to Gustave’s character being modified and improved throughout the first year of production.

Of course everything has to be rendered in the end, which was another aspect of 3D production that the team had to learn. The first attempts showed the quality that can be achieved with the Physical Renderer – and the realistic-looking depth of field really impressed Annegret and Till. The team only had three mid-range computers available for their render farm, which meant that they had to do without the features of the Physical Renderer and the depth of field had to be added in After Effects during post-production.

The new Team Render on the other hand really helped in the production phase. “Cinema 4D R15 was made available in the middle of the production phase. We quickly got to know the new Release and everything worked great. Team Render was the most effective solution for our small network,” declared Till.

Schmalbreit Film website:www.schmalbreit-film.de/]]>news-4363Thu, 18 Dec 2014 13:52:00 +0100Galactic Encounter of a Different KindLos Angeles-based Imaginary Forces is always on the lookout for tools that meet or exceed the standards set forth by their demanding team of artists. This creates an ideal environment for testing Cinema 4D Release 16.Imaginary Forces is a well-known name among LA studios. A look at the IF website reveals a wide variety of notable projects on which the team has worked. One of the tools that plays a major role in their everyday work is Cinema 4D. As soon as the beta version of Cinema 4D R16 had been made available the team at IF was very curious about its new features. This was particularly true for Creative Director Ryan Summers, who was very interested in the variety of new possibilities that Cinema 4D R16 had to offer: tools like Polygon Pen, improved sculpting and the impressive materials created using the new Reflectance Channel made a lasting impression on him.
Even before the final version of R16 had been made available, Ryan and his team decided to create a short film to put the new version through its paces under nearly realistic conditions. The new R16 features in particular had to be put to the test. Ryan also decided to let the tools themselves star in the film. Functions such as Polygon Pen, Reflectance channel and Solo button were given starring roles.

The film is a science-fiction story in which a small, somewhat rickety space shop is attacked by a large battle cruiser. Despite its apparent superiority, it’s always outwitted by the small, rickety underdog, who first conjures up a more powerful booster engine using the Polygon Pen tool to escape the first attack and then uses its own reflectivity created with the Reflectance channel to stave off a laser attack. As a last – and finally successful – resort, the small ship uses the Solo button to make the cruiser disappear.

Presented as an instructional film from the Hyperspace Travel Security Authority, the film is somewhat reminiscent of classics such as ‘Duck and Cover’ or ‘Hitchhiker’s Guide to the Galaxy’ and brings smiles to the faces of those who watch it.
With this Cinema 4D pilot project, Ryan Summers not only wanted to put new features to the test, he also wanted to find out just how quickly newcomers can learn how to use Cinema 4D. For example, the ZBrush and Maya expert Amir Karim was asked to do the modeling and sculpting. This was Amir’s first project using Cinema 4D exclusively. Amir said, “I thought that it would definitely be an interesting experience using Cinema 4D instead of my usual software for this project. Soon after getting started I was very surprised to find all the tools that I also know from Maya, and I was able to get rolling without any major interruptions. Especially the fact that I was able to apply the Sculpt tools to normal geometry that hadn’t been subdivided in any special way turned out to be very helpful.”

Character artist Richard Deforno also took part in the project and was fascinated by Cinema 4D’s comprehensive set of functions and how seamlessly Cinema 4D could be integrated into his existing workflow. “Thanks to the wide range of available functions you almost never have to leave Cinema 4D, which means that you can complete almost every phase of work directly in Cinema 4D!” While working with Cinema 4D, Richard constantly discovered new advantages to working with Cinema 4D and also encountered stereotypes: “Many times, colleagues would watch me work and they would tell me that they thought Cinema 4D was only a motion graphics tool. They had animated fonts and flying letters and were pleasantly surprised and excited when they saw me animate the character – especially when they saw how easy it was to work with Cinema 4D."

What was particularly interesting according to Glen Snyder, who is responsible for pipeline development at Imaginary Forces was: “The Python interface is clearly documented and makes it possible for me to program tools for everything that Cinema 4D does not offer as a standard tool. This means that you don’t have to leave Cinema 4D when performing various tasks in your everyday work.”

Everyone who worked with Cinema 4D on this project had a positive reaction: the possibilities that the new features and functions in Cinema 4D R16 offered, its stability and the expansive range of uses means that it can come to fruition in just about any production situation. Ryan Summers is very satisfied after completing this “first run”: “We can hardly wait to use Cinema 4D R16 for our commercial production. I think that the way Cinema 4D is perceived will change in the near future. We are encountering consistently more artists who previously swore by another software package and are now achieving spectacular results in just a short time with Cinema 4D. The entire 3D production landscape will change significantly.”

]]>news-4336Fri, 05 Dec 2014 09:17:00 +0100Architectural Visions Printed in 3Dhttps://maxon.net/en-us/news/case-studies/architecture/article/architectural-visions-printed-in-3d/Real-life models of 3D objects remain an important part of architectural visualizations. Objects that were laboriously modeled in the recent past are now printed in 3D – also using Cinema 4D.Ulrich Schneidt recognized this trend early on and offers complete 3D printing services through his studio MindModel Services. His studio focuses on the needs of architects, engineers and construction planners. As a result of his extensive experience with Allplan, both as a user and in sales, Ulrich has an in-depth understanding of the field of architecture and knows what makes it tick. “Currently, architectural visualizations are primarily created using images but it’s very apparent that architects need other ways of presenting their work. Physical models are a detailed representation of the final design. They are excellent for promoting the communication between planners and clients,” says Ulrich.

Depending on the project in question, planning offices are constantly looking for new ways of presenting their work. “Clients are generally skeptical of new methods such as 3D printing, which is compounded by the fact that they are not yet familiar with any professional 3D printing processes,” remembers Ulrich.

The fact that project files for the pilot project could be used from the planning phase to animation and 3D printing convinced Ulrich’s client Hauser Massivbau GmbH to give 3D printing a try. The MindModel team started by creating an animation in Cinema 4D for the project and then adapted this model for use in 3D printing.

The client wanted to use the 3D model for a marketing campaign to market its product palette. Hauser Massivbau GmbH very rarely uses actual display homes, in particular because each house can be easily customized by the customer, which results in a wide variety of possible designs. The variable 3D printing process in turn is ideal for reproducing a wide range of designs for marketing purposes. For example, the shape of a roof can be easily changed from one with dormer to one without.

The files needed for 3D printing were created by the MindModel team using Allplan source files, which had to be imported into Cinema 4D for the design and animation phases anyway. “Importing Allplan files was not a problem at all. The work that had to be done in Cinema 4D for the visualization animation were in turn a good base for subsequently preparing the data for 3D printing. In short: all elements were able to be used and nothing went to waste,” explains Ulrich.In the end, the project was printed in 3D (Project 660) by 3D Systems. The printer that was used was a powder-based 3D printer, which made it possible for MindModel to print colored surfaces. Even small-scale textures (1:1000) could be reproduced in detail.

“A certain amount of persuasion is always needed to convince a client how useful a printed 3D model can be in the sales and presentation process. Once the model is standing on the table, all doubts are quickly put aside. The model for Hauser Massivbau was successfully presented at an event in front of more than 1,000 potential customers. This pilot project has since been followed up by two additional projects from our client, for which Allplan models will also be modified in Cinema 4D. Why? The answer is simple: stability, reliability and the nearly seamless connectivity to Allplan for optimal workflow and first-rate results,” concludes Ulrich.

Web site Hauser Massivbau: www.hausermassivbau.de/]]>news-4267Fri, 28 Nov 2014 11:22:00 +0100Gtech Vacuum Cleaners Wipe the Floor with Cinema 4Dhttps://maxon.net/en-us/news/case-studies/advertising-design/article/gtech-vacuum-cleaners-wipe-the-floor-with-cinema-4d/To show off its new Gtech Multi hand-held vacuum cleaner, Grey Technology turned to Bomper Studios for this polished promo videoThe humble vacuum cleaner has undergone something of a renaissance since James Dyson invented the canister vacuum cleaner. Now British inventor Nick Grey has taken the idea a step further with his Gtech range of battery-powered vacuum cleaners that are light, powerful, and can even outperform systems running on electricity.

To help promote its new portable Gtech Multi, Grey Technology approached Bomper Studios based in Caerphilly, Wales, with a brief to produce an info graphic-style animation showing how the device works and highlighting its advantages. The resulting 1080p HD sequence required a mixture of illustrative and photo-realistic rendering, showing the internal workings of the device, as well as its components and attachments.

"The animation had to highlight the benefits of this new vacuum cleaner and compare it to current generic upright cleaners on the market," explains Emlyn Davies, Creative Director at Bomper. "We were also supplied with a full storyboard and script that really helped define each shot and look of the 90-second animation."

The team's first task was to clean up the vacuum model, which Grey Technology supplied as a CAD file in STL format from Autodesk Inventor. Although Cinema 4D can import STL files, Bomper used a dedicated modeling app called Moment of Inspiration – or MoI – from Triple Squid Software Design to gain more control over the export process.

"It was time-consuming," says Emlyn, "as most of the sections had to be imported separately to keep the number of facets in the models high enough to help retain the curvature of the parts."

Fortunately, MoI performed admirably. "The resulting meshes were exported as OBJ files, which required minimum clean up in Cinema 4D. Finally, a few sections were deleted, as they wouldn't be seen in the animation."

With the model prepped, it was then a question of setting up the various animations, revealing the vacuum's different parts and showing it in operation. One of the trickier tasks was representing the concertina-style hose as it cleans a sofa – an effect achieved using hierarchies, XPresso and a Spline Wrap modifier. "We created a spline equal to the overall length of the hose with evenly-spaced points along its length," explains Emlyn. "Then we added several nulls that are pinned on each section of the spline at each point – these control the movement and help us animate the overall hose."

"One end null was placed inside the main body and the other end is placed in the attachment at the other end of the hose. The inner section of the hose is modeled as a low-res polygon object and matches the outer coil, which is created by a helix and circle in a Sweep Object. Beneath the helix and the inner hose geometry is a Spline Wrap modifier that uses the overall spline as a reference and binds both sections of the hose to it."

"The inner hose geometry then has a Cloth object as a parent that smooths out the hose and gives better results compared to the Subdivision Surface object. This means the helix coil doesn't distort along its length when stretched and keeps the coil uniform. Hand animating any jiggle and dynamic effects are simple and no spline dynamics were needed."

The animation was rendered using Chaos Group's V-Ray renderer via the VRAYforC4D bridge between Cinema 4D and V-Ray.

Emlyn notes that one of V-Ray's strengths is that it produces good-looking renders with relatively modest lighting. "We have several setups for the lighting and rendering in this animation," he says. "The white scenes were a simple infinity curve with a floor plane above, lit with several softbox lights. The interior scenes used similar softbox lights to keep the look of the product consistent throughout the animation. There are only four lights in each scene, except for the X-ray scenes, which had no lights and only luminous materials."

"The render times were high for some of the sections," he adds, "which were roughly 25 minutes per frame. We wished that multi-pass was as well supported as in Cinema 4D’s Standard Renderer as it just works so well – luckily the beauty passes we got didn't require much adjustment in post."

The Gtech Multi was still in development during the project so material specification and samples were given to the Bomper team, which they then matched in V-Ray. "The client also wanted to enhance some of the reflectance of the grey materials for the main body to improve the look of the product," says Emlyn.

The ‘X-ray' vacuum cleaner was made from parts of different models from previous projects with some scratch-built sections, and made to represent a generic competitor without actually looking like any specific model. The X-ray effect was achieved with various V-Ray materials consisting of a refraction layer and Fresnel luminosity in the color channel.

To recreate the effect of the X-ray vacuum sucking up debris, Bomper turned to a MoGraph Cloner with a selection of differently colored polygons. At specific locations in the vacuum they then placed planes with Thinking Particles attached using the same polygons as in the Cloner. "Using the TP Suction and TP Follow Spline [presets] the particles get sucked up the attachment and along the hose spline," explains Emlyn. "At the end was a TP destructor and inside the cylinder is a MoGraph object rotating with a number of polygons that grows as the animation runs, faking the build-up of particles."

The sequence of emptying the Gtech into a pedal bin employed a hair object combined with a number of small polygons. Again, a MoGraph Cloner was used to represent small bits of debris, and a Random Modifier was used to affect the scale, position and rotation of the pieces – a result the team was pretty pleased with.

With another project successfully completed, Bomper's creative director has special praise for Cinema 4D, which he declares is "easy to use with spectacular results – the most intuitive software on the market and a joy to use daily."

Emlyn also name checks RebusFarms's render manager – the ‘Farminizer' – which is integrated into Cinema 4D’s interface and was "a dream to use and aided in the rendering workflow and file handling."

With the Gtech Multi animation successfully completed, Gtech UK called upon Bomper to create further animations for other models in the range. As well as the Gtech Multi, Bomper Studio has also visualized the massively popular Gtech AirRam and AirRam K9 Pet vacuum, the animations for which you can see here:

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

Bomper Studio website:www.bomperstudio.com]]>news-4279Wed, 05 Nov 2014 10:44:00 +0100The Social Networks’ Nethttps://maxon.net/en-us/news/case-studies/advertising-design/article/the-social-networks-net/Clément Morin uses Cinema 4D to create a film that shows how a tool that connects various social media channels works.Facebook, Twitter, LinkedIn and other social media channels help us manage contacts all around the world. However, it’s still quite a challenge to bring all of these channels together. This is what Hootsuite wants to do and provide users access to all channels via Hootsuite. To spread the word about this new service, motion artist and animator Clément Morin was given the job of creating a film to show how the system works.

During the time enthusiasm for social media networks was steadily increasing, the television series ‘Game of Thrones’ became one of the most popular series amongst mainstream viewers and the series’ excellent title sequence became known to millions of viewers. The concept that Clément’s client wanted him to realize was a variation of the GoT title sequence in which social media channels appeared as stylized houses on a map. As with the original, the social media houses had to emerge like clockwork mechanisms.

“The biggest challenge was the deadline. The client wanted the film to be presented parallel to the start of the next GoT season. This meant that I had to be finished in 5 weeks, which was when the first Got episode for the following season was to be broadcast,” remembers Clément. “First, I created the stylized map using grayscale height textures. The maps themselves were made up of simple primitives and planes, which were converted to splines using the Vectorizer object and subsequently extruded. The resulting shapes could then be easily animated or otherwise manipulated.”

26433

About modeling other element, Clément says: “I also used the GoT sequence as a reference for the other models and I worked with very simple geometry. I used the Octane render module for rendering, which uses GPU support for rendering. I was able to work a lot with render instances, which reduced the overall amount of geometry.”

Asked which of the many Cinema 4D features was the most important for this project, Clémens clearly states that it was MoGraph: “Aside from the fact that MoGraph is one of my favorite Cinema 4D tools, it was indispensible for this project when it came to animating the numerous small elements. The Random, Shader and Delay effectors were combined to generate a slight randomness, lag and special shading,” explains Clément.

Summing up, Clément explains that “Without Cinema 4D, 3D graphics would most likely not have become part of my professional repertoire. It would be great if other programs with which I work were so user-friendly, easy to use and easy to adapt to one’s needs as Cinema 4D!”

Clément’s video turned out to be a great boost for Hootsuite and has been viewed over 800,000 times!

]]>news-4300Wed, 29 Oct 2014 11:23:00 +0100A New Look for Bertie Bassetthttps://maxon.net/en-us/news/case-studies/advertising-design/article/a-new-look-for-bertie-bassett/Bassetts Licorice Allsorts uses Cinema 4D to revamp their mascot - lickety-split!The job of visualizing the new-look Bertie fell to Emlyn Davies, now the Creative Director at Bomper Studio in Caerphilly, South Wales. When he was given the project he was working for Cadbury Design Studio – now Mondelēz International – and the two-person team had just a week to give Bertie a 3D makeover.

The character's individual parts were modeled in Cinema 4D, and then textured using high-resolution photos of the actual sweets. "Some materials were really simple," admits Emlyn, "Just using the high-res images worked really well. Time is a major factor when working with a big studio – they have quick turnarounds, so finding shortcuts is key."

However, some materials did prove problematic. "The best example was the yellow sweets," says Emlyn. "Yellow always seems to be the hardest color to get right we've found, making sure it doesn't look dirty. Lots of time was spent in Photoshop color balancing the textures and adjusting the hues and saturation."

To replicate Bertie's feet and top hat, which are covered in aniseed balls, MoGraph was used to bind small spheres to a hidden cylinder object. This was a tricky balancing act to ensure the balls looked realistic and random. The finished sweet models were then arranged to create Bertie in a ‘T' pose ready for rigging.

Although the brief only required a static image of Bertie for packaging and print ads, the team decided to rig the model, enabling them to quickly create new poses and with the aim of future-proofing the project, should the client request further work.

Cinema 4D's character tools allowed bones to be placed within the model that bind to the mesh and distort it when moved. This helped the artists to quickly manipulate Bertie into his stride pose. "We used the character tools to quickly rig him using the advanced biped setup," remarks Emlyn. "I think it took a day to model and a day to rig. In the end we were separating out sections as the designers wanted to subtly change the pose and overlay objects that weren't possible, like the yellow sweets on the legs."

"The benefit of the rig was having the hand controls quickly set up with minimum fuss," he adds. "Weighting took a lot longer than anticipated as the sweets would intersect each other, so time was taken to make sure we had controls to stretch and squash elements if needed. Elements like the hands had additional controls to allow for bending the fingers, posing his hands and holding objects."

With Bertie rigged and posed it was time to light and render him. "We used area lights with area shadows, as we wanted the best possible results using Global Illumination rendering. We used HDR images within some of the material environment channels to aid with additional highlights on some of the licorice and aniseed sweets. These were then adjusted in post in Photoshop."

The shadows on the final Bertie are very subtle – almost non-existent in places. Emlyn explains how this look was art directed by him and Andy Baker, the packaging designer: "Bertie needed to pop off the dark background as much as possible," he says. "We used subtle shadows adjusted in multi-pass renders."

"The shadow and highlight on the floor don’t match," he continues, "but it works because your eye just doesn't realize it. If we'd tried to utilize a spotlight scene there would be hard shadows projected onto lots of areas and we wanted it to look as clean as possible. We coined the term ‘hyper-realistic', which tends to get used on lots of retouched imagery and CG work, especially in packaging. It's a fine line as you want the image to look as good as possible, but need to make sure that the image is still true to the product."

Bertie was output as 16-bit images and with multi-pass layers to help with color correcting and for the addition or removal of shadows and highlights. Emlyn describes how elements such as the legs, head, body and arms were all rendered out separately, which gave the packaging designer freedom to make subtle positional changes.

"There was a lot of post work," he comments. "It was mainly color correction from RGB to CMYK and making sure they referenced color swatches associated with the brand. The mouth and eyes were added in Photoshop, but apart from that everything is from Cinema 4D from tweaks to highlights, etc., to boost the image."

When asked which part of Cinema 4D proved most useful, Emlyn replies: "Tough one. It's a close thing between the character tools for the hands and MoGraph for the aniseed sweets. Probably the character tools as they're so easy to use and take out a lot of work that's never seen by clients who don't understand the process."

If Emlyn ever gets another chance at re-creating Bertie Bassett, he says that Cinema 4D's sculpting tools would definitely be used for the face and he'd love to give him a facial rig for expressions.

So does he have any suggestions for budding Cinema 4D artists? "The best advice is to use it every day and experiment," says Emlyn. "There are thousands of videos and tutorial sites and extremely helpful forums if you get stuck. I always remember I'd just experiment and try and copy styles or things I'd seen – this is the basis of how I learned to use Cinema 4D as I came from a background in 3ds Max in university. The learning curve for Cinema 4D is very good compared to other software and the interface is just so intuitive."Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

Bomper Studio website:www.bomperstudio.com]]>news-4277Mon, 27 Oct 2014 13:34:00 +0100Planetary EncountersNormally, Andy Lefton works in the field of advertising but with ‘Two Worlds’ he ventures into new territory – with the help of Cinema 4D.Andy’s online portfolio includes a wide range of extremely high-end fantasy works, which put Andy in the top league of professional 3D artists. As such, he doesn’t need to create advertising for himself. Nevertheless, he’s been working on a short film for the past ten years that’s slowly beginning to take shape. He’s completed a trailer for the short film that offers a first glimpse into the project.

A pile of junk, which obviously used to be a space ship, lying in the desert. Elements added after the crash such as simple mobiles, a canopy and an antennae built from spare parts indicate that someone lives here. The stranded person can only be seen in the reflection on his communication computer monitor, which constantly displays the message that no connection can be established. But suddenly the communication systems receive a signal from an aircraft that is landing on the planet …

With this trailer, Andy creates a dramaturgical effect making the viewer want to see more. And the quality of rendering for this trailer is so realistic that you have to look twice to see that it was created using computer graphics. “You could say that just about every Cinema 4D function was used for this project. In particular dynamics, hair and cloth made it possible for me to create exactly the look I wanted. I was able to concentrate completely on the film and didn’t have to worry about how I was going to realize specific parts. For me, Cinema 4D is the most important tool for all aspects of 3D, VFX and motion design!” says Andy.

Andy did a lot of research for his ‘Two World’ project and created numerous production sketches. “Drawing is not my greatest strength but, together with the images I researched, it helped a lot in creating the various elements, characters and scenes as I envisioned them. After the first sketches had been made I created a complete storyboard, which was used to define the speed of the camera movement and the action.” Andy then began creating all elements and scenes in Cinema 4D. “Cinema 4D’s and BodyPaint 3D’s seamless connectivity to Photoshop was invaluable for the creation phase,” exclaims Andy.

The character was not very complex, which made it possible for Andy to animate it using several keyframes, Morph tags and Expressions for facial expressions. Animation will become more relevant for the project at a later point when completely rigged characters will be active in the film – of course all done in Cinema 4D. In the meantime, the ‘Two Worlds’ teaser will continue to wet everyone’s appetite for the final film. Andy is planning on completing the film by the end of 2014. We can’t wait to see it!

Andy Leftons website and "Two Worlds" blog:www.andylefton.com/blog/]]>news-4272Tue, 21 Oct 2014 14:49:00 +0200Stellar Visuals for Guardians of the GalaxyMade with Cinema 4D: Intergalactic interface design for an old spaceship, a crew of cosmic criminals and a virtual walkman ...By Duncan Evans

The jump from acclaimed graphic novel to successful movie tie-in is a tricky one. For every successful adaptation there’s always one that leaves a sour taste in your mouth. For a project as ambitious as Marvel’s 'Guardians of the Galaxy,' it required the efforts of half a dozen VFX companies to do the culture-spanning tale justice. Fortunately, it did just that and was one of the biggest draws at the summer box office.

Playing a vital part was Territory Studio, which was asked to create holograms, screens and interfaces for the Marvel Studios film. Territory was set up in 2010 when Creative Director David Sheldon-Hicks got together with Lee Fasciani and Nick Glover to found the company. Having already created a lot of the screen and graphic interfaces for Ridley Scott’s Prometheus, Territory was the natural choice for the Guardians project.

The brief called for hundreds of designs and animations, with each set requiring its own look and feel to fit the vast number of sets, environments and cultures. There were screens for spaceships, street scenes, gambling dens, communication hubs and a prison. Each one had to reflect the specifics of a particular culture, whether it was human or alien. They also needed to reflect a sense of the wear and tear of function, history and backstory.

Territory took advantage of the massive Cinema 4D toolset using diverse tools such as those for character animation, motion graphics, particle effects and cel rendering. Cinema 4D’s MoGraph tools and workflow with Adobe After Effects were key in simplifying the creation of the complex screen interfaces. "The complexity and variables of on-set screen graphics work requires us to be fast, flexible and creative, and Cinema 4D allows us to do this in bucket loads," states David. "The tight integration with After Effects was essential."

David explained how they approached the task: "We had conversations with Director James Gunn who was very script-focused, with a clear vision of what he wanted. He was also very supportive of our work and gave us a lot of freedom. We also worked very closely with Production Designer Charles Wood and Art Director Alan Payne and the art team. We had bi-weekly meetings with the art team in which they would show us their concepts and visuals for scenes and environments that were further down the line than we were. This really helped us understand the look and feel of the many locations, be they planets, prisons, spaceships, street scenes, gambling dens and so on, which we would reference and support."

The first challenge was receiving high-poly-count assets from the art department, typically around five million polys, and creating usable assets for animation purposes, especially the Nova prison and Milano ship screens. These had to be reduced to 500,000 polys or less. To do this, some of the assets were remodeled while others used the Polygon Reduction tool within Cinema 4D to achieve the same result. The streamlined workflow between After Effects and Cinema 4D certainly helped the process along.

There were a lot of typefaces on show for each area and culture. David went through the design process: "We referenced artwork from the original Marvel comics and style guides and concept art from the art and costume departments. Then Lee Fasciani, my co-founder here at Territory and a specialist in crafting fonts and icons, created each typographic and icon style to reflect the character and personality of a particular culture or location, drawing on colors and shapes to provide a visual language to convey this. Each one was subject to approval and, because of our collaborative relationship with the art department, we didn’t run into any problems."

One of the main areas that saw action throughout the film was the spaceship Milano that the main characters used. The Milano’s backstory was that it’s seen some action and is clearly a bit dated, but still beautiful with many clever modifications. David described how their designs fit in: "Our UI needed to reflect both this engineering sophistication and main character, Peter Quill's can-do attitude to hacking the system to get the extra performance he wanted. He was more interested in effective modifications than in perfect code, so our screen graphics for the ship's navigation, weapons and entertainment were a bit rough and ready to reflect this. We were also able to have a lot of fun with details such as the music interface that Quill hacked to simulate a 1980s tape deck."

In the film, the '80s mix cassette tape was a key personal memento for Quill and he took it, and a Sony Walkman, wherever he went. This connection to the '80s ran through the whole film, in terms of the soundtrack, in some of the styling and in the look and feel of Territory’s graphics. On the Milano, the team wanted to create an interface that was based on an authentic deck that visually converted the media into a cassette tape as it was inserted and ejected, with tape that could be seen to roll as the music was playing.

David concluded: "In terms of the user interface, we referenced the classic reds and oranges against a black background of 1980’s UI, we looked at the paintwork styling of 80’s sports cars and we explored the effect of degradation on screens by looking at how airplane windows show the effect of stress and age – the clouding and scratches that their screens show with time. Our designers then took on board this research and created a look that felt true to the spirit of the film."

Territory also worked directly with a company called Compuhire, who were the wizards behind programming on-set playbacks and getting the graphics in front of actors and directors. Territory would often have to turn around several screens in a day, and with Cinema 4D working so well with After Effects they were able to do this. Sharing nulls and camera data between applications and rendering object buffers were key workflows that the team used on a daily basis. XPresso was also called into play on a sequence on the Dark Aster ship, to make the animation process easier for actions such as finger rolls where David only wanted to animate one slider.

With a large number of VFX teams working on the project, producing work from Cinema 4D that everyone else could use was paramount. David recalled how the Production VFX Supervisor Stephane Ceretti played a key role here, "He really understood our way of working and what we were able to do and because he oversaw all the VFX vendors, they were able to ensure the consistency of our concepts, from screen to post."

In the end, Territory used from two to seven people on the project over a ten-month period, putting up to 18 Mac OS X 3.5GHz six-core machines to work rendering out the various elements.

Duncan Evans is a freelance journalist, photographer and author.

You can see more of the graphics from Guardians of the Galaxy here:www.territorystudio.com]]>news-4268Wed, 15 Oct 2014 14:58:00 +0200Creating a Luxury Spacehttps://maxon.net/en-us/news/case-studies/architecture/article/creating-a-luxury-space/When Scardigno Design was asked to design a retail concession in Harrods, Cinema 4D was called upon to create the visuals that balanced upmarket looks with available floor space.By Duncan EvansGreggio is primarily an Italian wholesale business dedicated to creating upmarket silverware and jewelry, so when the company had the opportunity to move into a retail concession in the Harrods store in London, it turned to Nick Scardigno of Scardigno Design for visualizations first. Nick explained, "The design brief was quite an open one. They wanted an environment that would complement their product, not compete with it, and that conveyed a sense of luxury and authority."

Easier said than done, though, because the actual floor space in Harrods was quite small, making it a challenge to balance room for the products and space for people to actually walk around and see them. Nick described how he approached the problem: "The floor space was very limited, especially when working with Harrods' strict guidelines. We wanted to obviously get as much of the Greggio product out on display as possible, but also to make shopping the space an easy experience. All of the site dimensions were supplied by Harrods in advance, and the space was carefully planned to make sure that everyone's needs were met."

The first stage was to produce a series of sketched visuals, using Cinema 4D's Sketch and Toon, to give an overall view of the design concept. Once this was approved Nick started work on creating the photo-quality versions, starting with products to put on the shelves. The plates were done using plane objects with pictures from the Greggio catalog applied to them. Up close, this type of object tended to look a little flat, but from a distance they worked out exactly as Nick had hoped they would.

Nick modeled the picture frames himself, and then added a little personal touch, "I even sneaked in a photo of my daughter into the picture frames – I always try to get a picture of her into my renders!"

Populating the shelves with products could have been a time-consuming, not to say tiresome, job but this is where Cinema 4D's Cloner tool came into play. Nick explained how it was used, "I loaded up a selection of the product into a Cloner object and used the randomize setting to make sure that different products ended up on different fixtures. By adding a random effector to slightly adjust the position and rotation of the elements, I managed to get a very natural spread of product on the fixtures that did not repeat or look obvious."

One part of the visualisation has a cloth laid over a table. In order to make it look more realistic, the cloth simulator in Cinema 4D was used to create some deformation where it hung down. It took a little bit of experimentation, but the end result was more natural.

One of the interesting comparisons to be made from the Scardigno visualizations and photographs of the actual retail space once built, is that the black bricks on the columns and product supports, and the floor tiles, look exactly like the real-life concession. Nick detailed the process he went through to achieve this look, "I photographed some marble I found on the outside of a building, and tidied up the picture in Photoshop to create the flooring. The black lacquer material was done using a simple color, and adding a Fresnel shader into the Reflection channel. The smallest amount of bump, and a bevel to all of the corners, gives a very slight imperfection to the finish, and really goes a long way to making the finish look as it will in real life."

The actual store is festooned with spotlights, so here the design needed a similar kind of result without going overboard and making the rendering engine overloaded with raytracing huge numbers of lights. In the end, IES lighting was used for the highlights, but the environmental reflections were done using one of the stock HDRI files that ship with CINEMA 4D. Even so, when Nick started rendering the 1680x1260 resolution files with 103k polygons, his MacBook started to struggle, and he only had a week to work on the project. Fortunately, he was able to send the files off to a render farm for a quick turnaround.

The end result was a success, though, as the client thought he was showing them actual photographs of a previous project when he presented the final visuals. As Nick concluded, "nothing comes close to Cinema 4D for its ease of use and awesome end results." The Greggio retail space is now up and running in Harrods, so you can see for yourself how it compares.

Greggio website:www.greggio.it]]>news-4240Fri, 10 Oct 2014 10:42:00 +0200Dauntless with the Cinema 4D Protective Shieldhttps://maxon.net/en-us/news/case-studies/games/article/dauntless-with-the-cinema-4d-protective-shield/With its outstanding animations, Aixsponza has made a name for itself in many branches – with the exception of game trailers. Their new creation for Dreadnought successfully catapults them into yet another genre.Dreadnought is a game project that Berlin, Germany-based YAGER Development GmbH plans to release in 2015. As is the case with most action games, Dreadnought will offer an intro in cinematic quality that gets players excited about the game. These intros are also teasers that are designed to attract customers to the upcoming product.

In this game, futuristic fighter planes will battle it out in the Earth’s atmosphere. Players have a choice of fast fighter planes, super battle ships or fearless Dreadnoughts. The intro had to give future customers an impression of the game’s storyline.

The team of artists in supervisor Manuel Casasola’s team had only four weeks to finish the project. Since the game takes place on a future Earth, a complex landscape had to be created with numerous seas and cliffs. Daylight and a clear atmosphere combined to create an immense amount of visibility. The landscape was generated using World Machine and subsequently exported to Cinema 4D in various resolutions.

“The landscape geometry and textures over which the fighters would fly took up seven terrabytes of memory. This amount of data could barely be handled despite the high-end computers being used, which is why we developed a special tool: a Python tool with which the LOD (level of detail) geographical files, which were separated into different parts, could be combined. This made it possible to display detailed, high-resolution geometry in the foreground, 4k textures at mid-range and low-resolution textures in the distance,” explains Fuat Yüksel “and displacement maps were applied in Cinema 4D to geometry in the foreground to add the finishing touch.”

Modeling the fighters was also a challenge. Regular models were needed for the flight scenes and special versions were needed for the scenes in which the fighters exploded after being rammed by the Dreadnoughts. The basic models were supplied by the client in the form of high-res zBrush models, which had to be re-topologized for use in Cinema 4D. “For this we used Cinema 4D R16’s new PolyPen tool, which is ideal for this type of work. Using the low-poly model we created we were then able to create a Voronoi diagram: the model was disassembled (fractured) with the help of Thinking Particles and Geotools and the fractured parts were saved as a Voronoi diagram. These parts were then simulated using Cinema 4D’s Rigid Body Dynamics system, which is based on the Bullet Engine” adds Fuat.

While working on this part of the movie, Fuat also developed a special process which makes ‘directable fracturing’ possible in the Viewport. “In order to have complete control over the fragments and the dispersion of fractured parts, I used the Hair feature and Thinking Particles to develop a technique that, with the help of XPresso, outputs the global position of Hair guides that can easily be used as position data for Thinking Particles. This makes it possible to position and orient them precisely and add various attributes. They can also be deleted at any time using the Hair selection tools – which can only be done in a limited fashion in the Viewport using Thinking Particles (TPDraw) but not dynamically or non-destructively. This basically gave us a ‘procedural’ node-based process directly in the Viewport and in real-time.”

The explosions themselves are only one of a series of fluid simulation effects Fuat created using Turbulence FK and Krakatoa. Thinking Particles was also used to create the particles, which were then used by Turbulence FD as emitters for the actual explosion effects and subsequently rendered using the TFD renderer. “The hyper-space vortex that precedes the Dreadnoughts’ appearance was created by Simon using TFD and Krakatoa.”

Despite the overall complexity of the project and the tight deadline, every think went smoothly and no major problems were encountered. “Considering the project’s scope and the tight deadline we expected to encounter problems, but surprisingly everything went well. The team was made up of experienced professionals who were well prepared and knew how to handle their tools. The tight deadline meant that overtime had to be put in – especially towards the end. But in the end, and thanks also to Cinema 4D and the very professional and constructive client, we were able to complete the project to our client’s complete satisfaction,” Fuat sums up.]]>news-4214Mon, 06 Oct 2014 13:39:00 +0200Cinema 4D Creates 3D Content for Oculus Rifthttps://maxon.net/en-us/news/case-studies/advertising-design/article/cinema-4d-creates-3d-content-for-oculus-rift/Oculus Rift is one of the most important new innovations for the media sector. Students Nguyen (Kenji) Duong and Felix Droessler are tapping this market and are using Cinema 4D to create interactive content.Infinite Travel is the title of the short film they created using Cinema 4D, in which the motion data of a dance performance is uniquely transformed into an animation. This unorthodox film is not only made special by the unique way in which it uses motion data but also by the way its presented: it was specially created for viewing on Oculus Rift displays. In times full of new innovations, Oculus Rift is preparing the next revolution. The head-mounted display promises to let users dive into interactive 3D worlds. The interactive nature stems from the freedom of movement users have to move their heads in a 3D environment.

As part of their course studies at the Hof university / Campus Münchberg, Kenji and Felix were asked by their professor to create an animation that was based on the Motion Bank project. The Motion Bank project, which was also initiated by Professor Zöllner, is an abstract depiction of digitized choreography, with the aim of producing a new view of movement. Kenji and Felix had to use the score from Motion Bank’s ‘No Time to Fly’ by Deborah Hay – and the resulting motion data – for their project. The team decided to portray the music’s mood in particular in the animation. In addition to the adaption for the Oculus Rift, the animation also had to be set up for stereoscopic rendering. This meant that the viewer would not only experience the freedom of movement that Oculus Rift offers but would also view the animation in 3D.

Converting the Motion Project’s tracking data was done using a plugin that was written specially for this project, which made it possible to transfer this data to objects and particle systems. In addition, XPresso and Thinking Particles were used to convert Motion Project data for the animation, which featured no human performers. With Infinite Travel, the user is taken on a journey while remaining in place. As the animation continues, the journey’s destinations emerge from the environment, which itself is constantly transforming and evolves from a medieval town to a nebulous world and ends in a steppe landscape with blue skies.

At 2 minutes, Kenji’s and Felix’ animation is quite short but considering the fact that the images’ 360° perspective and their stereoscopic set up with a position offset of 8 cm meant that a lot of rendering was involved: the images were not only large but also had to be rendered twice. Precise planning and maximum optimization of objects and materials were required to produce render times that could be realistically achieved with the resources at hand. Rendering was done with VRay. After Effects and special editing tools from the Oculus Rift developer pack were used to complete the film.

Infinite Travel is not only an interesting approach for letting viewers experience motion data in an unorthodox manner but is also an interesting experiment for the creation of new content using the Oculus Rift head mount display.

]]>news-4041Mon, 25 Aug 2014 12:14:00 +02003D, Constellations and Flying FishArtist Murat Sayginer uses Cinema 4D to create a mysterious flood of images with a deserted island as a backdrop.Volans is the name of a constellation in the southern sky, whose German name is ‘flying fish’. This fish, a deserted island and the starry sky inspired artist Murat Sayginer to create a spectacular short film, which he created entirely by himself. Murat began his career as an illustrator and photographer. According to Murat himself, the evolution to animation was the next logical step if you look at the work featured on his website. Many of his photos told stories and were often shown as photographic series. Murat was introduced to Cinema 4D about five years ago. He wanted to tell stories and photography alone was too restrictive for what he wanted to do.

As an internationally known artist who has won numerous awards, Murat has released four films on the online platform Vimeo – of which Volans is the most impressive. In a style of fantastic realism it tells the story of a flying fish’s voyage of discovery as it frees itself from a stone sphere and explores the world outside – a trip that quickly turns into a cosmic search for meaning.

The fully saturated colors of the scenes are often reminiscent of tantric Hindu meditative illustrations. In order to realize these visions, Murat took advantage of Cinema 4D’s full range of tools. He used the Hair feature to create the carpet of grass that covers the island, and the water that seems to flow weightlessly from the island’s pond towards the sky was created using Real Flow and rendered in Cinema 4D. Splines and Dynamics were used to create the numerous floating light particles.

The music that accompanies this unique film was also created by Murat and is timed perfectly to match the animation. Murat claims that the truth lies somewhere in-between: sometimes the music was made to match the animation and at times the animation was made to match the music.

Due to its popularity at Vimeo, Murat’s ‘Volans’ animation was chosen as ‘staff pick’ even though it wasn’t submitted for any animation festivals. Murat plans to continue along this path. Although he has a strong desire to create traditional film projects he will continue to realize his visions and ideas using Cinema 4D. After all, according to Murat Sayginer, Cinema 4D is “the most user-friendly animation software that was ever developed!”

Murat Sayginer’s web site:www.muratsayginer.com/]]>news-4130Tue, 19 Aug 2014 15:04:00 +0200Bear Printshttps://maxon.net/en-us/news/case-studies/advertising-design/article/bear-prints/For their latest in-house project, creative agency DBLG turned to 3D printing as way of making their ideas solidEvery now and then, London creative agency DBLG gives it artists free reign to explore new ideas and techniques: "It's a platform to experiment and above all have fun," says the company's founding director, Grant Gilbert.

Fascinated by the concept of 3D printing, the studio embarked on a project to explore the use of a stop-frame animation by physically producing every model in the sequence. "As a rule, everything we do is made and stays in a computer," explains Gilbert. "We were looking for a new studio project and liked the idea of making something physical out of the computer. We've always like the rawness of [Hungarian-born animator] George Pal and wanted to recreate it using modern techniques."

As the project was meant as a proof-of-concept, the decision was made to keep it relatively simple: the animation shows a bear climbing a downwards-moving escalator, effectively making the creature walk on the spot. "We aimed for a two-second, 50-frame animation loop," says Gilbert. "We started with the escalator animation and fitted the bear around it."

36637The tricky part was making the stairs loop, using a Boolean object to cut the Cloner into the right form. For each frame, the stairs were made into a single mesh and tidied up. "We had to make sure each model of the stairs had a clean closed mesh and the normals aligned in order to be properly printed," comments Gilbert.

The job of animating the bear's movement fell to Blue Zoo Productions, a BAFTA award-winning animation studio based in London. You may well have seen the lead character before: the polygonal grizzly starred in the rebrand spots for US TV channel, Animal Planet, seen trying to catch a fish and scratching his back on a tree.

The animated bear was then imported into Cinema 4D, where it was paired with the escalator. Each individual frame was then exported as an OBJ and printed using a Makerbot Replicator 2 – a desktop system currently available for just $2,000. "Each model takes approximately three hours", explains Gilbert, "and we only had one or two printing errors amongst the whole 50. To print the complete set took us nearly four weeks. It's a long time, but we're used to that when we render big files!"

Once all 50 models had been printed, they were then tidied up to remove the excess plastic that supports the bear figure during the printing process. "We had to tweak the mesh here and there, but overall the printer dealt with it very well," says Gilbert. However, the process wasn't without its risks: "We kept the First Aid kit nearby at all times – taking the physical 3D printed bears off of the resin platform was hard work because they stick to the surface. Every time we cut a finger or hurt ourselves, we used to say something like ‘I feel alive'. We're so used to computer animation, doing something tangible for a change was very refreshing."

Successive models were positioned on a small lighting stage and photographed using a Canon EOS 5D SLR connected to a laptop running Dragonframe – a dedicated stop-frame application, used by top studios such as Disney and Aardman. The resulting animation was edited and set to music, and then released to the world. The charming sequence quickly went viral, garnering coverage on loads of tech and creative sites. "It's been incredible PR for us," Gilbert admits. "We've been completely overwhelmed by the response. We've had articles written about the project in Time Magazine, Gizmondo, Wired, Creative Review, The Creators Project and The Verge to name but a few…"

3D printed objects have been used in stop-frame animated shorts for several years, and the Laika movie ParaNorman employed the process to create Norman's varied facial expressions. So while the concept isn't entirely new, DBLG's looping animation is wonderfully hypnotic. "We had no idea that the technique would work as well as it did," comments Gilbert.

The studio had "amazing feedback" from the 3D printing community, though, bizarrely, some animators wanted to know why the studio hadn't just produced the sequence as CG. "That wasn't the point," says Gilbert bluntly. "Anyone can do that, we wanted to get our hands dirty and make something real." He cites the project's physicality as the best part of the process: "People can't help but pick it up and touch it."

DBLG and the crew have been thrilled with the success of Bears on Stairs, and it's certainly given them inspiration for future assignments. Indeed, there's the possibility of an extended version of the animation in the works. "We learnt a lot about improving the models from Cinema 4D to get the best 3D print," says Gilbert. "We're really keen to develop the technique and explore ways of taking the process to the next level."By Steve JarrattSteve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-4091Thu, 31 Jul 2014 14:50:00 +020030 Hours of CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/visualization/article/30-hours-of-cinema-4d/Teamwork and Team Render save the dayCINEMA 4D’s impact on the Chinese market is increasing at an impressive rate. One factor that makes this very evident is the steady growth in the training sector. Shanghai-based Imagehost Digital Technology (IHDT) is one of the few selected certified CINEMA 4D training centers worldwide – and the first on mainland China. Manager and Certified Instructor Yan Ge, a.k.a. Hey Joe, offers training for a wide spectrum of participants ranging from film directors to beginning 3D artists with varying amounts of 3D experience, if at all. Participants learn how to use CINEMA 4D effectively to produce great-looking results.

At the end of each course, participants are tasked with creating a short film and are given only 30 hours to complete their project. The teams can determine the subject themselves, which have to be approved by the trainers before they start. Not only the limited time within which the project has to be complete but also selecting the best team members is a challenge for a successful cooperation. Yan Ge explains: “70% of course participants learn CINEMA 4D from the ground up. About one-third have absolutely no prior experience with 3D software, and our course participants range from film directors to 2D artists and just about any other profession you can imagine. We’ve even had a few construction workers and auto mechanics! Everyone who has completed a course has performed well enough to receive a certificate. This is also in part a result of CINEMA 4D’s easy learnability with its intuitive user interface, which is easy to learn, even for someone who’s not a savvy computer user."

The best short films are selected at the end of each year, and last year’s winner was the amazing animation ‘Post-industrial Advertising Clip’. The group responsible for creating this film was inspired by an animation from a Norwegian company about the extraction of tin in the country’s arctic North. As usual, this team also had the allotted 30 hours to complete their project. The team didn’t waste a minute and got to work modeling, animating, creating MoGraph setups and determining which render settings should be used. Since the project also had to be rendered within the allotted 30 hours, models had to be optimized accordingly so the desired render quality could be achieved in Multi-Pass rendering.

Even after everything had been optimized, it looked like the five PC workstations would not be able to render everything in time. That’s when the team members spontaneously took out their MacBooks and hooked them up to the network via WiFi. In an instant they were able to use CINEMA 4D’s Team Render feature to help render their project and their problem was solved! “We’re really happy with the cross-platform render system. Team Render makes it possible to quickly and easily use available computers across the network as render resources,” stated Yan Ge.

IHDT Website:http://bbs.ihdt.tv/forum.php?mod=viewthread&tid=5528, http://c4datc.com]]>news-4065Mon, 28 Jul 2014 10:50:00 +0200Austrian Student Racing Teamhttps://maxon.net/en-us/news/case-studies/visualization/article/austrian-student-racing-team/Every year the University of Graz’ TANKIA Team constructs a race car as part of a practical learning project – and this year CINEMA 4D was part of the team as well.Practical experience is the best way to help learn just about anything. If you’re learning a trade such as baking, gardening or carpentry it’s easy to find ways to gain practical experience to help in your learning process. But what if you’re studying vehicle construction or production logistics? Gaining practical experience can be quite difficult in these and other fields. In order to give those studying in fields in which practical experience is hard to come by, the Technical University of Graz created the TANKIA (There Are No Kangaroos In Austria) project. The project’s proper name is TU Graz Racing Team, which encompasses no less than an entire racing team, which does everything from constructing its own racing car to creating marketing campaigns for the team – projects that are part of a real-world racing team’s mission.The racing team was called into life by the TU Graz in 2002 and is financed exclusively by sponsors who are involved in the project in different ways. Sponsoring ranges from financial support to donations in kind.For the 2014 season, the race car – upon completion – was to be included in additional graphics material to accompany the existing PR material. These visuals had to showcase the car’s technical highlights, illustrate its complex inner workings and show the public things they would otherwise not see. The animation starts with an exploded illustration of all components, which then start to assemble themselves to form the new race car.This is something the TANKIA team wanted to do using CINEMA 4D. The first thing that had to be done was to export the models from CATHIA, which turned out to be more of a challenge than expected. 3DVIA Composer was used to convert the .3Dxml files exported from CATHIA to .OBJ files, which CINEMA 4D was then able to import. The team then had to learn how to use CINEMA 4D – with which no-one in the team had ever worked! However, this hurdle was overcome surprisingly quickly and after only a few days and slight initial difficulties the team was up and running in full production mode.The team also faced other unpredictable challenges such as the dishwasher that caused a fuse to blow and cause half a day’s work to be lost. The team switched to using laptops and was able to complete the sequence a mere 60 minutes before the deadline. Rendering was done on a single computer because nobody had had time to learn how to set up network rendering.Despite all difficulties (which most often were simply a result of using the wrong settings) encountered by the team, the result is exceptional and looks quite professional. Even the scene in which the TANKIA racing car slides to a screeching halt next to the previous year’s car doesn’t show a hint of the fact that it was created by a team of CINEMA 4D newbies!

]]>news-4064Thu, 17 Jul 2014 10:36:00 +0200Impressive 3D Caveworldshttps://maxon.net/en-us/news/case-studies/visualization/article/impressive-3d-caveworlds/National Geographic is known for its high standards when it comes to photo reporting. And where real-world photography reaches its limits, Cinema 4D is there to help get the job done.The National Geographic Society was founded with the aim of exploring the globe and is best known for its periodicals, which have been published continuously since 1888. The National Geographic Magazine has long since established itself as the epitome of photographic reporting for all topics related to geography. Often, complex topics cannot be illustrated well enough simply using photography, which means that informational graphics have to be added whose quality is held to the same high standards as the original photographs. A very unique case was the report about the Gebihe caves in China. One of the cave’s chambers, the Miao chamber, was scanned with a laser and Cinema 4D was used to create spectacular informational graphics for the magazine and the National Geographic website.The Miao chamber is approximately 852 meters long and reaches heights of 190 meters, which makes it the second largest known chamber worldwide. This enormous natural phenomenon was scanned using a laser and about 15 million measuring points were generated, which were used to create a virtual cluster of points that basically reproduces the Miao chamber virtually. Using this data and the photos shot by National Geographic photographer Carsten Peter, the Berlin, Germany-based studio for professional visualization and informational graphics, ixtract, was given the job of visualizing the chamber for the National Geographic Magazine.

The first challenge that this project presented was to create a precise model of the chamber’s spatial composition using the data supplied. When asked about the challenges faced by the ixtract team during this project, Stefan Fichtel said: “We first had to deal with the gaps in the data caused by projecting rock formations, boulders and other obstacles, which prevented the chamber from being measured in its entirety. We used Cinema 4D’s XPresso feature to create a customized Displacement object, which was placed around the object like a flexible, opaque outer skin. This object was then adapted to fit the geometry, which helped solve lighting issues later in the project.”

The team also had to color the cave walls. Using textures was not an option because the size required for the desired resolution would have made them too large. “This is why we simulated all textures using Cinema 4D’s own shaders. Modifying the shaders to make them look like real cave walls required a lot of creativity! More than 100 layers in different channels were mixed while making sure that the antialiasing was flicker-free, which was a challenge in and of itself,” recalls Stefan.

The project initially called for a printed informational graphic only but as it turned out, a complex animation ended up being created for the National Geographic website. This animation was designed to be a type of virtual walkthrough of the Miao chamber. At certain points, the walkthrough were to consist of photos made by Carsten Peters. “This animation had two sections in particular that posed problems”, remembers Stefan. “First, the points from which the photos had been taken had to be located. The expedition members weren’t able to tell us where these were because it was simply too dark. The second issue that we had was that we were working within a very tight deadline and we had only 2 weeks to render about 2,400 images for 90 seconds of animation!”

The Motion Camera was used to create the animated camera so that the locations of the photos were combined seamlessly with the camera’s movement. Then things suddenly got interesting during rendering: In the middle of the project it was announced that the interactive version, which had a resolution for use on iPads, had to be rendered in full HD after all – and the team had no way of knowing if the rendering could be completed on time.

Stefan Fichtel about the situation: “Unfortunately we had to do without features such as Global Illumination, etc. and had to find additional ways of keeping render times as low as possible. Render time for only one version of the 90-second animation (2,400 images) was 4.5 days (60 GHz, 50 i7 cores with 140 GB RAM). In order to meet the deadline, render times had to be reduced to 12 -15 minutes per frame, without compromising on realism, which we were fortunately able to do.”The results achieved by the team at ixtract are outstanding and a proud Stefan Fichtel states: “National Geographic works with only two external studios worldwide – and we’re one of them!”Learn from the prosixtract has established itself as an studio and Stefan Fichtel has built a reputation as a Cinema 4D specialist whose abilities are reflected in the work he produces. Stefan also works as a consultant and trainer, e.g., for companies who want to integrate Cinema 4D into their existing production pipelines. Stefan offers his services either on-site or at his own facilities. The clients decide what they want to learn and Stefan will also create a customized course on an individualized or group basis. Courses are offered for beginners as well as advanced users, and specialized topics such as XPresso are also covered.

Miao cave link:www.behance.net/gallery/17171609/ixtract-Miao-Cave-Animation]]>news-4040Thu, 10 Jul 2014 12:39:00 +0200Cinema 4D and After Effects: The Perfect Matchhttps://maxon.net/en-us/news/case-studies/advertising-design/article/cinema-4d-and-after-effects-the-perfect-match/TOKYO DESIGNERS WEEK features an animation created with Cinema 4D and After Effects that perfectly illustrates how the Adobe Cloud works!The TOKYO DESIGNER WEEK is held in Japan and renowned worldwide. In the 28 year since its inception it has grown into an event that showcases architectural and engineering design, product design, graphics and art from around the world. Adobe has been a regular exhibitor for several years and used this year’s event to focus on its Adobe Cloud solution. Among the projects that were showcased were numerous student projects that used the Adobe Cloud for their realization.

Adobe hired the visual design experts at WOW studios to create a film that clearly and concisely shows how the Adobe Cloud can be used. The film that director Shingo Abe and his team created using Cinema 4D and Adobe After Effects utilizes a clear pictorial language to show what can be done using the Adobe Cloud.

Director Shingo Abe says about his work and that of his team: “The response to the film at the trade show was immense. The animation’s distinctive music made visitors stop and take a closer look at what was being offered at the booth and in turn gave them the opportunity to watch the film, which uses an entertaining approach to demonstrate how software and production pipeline work together.”

During the show, Shingo Abe also conducted a motion graphics workshop with After Effects and Flash with the participation of university art students who had experience using Illustrator but had not yet worked with After Effects or Flash. Shingo Abe demonstrated his workflow and showed how to create impressive motion graphics, which gave the students valuable insights for their own projects.

“I try to create films that effect the viewer emotionally. Visuals can convey humor, sadness, suspense and a wealth of other emotions to the audience. A film can be used to influence – or even change – a person’s view of certain things,” says Shingo Abe.

Asked why he uses Cinema 4D for his work, Shingo Abe responded: “I’ve been a Mac user for a very long time and Cinema 4D has always been available for the Mac. This is the primary reason why I work with Cinema 4D. Also, being someone who is not an expert at math or geometry, Cinema 4D’s ease of use and intuitive operation lets me quickly and intuitively bring my ideas to life.”

About WOW:WOW is a visual design studio based in Tokyo, Sendai and London. They are involved in a wide field of design from advertising and commercial works to installation works for exhibition spaces, and also invent new user interface designs for prominent brand names. WOW is also very passionate about creating original art works, holding exhibitions not only in Japan but also internationally. We are continuously discovering the tremendous possibilities of visual design, and the visual designs that are useful for society bring out the best talents of each artist and designer.

2013 Making of Story:www.tv.adobe.com/jp/watch/adobe-students/21168/ TOKYO DESIGNERS WEEK http://www.tdwa.com/]]>news-4010Fri, 04 Jul 2014 08:30:00 +0200Happy Drum BeatsA slightly macabre but thoroughly enjoyable music video created by Job, Joris and Marieke using Cinema 4D.A bunch of cute, musically inclined creatures with zebra stripes running a parcours from rooftop to rooftop accompanied by the new song from the Dutch band Happy Camper, The Daily Drumbeat. Soon the race evolves into a macabre version of musical chairs – with drum sets instead of chairs. Whoever’s not sitting behind the drums when the refrain starts is pushed off the building by an invisible hand – and the game continues with one less player …

Joop, Joris and Marieke is a team of Dutch film makers with a love of cartoons and short films. Their projects are created using Cinema 4D, which offers them the flexibility and power they need to successfully realize their complex visions. Joop, Joris and Marieke also have established a reputation with their past work, including their well-known work ‘Mute’, which recently garnered a lot of attention worldwide. This short film also used highly stylized characters and a very dark humor, which also found its way into ‘The Daily Drumbeat’.

“The music is very upbeat but the lyrics have a dark touch: ‘The drum beat will get you!’”, explained Job. “We wanted to express this somewhat foreboding message in our film. This is how we came up with the idea of the invisible hand that eliminates one character after the next from the game.” The visuals had to have a crude, rough and grainy style and look like they were filmed using a handheld camera. This is why the film was made in black-and-white and a lot of motion blur and depth of field was used. “These were all effects that we could easily create using Cinema 4D’s standard renderer without generating exorbitant render times, even though the scene was made up of a comprehensive cityscape. We also used a few tricks,” admits Job. “We created Sky objects to which not only the sky was applied but parts of the city as well. This helped convey the impression of a big city without inflating render times.”The zebra suit-clad characters were modeled, textured, rigged and animated in Cinema 4D. Since the characters’ physiology was very compact, the Jiggle deformation object was used to let their noses whip about nicely and make their movements more dynamic, which in turn produced a very organic look.

After their success with ‘Mute’, ‘The Daily Drumbeat’ also earned the title Staff Pick at Vimeo. Not bad for Job, Joris and Marieke, who actually come from the field of graphic design. “Despite our professional backgrounds, we all had a strong desire to switch from design to animation. Cinema 4D is perfect for everyone wanting to get the best possible start in 3D animation. Not only does everything work great, it’s also very intuitive to use,” says Job.

]]>news-4009Fri, 27 Jun 2014 11:34:00 +0200Floating Metal KeyMusic and images are often intertwined in new media – and CINEMA 4D often plays an important role in adding impressive imagery to music.Prior to the introduction of musician Mathew Wilcock’s newest album, he wanted to use an EP single to raise awareness for the album. Attention to the EP single would in turn be generated by a video. Standing out in the crowd is not an easy task these days, considering the extent to which spectacular visuals are added to videos! The video had to hold its own against the rest, which is why Mathew hired Tony Zagoraios as art director/concept designer and Dan Kokotalijo as director for the project.

All three had often worked together in the past so convincing Tony to take over the role of art director was the easy part. Mathew explained his concept of what visuals he wanted for his music: visuals that are inspired by the underlying music and underscore its ambience and mood – and combine to create a pictorial story. Based on this information, Tony began to create a storyboard.

Tony used CINEMA 4D to create most of the visuals and had to use several elements taken from external applications for the animation – which “… was not a problem thanks to CINEMA 4D’s comprehensive import options,” as Tony states. “Especially the file exchange with After Effects and its seamless integration with CINEMA 4D sped up workflow enormously!” Otherwise, many aspects of the work on ‘Floating Metal Key’ was interdisciplinary and included numerous applications – with CINEMA 4D at the core of the project.

Much of the work done in CINEMA 4D was done using the MoGraph feature, which was used to animate the numerous object fragments floating in the scene. Hair and spline dynamics as well as Thinking Particles, and almost all Deformer objects were used to really put things in motion. Features not offered by CINEMA 4D were simply created via plugins such as Thrausi, for example.

The animation was rendered primarily using CINEMA 4D’s Standard Renderer. The Physical Renderer was used to render a few scenes that required render settings not included with the Standard Renderer. By strategically splitting the render process between both renderers and with the right settings, the team was able to accurately plan the required render time, which was split up across three render clients.

The team members who worked on this project were spread out across the globe and worked over a network to complete the project. Their hard work quickly paid off as ‘Floating Metal Key’ was selected as Vimeo’s ‘staff pick’ only a few weeks after release. Mathew and his team are also planning to enter the animation, which already has over 110,000 clicks, in various festivals. All team members benefitted from working on this project: “The coordination between the various team members and fine-tuning various aspects such as environment, scene setup and timing were challenging and resulted in a vast exchange of know-how amongst everyone involved. Everyone was able to gain a great deal of technical and production experience, and learn a lot about CINEMA 4D in particular.” concludes Tony Zagoraias.

]]>news-3950Wed, 11 Jun 2014 10:13:00 +0200Dawn of the 3D Droneshttps://maxon.net/en-us/news/case-studies/advertising-design/article/dawn-of-the-3d-drones/Dark clouds loom over the closely monitored city. Drum beats defy the complete surveillance and create magical, colorful patterns in the sky.It’s no secret that our world is under surveillance. It’s almost impossible to escape and there’s no single answer to the question, “What can I do about it?”The English artist, motion designer and animator Simon Russell found his own style of protesting the surveillance by drones, intelligence agencies and cameras. He created what he calls a self-initiated experimental 3D animation. Simon explains why he created this film: “When I started working on ‘Dysco’ I wanted to include important topics like the NSA affair, the Arabic Spring in which freedom and dictatorships stood toe-to-toe, and the fact that I live in London, the city with the world’s most comprehensive surveillance system. I wanted to create an experimental film that showcased these topics and dealt with them accordingly. Now that the project is finished I’ve given it a very special definition – but I’m not sure the word can be found in any dictionary.”

Since leaving elementary school, Simon also wanted to create a short film that visualized music. This was also something he wanted to realize with Dysco. Simone used CINEMA 4D, which he has also been using professionally for several years, to create a complex landscape of buildings, sky scrapers and concrete walls crowned with barbed wire – and surveillance cameras everywhere. Different types of drones fly between the buildings and bright red LED lights add to the surveillance robots’ foreboding look and seem to bring them to life.

From the very beginning, Simon used layers in CINEMA 4D to keep the extraordinarily detailed scenes organized. He used the Projection Manager to create the backgrounds for the sky scrapers that appear in the distance. He modeled the objects as cleanly as possible used precisely placed UVs, which made it possible to later bake the textures to further optimize the scene.

The gloomy scenario is eventually broken up by brightly colored abstract shapes that multiply to the rhythm of the music and arrange themselves in geometric structures in time to the beat. This is where the film really becomes experimental and Simon realized most of these scenes using MoGraph and Thinking Particles. “XPresso was also used in many cases to create the numerous procedural effects,” explains Simon.“I wanted to create a film in which music and animation complement each other. This meant that I had to select the right music to achieve a dramaturgical balance for both. Using an existing composition was a problem because this meant that the dramaturgy of the animation would have to be adapted to the music and all dynamic elements would have to be created to match. This is why I decided to compose my own music and let an experienced colleague edit so it sounded more professional.”

Simon had waited 15 years to turn his idea into reality. “I had never had the time nor had I had the skills required to bring my thoughts to film. Now that I’ve used CINEMA 4D for several years and have had the opportunity to drastically improve my skills, creating a highly complex project like Dysco was quite easy to do,” explains Simon. The film was quickly named “Staff Pick” at Vimeo – a well-deserved distinction!

With the UK economy beginning to show the grass roots of recovery, the BBC News Business Unit required a short animated 'sting' for use in its slot during the regular news broadcast. Creation of the 12-second sequence fell to Sophia Kyriacou, who's worked as a broadcast designer since 1997, originally with European Business News before moving to the BBC in London in 1998.

Her brief was to illustrate the unit's 'UK Economic Recovery' brand with an animation representing the feeling of improvement. To do this she conceptualized, modelled and animated a town seemingly made of paper, which unfolds itself as the cold blue sky is warmed by the dawn sun. It's a relatively simple approach but effective and aesthetically pleasing.

"The sting had to encapsulate the various areas of the affected economy from housing, retail, to the city and so on," says Sophia. "The concept of 'regeneration' had to be key to the overall feel and design."

She explains how the theme was derived from the concept of paper money. "Objects were individually created, simply unfolding one-by-one," she says, "re-growing, coming back to life, subtly bouncing back with energy. By interjecting an element of humor it gave a light-hearted feel to what has been essentially a difficult economic period. The white textured objects lit with sunlight express the start of a new dawn."

In designing the piece, Sophia had to bear in mind that the piece was targeted at the BBC Business News audience and had to withstand repeated viewing over a long period. The animation also had to sit within the BBC News branding, working with existing color schemes and typography - and she had to deliver a super-widescreen version at 5760 x 1080 that would run across three screens in the studio!

To create the unfolding effect Sophia turned to CINEMA 4D's Cloth tag: when applied to a mesh, the underlying structure becomes flexible, acting like fabric and, under the right circumstances, can be made to collapse completely.

Each object was built using a low-poly mesh, and then Sophia used the Knife tool to create some random segments. "Because I was going for the unfolded paper feel, all the geometry was triangulated and deliberately lacked detail," she says. "I changed all default Phong Angles to zero for that overall hard-edged look."

"Once I could see a pattern of the actual slices that worked, I kept the cuts consistent throughout," she explains. "I used a Cloth Simulation tag on the object and a disc for the floor that had a Cloth Collider Simulation tag applied. I played around with the keyframes on the Cloth tag and Gravity forces until I got the movement I was after."

Naturally the simulation didn't always go as Sophia expected: "I did encounter several funny moments when objects would appear desperately trying hard to collapse in one go, only ending up lopped sideways bobbing off the ground like a bouncy castle. Other times if I overlooked the gravity, objects would fall in slow motion or fly away!"

When the collapsing movement looked right, the animation was cached to disc. But Sophia then needed a way to convert the information into keyframes so she could edit it. For this she turned to a little-known tool called Cappucino, which captures mouse input in real-time, but which will also record point-level animation (PLA).

"Prior to this sequence I hadn't used the Cappucino tool," she admits, "but it came into great use, as I was then able to play the cached dynamics and record live PLA on-the-fly. Once I had my keyframes I was able to reverse them and do pretty much whatever I wanted to do with them."

"The only things I had to keep tweaking until the collapse looked right were the settings within the Cloth Dynamics: Gravity, Flexion and Friction. Because all objects were naturally different they didn't all collapse identically every time, so the gravity had to be tweaked depending on size and shape."

If an object didn't collapse cleanly, Sophia would either hand-animate the final frames, or position the object below the floor, so that when it began to unfold, you didn't notice the initial stages of the animation.

The cloth simulations proved effective, but the final results weren't quite right: "They animated like paper but ended up looking smooth and lifeless," Sophia comments. "I wanted to create objects that had personality and I knew using PLA alone would not give me that, as all the keyframes are naturally linear."

To achieve the look of paper unfolding in succession, a Random Effector was added. This enabled Sophia to control the amount of distortion, and imbue the animation with the level of imperfection she needed. But she still wasn't happy: "Once I had the keyframes captured and the geometry randomized, I was still left with an animation that had little character and personality," she states. "Making each object bounce back to life with an element of humor was so essential to the overall sequence, both conceptually and technically."

CINEMA 4D's Delay Effector came to the rescue. By using the Spring Mode selection and controlling its strength, Sophia was able to bring a little life and soul to each object. "Both MoGraph effectors reacted with each other, which worked very well," she adds.

With all the objects animated, Sophia imported them into a new scene. She was then faced with the task of choreographing the various animations, so they appeared in shot as the camera moved past. "I think for me the tricky part was positioning all the objects first and making sure I had a strong end frame," she explains. "Once I had that sorted I kept rendering very small hardware renders because they were incredibly quick to do and I could focus my time on the camera and getting the objects popping up on cue."

The final step was to make the objects actually look like paper. Sophia employed Subsurface Scattering to give the look of light dispersing through the material, saying, "The interiors felt like they were lit from the inside. This effect is seen more prominently through the gaps of each object."

To light the scene, Sophia used a Sun object and an orange spotlight, which glanced off the rooftops, giving the feel of a sunrise. "I wanted to express conceptually a positive fresh start of a 'new dawn'," she comments.

The sequence was rendered using the Physical Render engine, due to its support for Lens Distortion, Chromatic Aberration and Vignetting. "It also gave me the freedom to override the shutter speed and aperture," she adds. Global Illumination was also used, of which Sophia declares herself a huge fan: "Having done a lot of studio photography in the past, I have always been fascinated by how you can light an object and watch light and color bounce off and illuminate the surrounding objects and materials. It is real, it is a tactile solution, and for me GI almost mimics that same tactile feel."

"Without the use of Global Illumination this scene would not feel alive," she continues. "It had to encompass that tactile look and feel I was after but in a miniature world. Naturally when using GI you have to accommodate the longer render times, but the difference in lighting is quite dramatic, and it was important my sequence felt like a little paper town springing back to life."

Sophia did encounter some GI flicker in shadowed areas, but explains that she was able to minimize the effect by increasing the Record Density and Stochastic Samples.

To complete the animation, the rendered footage was imported into Adobe After Effects for color grading. "I wanted to take the viewer from an icy-cold scene to a more comfortable warmer end," says Sophia. "All grading was created by adding a warm graduation of orange towards the bottom and a blue graduation at the top for the sky. The scene was tweaked over time from cold to warm, enriched colors."

The final stage of the project was to create a super-widescreen version for use in the news studio. As the action was mostly centered, there wasn't too much work to do too in terms of composition, although the camera had to be repositioned for certain shots, and pulled back to widen the view. "Some objects had to be replicated and turned around to cover the wider shot at the end," explains Sophia, "but other than that it was pretty simple. We're used to creating sequences that have to be interpreted from a full-frame sting to a Tri-Screen for the studio, to reinforce the brand and for consistency. I rendered the Tri-screen anamorphic at 1920x1080 to keep the file size down."

The full 12-second animation took just four weeks to complete, and was achieved using CINEMA 4D running on a 12-core Mac Pro.

In May, 2014, 'Paper Town' was named a finalist for a prestigious PromaxBDA award, in the category 'Art Direction & Design: News Program Bumper.' To see a full list of finalists, visit: www.promaxbda.org

PromaxBDA is the international association for entertainment marketing professionals. PromaxBDA leads the global community of those passionately engaged in the marketing of television and video content on all platforms, inspiring creativity, driving innovation and honoring excellence. The association represents more than 10,000 companies and individuals at every major media organization, marketing agency, research company, strategic and creative vendor and technology provider and is considered to be the leading global resource for education, community, creative inspiration and career development in the media and media marketing sectors. For more information.

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-3949Mon, 19 May 2014 15:38:00 +0200Right About Nowhttps://maxon.net/en-us/news/case-studies/advertising-design/article/right-about-now/Headlining the Coachella 2014 festival, Fatboy Slim mixed electronic dance music with hi-res CG visuals for the ultimate live show performanceTim had already set up Plastic Reality and Plastic Pictures and the content for this year's Coachella was to be provided by his new venture, The Happiness Labs - a studio focusing on the integration of shared experience through the latest crop of tech devices such as Leap Motion, Myo from Thalmic Labs and of course the mighty Occulus Rift. So, whether creating content for music events, experiential content for brands or looking at new ways of storytelling, Tim intends to be at the convergence of the new wave of tech and tools and the never-ending desire for a good story.

This year's team that’s working on the show for Coachella included Chris Cousins, Joe Plant and Bob Jaroc. Bob was out on the tour while Chris and Joe were crunching out some stunning frames. For this year's production they also shot lots of footage in slow motion and had Mike Sansom from the Brightfire Pyrotechnics company help blow stuff up - not using particles or expressions but black powder.

The content was created using four workstations with 16 CPUs for rendering the footage, with CINEMA 4D and After Effects driving the entire process. The Coachella staff built a set with the DJ booth integrated into a 96 square meter screen and the VJ would run the entire show using video being sent live from Fatboy's Serato setup, with Resolume running from the VJ position off-stage.

Tim explained how the key concepts came about: "Coachella originally approached Team Fatboy asking if we would like to do a show based on the four seasons. The set at Coachella is 60 minutes long so they were looking to split it into four parts and use a bunch of physical effects - fire, snow, rain - to accentuate the different seasons. We had a think about this and obviously loved the idea of the different physical effects but thought the four seasons might be a bit like doing opera. We started throwing some ideas around and realized we could re-work Fatboy's ‘Eat Sleep Rave Repeat' track into ‘Heat, Sleet, Rain, Repeat!' We got to keep the physical effects but had to incorporate them into Norm's global smash. Norman was placed in the middle of the screen with the display split into nine regions. This meant that we were able to build some additional content around this configuration, such as the fruit machine animation"

The fruit machine section was built by Chris Cousins, who commented on the value of "The quick and smooth motion blur, using MAXON's physical renderer, which helps avoid the strobe effect - especially important with large-scale displays. Also, the simple parametric setup via MoGraph cloners, which meant it was easy to create new combinations, swap out text, and animate bounce using simple sliders."

A big part of the show utilizes a 3D version of Fatboy Slim’s head built entirely in 3D, which you can see here…

They scanned Norman Cook with the guys at Centroid at Pinewood studios and then added the textures and finishing touches in After Effects. This leads to the boom box video:

Where the four seasons effects were created to run along with the tune. It starts with paint splashes, which were created in Realflow. Joe Plant, who built the boom box footage, went through the process. "The trick to the consistency of paint in Realflow was a subtle blend of viscosity, density and surface tension. The transition of the paint was down to a high output rate, which passed through three separate vortex daemons. The surface of the boom box was then given a high stickiness value to make the paint stick, which in turn, with a modified gravity setting, produced the desired results."

The next stage of the video was fire, which was created in Phoenix FD. Once the required elements of the boom box were separated and incorporated into the simulation, a high fuel temperature combined with a high cooling rate and varying output to create the fire pulses, provided the desired results.

About 60% of the video elements were scripted and were launched within Serato, which has been Fatboy Slim's DJ weapon of choice since it incorporated video about six years ago. The files sat as QuickTime video within Serato and, when a pre-scripted track was selected, the video and audio worked in perfect synchronicity. If the pitch was changed, the video sped up or slowed down accordingly. The other elements, such as the rain and snow effects, were all operated by the VJ, Bob Jaroc. Tim added, "We had a script we had worked on with Norman but a lot of it was triggered manually and not on rails. Norm always likes to throw lots of curveballs into a show so we have evolved with that spirit in mind over the years - always be ready for the unexpected!"

At the heart of the process, though, was the CINEMA 4D and After Effects pipeline. Tim concluded, "Their widespread adoption throughout the creative industries is a reflection of the quality of results that can be achieved. And we find for speed and flexibility, they are the ultimate combination. The forthcoming era of deeper integration between CINEMA 4D and After Effects is really exciting, and we are really looking forward to see how this will enhance our workflow. We find them a joy to play with and encourage all younger artists who are working with us to learn this combination."

]]>news-3927Mon, 05 May 2014 11:06:00 +0200Ring My Bellhttps://maxon.net/en-us/news/case-studies/visualization/article/ring-my-bell-1/Not long after being introduced to CINEMA 4D, student artist Matthias Ries created a short film that features a complex animated character on a unique emotional journey.Matthias is studying media design at the University of Münster (Germany) and focused primarily on illustration in the first four years. His interest in 3D was sparked after taking a course in storyboarding and by the fact that many other students were working with 3D software packages. When the school offered a course for character animation and rigging with Arndt von Koenigsmarck as the teacher, Matthias jumped at the chance.

Initial curiosity quickly turned to full-blown enthusiasm and Matthias decided to develop a complete character and feature it in a short film for his semester project. Matthias put a lot of thought into designing his character. It went through various phases of evolution, ranging from an egg-shaped wombat-like character to the final design – a marionette with a jingle bell head. The final character design in fact was a reflection of Matthias’ self: overwhelmed by constant restlessness, in search of his own inner peace.

After putting his ideas to paper, Matthias started to model his character in CINEMA 4D, making sure it would be fully animatable. This included modeling the bell’s opening so that it could be transformed using the Morph Target function without compromising its shape. The character’s knees and coat had to be modeled correctly, e.g., so the legs looked correct when they were bent at the knee. Due to the coat’s bell-like shape, Matthias was not able to animated using a cloth simulation, which also would not have provided the desired degree of control during animation. So Matthias simply developed his own bines setup for more control.

Matthias also developed his own rig for the rest of the character. The fact that he only had about six months experience using CINEMA 4D is a tribute to Matthias’ skills as well as to CINEMA 4D’s easy learnability. Another important factor was the expertise of his trainer Arndt von Koenigsmarck. After completing Arndt’s course, Matthias was able to get fast and comprehensive support from the MAXON support team. Matthias says that he “… got to know CINEMA 4D as a software that can be easily learned whereas other applications virtually prevent beginners from learning how to use them!”

In fact, Matthias learned everything he needed to know on-the-fly whenever he needed certain skills for the project. Matthias explains how he learned the non-linear animation system: “I more or less stumbled across this feature when I began creating the first walk cycle. I noticed that it would be helpful and spent that day learning how to use the system. After only one day I was able to save the walk cycle to several Motion Clips that I then applied to the character in a different file and mixed with other movements."

This approach proved to be somewhat nerve wracking at times simply because of the lack of experience, which resulted in the odd beginner’s mistake. Once Matthias spent a whole day working on a new and useful feature he’d discovered in CINEMA 4D. “In the end, this helped me achieve an efficient workflow that in turn helped me make up for lost time, which was a great help whenever something went wrong or had to be fixed near the end of the project.”

After everything had been modeled and set up, rendering was done using network rendering on six Mac Pros. With render times of six to ten minutes per frame, it took six days to finish rendering – and all hard drives were full. “The system admins were pretty upset because I clogged up the computers with my external hard drives,” remembers Matthias.

The layered renderings were composited in After Effects and sound was added as a final touch. A local musician provided sounds that were recorded at his studio.The result of about six months of planning and development plus six months of CINEMA 4D training is very respectable and has been highly praised by everyone who’s seen the final animation. “3D animation didn’t play a major role in my studies but now that I’ve learned how to use CINEMA 4D and know what can be done with it, I will definitely use it for future projects!”]]>news-3835Wed, 23 Apr 2014 10:20:00 +0200Back From the Futurehttps://maxon.net/en-us/news/case-studies/games/article/back-from-the-future/A soldier from the future interning at the office? Brennan Ieyoub uses CINEMA 4D to create an entertaining video clip for Ubisoft.It’s hard to imagine what a humorous take on Ubisoft’s new third-person shooter game Tom Clancy’s Ghost Recon: Future Soldier might look like. And you won’t have to because San Francisco-based film and video producer Brennan Ieyoub has already done so. The custom video is called Ghost Recon: Future Intern and he made it for IGN (Imagine Games Network) as a way to promote the Tom Clancy game prior to its release earlier this year.

Ieyoub, who worked for IGN for seven years before starting his own production company, Layer Media, in 2011, had two weeks to create the video in which a soldier straight out of the game gets a job as an intern at IGN. In addition to coming up with the concept and writing the script, he also did all of the 2D and 3D motion graphics using MAXON’s Cinema 4D and Adobe’s After Effects.

The goal was to create a funny parody that stayed true to the game while bringing in a bit of IGN’s office culture. With such a tight deadline to meet, Ieyoub spent one long day shooting footage. The task was made easier by the fact that Ubisoft had just finished producing a live-action short film to promote the game so Ieyoub was able to use some of the film’s high-quality props and costumes that were still lying around the office. A Student of Cinema 4DBecause he considers himself to be a “student of Cinema 4D” with much more to learn, Ieyoub wasn’t planning on using the software to create the visual effects for the video. He changed his mind, though, when a few things went wrong on shoot day, most notably the scene in which the helicopter flies over the shoulder of the Future Intern. The plan was to use his smart phone to fly a Parrot AR.Drone quadricopter into the shot and over the actor’s shoulder.

Even though Ieyoub had practiced quite a bit in his living room, come shoot day the drone took off and flew straight up into the ceiling before crashing to the floor in pieces. “I was like ‘Oh my God! I’m going to have to somehow do this effect with Cinema 4D now,” he recalls, laughing. After finding a free model of a Parrot AR.Drone online, he “Frankensteined together” a Parrot AR.Drone model with a toy mechanical claw model to create the drone UAV (unmanned aerial vehicle).

Cinema 4D also came in handy when Ieyoub needed to create some effects to make the scene in which the UPS boxes stacked up on peoples’ desks look more comical and dramatic. After first trying to make the scene work by having a production assistant stand off-camera and throw boxes onto the desks, he decided to use Cinema 4D to model the boxes. “And then I used proxy geometry and physics to shoot boxes out of an emitter and land in a stack in a cool, funny way,” he explains.

Ieyoub attributes much of his success using Cinema 4D for this project to the many helpful tutorials he was able to find online on YouTube. And, as an experienced After Effects user, he was grateful for the smooth integration between the two software packages, which allowed him to share a camera and jump back-and-forth quickly and easily. “I’m sure there are smarter ways I could have done things," he says. “But at the end of the day, I thought it turned out really well and the people at Ubisoft were really happy with it, too.”

Read the full-length feature on this project on Renderosity! ]]>news-3890Tue, 15 Apr 2014 13:07:00 +0200Diageo gets into the spirit with CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/visualization/article/diageo-gets-into-the-spirit-with-cinema-4d/How RPM created a range of 3D spirit bottles indistinguishable from photosBy Duncan Evans

While it’s easy enough to create 3D bottles of liquid, it’s a whole different ball game when you are tasked with creating 3D renders of actual spirit bottles that have to be indistinguishable from actual photographs. That was the brief that brand-specialist RPM was faced with in a commission from Diageo, the world's leading drinks company. For Scott Ramsay, Senior 3D Visualiser, and the 10 artists working on the project at RPM in the Old Treacle Factory in Shepherds Bush, the challenge was considerable. Traditionally, creating images to make each bottle look as beautiful as possible requires an expensive photo-shoot with sophisticated lighting. However, the sheer number of bottles involved meant the cost would have been prohibitive for them all. Step forward RPM and CINEMA 4D with a CG-based alternative that offered both a more economic cost and considerably more useful and flexible assets. Once created, the bottles could then be re-used in any other marketing scenario. The big question was whether Scott and the team could actually do it, because in order for the project to be successful, those CG bottles had to go up against an equivalent photo and be just as good.

Scott explained how the process started, "Working in CINEMA 4D, I started modelling using technical drawings and blueprints from Diageo’s asset library as reference. Labels and graphics were applied using the print-ready artwork files supplied by Diageo. This is where good print process knowledge comes in, because the gold inks, foils, varnishes and embosses that make the labels look so premium and rich are applied in a similar fashion, even using the choking, trapping and overprints a print designer will be familiar with. Knowing these would be rendered at a very high resolution, attention to detail was paramount and all of the tiny imperfections and dimples, seams and registration marks in each bottle were faithfully recreated using displacement and bump maps."

The displacement and bump maps were used for sub-polygon displacement because it allows RPM to update the graphics relatively easily when the brands and labels are updated, or for flavour variants. Smirnoff, for example, has many different variants all using the same core bottle, but with different kinds of embossing on the glass. It was also relatively easy to plot the position of subtle bumps, scratches and imperfections in the glass in relation to the label artwork once the bottles' UV's were unwrapped. For the future, RPM is looking at using Normal maps and CINEMA 4D's Sculpt tools as an alternative. The team has created 36 bottles to date, but there are more in the pipeline, with simple bottles taking two days to model and the more complex ones up to five days.

The lighting stage was next, and this involved matching up the cameras in CINEMA 4D and those at an actual physical photo-shoot as Scott explained, "The lighting rig for the bottles was particularly important because the bottles needed to sit next to a range of beautifully photographed drinks and cocktails, blending in seamlessly. A Phase One camera was set up in a studio environment with a fill light, key light and two supporting lights. The photographer worked closely with a Mixologist and our Art Director, Leigh Butler, and Account Director, Alice Libaudiere, to photograph the cocktails that would sit next to our 3D bottles. I was on site too to record exact details including distance from the lens, focal length, lens details, lighting positions, textures and any other details relevant to recreating the environment in 3D later on. HDRI details weren't recorded – something we may consider next time. Subjects were shot on a wooden table (also replicated in 3D) against a backdrop we had designed and printed beforehand from our approved Mac Visuals."

How easy was it to translate those variables into camera and lighting settings in CINEMA 4D? According to Scott, it was very easy because almost all the real world values had an equivalent in CINEMA 4D. He could have recorded much more than he did, but easily had enough information for the CG setup. More of a challenge was the materials and liquids and getting the light to refract in the bottles correctly. Scott revealed how they approached it, "Truly realistic glass is something I've always thought to be the domain of Vray, but a solid understanding of how liquid and glass behave is perhaps more important. A lot of experimenting was required, primarily within the transparency channel of each shader, to get the correct effect. The fresnel effect was the most obvious adjustment available, but subtle bumps and displacements worked magically with the glass and liquid shaders to create the striking refractions and reflections. By outputting multi-passes, we were able to isolate the reflectance and refraction passes and amplify them for effect."

However, all of this detail proved a real strain at render time and to achieve hi-res results RPM used CINEMA 4D's NET Render to link up to six Mac Pro workstations and split the renders into a hundred tiles, recompiling in Photoshop afterwards. Even with this division of labour, typically used for animation, some of the most detailed bottles took two to three days to render. Results were intentionally muted and neutral giving their retouch team enough scope to adjust the colours and tones without having to re-render each time. A multi-pass render also allowed them to split the render into Photoshop layers. Refractions, reflections and shadows were all isolated for easier compositing in Photoshop.

The end results were beautiful, realistic images of the bottles which could be re-imagined at any angle, in any lighting scheme and at any time. Scott concluded that, "It’s an incredibly versatile way of generating imagery. Diageo can now reap the rewards of the added benefits of a 3D asset: they are now a Diageo library asset for agencies to use on any campaign they design. The models are fully textured, front and back, top and bottom, and built to scale with a hi-res version and a lo-res version on separate layers. Lo-res versions have no bump or displacement maps, no fresnels and a low-poly count. The hi-res versions are fully resolved with texture maps of at least 8K and the scalable, smart objects mean they can be increased in size easily. The bottles can be used for any applications, from high resolution print output, to an animated GIF or a fully animated cinematic spot".

Duncan Evans is a freelance journalist, photographer and author.

RPM Ltd Website: www.rpmltd.com]]>news-3833Thu, 27 Mar 2014 09:11:00 +0100Floods, Fish and a Flooded ValleyAustrian satire is often mordant and exaggerated. The Austrian town of Bad Fucking, pronounced ‘fooking’, takes it to the extreme and CINEMA 4D helped create the effects for this film in truly Austrian satirical manner.It’s a fact that words that have a specific meaning in one language can have a different meaning in another. Hence, a multi-language term can have a very respectable meaning in its language of origin but can be of quite objectionable nature in another language. The Austrian author Kurt Palm took advantage of the fact that an Austrian town carried the name Fucking, whose status he then elevated to that of a spa town and renaming it ‘Bad’ Fucking (Bad is the German prefix for spa) and relocating it to the Alps for his story. He then wrote an outrageous whodunit that satirically pokes fun at the Austrian character. The result was a brutally mordant Alpine satire!

The book went on to become a great success and it didn’t take long until the decision was made to turn it into a movie. The seasoned director Harald Sicheritz, known for his work in both film and television, was selected to make the movie. Since the story included several fairly spectacular scenes, which could never have been realized using live footage, the Vienna-based computer graphics and animation specialists Cybertime were brought on board.

The road that leads to the tourist town of Bad Fucking winds along the edge of a narrow valley. At the very beginning of the movie, a huge avalanche blocks the road and makes in impassable. The avalanche, made up of boulders, mud and debris, crashes through the valley and flattens a car on its way. The team at Cybertime created this scene with the help of CINEMA 4D. “It was particularly important to be able to control the avalanche’s path,” explains VFX artist, supervisor and producer Günther Nikodim. “The Dynamics feature’s follow position settings were extremely helpful. They made it possible for us to use simple keyframe animation to define the avalanche’s path relatively precisely; the motion of individual elements was controlled using dynamics.”

The simulation was then baked and converted to keyframes. The keyframes were then reduced and Alembic was used to export them to Realflow 2013 where the large boulders were used as collider objects for the fluids simulation. The meshes created in Realfow, which contained up to 25 million polygons, were then rendered in CINEMA 4D together with the boulders and all the debris in the avalanche using the Physical Renderer in CINEMA 4D. Final compositing was done in Nuke.

The rest of the movie is no less spectacular: At one point, a police officer is swept away by a tidal wave several meters high that’s teaming with hundreds of eels. The eels were created entirely in CINEMA 4D using the integrated Sculpt feature in addition to other modeling tools. A simple rig was then created using Deformer objects and MoGraph was used to duplicate a total of about 3,000 eels. The water was simulated using Realflow via Hybrido2 and was imported into CINEMA 4D as a baked mesh. “We first experimented with transparencies and absorption for the water’s shading, which unfortunately resulted in extremely long render times,” stated Günter Nikodim. “In the end we omitted transparency entirely and instead used Subsurface Scattering with a long Path length. To our surprise we were able to use this method to quickly create the muddy water we needed and in the desired quality.” In addition, reflections were rendered separately as Multi-Passes and then combined with SSS shading in Nuke. Realflow was used to create Splash&Foam particles, which were then rendered in CINEMA 4D.

In the end, the tidal wave leads to the closing scene in which the entire flooded valley of Bad Fucking can be seen. This live footage was edited extensively in post-production: mountains were removed and the slightly overcast sky was replaced by dark storm clouds. Color correction was used to make the scene even more sinister and foreboding. Most of this editing was done in Nuke and Photoshop. The edited material was then projected onto rough geometry in CINEMA 4D using the Camera Projection feature so that careful camera movements could be created to enhance the scene’s three-dimensionality even more.

A CG water surface was then created in CINEMA 4D that consisted of a simple plane with a bump map and a highly reflective material. The unique challenge presented by this shot was the seamless integration of live footage of a swimmer in a lake into the CG water. The waves created by the swimmer in the live footage continued outside of the picture and therefore had to be supplemented digitally. CINEMA 4D’s Formula deformer proved to be a fast and easy to control solution.

Cybertime’s team of 3 artists completed the main scenes and innumerable smaller corrections and additions within only five months. A total of about 7 minutes of CGI scenes were created and rendered, which made it possible to add the desired exaggerated satirical components from the story line – which are surely a matter of taste. After all, satire is designed to provoke a debate with regard to its content. There’s no question about the quality of the CGI effects on the other hand – they’re excellent!

One of the attractions of using CG is that it is very effective at creating striking and memorable characters and scenes. That was the idea when CompareNI approached Streetmonkey to create a 30-second advertising slot for TV and online. The objective was to create a distinctive look and feel through the use of 3D animated characters that would provide a strong brand association and increase traffic to the website, and expand their customer base.

The biggest challenge faced by Ian Lawrence, the animator/motion graphic designer for Streetmonkey, was the deadline of just five weeks for a complete 720p project. Ian explained, "The main technical challenges involved the creation of the characters, seascape environment, animation, and rendering within the project timeline."

The story then features a salty old sea dog of a captain called Rod and his trusty seagull accomplice. When a customer checks online for a quote, the captain casts his net over a range of prices from companies all around the UK, including Northern Ireland, where the advert was to be broadcast. The overall technical consideration was to work out simple and efficient techniques for solving various problems, from creating the dynamic cloth fishing nets to an overall lighting and rendering solution that could keep the rendering of each frame to less than 15 minutes. This was important due to last minute changes that inevitably occur.

First up was the water simulation, which needed to be choppy with lots of animation. Ian Lawrence detailed how and why it was done, "There were tight advertising deadlines, so the water was created with a large, flat plane to which three separate wind modifiers were applied. This resulted in a nice cartoony-looking sea movement. On the sea material there were also animated displacement noise and diffusion maps, which added a little bit more subtle detail to the movement. A proximal shader was also used in combination with animated noise shaders to achieve a white water foam effect around objects in the sea."

The main character, the captain, is all hat, teeth and beard. Fortunately, CINEMA 4D's own excellent Hair module provided a nice integrated and efficient way to create the look. The styling for the captain and the overall look and feel of the animation was developed through character sketches and feedback from the advertising agency for which Streetmonkey produced the ad. Ian added, "We wanted to ground the animation somewhere between modern 3D animation and a feeling of nostalgia, with robust characters and environments that almost look hand-crafted."

One of the more complex effects in the animation is the use of the fishing net to capture quotes, but, with CINEMA 4D, it worked out to be an easy task, as Ian revealed, "The cloth used in the fishing nets was a fairly simple setup, the net being a single cloth sheet with a hi-res fishing net in the alpha channel. Around the edges of the net, we belted on some floats that had a dynamics simulation applied to them, causing the net to fall and wrap around the collider objects on the land."

On other elements of the animation, the character tool for rigging helped a great deal and enabled Streetmonkey to concentrate on more creative aspects of the animation rather than get caught up in technical mumbo jumbo.

A basic GI setup was used to create the lighting, using Physical Sky and careful placement of additional non-GI lights to pick out certain areas and provide highlights. This kept the all-important render time down, and with six Mac Pros (2.66GHz, six-core Intel Xeon) they were able to render three to four scenes overnight, providing the clients with daily updates.

Once various render passes for each scene had been created, they were composited in After Effects. Ian explained how the finishing touches were applied, "We edited it down to the timings, graded the whole animation and applied a few post effects, including the steam on the coffee mug and the sea spray when the boat comes to a stop, both created with the Red Giant Particular plugin. The seamless export process of 3D cameras and lights from CINEMA 4D to After Effects meant this was easily done. At Streetmonkey we have used CINEMA 4D in our creative pipeline for over 10 years because of its ease of use, stability, and speed in turning around a project within the timeframe."Duncan Evans is a freelance journalist, photographer and author.

For more work by Streetmonkey, please visit: http://streetmonkey.tv/our-work/]]>news-3815Tue, 11 Mar 2014 11:16:00 +0100Between Two Worldshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/between-two-worlds/Ghost Town Media on the Making of the Video for Linkin Park’s ‘A Light That Never Comes’Physical Meets DigitalHahn and Parvini have long discussed how they might explore ways to have live images exist in digital space. For this video, they came up with an overall look that conveys the core concepts of a remix, Parvini says, in that “you’re taking something from somewhere else and repurposing it.”
Parvini scanned all of the band members in various positions and at different angles for use as raw 3D assets. In addition to the static 3D scans, Parvini and Ghost Town devised a system to capture sequential 3D objects, allowing the band to perform and have the performances played back in post—similar to how you would normally play back video in a non-linear editor.
Early conception consisted of finding images they responded to and weaving them together loosely to get an understanding of the look they wanted. Next, they created a map showing each separate district with Aoki’s district in the center. Ghost Town brought on director and 3D and VFX artist Noah Rappaport (http://vimeo.com/noahrappaport/videos) to lead the team in charge of planning and developing the city that serves as the main setting for the video.
Wrangling a Monster
During the two months they worked on the video, the biggest challenge Ghost Town Media faced was the sheer expansiveness of the piece, Parvini says. Because CG is ubiquitous these days, it’s easy to take for granted what it takes to create an entire digital world. But for a small production house, the reality is “you’re wrangling a monster,” he explains.
“We had to build the whole place brick by brick, and having to make everything, gather all of the assets and drop them in all scenes was really challenging. I think at one point we had about 70,000 buildings in our city setup, which is around the same as Manhattan.” (Watch the process video here.)
It helped that they were able to use CINEMA 4D’s MoGraph to clone many of the buildings they needed.
Combining stock models and custom-built building structures, Ghost Town used cloned instances in CINEMA 4D to allow for quick population of buildings across specific segments of the city. Using the instance cloner system after the buildings were populated allowed Ghost Town to quickly adjust positions and final angles.
Boosting Render Times
Rendering was another major hurdle to overcome. Ghost Town used Dell Precision Workstations for the heavy rendering of the project, but even with their systems completely built out to handle the renders, GI rending was simply too cumbersome and slow. After struggling with workflow for a few weeks, Parvini opted to upgrade from CINEMA 4D R14 to R15, even though he knew it was ill-advised to make that kind of change in the middle of a project.
Fortunately, the upgrade paid off. “Once we got our hands on R15’s new light caching system we saw dramatic improvements to the timeframes and render times, so we could use Global Illumination to get the look we wanted in a quarter of the time,” he says.
Read the full-length feature at Studio Daily.]]>news-3747Mon, 03 Mar 2014 12:45:00 +0100Ben 10: Adventures in the Third DimensionVincent London transports the 2D cartoon into a futuristic 3D worldBen Tennyson is a ten-year-old boy whose wrist-worn Omnitrix device enables him to change into alien creatures and fend off attack from extra-terrestrial forces.

The Ben 10 animated series first aired in late 2006 and ran for four seasons until 2008. This was followed by three seasons of Ben 10 Alien Force and two seasons of Ben 10 Ultimate Alien. The fourth instalment, Ben 10 Omniverse, has been running since 2012 and is now entering its fifth season, in which Ben comes face to face with Albedo, an evil clone created by the alien Galvan race.

To promote the new episodes, Turner Broadcast and The Cartoon Network turned to Soho-based studio Vincent London, which produced three spectacular slots for the campaign. "The project was intended to showcase Ben and his alter-ego within a dual cityscape environment," explains Creative Director, John Hill. "We wanted to create a flexible futuristic world to host the epic fight scenes between Ben 10 and his new nemesis."

The work is a slick combination of 3D environments created with CINEMA 4D and 2D characters, which were drawn and animated in-house using Adobe Flash. The design was one of the key challenges, states Hill. "The 3D world needed to compliment the iconic Ben 10 artwork and style and allow us to flexibly composite 2D character animation. The buildings and cityscapes had to reflect the good and the bad renditions of Ben 10 and also create awesome spaces for the fight scenes." Accordingly, the green and red outfits of Ben and his clone were used to color-code the environment.

With all the dynamic camera moves and shifts in perspective, the melding of 2D and 3D required thorough planning to ensure that all the elements matched up. "We worked up fairly detailed hand-drawn animatics," says Hill, "followed by more considered pre-vis animatics for each shot. This enabled us to sync up the interactions between Ben and the 3D scenes as well as plan overall shot composition and timings."

"CINEMA 4D was easy and quick to use when realising our hand-drawn cityscape designs," he continues. "It brought us time when wanting to experiment with various sets and environments for the action."

A few of the 2D character shots were combined with the backdrops in post-production but Hill suggests that for the most part it was all done in the 3D realm. "Although some shots were composited in Adobe After Effects, we also rendered them out with the 3D scenes to get additional GI and shadow passes. The more you can do in 3D, the better the composite and final result, we find."

This workflow makes sense when you realize that the characters are often required to produce subtle reflections on the floor and cast shadows (like when Ben is running through the explosions). To achieve this, the team applied alpha-mapped animation onto a flat plane positioned within the scene and animated accordingly. They then used the Reflection, Global Illumination, Material, Diffuse and Shadow passes for use in the final composite.

Stylized explosions were handled similarly: "After animating them in Flash, we added them to the 3D scenes to help add GI lighting and shadows," says Hill. "We mixed 3D explosions with the hand-drawn 2D explosions to merge the styles."

An abstract cityscape of monolithic buildings makes an ominous backdrop to the action. "We sketched out quite a few cityscapes and building designs before settling on the self-illuminating futuristic style," Hill says, "as this worked best for atmosphere and premise. We used mainly geometry for the neon-style wireframes, as the render is always cleaner than using textures."

He adds, "The simple cityscape complimented Ben 10's graphic style well; we had to be careful not to add too much detail as this moved the design too far away from Ben's 2D artwork."

The angular buildings are lit by area lights to add subtle glows, with occasional flashes to make the city feel alive. However, the main light sources come from a network of flying cubes, which also give the city a sense of scale. These were simple boxes containing lights in conjunction with self-illuminating textures to add detail.

Lighting proved the hardest part of the project, admits Hill. "Atmospheric lighting with flicker-free GI is always time-consuming," he says, adding that the challenge was to "create subtle mid-tone detail in a night-time dystopian environment so as to not end up with a flat graphic look."

Naturally, CINEMA 4D's MoGraph tools were used for the floating cubes, with a little manual animation and keyframing to get the right results. "MoGraph is probably my favorite tool," declares Hill. "It's so flexible, and so much complex animation can be achieved with very simple scenes."

In one sequence, evil Ben's influence spreads through the city, creating giant cracks in the buildings and causing explosions. The team employed CINEMA 4D's integrated Dynamics to create the tumbling debris, then hand-animated ripples and deformations in the floors and walls for extra detail around the impact areas.

The Tron-style environment is full of bright objects and details that were required to glow, and it's usually best to add these in post-production for more control. "We used the material illumination pass as an EXR file sequence, and 32-bit linear compositing," explains Hill. "The After Effects glow effect does a pretty good job with high color depth. Blurred overlays in After Effects can also prove helpful for glows when treated well."

To sweeten the shots, lens flare effects were added to bright light sources and explosions. Again, these were added in post using After Effects plug-ins, although some were actual lens flares that were photographed by Vincent's team, or hand-made textures.

Depth of field is used to great effect, too, adding drama and helping to tie the 3D and 2D elements together. "Most depth of field was applied in After Effects but some shots were rendered using Physical Renderer's decent Depth Of Field function," says Hill.

The three promo slots for Ben 10 Omniverse were completed over the course of six to seven weeks. With various projects on the go at the same time, team numbers varied, but usually consisted of six to eight people: 2 to 3 Flash animators, two CINEMA 4D artists and two working on compositing.

The team were able to call on four 12-core Mac Pros and a handful of i7 iMacs as workstations and an overnight render farm. "We could usually render two to four shots in one evening depending on complexity and render engine," comments Hill, but says he has no idea about total render time: "We were rendering every evening for six weeks, but probably reworked each shot a few times, so I'm not sure how long for the final shot renders."

Certainly the end result was worth all the effort. The flawless combination of 2D and 3D elements produced three thrilling sequences with a unique aesthetic. Creative Director Hill is understandably satisfied: "The design process, from a hand-drawn cityscape design and a fight scene moment, to a full 3D animated scene, was really gratifying."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-3753Mon, 03 Mar 2014 12:22:00 +0100The ABCs of Rebranding with CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/visualization/article/the-abcs-of-rebranding-with-cinema-4d/Design and motion graphics studio Already Been Chewed cooks up a fresh rebrand for Malibu Boats.http://alreadybeenchewed.tv), had been tasked with creating and implementing a rebranding effort across numerous media platforms for Malibu and its sister company, Axis Wake Research. The goal was to illustrate the slogan “Life without limits” and accentuate the active lifestyle that Malibu and Axis can offer their customers.
ABC was chosen for the project because of the company’s history of working with other action sports brands, including Nike, Street League Skateboarding, and Supra Footwear. “We analyze what the competitors are doing and then try to do something totally different,” Damer explains. It was a bold approach, and they didn’t know if Malibu would go for it. But ABC figured, if they won the year-long contract to create a series of catalog images, print ads and online videos using MAXON’s CINEMA 4D, it would be best to have presented ideas they would be excited to work on rather than playing it safe.
Creative realism
Damer and ABC’s art director Brad Wolf used CINEMA 4D for both the print and animation portions of the Malibu/Axis rebranding project, with the final looks and compositing completed in Photoshop for print and After Effects for video. For the Malibu boats campaign, Life Without Limits, ABC developed a visual that depicts a cubicle-bound office worker transforming into a wake surfer and escaping the limits of his 9 to 5 job into a surreal wake surfing experience behind a Malibu Wakesetter.
ABC set up a photo shoot with professional wake skater, Brian Grubb, to capture the sequence needed for the campaign artwork. After isolating the photos off of their background, ABC set them up in CINEMA 4D as textures that were applied to vertical planes. “We used the luminance channel of the texture, as well as the alpha channel of the layer in Photoshop we cut out,” Damer recalls. “This allowed us to place the pictures of Brian into true 3D space within the scene and have the CINEMA 4D lights interact with the cut out photos.” CINEMA 4D was also used to match the lighting from the original studio shot.
Graphic style
Already Been Chewed opted to introduce a more graphic style for both the Malibu and Axis catalogs after seeing the approach used for the footwear and auto industries but not for boats. One technique they used was to introduce the use of shard-like images that are repeated throughout the branding for Malibu. To create the shards, Damer used a Cloner object in CINEMA 4D to replicate a triangle shape. After cloning it along a low-poly version of a wake that they had modeled, renders were composted together in Photoshop to create the scene.
“CINEMA 4D allows you to use an object as the basis for your clones,” he says, explaining that a low-poly representation of a wake was created to serve as the basis for the clones to be duplicated across. Random effectors allowed the wave to take on a more organic feel combined with Plane effectors that allowed the transition from smooth “water” to the large wake that was created.
Axis Catalog
For Malibu’s sister company, Axis, ABC used CINEMA 4D to come up with a completely different look. Based on ABC’s tours of the factory, this concept put the focus on the craftsmanship of each boat and how they are made by hand. Damer used CINEMA 4D to create the futuristic-looking factory and composited the final boat and workers into the shot. Unsure about whether they would be able to get the liquid to do exactly what they wanted it to do through fluid simulation, Damer ended up combining 15 or 16 individual models of liquid rendered out of CINEMA 4D and using Photoshop to make them all form the back of the boat.
In addition to the rebranding effort, which will continue for Malibu and Axis throughout the rest of the year, Already Been Chewed has also done several animated features for internet broadcast with television versions coming out in 2014.
Read the full-length story on Gomediazine: http://www.gomediazine.com/insights/life-without-limits/]]>news-3749Thu, 20 Feb 2014 11:39:00 +0100Layer Media Uses CINEMA 4D for Promo ParodyIt’s hard to imagine what a humorous take on Ubisoft’s new third-person shooter game, Tom Clancy’s Ghost Recon: Future Soldier might look likeA Student of CINEMA 4DBecause he considers himself to be a “student of CINEMA 4D” with much more to learn, Ieyoub wasn’t planning on using the software to create the visual effects for the video. He changed his mind, though, when a few things went wrong on shoot day - most notably the scene in which the helicopter flies over the shoulder of the Future Intern. The plan was to use his smart phone to fly a Parrot AR.Dronequadricopter into the shot and over the actor’s shoulder.
But even though Ieyoub had practiced quite a bit in his living room, on shoot day the drone took off and flew straight up into the ceiling before crashing to the floor in pieces. “I was like ‘Oh my God! I’m going to have to somehow do this effect with CINEMA 4D now,” he recalls, laughing. After finding a free model of a Parrot AR.Drone online, he “Frankensteined together” a Parrot AR.Drone model with a toy mechanical claw model to create the drone UAV (unmanned aerial vehicle).
CINEMA 4D also came in handy when Ieyoub needed to create some effects to make the scene in which the UPS boxes stack up on people’s desks look more comical and dramatic. After first trying to make the scene work by having a production assistant stand off camera and throw boxes onto the desks from off camera, he decided to use CINEMA 4D to model the boxes. “And then I used proxy geometry and physics to shoot boxes out of an emitter and land in a stack in a cool, funny way,” he explains.
Ieyoub attributes much of his success using CINEMA 4D for this project to the many helpful tutorials he was able to find online on YouTube. And, as an experienced After Effects user, he was grateful for the smooth integration between the two software packages, which allowed him to share a camera and jump back-and-forth quickly and easily. “I’m sure there are smarter ways I could have done things", he says. “But at the end of the day, I thought it turned out really well and the people at Ubisoft were really happy with it, too.”
Read the full-length feature on this project at Renderosity:

http://www.renderosity.com/ghost-team-cms-16798Layer Media website:http://layer-media.com]]>news-3745Thu, 13 Feb 2014 09:58:00 +0100Wonder WomanRainfall Films’ new short film puts a modern spin on a beloved comic book heroineBy Meleah Maynard
In his introduction to Rainfall Films’ new Wonder Woman short on YouTube, Director Sam Balcomb says that he thinks “quite a number” of viewers will agree that “Wonder Woman is a character just as vital and crucial to our understanding of humanity as any other superhero…if not more so.” Judging by the fact that the film got over 4 million views in the first four days, making it the top featured video on YouTube’s home page and triggering an avalanche of exuberant media coverage, he’s probably onto something.
Wonder Woman is by far the most popular short film that Los Angeles, Calif.-based Rainfall Films has produced internally since the launch of the company in 2008. Relying on CINEMA 4D for all of the environments and After Effects for compositing, they worked on the film for most of 2013, fitting it in between paying jobs. Rileah Vanderbilt (Frozen, Team Unicorn) stars, battling bad guys in a modern-day city, as well as minotaurs in Themyscira, Wonder Woman’s island homeland – also known as Paradise Island.
Why Wonder Woman
Rainfall tries to do some kind of short film on their own every year, says Balcomb, the production/post company’s director, writer and producer. But they’re also commissioned for projects such as their first short, a trailer for The Legend of Zelda, which was created for IGN.
“We’re all huge gamers and comic book fans, and our original vision for the company was to be a production studio generating content online or for videos or feature films, so these projects help show that we love to tell stories and we can create our own stuff,” Balcomb explains (watch their new show reel here).
Because they didn’t have time to work on a project of their own in 2012, Rainfall decided to go all out and do something “really cool and fun that would push us as far as we could possibly go,” Balcomb recalls. The hope was to make a short that would be good enough for their show reel. Balcomb proposed that they do something based on Wonder Woman, and the team eagerly agreed.
“My wife is a huge Wonder Woman fan, so we have comic books all over the house and she has an encyclopedic knowledge of the character,” he explains. “We were talking and I realized that Wonder Woman’s background is infused with a lot of Greek mythology and that got me thinking about how we haven’t seen any of that in live action before.” Storyboarding began immediately, with CINEMA 4D being used to create the 3D animatic and everyone wanting the story to be told through action in a way that would be interesting to long-time fans.
Creating Themyscira
Knowing that they were working with a small team and a limited budget, Rainfall Films opted to shoot much of the footage of Wonder Woman (Diana Prince when she’s in her world) and other Amazonians fighting an advancing army of minotaurs on greenscreen. Prior to the shoot, Balcomb and the other artists modeled the mountains and buildings on the island of Themyscira from scratch in CINEMA 4D (watch a VFX breakdown here).
Modeling the buildings was the most fun and most challenging part of the project, Balcomb says, because the background models were highly detailed and took a long time to render using global illumination. Modeling those elements first allowed Rainfall to use CINEMA 4D’s camera to find the most interesting angles ahead of time. Some shots were rendered before the shoot so they could be used as a reference for lighting and placement of the actors.
An Anonymous City In Flames
The next big hurdle was figuring out how to shoot the city scenes. It would have cost too much money to shut down a whole city block to shoot an action scene. So they opted to go the green screen route again – this time on a sound stage in Burbank. “We watched the edit and it was nothing but green, but when we showed people the rough cut with music, people really responded well to that and wanted to see it over and over again,” Balcomb recalls. “That was great, but we couldn’t help thinking, ‘Oh God, we hope it’s as good once the visuals are in place.’”
To create a believable city, Balcomb headed into downtown Los Angeles at 5 a.m. one Sunday, set up his camera and tripod on a street with the “least LA look to it,” and shot hundreds and hundreds of stills. After altering some of the buildings and adding others using CINEMA 4D, he created a rough geometry of the buildings and camera-mapped the stills he took onto that geometry in order to get real parallax when the camera moved.
Balcomb, along with digital artists Jason Schaefer and Nick Viola, composited the film in After Effects and the job took much longer than expected because many of the shots included tricky elements like a lot of hair and motion blur. “We had to fine-tune each shot because each one of them was so completely different – we couldn’t just copy and paste,” Balcomb recalls. Wonder Woman’s costume, which was designed by Heather Greene (Tron: Legacy), was also problematic in post because it was made of so much metal and it was hard to get the grain out and paint out wires.
Wonder Woman’s Revival
Balcomb says everyone at Rainfall has been overwhelmed by the positive response to the film and the exposure it received in Hollywood. Shortly after Rainfall’s video was posted, Warner Bros. released a statement saying how important it is to give Wonder Woman the big-screen respect she deserves and that fans are hoping for. And not long after that, they announced that Wonder Woman would be in the upcoming Batman movie.

For his part, Balcomb just hopes that anyone who brings Wonder Woman to the big screen pays attention to what people love about her character and they don’t dumb her down. “Our company just had the best year yet, thanks to this project,” he says. “Anytime we get to work on something awesome like Wonder Woman, it makes me so thankful to have this job. I'm really proud of what we've done so far, but it's safe to say the best is yet to come.”
Meleah Maynard is a freelance writer and editor in Minneapolis, Minnesota.Credits:
Starring: Rileah VanderbiltClare Grant • Alicia Marie • America Young • Kimi Hughes • Christy HauptmanDirector of Photography: Andrew Finch Key Grip: Duy Nguyen 1st AC: Nick Roney Gaffer: Ryan WaltonStunt Coordinator: Surawit Sae Kang Stunt Team: Surawit Sae Kang • Joe Perez • Billy Bussey• Jason Brillantes • Kerry WongCostume Designer: Heather Greene Costumer: Sarah Skinner Consultant: GoldenLasso.netHair & Makeup: Anissa Salazar Assistant: Yulitzin AlvarezMusic & Sound Design: Jeff Dodson Vocalist: Raya Yarbrough Sound Mastered at: RunsilentEditor: Jesse Soff Visual Effects Supervisor: Sam Balcomb Compositors: Jason Schaefer • Nick ViolaProducer: Jesse SoffDirected by: Sam Balcomb]]>news-3693Wed, 15 Jan 2014 10:36:00 +0100Heavy Metalhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/heavy-metal/Matt Frodsham had already had experience creating music videos, but the project ‘From Hell’ presented completely new challenges – that he was able to master with the help of MAXON CINEMA 4D.‘From Hell’ is a band whose music is not for the faint-hearted. They combine hard metal music with lyrics that are just as hard, which can also be aimed at clergy or the Catholic Church. The band and their label went on a search for an artist who could create a video in the style they wanted and found Matt, who just happened to have some time available and decided to take on this challenging project. Matt had already worked on music videos in the past, which he considered to be very elaborate. But Matt thought that the song ‘Holy War’ was interesting and thought it would be an exciting project to work on – a welcome contrast to his previous work. Matt corresponded with the client until the project had been successfully defined and got to work. Since the video’s overall look and mood had been defined and Matt was given the creative freedom for the rest, he decided to try out several new techniques.

Contrary to most other projects, this project’s premise was quantity over quality, which was due to the large amount of work on the one hand and the tight deadline on the other. Twelve different animals had to be modeled, textured, rigged and animated! Most of the modeling was done in CINEMA 4D, whose Sculpt and UV painting features were used extensively. The final look was created using Coat 3D, after which Matt was able to start with rigging. He used CINEMA 4D’s Character object, with which he was able to quickly add joints to all twelve models and animate them. The types of animals ranged from cockroaches to a skeleton horse and millipedes. In addition to the various character objects’ rigs, the automatic weighting feature also made Matt’s life easier. “Since the video’s scenes had to be cut very quickly, I didn’t have to add too much detail to the joint regions and the automatic weighting was perfect for this purpose”, explains Matt.

Despite extensive planning and efficient modeling, the scenes nevertheless ended up being very large, which is why Matt created low-poly proxy models for all animated scenes in order to set up the animations in realtime. Matt used the Sculpt feature and a Python script, which was written by a friend, to create several unique animation sequences. The Python script was used to sequentially display or hide separate sculpt layers for animation, which produced a very unique look, similar to Claymation.

This project was indeed a unique challenge and a welcome change from his usual work. The new techniques he was forced to try out made it possible for him to extensively test CINEMA 4D’s new functions and create completely new effects. “Again, CINEMA 4D proved that it’s packed full of functions that leave almost nothing to be desired,” concludes Matt. “Without such a reliable tool, it would be impossible to create a 2-minute video filled to the brim with character animation in just 3 months. Especially considering the fact that I completed several other projects during this time as well!”

]]>news-3690Wed, 08 Jan 2014 13:51:00 +0100Queen of the Junglehttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/queen-of-the-jungle/Mirada on Creating the VFX for Katy Perry’s “Roar” By: Meleah MaynardThe music video for Katy Perry’s single, “Roar,” may have been shot at the Los Angeles County Arboretum and Botanic Garden, but it looks like the heart of the jungle thanks to an assortment of trained animals and the impressive array of visual effects created by Los Angeles-based Mirada. Working with Motion Theory directors Grady Hall and Mark Kudsi, Mirada had just three weeks to do all of the post-production for the nearly five-minute video.

Mindful of the challenging schedule, Mirada made the most of that time by employing a variety of software and hardware, including CINEMA 4D, After Effects, Houdini, Nuke and Flame. They also took the time to select the right artists, in-house and freelance, for every aspect of the project.

“We wanted to pay homage to the Jane Of the Jungle style and genre, so finding the right artists for that look was crucial,” recalls Jonathan Wu, Mirada’s creative director. “Tools are great because they allow artists to express themselves, but the artists really bring things to life.”

Enhancing the NarrativeThough the post-production deadline was tight, the process was made easier by the fact that Mirada was actively involved in the making of the video from the beginning, starting with conception. And because Hall and Kudsi are well aware of what can be achieved in post-production, they planned their shots accordingly with input from the team at Mirada. “We started talking right away about the visuals we would need to complement the story, anything that would help reinforce the narrative,” says VFX supervisor Michael Shelton.

Shelton and Wu were also on set to partner with the directors on pinpointing instances where the narrative could be enhanced with effects. That kind of input on creative decisions made the project even more fun and challenging, Shelton says. It also helped ensure that they would have everything they needed when it was crunch time. “When you shoot something as ambitious as this, it’s a constantly evolving picture that reveals itself as the director makes decisions to improve the story and look of the video, so you have to be there to roll with the punches and figure out the best approach.”

By way of example, Shelton points to the moment when the crew realized that the green screen they had wasn’t large enough to accommodate the size of the elephant they were shooting. So he was able to troubleshoot on set, knowing it would need to be rotoscoped as soon as post-production started.

Animals are always a wildcard on set, Shelton explains, because, like kids, you can never be entirely certain what they’ll do. “It’s tricky because you have to wait and see what the animals can and can’t do,” he says, explaining that things like how the animal’s trajectory changed when jumping form platform to platform often had to be massaged in post-production. This was exactly the case in shooting the tiger, so the tiger and Katy were shot as separate plates and composited together for the final shot.

A Team EffortIn all, there were 139 shots in “Roar” that incorporated visual effects, including 2D animation, matte paintings, particle effects, 3D and a lot of heavy compositing. Artists primarily worked in teams, handing shots off to the next person in the pipeline as required. Hours were long, but what kept people going (beyond energy drinks, snacks and coffee) was the energy and sense of camaraderie that surrounded the project. “It was really a passion project because work like this doesn’t come around very often, so everybody really embraced it and did their best,” Shelton says.

One of the most striking visual effects in the video happens when Perry stoops by a pool of water and breaks into Roar’s empowering chorus for the first time. As she sings, blinking fireflies move gracefully around her before coming together in the air to form the head of a roaring tiger. After first studying footage of how fireflies move and light up and dim back down, Mirada opted to use CINEMA 4D to create different iterations of the fireflies because the 3D software would allow them to make changes quickly and easily.

“We knew we wanted the fireflies to hover and dissipate, but we needed to see what that looked like and make several changes, so we chose CIN EMA 4Dbecause it’s artist friendly and fast,” Shelton explains. Using several particle tools in CINEMA 4D, artists were able to alternate between MoGraph, Thinking Particles or X-Particles, whichever got the job done the fastest, depending on the shot. Mirada artists chose MoGraph for the ambient firefly shots because a basic random noise effector can animate several spheres around a general area. Setup was simple with few parameters and it provided realistic results, especially with CINEMA 4D’s Physical Render Motion Blur setting that added the firefly trail.

The tiger roar shot was a bit more complicated. To create the fireflies that buzz around the environment and then gravitate towards the center to the tiger’s head, a combination of X-Particles and Thinking Particles was used. X-Particles was faster to set up: only a volume emitter, an attractor and a few deflectors were needed to guide the fireflies to the center in a natural manner. Once the fireflies reached the center, they were killed off and Thinking Particles was used to emit from pre-rendered tiger footage to drive the firefly birth and color. Varying the intensity and direction in Thinking Particles made the roar very expressive and magical. After Effects was then used to add color and glow to the fireflies as well as add their reflections in the water.CINEMA 4D was also used to help test techniques for the effect that called for a swarm of fireflies to gather and take the shape of a roaring tiger head. Once they’d come up with a plan, the Mirada team rigged the model in Maya using a combination of skinning and corrective shapes. When the animation was complete, the geometry was baked with Alembic and imported into Houdini for the articulate particle effects.

“The motion of the tiger roaring needed to be carefully considered as it would be the basis of the particles’ movement and determine the overall clarity of the image the swarm was meant to produce,” Shelton explains, adding that it was also important that the tiger have a regal feel to it. He also wanted the point of the actual roar to be in sync with the lyrics of the song. So combining the moving camera and the finite number of frames they had to work with when the tiger entered the frame was critical. “Luckily, the Mirada pipeline from Maya to Houdini made iterating between Houdini very quick,” he says.

After the tiger head had been animated in Maya and imported into Houdini, Mirada FX artists were able to scatter points on the tiger head and use them as attractor points for the particles, making the same number of points on the tiger as in the air. A custom tool was written to make the particles attract to the points and then release at a specified time. Houdini was able to control the brightness of each particle based on the Fresnel angle of the tiger to the camera.

Telling Great StoriesMirada’s main goal, Wu says, is to tell good stories whether they’re working on a music video, film, commercial or interactive project. And having the opportunity to work on something as high profile as “Roar,” was fantastic.

Both the song and the video have been widely publicized and praised, and the Mirada team is already hard at work on other projects. Wu is busy developing German author Cornelia Funke’s latest interactive storybook, while Shelton is currently leading a three-minute photo-real CG project. Reflecting on the “Roar” video, Wu describes the project as “a great experience for the entire team with great energy and all-around good vibes.”

Website Motion Theory:http://www.motiontheory.com/]]>news-3545Mon, 06 Jan 2014 10:34:00 +0100Down-to-Earthhttps://maxon.net/en-us/news/case-studies/visualization/article/down-to-earth/Giving the Earth a starring role in a film was quite a challenge for artist Uli Henrik Streckenbach but he used MAXON CINEMA 4D to find a unique solution that won him a prize at this year’s animago AWARDS.The Earth below our feet: a topic that is more complex that it appears! When the IASS (Institute for Advanced Sustainability Studies) in Potsdam, Germany, gave Uli the job of creating a 5-minute informational visualization, he saw himself confronted with the challenge of putting a faceless character into the limelight. What’s more, the film’s content that had to be conveyed was so complex that a realistic depiction would in fact be counterproductive because it would not convey the message in a visually concise manner.

This film was also part of a series of informational films that Uli had created, which meant that its look would have to match the previous films. These previous films had been created using a different 3D software – and this new film was going to be created using CINEMA 4D, which Uli had started using only months earlier. In short, the film’s look had already been defined and had to be maintained to ensure a coherent look for the series.

Uli mastered the task of depicting the Earth’s various layers by using a square section of Earth with stylized depictions of its different layers such as vegetation, soil and subsoil. This square segment is then multiplied and puzzled together to create larger areas while constantly maintaining the segmented look. CINEMA 4D’s Grid Array cloner and various Effectors were used to create this part of the animation.

Throughout the film, elements such as stones are ground or decomposed, which was done using the Thrausi plugin for CINEMA 4D. This plugin makes it possible to break an object down into a definable number of pieces, which can subsequently be re-used as part of a physical simulation. The free plugin Steady Bake was then used to bake the simulation in an animation, which made subsequent tasks easier to handle, e.g., when moving these animations along the Timeline.

‘Let’s Talk About Soil’ was the first major project for which Uli used CINEMA 4D – less than three months after being introduced to the software. It was easy for him to implement the innovative realization and unique style that had already been developed for this project. The fact that Uli was able to so quickly create award-winning work with a new software application speaks not only for the artist’s outstanding abilities but for CINEMA 4D as well.

]]>news-3574Wed, 27 Nov 2013 12:11:00 +0100Love Is All Aroundhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/love-is-all-around/For BBC One's 'Love 2013' series of channel idents, London design agency Vincent turned to CINEMA 4D's extensive MoGraph toolsetBy Steve JarrattBritain's public service broadcaster, the BBC, has a rich history of channel idents spanning back to the 1950s. BBC One’s longstanding trademark was a globe of the world – initially created with mechanical models and mirrors and later using CG in the mid-1980s. After toying with hot air balloons and dancers in the 1990s and beyond, the circular motif reappeared in 2006, with a series of indents showing rings of kites, crop circles, a formation of swimming hippos, and so on. The series, entitled Circles, echoes the old BBC globe and the 'O' stood as a symbol of unity.

The company responsible for these idents is Red Bee Media, a media management company based in London with offices across Europe and in Australia. To celebrate the spring schedule, and continuing an on-going 'Love 2013' theme, it turned to Vincent, a design agency in central London, whose work includes video game cinematics, brand idents, TV show intros and even visual effects for ‘Prometheus’ and ‘Quantum of Solace’. "We direct, write, design, animate and produce all our projects in-house," says director John Hill. "We're very hands-on. We are experienced in many disciplines, including live action direction, VFX, branding, art direction, design, 2D/3D animation and script writing."

The team at Vincent decided to use the circular theme in a very literal sense, creating an animated pin cushion sequence of tubes arrayed and choreographed in sweeping circular patterns. The results are both photorealistic and abstract but their apparent simplicity belie the work required to complete the project in just four weeks. When initially questioned about the number of polygons, director John Hill's one-word response was "huge"; asked which tools or methods were employed, his answer: "patience."

The project was handled by a four-person team running CINEMA 4D R14 on 12-core Mac workstations. As you'd expect, the MoGraph tools, specifically the Cloner in conjunction with various Effectors, were key in creating the sequence. "The pin cushion was created using the Cloner object," explains Hill, "with a series of effectors to control the height, width and colors. The Shader effector controlled most of the pin heights and color changes. Inheritance controlled a lot of the offset animations within selected groups of clones. And the Random effector was used to offset general timings and random heights of peripheral pins."

Hill highlights the Inheritance effector for its part in the project. "It was very useful to help offset timings within each group of animated clones," he says, "combined with the Shader effector to control heights and their color."

While the overall structure of the indents is relatively straightforward – using the cloner and effectors to automate the movement – the sheer volume of geometry meant that it was almost impossible to visualize the sequence in order to preview the timing. "The scene data was epic, and almost impossible to watch on-the-fly," explains Hill. "So we simply visualized/predicted in advance what the animation would look like in our minds."

To speed things up, the team used layers to group certain selections of pins, which they could then view and tweak independently. "We rendered out hardware previews for accurate timings and flow," adds Hill, "followed by cached low-resolution MoGraph renders for a more comprehensive preview." To get the specific look they wanted, the team then used XPresso to control those groups of clones that had specific hero animations.

However, not everything could be automated, such as the little flowers that blossom at precise times. "That was a bit difficult," admits Hill. "It's easy creating random timings for animations, but on-cue and ordered is quite difficult. We found ourselves doing this by hand, keyframing and using the Inheritance effector most of the time."

The animations were rendered out using Global Illumination on Vincent's in-house render farm, which consists of eight render clients. "Some sequences were rendered with the Physical and some with the Standard render engine," says Hill. "Standard is better for speed, but not as good for GI renders, so Physical was used for those sequences. Physical in R14 can be quite slow when you crank up the sampling settings to reduce the noise, so we needed a lot of rendering power to get these done in time. Because most of the sequences were quite evenly and broadly lit, we did have a few noise problems with the Physical Renderer that had to be smoothed out in post-production. Remove grain plugins always help with this in the composite."

All of the idents look great but the nighttime sequence with the subtly glowing pins is particularly effective. "We find using lights within objects is a good, speedy workaround to help create natural GI-style lighting," commented Hill. "Varying lights’ color and intensity also helps break up the overall lighting ambience very well."

With the sequences rendered, the team then turned to Adobe After Effects to add the final polish. To aid in the compositing, Hill remarks how the animations were output as a range of render passes: "We made use of the Ambient Occlusion, Diffuse, Shadow, Specular, Reflection and Depth passes amongst others to help grade and separate areas of color," he says. "The camera data helped to add 2.5D color fills and gradients over the 3D rendered content." The team rendered out various mattes and then composited gradient color ramps as 2.5D overlays. "These worked on top of the 3D render to help transition between waves of color," adds Hill, "but you can do this easily now with the CINEWARE plugin in Adobe After Effects CC."

Clearly this project pushed Vincent and CINEMA 4D pretty hard, but Hill is full of praise for the app: "Again, CINEMA 4D is by far the best 3D software for motion graphics and general creative flexibility," he says, calling it "stable, fast and intuitive." However it's possible that the new CINEMA 4D R15 would have saved a lot of time on this project: "The faster rendering and live networking is fantastic," claims Hill. "The ability to make use of peripheral computers to help with processing on-the-fly is really great – it should really speed up our workflow."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

Vincent London website:www.vincentlondon.com]]>news-3628Thu, 21 Nov 2013 11:18:00 +0100Speechless – but not Because the Cat’s got Your Tongue.Creating a horde of characters and animating them is generally a job for a large studio – but the talented team at Job, Joris & Marieke studios did it themselves using MAXON CINEMA 4D.Creativity can work in mysterious ways, which can also be said about the short film ‘Mute’ that was based on a cut that Joris suffered while swimming: The cut looked like a mouth so he started joking around and made is toe “talk”. Job and Marieke thought it was disgusting – but it gave them a great idea for a short film!

Job, Joris and Marieke are actually designers but they all were yearning to do animation. After the basic storyboard had been finished, the got turned to CINEMA 4D and started character development. They needed numerous characters who would all create their own mouth. In fact, the entire cast is made up of only three different characters whose look was varied using textures. “To find the right wardrobe for our characters, we researched people named Dieter, Klaus, Marcy and Cindy online. Surprisingly enough, people with these names had the exact look we wanted for our characters,” states a smiling Marieke.

Everything from a chainsaw to a kitchen knife, a hand mixer and a record were used to perform the operations. “Since we only had three base models for all characters, modifying the geometry to create each of these effects was not possible. We used sub-polygon displacement maps instead. This was perfect for creating individual mouths for each character. Even the guy who uses a chainsaw to create his mouth turned out just as we planned,” said Joris.

‘Mute’ looks very elaborate – and indeed it was not trivial considering the fact that the team had never before created such a complex animation: Four minutes of “cutting edge” animation with a heavy dose of dark humor. “Thanks to CINEMA 4D ‘s intuitive interface, we were able to quickly learn what we needed to know to animate our characters. It only took us about three months to finish the film,” remembers Job.Since its debut, ‘Mute’ was nominated for the ‘Golden Calf’ award at the Dutch Film Festival and was also awarded top honors in the Dutch Playground Festival’s ‘Best Independent’ category. Surely this is only the tip of the iceberg since ‘Mute’ will be shown at numerous other festivals in the coming months. We wish Job, Joris & Marieke the best of luck!
"Making of" video:www.vimeo.com/78911041]]>news-3514Mon, 18 Nov 2013 09:29:00 +0100The Spirit of the Machinehttps://maxon.net/en-us/news/case-studies/advertising-design/article/the-spirit-of-the-machine/The confrontation of constant exhibition online on the one hand and the need for privacy on the other was the motivation for artist Mike Winkelmann, a.k.a. Beeple, to create a short film on this subject, which he realized using MAXON CINEMA 4D.Beeple is an extremely calm artist who creates a work of art every day, regardless of how small it is. He also works on larger projects that take longer to create and works as a web designer, all the while keeping a good eye on the world around him. He keenly observes what’s going on around him and articulates this in his works of art.

His latest work was obviously influenced by concern and contemplation, which Beeple successfully articulated using his unique style. He made a short film titled “Transparent Machines”, which deals with the problematic discrepancies between the extroverted exhibitionism taking place online and in social networks on the one hand and the constant admonishments for the protection of privacy online on the other. Beeple’s statement is clear: we are all transparent machines!

The short film shows an unbelievably complex arrangement of machine parts that unfold, open, interlink, move themselves, bolt together and more: in short, they interact in any way imaginable. The mechanical movements create a wave over the machines. The camera follows the movement and reveals in increasing number of glass parts.

“The greatest challenge was to create such a complex object with so many moving parts that were also rigged and animated. The object was also developed ‘on-the-fly’,” said Beeple. “Since each part of the machine was an individual moving part, there was no way to automate the movements. It was a very time-consuming process.”

Before starting this project, Beeple did not have any experience with a linear workflow. This was the first time that Beeple rendered an animation with single images in 4k resolution, which he then imported into After Effects to create the animation. Here, he created a camera movement to make the animation look unsettled – like it would look if a hand-held camera had been used. He was very accurate in the creation of the glass material and used Subsurface Scattering and volumetric fog in the Refraction channel.

When the animation was rendered using V-Ray, Beeple decided to use a very simple setup. “No lights, no GI, no sky, no Ambient Occlusion, not even the omission of depth of field resulted in acceptable render times,” remembers Beeple. “The whole glass look was actually a mistake – I wanted an entirely different look. But when I saw the entire machine in the glass look it went ‘bang’ in my mind and the entire concept was changed,” laughs Beeple. “During production, I sometimes doubted that I could finish the project. The huge scene with the innumerable number of assets and thousands of keyframes in the Timeline! But once again, CINEMA 4D lived up to its reputation. The clear interface, which offers quick feedback regardless of the number of objects, let me locate items I wanted to work on anywhere in that mountain of elements in the scene so I could fine-tune and accentuate them exactly the way I wanted to!”

MPC's Motion Design Studio, in collaboration with creative ad agency Boys and Girls and director Brett Foraker of production company RSA, was faced with the technical challenge of incorporating fast-moving CG elements into a single, 40-second camera take. The concept for the advert was, that through mobile phones everyone has access to many different digital applications. Conveying the benefits of having that digital lifestyle through the Three Mobile network was a challenge that started with HDRI and reference measurements taken on a cold street in Prague.

The hero figure of the film walks down a street, encountering other people and accessing social media, collecting music albums in Spotify, playing fantasy football and finally encountering a wall of Instagram photos.

The Motion Design Studio team, part of MPC, was lead by Eliot Hobdell with the efforts of 15 artists over a period of four weeks. CINEMA 4D R14 Studio was used for all the 3D elements in the 40-second spot, with the final piece put together in NUKE and then polished off in Flame.

Eliot commented, "It took a lot of R&D and Look Dev to get it all working, with various technical hurdles to overcome. In particular, we developed a bespoke voxel effect for the fantasy football section as well as sculpting multiple dynamic simulations for the Instagram section at the end. For all sections, we tried a lot of different looks and approaches before we settled on the final treatment."

The main focus throughout the spot was to get the real world and the CG world interconnected in a realistic fashion. The VFX work needed to fit seamlessly around the hero, as well as provide a driving force behind his journey down the street. In order to achieve this, the hero's sightline and physical position were precisely measured to control how the CG elements were fitted around him. Eliot remarked, "The director, Brett Foraker, is extremely experienced in this kind of VFX work. We took his original treatment and storyboards and worked with him to develop the best techniques and toolsets to use."

The whole ad was a single-take shot, which created some challenges. The final take that was used was more than three seconds over length, and needed to be time-warped to match the required 40-second slot. This meant there were time-warping artifacts that had to be removed throughout the film. The Motion Design Studio pipeline had to accommodate this time-warp right from the start. So, the 3D camera track was created from the original plate and then the time-warp being used on the footage was applied to it. Lead Flame operator Franck Lambertz built a custom NUKE script to do this.

There were a number of technical challenges to overcome, starting with camera tracking and match-moving. These were particularly difficult for that section as there was a very shallow depth of field, a 180-degree camera swing and very little texture detail in the background. To overcome this, the camera track was done in sections that were then stitched together, mainly using Synth Eyes and PF track. As the hero walked along, the original concept was for the icons to form in the air around the hero's head, but this impacted on his performance, especially when he was talking.

Instead, director Brett Foraker suggested there be an enhancement of the phone screen, in a heads-up display style. So, over the course of a week, they looked at numerous ways for how the icons could be re-worked into this shot. The trick then was to make the icons feel like they were connected to the swing of the phone in the hero's hand while walking, but also moving in the air around him.

Having generated a camera track for the shot, the Motion Design Studio then used tracking markers placed on the hero to generate an object track of his movement in relation to the CG camera. This was then further adjusted by hand to give a tracking null for the movement of the icons that had a feel of them rolling with his body movement while also feeling connected to the phone that was out of shot. The other half of this section was as the hero encountered a group of young women, and this initiated the social media world. This showed the hero interacting with both women and the CG elements while adding someone to Facebook. The challenge here was tracking the geometry onto the hero's shoulder and rotating the jacket texture to create the voxel effect for the Facebook thumbs up. The geometry was match-moved and then CINEMA 4D's MoGraph was used to fill the volume with cubes. These were animated with effectors.

One of the more laborious tasks was painting out the tracking markers on the actor's shoulders, arms and wrists, as well as the phone, for the first two sections of the shot for every frame of the 40-second advert. This was no fewer than 1,000 frames and, along with the markers, the body and hair of the hero had to be rotoscoped as well, while the entire street was modelled and geometry-matched. To round it off, there was also lighting, texturing and compositing.

Music has become increasingly digital and that's what has to be conveyed in the Spotify section. The idea was to enhance records that were dropped onto the set with CG extras. Also, the album cover that is actually caught in the shot, needed to be replaced and the sequence needed to be modified to show the albums breaking up into pixels as they came into contact with the floor. This is where CINEMA 4D's aero/dynamics simulation and an XPresso script came into play. Manual animation was used to match digital doubles for the original records and an aerodynamics simulation was created for the extra records. A 2D planar track of album cover that the hero catches was created, and an XPresso script was set up to swap the covers with exploding MoGraph clones for the pixels when the records made contact with the floor plane.

This script also handled the swapping of the textures from the covers they were replacing. Of course, this did mean that the floor was then covered in albums that would have interfered with the next section with the footballers. To get round this, a digital matte painting of the floor was created and tracked into the shot.

One of the most challenging elements came next, as it featured fantasy football players as if they were pixelated computer game characters, running past the hero. Eliot explained the issues: "As time was short, we needed to develop a way of creating this shot using the players in the original shot. Tests showed that using a 2D pixelating effect on the plate wasn't going to work, as it didn't have any Z depth to the effect. Also, the alternative of creating digital doubles and animating them running past would have taken too long."

The solution featured using an XPresso script to drive a MoGraph setup. Eliot revealed: "From the Look Dev tests, we evolved a bespoke XPresso script that we were able to feed the roto outlines of the players into. This script then drove a MoGraph setup to divide a grid of clones evenly into the changing shapes of the players' outlines. This meant that the grids got smaller as the players moved further away. This was then match-moved to the players' Z depth as they ran through the shot. We also created textures for the players that expanded their outlines over the plate for the CG set up and eroded their outlines in the back plate. This meant that, when composited in, the characters had crisp edges against the plate."

The final part of the film required a wall that breaks up into Instagram pictures and flies up the street. This was mastered in CINEMA 4D by creating a CG wall from MoGraph clones, then sculpting dynamic simulations using MoGraph tags and wind generators. Added to all this was the ongoing cleanup from time-warp artifacts and tracking markers, as well as lighting, texturing and final compositing.

When the project was finished, Eliot commented, "This project was a perfect blend of creative and technical challenges and was a pleasure to work on from the initial pitch to the final delivery."

The director, Brett Foraker, was equally pleased with the result that the Motion Design Studio team and CINEMA 4D had created, remarking to Eliot, "You and your team did great. I was really happy with how we kept pushing the quality forward. This kind of collaboration is my favorite way to work by a mile!"

Duncan Evans is a freelance journalist, photographer and author.

You can see the finished film, called Three, Digital You, here:www.youtube.com

]]>news-3542Mon, 28 Oct 2013 14:07:00 +0100It’s Only Paper, Right?https://maxon.net/en-us/news/case-studies/visualization/article/its-only-paper-right/Pingo van der Brinkolev only wanted to create a new type of stock photo using MAXON CINEMA 4D, but things turned out differently.What Pingo van der Brinkolev actually wanted to do was create a new type of animated stock photos, for which he had already created a design. What was special about his scenes was the origami-like look of the models that all looked like they were made of folded paper. Pingo also created looping animations for these folding objects, which means that the observer doesn’t know when the animation begins or when it ends.Pingo put a lot of work into the look of the paper, which he wanted to render flicker-free using GI within a realistic timeframe. Using a combination of QMC GI, backlighting and a balanced amount of secondary reflection, he was able to reduce the render time per frame to about 5 minutes. Pingo used the Cloner object to create the animation loops, and the MoGraph feature was one of the most important tools for adding a slight amount of irregularity to the otherwise very smooth movements.When Brinkelov presented his works as stock graphics, they were received very positively. The webmaster for the site on which the animations were offered was impressed. But nobody bought the animations! So his work wouldn’t be for naught, Pingo added some camera movement, re-rendered the animations and created a short film with them that he posted at Vimeo. A few days later, the short film, which was named ‘It’s Paper’, was given the predicate Staff Pick and went on to become a hit. Brinkelov received invitations to be keynote speaker as well as numerous offers to cooperate on projects. In all, Pingo said that the response and resulting opportunities were overwhelming, and adds with a wink of the eye, “… with very little effort!”

Pingo van der Brinkolev's website:www.pingo.nu]]>news-3486Fri, 18 Oct 2013 13:22:00 +0200HIVE-FX on How They Use CINEMA 4D for NBC’s Grimmhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/hive-fx-on-how-they-use-cinema-4d-for-nbcs-grimm/With two seasons of NBC’s Grimm now behind them, Portland, Oregon-based HIVE-FX has explored all sorts of ways to turn people into monsters.“We used to do an overall blending from human to creature, creating a cross-dissolving effect,” says Guy Cappiccie, HIVE-FX’s shot supervisor and lead creative for Grimm, explaining the innovative approach to creature morphing that landed them a coveted vendor spot on the series. “But now we’re blending and moving displacement based on shapes to help break up the characters’ transitions in a more organic way.” The result is slower, more realistically detailed morph from ordinary-looking human to monster.

HIVE-FX specializes in doing VFX shots for the series that involve morphs, creatures, animals and hair. That might sound strange to those who don’t know the show, which is based on the fairy tales of the Brothers Grimm and centers around Portland homicide detective Nick Burkhardt (David Giuntoli).

Refining the ProcessWhen HIVE-FX started work on season one of Grimm, only a handful of people on the creative team had experience with character animation, including Jim Clark, the company’s president. Things have evolved over time as NBC has gotten to know the team’s work and Cappiccie has not only composited the finishing touches on characters, he has built a pipeline that enables HIVE-FX to create things they could only dream of not long ago. “We still go over some concepts, but we’re getting more control over the process,” says lead compositer James Chick.

Devising a pipeline that works has not been easy. Most creatures begin as photos of the actors taken from every angle. Next, the photos are used to create 3D replicas of their heads using tracking points as a reference. Pixologic’s ZBrush is used for 3D sculpting, and models are rigged and animated in Maya. CINEMA 4D is used for hair, surfacing, lighting and rendering.

Though Clark had long been working in CINEMA 4D and knew he wanted to use it for creature making, HIVE-FX didn’t know any artists who were using the software for that purpose. Undaunted, they hired Maya artists and taught them how to do specific things like use CINEMA 4D’s Hair feature.

To help smooth the path between Maya and CINEMA 4D, HIVE-FX hired the German software developer team at2 GmbH to design a custom plug-in that takes the deformer mesh out of Maya and on a per-point bases, caches it into a single file, allowing the point cache tags to be read inside CINEMA 4D so camera and point positions can be matched perfectly.

Making MonstersWith a few exceptions, the three preferred vendors who create characters for the series tend to work on a specific set of monsters to ensure character continuity. HIVE-FX gets about three to four weeks to work on each episode, and it is common for them to be working on two or three episodes at the same time.

Recently, Cappiccie’s team has been working on several characters, including Don Nidaria, a lion-like creature that passes as a rich but abusive man in the human world. After killing his wife, he hires a slimy attorney (who morphs into a goat creature) to get him off the hook. “We’re known for the way we do the interaction between hair and skin, which has been so important with some of our characters, like the lion,” says Chick. “We’ve pushed how we use Hair in CINEMA 4D since the beginning of the second season and we’ve made massive strides in how hair moves naturally by driving it dynamically using realistic 3D and object tracks.”

Their most disturbing new creature is a ghoulish Hexenbiest named Frau Pech that they created on their own. “NBC said they wanted another Hexen, so our modeler Jerod Bough went for it and created Frau,” Cappiccie explains. “I wanted her to be as gross as possible, so I had her throat be exposed, allowing us to see it moving around when she talks.”

Having never modeled characters with the kind of holes needed for the exposed throat, getting the rigging right was challenging. “The trachea is one separate piece, and it’s rigged in a way that allows individual control of the mouth movements,” he continues, adding, “[Those] stitches over her eye – they’re rigged to stretch with her eyelids so we can tighten and loosen them.”

Running a close second to Frau is the “saggy lady,” the creature that detective Burkhardt’s girlfriend Juliette (Bitsie Tulloch) morphs into in a nightmare dream sequence during season two. “Juliette normally doesn’t morph, but this is a dream and it’s also the most advanced morph we’ve created,” Cappiccie recalls. “This one really shows how we’re using displacement so you can see her body morphing slowly, starting with her hand and crawling up her arm to her neck.”

See some creature morphs from the show here:www.hive-fx.com]]>news-3482Thu, 10 Oct 2013 10:20:00 +0200Ornamental Organ Decoration Created with CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/advertising-design/article/ornamental-organ-decoration-created-with-cinema-4d/Wood – the epitome of organic materials – is not the easiest material to bring into shape. But new paths were forged using the right tools and CINEMA 4D.When the firm of Orgel Mayer was given the job of building a new organ for the philharmonic in Penza, Russia, they wanted to use new methods for creating various elements of the organ’s decoration. The organ was designed and built using traditional methods but the decorations were first modeled in 3D and milled out of wood using advanced milling machines. German-based 3D Holzdesign, who has numerous years experience in this field, was commissioned to create the decorative wood elements. 3D Holzdesign specializes in creating wooden elements that are otherwise not feasible to produce. 3D Holzdesign’s Andreas Weinzierl worked several years to develop a method of milling based on virtual 3D models. Andreas is able to use virtual templates to mill wooden objects, which is exactly what Orgel Mayer needed for its project.

So much for the technical side, all that was needed next were the virtual designs. In the past, this would be done by a wood sculptor who would work from sketches that he would painstakingly carve out of wood. The ornamental design was created by the team at Wiesbaden, Germany-based Augenpulver studios that specializes in 3D design. Paul Kramer and his team created the designs based on a scale CAD model of the organ. “The CAD model of the organ was easily imported into CINEMA 4D,” Paul recalls, “after which we created basic models for the areas in question, which were then sculpted using CINEMA 4D’s sculpting tools.”

3D Holzdesign took the models created by Augenpulver to mill the shapes out of special blocks of wood using their proprietary connection with the CNC milling machine. The wood was ‘special’ because it was old and had been stored and dried over a long period of time to ensure that its shape would remain constant, i.e., no shrinkage or distortion. The largest of these blocks of wood was valued at more than €15,000. The finished designs could only be fitted on the finished organ, which was already standing in Penza, Russia – which meant that there was absolutely no room for error! This was made even more of a challenge considering the fact that the milling machine’s limitations and characteristics had to be taken into consideration. The diameter of the milling machine’s bit predetermined the size of the smallest possible notch or recess. Each tendril had to have its own guide spline within the model along which the milling machine could orient itself. The radian measure also had to be considered, which defines the milling machine’s movability and angular position.

“In the end we were all fascinated by the milled parts! Having virtual models on the monitor is different from holding the finished pieces in your hands. A fascinating experience that was overshadowed by the fact that we didn’t know if everything would fit as planned. But we were very elated when the call from Russia came and we were told that everything fit perfectly!”, remembers Paul Kramer. “But then, CINEMA 4D was the perfect tool for the job: Real units of measure, really good import tools, sculpting tools, and the Object Manager gave us an excellent overview of Generators and Deformers, which made it possible to quickly achieve the results we wanted,” concludes Paul and adds: “The team at Augenpulver looks forward to more projects of this type.”

Website Orgelbau Mayer:www.orgelbau-mayer.de]]>news-3420Thu, 26 Sep 2013 11:04:00 +0200Bavarian State History Museumhttps://maxon.net/en-us/news/case-studies/architecture/article/bavarian-state-history-museum/200 entries were submitted for an architecture competition for a new museum in Regensburg, Germany – and the winning design was created with MAXON CINEMA 4D.When the historic section of Regensburg was declared a World Heritage Site in 2006 it was seen as not only an award and an opportunity but also as a challenge for the future. Tourists from all over the world will want to visit Regensburg but any new structure built would have to meet a wide range of requirements, including having the right aesthetics, fitting into the traditional look of the historic section and effectively fulfilling their intended purpose.

The city of Regensburg was confronted with exactly these issues when it decided to build the Bavarian State History Museum in the historic section of town. This challenge was presented to architects who were invited to compete in a competition to design the new museum. Among the participants was Wöerner and Partner, who submitted their own design. The only thing that was missing was a fitting visualization of the design that had to be integrated into images taken of the proposed site. "wörnertraxlerrichter" decided to have the visualization done by Levin Dolgner, an experienced 3D artist, who was thrilled at the opportunity.

The project began with a photo shoot on site. The geometry, which was originally created in Allplan and subsequently edited in CINEMA 4D, was divided onto layers and set up for rendering. The geometry was adapted to the scene using a Camera Calibration tag in CINEMA 4D. BodyPaint 3D was used to edit the remodeled building’s UV coordinates and add a fitting structure to the façade. The lighting in the scene was then adapted to match that of the on-site photos. The scenes were rendered using CINEMA 4D’s Multi-Pass function, paying special attention to the depth pass, which would be used to create realistic-looking contrast in the compositing phase. The fact that layers for Ambient Occlusion and shadows could be defined, rendered and applied separately was also a major factor in creating a successful visualization.

For Wöerner and Partner and Levin Dolgner, the competition ended up consisting of three tasks: Phase 1: Design and realization of the design; Phase 2: Editing the designs after winning the competition; Phase 3: A final VOF procedure that verified the winning entry’s feasibility and the submitting company’s ability to perform and its qualifications for realizing the project.

For Levin Dolgner, CINEMA 4D was a fundamentally important tool in all three phases, whose flexibility and reliability proved to be invaluable assets.

"The goal was to create an engaging animation to excite the online design community about the prospect of being part of Heineken history," says Alex. "Vector Meldrew was hired as a director by the Knock Knock production studio based in Shoreditch."

Now, for readers outside the UK, we should explain that the name of Alex's company is derived from a British TV comedy show called 'One Foot In The Grave,' starring a character called Victor Meldrew – a perpetually surly character, for whom nothing ever seems to go right. Fortunately, Vector Meldrew (with an 'e') appears to be the complete opposite, with a growing portfolio and several awards under his belt.

Although he started out as a one-man operation, Alex is much more collaborative these days, pulling in other artists depending on the scale of his projects. For the 60-second Heineken promo, he joined forces with Sam Coldy, who provided illustrations, and James Doherty from Greeble.tv, as the team's Adobe After Effects guru.

Armed with the brief, Alex and his team set about planning the sequence. "There were a lot of storyboards and choreographing," he explains. "We even ditched some pretty solid ideas in order to refine things. The original concept involved over 100 transformations, but we decided to keep it to around four main sections."

Alex describes how the concepts for the different sections came about: "The first section is influenced by the artwork of one of the judges, design legend Joshua Davis. We loved his style of random generative graphics and use of repetition with geometric shapes. Everything else was inspired from various pieces of advertising history that we took from the Heineken Archives."

"One definite highlight was when Dan Chaput, the Creative Director of Knock Knock, beatboxed a brief to our sound designers to get his ideas across," says Alex.

With so many different styles being used, a range of CINEMA 4D's tools were employed. "A lot of the more bold colourful styles rely on a hint of ambient occlusion to give it a subtle punch," explains Alex. "For the more shiny elements, we tried to avoid Global Illumination and instead used the HDRI packs from Greyscale Gorilla. We used Thinking Particles to create the 'fireflies' effect and a lot of MoGraph cloners."

"I'm addicted to MoGraph," he adds, "but lately I've been looking more into XPresso. It'd be great to expand the tools in both of those areas."

However, Alex does admit that a lot of the sequence's success came down to clever compositing and multi-pass renders. "There's a lot of camera movement that may look like one shot," he says, "but is really just blending between compositions in After Effects. The blend camera tool in R14 also helped a lot to achieve the more extreme movements."

After Effects was also used in conjunction with Trapcode Particular for some particle effects, and Video Copilot's Optical Flares. But, as Alex states, "After Effects was only used to enhance CINEMA 4D renders; nothing in the film is pure After Effects."

The project took six weeks to complete, and was achieved using an eight-core Mac Pro plus a single render node. "The aim was to bring to life an otherwise inanimate object," declares Alex. "CINEMA 4D helped us bring our abstract concepts to life."

Steve Jarratt is a long-time CG enthusiast and technology journalist based in the UK.

]]>news-3394Sun, 01 Sep 2013 13:00:00 +0200The Credit Card Worldhttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/the-credit-card-world/Discover how The Studio at Smoke & Mirrors leveraged the power of CINEMA 4D to create an animated world made up of a massive number of credit cardsWhen the London-based studio was given the task of producing an animated TV and digital commercial for MBNA, it was CINEMA 4D that was the software system of choice for creative director, Dan Andrew. The concept was to show how credit cards had helped shape peoples’ lives and that MBNA was the power that made it all happen. This took the form of a single animated camera movement through a world made of credit cards. There were trigger points at certain positions throughout the journey through the 3D world that would launch specific animated vignettes, all with a sense of fun and purpose. The world was created predominantly from plain white credit cards and MBNA affinity cards. Simple 3D objects such as planes, cars, trees and clouds also populated the scenes.

The first technical challenge revolved around how many credit card objects were actually needed, as Dan Andrew explained: “We had to create a world made of a grid of credit cards and we ended up using 10,000 of them. These cards had to flip without intersecting each other and with the floor, keeping control of their pivot points and those of the textures. We wanted to have a single long camera move, so we had to keep the project light and tidy because we couldn't use any cuts.”

With a team of just two artists and a three-week deadline, CINEMA 4D and its MoGraph feature were the obvious choices. Everything had to be scripted carefully to work together, not to mention how large the scene became as more and more card structures were added and animated. Polygon and texture overheads were kept as light as possible by using Instances under a Cloner to lay down the main credit card grid. Nicola Gastaldi, the lead CINEMA 4D artist, explained why they did it that way: “We made it editable so we could change those main instances in other credit cards, or buildings made of credit cards, and then select some of these instances and animate them with a Fracture object and Plain or Shader effector.”

All along, overhead considerations were foremost in order to keep the project manageable. “To create the buildings, we used a Cloner and the instance of the credit card model. A Shader and a Color shader were used on the texture to keep the side of the cards white, but not the top of the building. A light Random effector helped us, giving a little bit of realness on the stack of cards. The Fracture object usually moves the pivot point to the center of the object, but not with the Instances. So, using Instances was the best choice for our project as we had total control of the cards and on their axis center. Even each card was an Extrude object of just one instanced rectangle spline. In this way, we could change the spline points at once,” Nicola revealed.

The other key element in the project was the very useful, and simple, character rigging module for the cat and the dog. In less than four hours they managed to fully model, rig and animate both of them. On the stadium scene, the team used Thinking Particles for the ticker-tape animation and the dynamics for the bouncing ball, which was nicely baked so they could precisely re-time the animation. To achieve a nice bouncing effect and a randomness on the animation of the buildings, the team chose to manually keyframe the cloners that were creating them. A Shader effector, a Random effector with a simple Cloner, nicely keyframed, were used to create unlimited animations in just a few minutes. For the camera, a free setup part of CS Tools called MoCam was used to help out.

The lighting system used throughout was GI, though this gave concerns that the system, a twin Mac Pro (3GHz dual-core Xenon) with 45 render clients, wouldn't be able to cope with the 15k polygon scene. With 30 seconds of animation at 1080p, it made for a lot of rendering. Still, using Net Render, the render time of each frame was kept to a reasonable level, with around a minute for the simple scenes, and up to 45 minutes for the complex ones. Then, just as The Studio at Smoke & Mirrors thought it had been nailed, on delivery day the client wanted to change the main texture on the credit card throughout the whole video. Thanks to the way it had been set up in CINEMA 4D, it was very simple to implement the change, then sit back and wait for it to take another 12 hours to render again.

The project was then composited in Nuke and refined in Smoke before being graded using Resolve. The workflow from CINEMA 4D to Nuke worked brilliantly as camera data could be shared, as well as initial Nuke scripts being output straight from CINEMA 4D. Also, on this project, the team decided to render separate passes, rather than EXR files, due to the timing. However, they have used both methods with complete success.

You can see the MBNA advert running on television, on the web and also at The Studio at Smoke & Mirrors website!

Text: Duncan Evans Screens: Smoke & Mirrors

Smoke & Mirrors website:www.thestudio.smoke-mirrors.com]]>news-3331Thu, 29 Aug 2013 13:34:00 +0200PyrrhosomaA dark burrow, a toad, a luminous dragonfly. These are the ingredients for a new film project by Alex Lehnert and Josh Johnson, created with MAXON CINEMA 4D R15!Josh Johnson and Alex Lehnert are freelance 3D artists who both have an affinity to CINEMA 4D and use the program on a daily basis. In fact, they have never met personally but worked together very closely while creating David Lewandowski’s film Tiny Tortures. While working on this project, Josh and Alex decided to create their own short film, which ended up being the cornerstone for Pyrrhosoma.

Although never having met in person before (or thereafter), online communication made a successful cooperation easy. Based on an idea the developed together and a script written by Josh, the artists set out to create their gloomy fairytale in which a toad in a burrow and a dragonfly who occasionally flies by play the main roles. In addition to the film being a promotional film for both artists, it was also meant to showcase the new features in CINEMA 4D R15.

The initial wave of enthusiasm was followed by a small shock. All assets had to be created for the film exactly as the script demanded: the toad, burrow, stones, mushrooms, spider nets and a lot of vegetation. In addition to all the modeling that had to be done, an eye had to be kept on the render time considering the fact that only three weeks were planned for production.

An essential part of the production was sculpting, which was used to create the wide range of organic shapes. Because Josh and Alex were cooperating via the internet, there were always hurdles that needed to be jumped with regard to texture paths and materials but these were quickly mastered using the new Texture Manager. And the new Bevel tool saved the day more than once and made it possible to quickly create optimized geometry at critical locations on the models. Because neither artist really sees himself as an animator, the animations for the toad were created using the new PoseMorph options.

The Physical Sky object was used in conjunction with the Physical Renderer to achieve the right mood. They first rendered to the Picture Viewer where they made liberal use of the filter options. This made it possible to quickly see if the lighting was as they wanted and if the balance of light and shadow was correct. Alex commented, “This is one of CINEMA 4D’s finer features: you can create images with a dynamic color spectrum without having to worry much about the technical aspects. Josh and I both have experience working with live film and we both know how it is when you’re constantly thinking about which lens should be used for the next shot in order to achieve the desired effect!”

About working with CINEMA 4D, Josh adds: “The program is not only completely intuitive to use, but is also a complete package in itself. Regardless of whether you’re modeling, painting textures, sculpting or animating, CINEMA 4D offers all the features you need in a single application. For example, Alex had never sculpted before. When it was time to fine-tune the various assets, he began sculpting for the first time and was able to achieve great results.”

So far, only the Pyrrhosoma teaser has been finished, which was rendered in time for this year’s MAXON user meeting using the new Team Render. The entire short film will be ready by the end of this year.

]]>news-3016Wed, 21 Aug 2013 16:35:00 +0200Ready to Race?https://maxon.net/en-us/news/case-studies/games/article/ready-to-race/Race games for iPads are nothing new. But simulating a scale-model racetrack is! And this is exactly what Thirdframe Studios did using MAXON CINEMA 4D.The team at Thirdframe Studios loves computer games like anyone else – and race games are especially fun. And everyone remembers the fun they had as kids playing with their racetracks. The rest of the crew at Thirdframe Studios also loved the idea, and creating a game for iPad and iPhone had been something everyone had always wanted to do anyway. But nobody had had the time to turn words in to action. A few months later, when Andraz Logar had some time to spare, the idea of making the race game could finally be realized.

Andraz had a special look in mind: he wanted an isometric landscape that resembled Minecraft. The cars had to look like toys but be realistic and look like real 3D cars – even though they would appear as sprites in the game. Andraz decided to use his and his team’s software of choice, Cinema 4D, to create all graphic elements for the game, which would carry the name Groove Racer.

Because of their extensive experience with Cinema 4D, the team knew it could effectively be used to create numerous types of graphic elements for the game without having to arduously switch back-and-forth between other applications.After the 3D models had been completed, they were used to set up the scenes for the game. In the end only six large renderings were needed, which were cut down to size in post-production using the Slice tool so they could be integrated into the game engine. Cameras with parallel projection were used to create the isometric look.

“At first, the decision to create voxel-style graphics seemed to be a limitation,” remembers Andraz. “But using a series of simple cubes to create everything from trees to mountains and all sorts of other things using Cinema 4D turned out to be fun and the result is a real eye-catcher. Technical restrictions for the target devices meant that we couldn’t use real 3D models. We had to generate the graphics as sprites, i.e., tiny PNG files. All animations are made up of sprite sequences. Thanks to Cinema 4D we were able to meet all challenges this workflow presented.” Andraz adds with a smile: “With Cinema 4D you can always find a solution for any challenge!”

Groove Racer in the APP-Store: www.itunes.apple.com/de/app/groove-racer]]>news-3332Mon, 12 Aug 2013 10:40:00 +0200Tiny Tortureshttps://maxon.net/en-us/news/case-studies/broadcast-motion-graphics/article/tiny-tortures/David Lewandowski uses CINEMA 4D for Flying Lotus video starring Elijah WoodIn addition to directing “Tiny Tortures,” Lewandowski wrote the treatment for the video and created many of the animations and visual effects using MAXON’s CINEMA 4D. Well-known for his work on TRON: Legacy, Lewandowski recently served as the lead graphics animator on Oblivion. He and Steven Ellison (aka Flying Lotus) connected via Twitter. “Steven wanted to learn more about animation so he put out a call out saying he was looking for the head vampire of CINEMA 4D, and people directed him to me,” Lewandowski recalls, laughing.

The two got in touch and started spending time together in Los Angeles, talking about art and their shared love of Asian cinema, particularly Japanese animation. Nailing down the look of “Tiny Tortures” took several motion and camera tests, and Lewandowski admits to worrying a bit that the subject matter was too dark. “It’s a midnight, pure-black song, really, so we went with it,” he recalls.

Elijah Wood is a mutual friend of Flying Lotus and Lewandowski and readily agreed to star in the video, which opens with him lying in the dark on his bed next to a nightstand strewn with prescription drug bottles. Suddenly, everything from spare change and guitar picks to baseballs and a smart phone begin floating over to him, coming together to form a cyborg-like replacement arm. Concept artist Ben Mauro came up with the design for the arm, and was one of several members of the freelance creative team Lewandowski pulled together for the video.

Imaginary ArmDustin Bowser supervised much of the VFX for “Tiny Tortures.” CINEMA 4D was used for animation and rigging, and because they were shooting in extremely low light, Lewandowski designed and built a rig that could work under those conditions. The practical arm nub that was fabricated with silicone for many of the shots was scanned using Agisoft Photoscan so that a digi-double could be swapped in for the CG arm shots. “I wanted to shoot this as dark as possible, single-digit foot candles and do effects over it, letting everything live in shadows,” he recalls.

In order to do that, Lewandowski upgraded the arm rig to include dimmable strips of LEDs to ensure good visibility without light pollution in the plate images. Additional LEDs were used in parts of the room in the shadows to add parallax information for PFTrack. Motion capture for the arm was pretty straightforward, Lewandowski says. “Sometimes we used a prosthetic arm stump and we digitally squeezed the pixels to remove where his real arm was concealed.”

Character technical director Bret Bays and rigging technical director Patrick Goski came up with rigging solutions to make the arm appear to be growing together. Ultimately, though, the CG arm rig was slow and difficult to work with, Lewandowski recalls. “It was a great rig but we had to model and buy so many assets for it and create the wires from scratch; it just wound up being pretty cumbersome.” Xrefs ultimately saved the day and made the rig more manageable, Lewandowski says.

Lewandowski also credits Goski with using CINEMA 4D’s sculpting tools to create the eye-catching shot of the coins floating out of a bowl on the dresser. The idea was to have the coins rise from the bowl, form a sort of double helix and then move across the room toward Wood. “Patrick did around 100 animations to get a double helix that still felt loose and abstract, and did the simulation with MoDynamics to really nail it.”

David Lewandowskis website: http://dlew.meMaking of video: http://dlew.me/Flying-Lotus-ProcessTiny Tortures video: http://vimeo.com/54585743]]>news-3303Wed, 31 Jul 2013 17:18:00 +0200Fairytale Castle 2.0https://maxon.net/en-us/industries/architecture/fairytale-castle-20/For architects it can be difficult to demonstrate one’s abilities to clients. Drawings and models are one possibility – a film in which a virtual structure plays the leading role, another. Juan Gayerre from Gayerre Infografia is an architect with heart and soul. However, in this profession it can be an even bigger challenge finding clients for projects than it is to realize these projects. Therefore, one of the most important things for architects to do is self-promotion.

3D graphics are replacing increasingly more drawings and models in architectural visualizations. Gayerre Infografia also uses 3D graphics, which it creates using MAXON CINEMA 4D. Together with is team, Gayerre planned the short film “Alone” for promotional purposes, but wanted to take it a step further and create more than just a series of photorealistic still images – he wanted to create a complete short film.

Gayerre and his team selected plans for a dream house and began to create a detailed textured model of it. After the model had been completed, the team began creating a scene after scene in CINEMA 4D, which went beyond merely depicting the house and practically explored it.

Impressive camera movement is accompanied by calm piano music. The film reveals a luxurious residence with spacious rooms and glides across a terrace with swimming pool and through a narrow study with a high, bright ceiling. Here and there an occupant can be seen, sitting on the terrace, relaxing on the sofa or engulfed in a magazine in the study. This person is Juan Gayerre himself, whose live footage was integrated seamlessly into the virtual settings. In the film, the day passes and the house is shown from the cool morning setting through a marvelous sunny day and into a relaxing evening. The lights are turned on and a completely new atmosphere is created.

Architectural visualizations are often dry and lack enthralling imagery. Gayerre Infografia’s “Alone” is in a league of its own. The building is the protagonist and the technically perfect realization in CINEMA 4D gives it an almost magical character. In any case, the film has used high-end architecture to successfully create a sensual visual experience for the viewer. The great amount of work required for this project is reflected in its five-month production time. The scenes made use of almost all major CINEMA 4D features and polygon counts ranging from 200,000 to 300,000 did their part in pushing the computers to their limits.Not only do the numerous positive comments on YouTube prove that “Alone” was well worth the effort – it was also nominated for the 2013 Architects Award in the category “Non-commissioned”. Asked what he thinks could be improved in hindsight, Juan Geyerre answers: “We would have liked to book a more attractive model to walk around the house but due to our limited budget we had to make to with me …”

Gayarre infografia website:www.gayarre.net]]>news-512Thu, 23 May 2013 10:48:00 +0200Astro Slayershttps://maxon.net/en-us/news/case-studies/games/article/astro-slayers/When you want to make a reboot of Asteroids with a strong Anime touch, you better have good GFX Tools ready, like MAXON's CINEMA 4D!By: Meleah MaynardWhat do you get when you combine the game play of Atari’s Asteroids with the graphic sensibility of Japanese Anime? It’s called Astro Slayers, and it’s the second game app produced by C4DGames, a sister company of Italian 3D and motion graphics house Digital TRX. Astro Slayers took about 9 months to complete and is powered by the Torque 2D engine.

“Asteroids made history and marked all of us,” says Paolo Lamanna, co-founder of Digital TRX with Marco and Mad Bertoglio, explaining why he and six other artists on his team decided to create their own take on the iconic game. “I spent a lot of time playing it as a child and, at that time, those few lines of light were enough to carry us into space to live incredible moments.” Powered by the Torque 2D engine and created with MAXON CINEMA 4D, After Effects, Photoshop and a bit of ZBrush, the Astro Slayers app took about nine months to complete.

Building on the simple structure of Asteroids, C4DGames modernized the look and added a cast of characters and basic plot. The date is 2050, and the galaxy is being pummeled by a destructive and seemingly unstoppable meteor shower. To protect the earth, the World Union of States launches the Astro Slayers program, which is headed by Dr. Azama, who is known for discovering the Astroparticle, a kind of spatial energy that gives human beings extraordinary abilities.

Azama begins by building an Astrobase and three Slayer Machines: Freccia, Queen-Star and Turtle. All three spacecraft are powerful enough to destroy meteorites and they’re flown by the only three pilots deemed qualified to fly spacecraft armed with Astroparticle-fueled engines and weapons. Players must help the three pilots, Khan, Luna and Ron, succeed at their mission to save the galaxy by taking out stray asteroids.

A Passion for 3DLamanna, who started out as an illustrator and cartoonist, has worked for many Italian and foreign publishing houses over the years. In addition to contributing regularly to Disney USA, and occasionally Disney Italy, he has also worked as a digital colorist for publishers and agencies in France and Belgium.For Lamanna, making the move to 3D felt like a natural progression. “It was a short step for me because my curiosity to explore the world of 3D was so great,” Lamanna recalls, adding that after considering different types of software, he chose CINEMA 4D. “I got my first release, which was maybe R6 or R7, from a magazine and then it was love.”

It was Lamanna’s desire to get more involved in 3D that inspired him to launch Digital TRX in 2011. The venture allowed him to bring together a CINEMA 4D team made up of many of the talented artists he had collaborated with on past projects, including two popular short films based on Grendizer, a super robot created by Japanese manga artist and sci-fi author Go Nagai that they all loved as kids.For both films, Grendizer Returns and Grendizer Arch-Enemy, the CINEMA 4D team worked closely with Ufo Robots.net, a website dedicated to all things Grendizer. Widely praised by Grendizer fans all over the world, the shorts were made with CINEMA 4D and After Effects and edited in Final Cut Pro. “This year is the 35th anniversary of the first Grendizer episode in Italy, and for 35 years I have followed Japanese animation with interest and admiration,” Lamanna says.

Saving the GalaxyWith Astro Slayers, which is available at iTunes for iPhone and iPad, the CINEMA 4D team was going for graphics inspired by artists, including Go Nagai, Masakazu Katsura and Steambot Studios and Dylan Cole, explains Lamanna. The app brings together two different techniques to look like one, and all of the 2D characters were made using Manga Studio and Photoshop.

After Effects was used for animation. All of the game’s structures and technological gadgetry were created in CINEMA 4D. “CINEMA made it easy to model things like meteorites and the spacecraft in a way that could be implemented in 2D,” he continues, “and with R14 we now have the sculpting functionality we need within CINEMA 4D.”

The team used BodyPaint 3D to get the cartoon-like graphics they wanted. Anime-style surfaces were painted directly onto UV maps in order to give them the control and freedom they wanted. Outlines were reinforced for effect in post-production. Renders were exported with several passes and then composited in Photoshop or After Effects so they could be imported into the 2D game engine. “CINEMA4D's internal render engine was indispensible because it allowed us to produce high-quality work in such a short amount of time,” Lamanna says.

Having the game objects in 3D allowed Lamanna and the other artists on the project to achieve a level of sophisticated kinematics at particular moments in the game, such as spacecraft coupling. “With XPresso we could control the mechanical movements of the spacecraft with just a few sliders,” he explains.

While new games are definitely in Digital TRX’s future, they are currently porting Astro Slayers to a new version of Torque 2D and working on updates. Look for features, including better player control, as well as new bonuses, weapons and levels. “We’re also studying the integration between CINEMA 4D and two other game engines, Unity and UDK, so in the future there will certainly be a new 3D game made with those engines,” Lamanna says.

All supporting images are copyright and cannot be copied, printed, or reproduced in any manner without written permission.

Meleah Maynard is a Minneapolis-based writer and editor. Contact her at her website.

Ratcliffe College Preparatory School, near Leicester, was founded in 1847 by Antonio Rosmini, a renowned educational innovator. The school has been a beacon of excellence ever since.

Ratcliffe is a Catholic school that welcomes children of all denominations and faiths whose parents feel they can benefit from and share in their ideals. The ethos of the school is that it caters to the development of the whole person, both academic and personal, through a wide range of extra-curricular activities. That means consistently high academic qualifications with top grades in A-levels as well as featuring a Steinway grand piano in the concert hall, amongst many other attractions.

Recently, the school was looking to expand its campus, by adding a new building and classrooms to update the facilities on offer, but needed to see how the design spaces would work before building even began. The solution was to commission design firm Franklin Ellis Architects to design the buildings and create the views that could then be signed off at the pre-planning stage.

Initially this was done with single image views. But stills aren't as good at showing how spaces work so the decision was made to create an animation. This could then also be used as a marketing tool for the school with the aim of hosting it on the website so that it was also available to parents and students.

Oliver Higgins, architect/visualizer of Franklin Ellis explained, “The visualization and animation were very much design tools for us as well as the client, as they allowed decisions to be made very early, which otherwise would have had to been made much later in the process. This minimized the risk for the client and allowed for any potential issues to be worked out before the construction process began.”

It was important that the animation had a narrative, a journey through the spaces that were being visualized. Essentially, someone would need to be able to watch it and understand where the building was located in relation to the existing school, the general external massing and appearance, and finally, how the spaces would work within the building.

The initial sequence was relatively straightforward, but once inside the building, a technical challenge became apparent. All the spaces within the building faced a glazed courtyard. Almost all the camera movements included an element of this courtyard so that the spaces on the opposite side of the courtyard were always visible.

Oliver explained how this situation would normally be resolved: “Usually we would break the sequence down into smaller film sets that would allow us to keep scene size down and allow for high details. As most of the building was visible through this courtyard we needed to limit details in the background while having a high amount of detail in the foreground. Eventually, we decided the best way forward was to have a high-res and a low-res classroom space that was swapped into the main model, when needed, using the XRef function. When the view was in the background, we used the proxy function to load the low-res version and then switched to the standard XRef when in the foreground.”

The polygon count for the completed scene was on the order of 3.9 million, but if hi-res classrooms had been used for all the spaces, this would have rocketed to around 6 million, placing a significant strain on rendering the final animation. It was this feature that made the project possible in the time available, and is one that gets a lot of use at Franklin Ellis, as Oliver explained, “CINEMA 4D's XRef function is something that we use on a day-to-day basis here. It allows different versions of buildings to be switched almost instantly while still maintaining individual control of their own files.”

Towards the end of the animation there are detailed shots within the courtyard. Franklin Ellis used CINEMA 4D’s Hair feature to place grass throughout. Combined with global illumination, this produced a lovely, soft grass effect. To further detail the areas of grass, individual daisy models were scattered around using MoGraph. Oliver commented, “MoGraph is another invaluable tool as it allows us to clone items such as desks, chairs, small landscape objects, etc. It makes a process that would be very tedious almost instantaneous.”

To finish off, the external lighting was created using the Physical Sky object. A simple sun and sky was enough to provide the aesthetic required. The internals used a mixture of spot, omni and area lights. The main corridors were lit using omni lights at about 20% intensity. This provided enough ambient light in the space and, coupled with the rays of sunlight streaming in, provided a soft look to the image. A series of planes were used in the corridors with a simple white illumination to give the impression the light was coming from these objects. Similarly, the classrooms used 600x600 mm ceiling grid lights with an omni light under each to generate the lighting. Various spots were used when specific areas needed highlighting.

The final animation was three minutes long at 1280x720 dpi resolution, but simply panning around spaces for that time is difficult to hold the attention of the audience. To complete the presentation, silhouetted children and audio clips from similar, real-world spaces were added during post-production to better simulate the atmosphere. Coupled with a variety of pans and camera movements, this achieved the aim of sustaining interest for the duration of the animation.

Duncan Evans is a freelance journalist, photographer and author.

]]>news-5924Wed, 10 Apr 2013 09:50:00 +0200Bergwelt Grindelwaldhttps://maxon.net/en-us/news/case-studies/architecture/article/bergwelt-grindelwald-1/Architectural visualization professionals use MAXON CINEMA 4D to enthrall architects and their clients with structures that have yet to be built.The Grindelwald project was initiated by Capaighi Marketing GmbH and HRS Real Estate and the team at Zurich-based Raumgleiter was given the job of visualizing Grindelwald. Raumgleiter’s 3D software of choice is MAXON CINEMA 4D. The team travelled to the future construction site – which lies at an altitude of 1100 m – to gather reference material and background images. Since it was mid-winter and temperatures had sunk to -25° C, a drone was used to make aerial photos.

The goal was to capture the location’s unmistakable feeling as prescribed by the advertising agency: high-class, modern, stylish Alps. The blueprints supplied the basic parameters for the apartments themselves. But empty rooms and bare walls don’t impress clients. The apartments had to be furnished and given an inviting look, which included adding items such as covers, pillows, carpets, fireplace logs, drapes and many other accessories.

A unique challenge was the snow in the surrounding environment. In many instances, virtual snow had to be added to the 3D scenes. Snow is very unique in its structure, the way it reflects light, the color of light it reflects and its surface properties – all of which make the creation of virtual snow a challenge in and of itself.

The final renderings were sent to experienced interior architects and everyone waited anxiously. Anxiously because the team knew the architects would find additional inspiration in the perfectly rendered images. The result was a barrage of modification wishes, all of which had to meet the architects’ demands.

Despite all the modifications that had to be made, the team at Raumgleiter was able to successfully complete the project in about 6 weeks. CINEMA 4D and its MoGraph and BodyPaint 3D features played a large role in fulfilling the clients’ wishes. Raumgleiter’s Christoph Altermatt about CINEMA 4D: “Thanks to the interface’s modular structure, the great workflow and clear arrangement as well as the program’s excellent ergonomics we were able to completely concentrate on the work at hand.”

]]>news-3161Mon, 18 Mar 2013 13:23:00 +0100Body Workshttps://maxon.net/en-us/news/case-studies/visualization/article/body-works/How an artist who once dreamed of being a bodybuilder became a successful medical animator thanks to CINEMA 4D and Arnold Schwarzenegger.It’s groundbreaking work that’s getting a lot of attention these days. So when medical animator Nick Shotwell was hired by Indiana-based DWA Healthcare Communications Group to increase the company’s capabilities and help them move from illustration into animation, he opted to create an apoptosis animation that could potentially attract new clients. The first hurdle? Convincing the company to buy a copy of MAXON CINEMA 4D, and then learning to use it in ways he’d never attempted to before. And he needed to learn fast.

“When I bought CINEMA 4D through work, we got an upgrade that included access to Cineversity’s tutorials and one-on-one training,” Shotwell recalls. “I contacted MAXON and they told me that one of their lead trainers lives here in town and could help me.” That trainer was digital game artist and designer Darrin Frankovitz who, as it turned out, lived very close to Shotwell. The two met and over the next two years did web-based trainings every few months.

MoGraph was the main tool Shotwell needed to learn for the apoptosis animation. In particular, MoGraph came in handy when he was attempting to depict a cell’s surface. “There are all of these little spheres on the surface of a cell and that’s a tricky thing to do for all medical animators, so you have to fake it somehow or you’ll crash your machine,” he explains. Frankovitz showed him how to use clones more densely up by the camera while making the most of depth of field.

Like other medical animators, Shotwell turned to the open source plug-in ePMV (embedded Python Molecular Viewer) when building and animating molecules in CINEMA 4D. Created by Graham Johnson and Ludovic Autin of Scripps Research Institute, ePMV allows users to run molecular modeling software inside 3D animation applications.

After finishing the apoptosis animation, Shotwell moved on to another challenging project that is still in progress – rigging a complex model of the human anatomy that was purchased online from Zygote. Again, he turned to Frankovitz for help. “Darrin has given me tips that you just can’t get from a video,” he says. “And I needed a lot of help because I had no idea what I was doing with rigging when I started this.”

The decision to take on the challenge of rigging such a difficult model of the human anatomy was not just motivated by work. Before becoming a medical animator, Shotwell had considered a career in bodybuilding, having been inspired in part by Arnold Schwarzenegger. As fate would have it, Schwarzenegger turned out to be the guiding force behind Shotwell’s medical animation career.

After meeting Shotwell and seeing his artwork, Schwarzenegger was so impressed that he commissioned additional work from Shotwell. Shotwell met Schwarzenegger at a prominent bodybuilding and sports event and presented his idol with a montage of portraits he’d drawn of him. Soon after, he got a call from Schwarzenegger’s asking if he could create some more artwork for him. “It was amazing,” recalls Shotwell. “He was my number-one idol and it meant a lot to me that he thought I was a talented artist. Had I not met him, I don’t know if I would have had the confidence to believe I could do art for a living and become a medical animator,” recalls Shotwell.

Read the full-length feature on this project on Renderosity: http://www.renderosity.com/news.php?viewStory=16323Watch the animation here:http://olr.dwainc.com/mmds/Apoptosis/index.html]]>news-3074Thu, 14 Feb 2013 13:21:00 +0100KIA’s Colossal Cataloghttps://maxon.net/en-us/news/case-studies/advertising-design/article/kias-colossal-catalog/Challenging project for KIA Motors mastered by design studio Moss &amp; Dew using MAXON Cinema 4D.In the past, traditional photography was used to create exterior and interior shots of various KIA models but for the new KIA Quoris, exterior shots were created entirely as CGI. Mr Soonyup Song, head of the 3D communication team at Seoul, Korea-based Moss & Dew and expert for 3D content and planning, saw this challenge as an opportunity.

The existing photo catalog was made up of images created using traditional photography, and this project demanded that images of the new Quoris be created as CG – with the same consistent look and within a tight project schedule. Modifications during the course of project completion were also probable.

The introduction of a new car model and its marketing orchestration down to the last detai is an elaborate and costly process for any manufacturer and is designed to ensure maximum impact. Photos of upcoming models shot in secret undermine the effectiveness of these marketing measures. The fact that the new KIA Quoris only existed as a virtual 3D model made it impossible to photograph. In addition to the cost-saving factor and the enhanced creative freedom, the increased security for KIA's intellectual property rights was a key factor in the decision to use 3D renderings.

“We had to create photo-realistic, high-res images at a resolution of 60 cm x 40 cm, which meant that we had to pay very close attention to even the smallest details with regard to materials and lighting in order to achieve the desired look within the tight deadline,” Mr. Song recalls. “Because all images were being created for print, almost all models had to be high-res, which resulted in scenes with 30 million to 50 million polygons each. On top of this, many optional vehicle parts and colors had to be taken into consideration and the background had to be very appealing. We had to define a workflow that could handle large amounts of data effectively and also quickly generate previews so we could fine-tune the scene settings.”

Mr. Song decided to use his software of choice, MAXON Cinema 4D, to meet this challenge. Mr. Song has relied on Cinema 4D for several years now as a primary tool for 3D content creation. Both low-poly as well as high-poly models were created for each model. The low-poly model was used while setting up and lighting the scene and then replaced with the high-poly model for rendering.

“This project would have been impossible to complete within the allotted time without Cinema 4D’s XRef function,” says Mr. Song. “In the course of this project, numerous parts were assembled and modified based on original parts, and design changes were also made every now and then. We saved all parts in a single archive and replaced them with XRefs as needed, which made it possible to create a very effective workflow. When a design was modified we didn’t have to manually update each scene – all we had to do was modify the archived object and every scene was updated automatically! XRefs made it possible to dramatically speed up the project’s workflow and increase it’s efficiency,” explains Mr. Song.

Artist Jinill Kim also modeled the background in Cinema 4D to achieve the best match for the Quoris. The imagery for the new KIA Quoris – from modeling to rendering - was created entirely in MAXON Cinema 4D.

“Our client, KIA Motors, was very happy with the result and was very impressed by the power of Cinema 4D. We are proud of the fact that we successfully met this challenge, and a large part of the credit for this achievement goes to Cinema 4D,” states Mr. Song.

Cinema 4D’s basic renderer was used to supply quick feedback during the project’s creation. Final rendering was done using Cinema 4D’s fast and reliable Advanced Render feature.

Website KIA:www.kiamotors.com]]>news-2897Tue, 25 Sep 2012 10:52:00 +0200Expanding the Reach of Medical Animation CGhttps://maxon.net/en-us/news/case-studies/visualization/article/expanding-the-reach-of-medical-animation-cg/Using CINEMA 4D, ePMV and After Effects, Cosmocyte is making the complex understandableYou probably studied cell mitosis grade school … Even if you do vaguely recall what cell division is all about, you’ve probably never thought of the process as something elegant, even beautiful. But step into the world of medical and scientific visualization and the process of mitosis suddenly becomes a work of art.

Maryland-based Cosmocyte has been creating stunningly vivid and accurate medical and scientific animations and illustrations since 2005. Using MAXON’s CINEMA 4D and Adobe After Effects, they do most of their work for academic and non-academic clients, such as Science magazine and the Stanford University School of Medicine.

“I know a lot of big medical animation companies use [3ds] Max and Maya but we like CINEMA 4D because of its flexibility and easy learning curve, and I know a broad cross-section of companies are going in that direction, too,” says Cameron Slayden, Cosmocyte’s founder and creative director.

Business has grown steadily over the last few years, making finding skilled scientific illustrators and animators who can use CINEMA 4D a top priority for Cosmocyte. To help, Slayden started a free, on-site training program last year through which scientific visualization specialists can learn the software.

Those who do well get hired as contractors. “We used MoGraph from the very first animation because they have to be able to wrap their heads around that to do good biological visualization,” Slayden explains. It’s the tinkerers who stand out, he says.

Vuk Nikolic, a 3D artist and experienced Maya user with a master’s degree in biology, produced “stellar” work from the start. “Vuk has a natural instinct for medical animation and he blazed through our CINEMA 4D training sequences very quickly,” says Slayden, pointing to Nikolic’s mitosis animation for which he used CINEMA 4D’s Fresnel, sub-surface scattering and displacement. “I experimented and created an interesting transparency of the nucleus membrane using the spectral shader in the transparency channel of the texture,” Nikolic recalls.

After he completed his CINEMA 4D training, Slayden brought Nikolic on board to help Cosmocyte create its own content. Specifically, they are collaborating on a series of educational animations on a variety of topics, such as “What is Cancer?” and “What is DNA?” “We see these as being deeply immersive educational modules that would be free to the public on YouTube,” says Slayden.

Because it helps to have a basic understanding of DNA in order to grasp the complexities of cancer, Slayden first asked Nikolic to create “the most complicated thing” he could think of, a 3D DNA translation, which involves getting from a gene to a protein via RNA (ribonucleic acid). “The process is divided into three stages: initiation, elongation and termination,” explains Nikolic, whose past clients include National Geographic and the History Channel.

The biggest challenge was building and animating the molecules in CINEMA 4D. He credits the open-source plug-in, embedded Python Molecular Viewer (ePMV) with making that daunting task possible. Created by Graham Johnson, a friend of Slayden’s and a frequent contributor to the annual National Science Foundation Medical Visualization Challenge, ePMV allows users to run molecular modeling software inside 3D animation applications.

With ePMV, Nikolic was able to bring the whole molecule into CINEMA 4D before animating and lighting it. “I had to put one file with five million polygons in different layers and make a proxy to animate and move through the interface,” he recalls.

Read the full-length feature on Comocyte at CGSociety: http://www.cgsociety.org/index.php/CGSFeatures/CGSFeatureSpecial/cosmocyte
Vuk Nikolic Website:http://vuk3d.com]]>news-2760Wed, 12 Sep 2012 09:55:00 +0200Business is Warhttps://maxon.net/en-us/news/case-studies/games/article/business-is-war/Corporate greed takes on a new meaning in Syndicate, with in-game cinematics created by Vincent using CINEMA 4D"Climbing the corporate ladder is tough when the guy at the top is shooting at you. That's why you'll need all of your skills - and a real go-getter attitude - to take on today's release of Syndicate," says Electronic Arts (EA) of its 2012 remake of the classic 1993 action strategy game of the same name.

London-based directing team Vincent worked closely with the game developers and art directors at EA and Starbreeze in conceiving the overall styling and scriptwriting of the in-game cinematics and trailers. "Syndicate had a huge following in '93, with a popular cyberpunk graphic styling," said John Hill, Director at Vincent. "Our challenge was to create a new graphic language that would seduce the high-end gamer of today."

Using MAXON's CINEMA 4D software on Mac OS to create the 3D animation, Vincent used only dots and lines to create the imagery, without any depth-of-field or heavy polish added in post. The color palette was restricted to just black and white, plus orange as a highlight color. "We were conscious of the basic interpretation of a dystopian future and wanted to steer away from the usual distressed typography and glitchy styling of many current first person shooter games," explained John.

Set in the year 2069, everything in the Syndicate world is digitally connected. Consumers no longer require external devices, such as tablets or smart phones, to access data and services. Instead, a chip implanted in the consumer's head provides direct access to all aspects of modern life - from housing, to banking, to education. With unprecedented control over the "CHIP'd" civilians, the neural chip companies - called syndicates - wage war against each other for control of the masses.

Embarking on a brutal adventure of corruption and revenge, the game’s player adopts the role of Miles Kilo, a prototype agent for the EuroCorp syndicate. To illustrate the connectivity theme of the game, Vincent designed a family of signature wireframes in CINEMA 4D that would scale and connect like a fractal. The trick to producing the complex animation and advanced connectivity was to find the right mix of methodology. Expressions created in CINEMA 4D using the built-in XPresso editor played a key part, as did mixing camera data from various software to create a seamless linear camera.

"CINEMA 4D is a very capable, fast and stable 3D software package. It's intuitive and great for creative users," said John. "We wanted to create a dramatic and engaging prologue that would not only provide a background narrative to the game, but would also establish a visual language for all in-game graphics and promotional material."

"Syndicate was our most enjoyable project from last year. It's always vital to have a supportive and like-minded client to produce a good result," summed up John.

Since the introduction of iPhone, iPad and Android phones and tablet PCs, the interactive children’s book segment has developed into its very own branch in the world of e-books. Among the many artists in this diverse genre, Matt Roussel stands out as one of the best illustrators and 3D animators.

For many years now, Matt’s work can be seen in many French magazines and newspapers that use his unique style to attract and entertain their readership. It’s been two years since Matt, together with Valentine Parguey, created the interactive children’s book “Scott’s Submarine”, which went on to become a hit in Apple’s App Store and on Google Play. The book reached #1 in France, Spain and Belgium’s App Top 10 and received rave reviews in the UK and Australia.

Matt continues his success story with his newest title, “Mademoiselle Daisy’s New Friends”, written by Amelie Sarn. This interactive children’s book was also done in Matt’s own unmistakable illustration style. The images look like miniature models in which paper mâché move, which he created using CINEMA 4D. “An application that is user-friendly, works reliably and offers incredible power for modeling,” explains Matt.Matt’s new app and both mini games, which are hidden in Daisy’s “Belle Epoque” world, were developed using the Corona Cross Platform Development Kit. Images and parts of images were split into 3 layers in CINEMA 4D and were applied as separate background, foreground and object layers. The graphics were then exported to Corona where functionality was applied – and the finished apps were then compiled for the respective platforms.

As with his previous titles, Matt’s exquisite graphics and their special flair make “Mademoiselle Daisy’s New Friends” a joy to look at throughout the entire e-book. This visual experience is particularly enhanced on the new retina display of the iPad 2 – even better than a printed book.

Mill editor and longtime skater Ryan McKenna poured over skateboarding, snowboarding and surfing videos for inspiration while working alongside creative director Mario Stipinovich, design director Jeff Stevens and art director Emmett Dzieza on the spot. The result is a stylized world that artfully combines illustration, cubist geometry and motion with the help of MAXON’s CINEMA 4D, Autodesk’s Maya, After Effects and a bit of Houdini and XSI.

The Pantech phone’s shape steered the creative team toward designing around board sports related to the tour since it “would have been a hard sell to make it behave like a BMX bike,” Stipinovich says, laughing. After doing some research on various types of moves, the team presented a reference board of some angles from surfing and skateboard photography that they thought were particularly cinematic.

Once the board was approved they moved on to creating a more linear storyboard and a live-action edit that helped with workflow. Working in the same room for the duration of the project enabled The Mill’s core team to collaborate while still sticking to their tight deadline—just three weeks from pitch to finished 30-second spot. “We had an editor, a 3D artist working on previs, and Emmett and Jeff were building the skeleton of the thing,” says Stipinovich. All told, 26 artists worked on the spot.

Real-life moves became real tricks for the crossover phone in the hands of 3D animators Ross Scroble and Sam Crees, making it possible to lock an edit in the first week of production. “Ryan’s edit gave us the cut points and transitions, and he was able to sort from existing footage for about 75 percent of this thing,” Stipinovich says.

Next, the design team went to work creating three different environments for the skateboarding, skiing and surfing portions of the spot. Each colorful, abstract world was designed to flow easily into the next and the look was inspired in part by cubist paintings. It was freelance illustrator and designer Bryan Louie’s job to turn those modernist visions into illustrated environments the 3D artists could build off of.

Once the animation and look were finalized, the design team used CINEMA 4D to make the environments come alive. “It was great and it allowed us to projection map the illustrations onto the geometry we built,” Dzieza explains. “It was a seamless workflow because we were able to map illustrations onto the geometry so easily with CINEMA 4D’s Projection Man.”

This project was one of the first big jobs for which The Mill NY has used CINEMA 4D. “And it really expanded our capabilities with CINEMA 4D a lot,” says Stipinovich, adding that projection mapping was used most often but for some scenes they did unwrap the UV’s so they could work with BodyPaint 3D.

Artists used CINEMA 4D’s Dynamics to get the icicles hanging from the railing in that scene to drop and fall to the ground. They created snow shaders and crystal rock shaders that came in handy for the skiing scene and CINEMA 4D’s Sketch and Toon helped them add a dynamic, graphic look. The Pantech phone was created with Maya and XSI.

Creating the wave for the surf scene turned out to be the most challenging thing to pull off, Dzieza says. “The smallest inconsistency with one of these particle animations and it just looks wrong, so it was a pretty big bite to take off for such a short amount of time and I think every artist on the project touched that scene.”

Find out more about this project: http://www.cgsociety.org/index.php/CGSFeatures/CGSFeatureSpecial/skate_pantechBryan Louie’s website:http://hellalouie.com]]>news-2248Wed, 28 Mar 2012 09:35:00 +0200Breaking Pointhttps://maxon.net/en-us/news/case-studies/visualization/article/breaking-point/An abstract visualization of an accident for a thesis project: a completely different point of view, the visualization maintains its dramatic characteristics thanks to CINEMA 4D.November, 2010: A vehicle traveling at 90 km/h (56 mph) loses control in a slight left curve and hits a tree with a diameter of 25 cm (10 inches)! The car's front end practically wraps itself around the tree trunk, the car is completely destroyed and the driver suffers broken bones and internal injuries. In the course of this detailed yet impersonal accident visualization the human tragedy involved flashes through and, in the end, our own morbid curiosity for destruction prevents us from looking away. Viewing events from a completely different point of view and capturing the aesthetics of destruction in an entirely new way is what Christian Lerch wanted to achieve with his thesis project, ‘Oversights’.

Whether at fault or not, an accident is always a pivotal event in the lives of those involved. Himself being the 'victim' of several accidents, this visualization as an aesthetic and somewhat abstract symphony of destruction is surely part of Christian’s process of coming full circle with the events in his own life. His personal experience with accidents was most likely a catalyst for the successful visualization of the breathtaking imagery created in CINEMA 4D.

He wanted to base his visualization on a real-world event. Christian searched an accident database and found an accident that contained the destructive elements he wanted. He selected an accident that involved injuries but no deaths or lasting injuries.

Christian made liberal use of particle effects to animate the large amount of debris, which were controlled using a combination of Dynamics, MoGraph and XPresso. His programming experience made it possible for him to create custom C.O.F.F.E.E. scripts to achieve the desired effects. CINEMA 4D's Hair feature was also used in the creation of this project - albeit with a different purpose: Long hair was used to simulate dynamic wind currents or falling rain!

Sketch and Toon was used to render the visualization, whereby depth, shadow and specular elements were rendered to separate layers. The final compositing was done in After Effects. Since this 180 second short film produced a vast number of frames, NET Render was used to render across several computers.

Thanks to CINEMA 4D's comprehensive feature set and reliability, Christian was able to complete the project to meet his own - very critical - standards. This was, after all, a thesis project! Christian knew he could depend on CINEMA 4D since he had already worked with the application before: “I initially worked with other 3D applications but their steep learning curve caused me to waste my time with everything else except 3D work! About one year later a fellow student introduced me to CINEMA 4D. It didn't take long before we had produced our very first animated film, 'Paste of Love'". While working on his newest project 'Oversights', Christian also relied on the power and dependability of CINEMA 4D - and was rewarded with excellent results!

Christian Lerchs online portfolio:www.ishowyousee.de]]>news-2363Wed, 01 Feb 2012 09:47:00 +0100Ski Jumping in Vikersundhttps://maxon.net/en-us/news/case-studies/games/article/ski-jumping-in-vikersund/The world’s highest ski jump in Vikersund, Norway, has been shrunk down to size for a new mobile game All game assets were created using CINEMA 4DThe app’s creators, Agens AS, broke new ground with the creation of Vikersund Ski Jump: Do date, no ski jump game for mobile devices had been created, and this was the first 3D game that Agens AS had developed. Because they wanted the app to run on all mobile platforms, all specifications and limitations had to be accounted for, which made reducing polygons and efficient memory use top priorities. The challenge was to create a game with complex features using the least amount of geometry.

The ski jump was re-created as accurately as possible based on original blueprints and the game precisely simulates realistic conditions with regard to the ski jump's length, dimensions and landing area. The athletes themselves, however, had to be designed with a minimalistic look to meet the limitations of all mobile devices.The app was developed using UNITY 3D, which has established itself as one of the top tools for programming content mobile devices. Developers and artists worked at separate locations, which meant that game assets had to be exchanged between the teams on a regular basis to optimize each scene. Since the developers did not have CINEMA 4D at their location, assets were sent as FBX files. The optimization of the character animation to meet the requirements of a game for mobile devices was particularly challenging, but with CINEMA 4D's excellent character and animation tools, this hurdle was easily cleared.

Despite all obstacles, the game was completed within a deadline of 4 weeks. Kim Lid, creative director at Agens AS, about his experience with CINEMA 4D: "CINEMA 4D is very easy to learn and intuitive to use! This said, CINEMA 4D is incredibly powerful and dependable. We've used the application almost non-stop over the past year and it hasn't crashed once!"

Vikersund Ski Flying app in the App Store:http://itunes.apple.com/us/app/vikersund-ski-flying/id501619003?mt=8]]>news-2195Mon, 30 Jan 2012 17:12:00 +0100BadgerSpot - Social Networking in Augmented Reality https://maxon.net/en-us/news/case-studies/games/article/badgerspot-social-networking-in-augmented-reality/Social networks and mobile telecommunication: New media that’s longing for new content – and CINEMA 4D feeds the need … It can take years or even decades until such glasses - and the information needed to feed them - are available. A taste of what is to come, however, is already available from BadgerSpot in their iPhone app.

Key elements for this app are image templates that serve as posters, billboards or t-shirts. If placed at locations such as the entries to restaurants or clubs, an "augmented reality spot" is created. When viewed with the BadgerSpot iPhone app on the iPhone's touchscreen, the user will see a modified version of reality: An augmented reality in which – in this stage of development – animated 3D characters live.

BadgerSpot, the creators of this app, are located in Memphis, Tennessee. They are constantly working on blurring the line more and more between reality and augmented reality. Currently, the effect is limited to billboards and posters displaying an entirely different content from reality when viewed with an iPhone. For example, a shadow silhouette on a billboard will be displayed as a completely modeled and animated 3D character.

All 3D characters as well as a dog, a warthog and a frog, were created entirely in CINEMA 4D. Users are generally surprised when they see this augmented reality for the first time and wonder how it’s done. And BadgerSpot wants to take this concept much further with the introduction of the previously mentioned bulletin board, which will make it possible for neighborhoods, restaurants or nightclubs to create their own social networks. These networks can then be accessed via the user's iPhone, and soon with Android smart phones, and will be able to read and post information.

Everything else is currently still in development and all elements are, or the time being, comprised of comic-inspired 3D models in augmented reality.

These characters were created by Chris Magee, who is responsible for modeling and UV mapping in CINEMA 4D as well as texture creation in BodyPaint 3D. “I always shied away from creating character rigs but this has changed with the introduction of CINEMA 4D R13”, explained Chris. He is also looking forward to mastering many more rigs in CINEMA 4D R13!

More information on Renderosity http://www.renderosity.com/badgerspot-s-augmented-reality-app-takes-social-networking-to-a-new-level-cms-15998]]>news-2194Mon, 30 Jan 2012 10:31:00 +0100Designing Buildings On-The-Fly with Vectorworks and CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/architecture/article/designing-buildings-on-the-fly-with-vectorworks-and-cinema-4d/Franklin Ellis Architects designs unique science priory for Repton SchoolIf you thought 3D animations of a building were just something snazzy for the marketing department to help sell a shiny new building that's in the process of being built, with everyone sipping champagne, think again. UK-based architecture firm Franklin Ellis Architects reports that clients now expect 3D visuals as part of the design process itself.

"Some clients struggle to interpret 2D drawings and feel more comfortable signing off a 3D image of the scheme," explains Oliver Higgins, Architect & CGI Visualizer at Franklin Ellis. Seeing 3D visuals during the design stage also gives clients an opportunity to give feedback on the design.

"Clients will always comment on the 3D visuals, which will influence the changes we make to the design," says Oliver. "The advantages to 'modeling on-the-fly' means that crucial decisions can be tested virtually before stepping onto site. In some respects, having the visual means that the entire design and construction teams are working towards the same end goal, there is no danger of people interpreting the 2D drawings differently."

There is one potential snag, however, in providing 3D visuals while the building is still being designed: If your CAD software does not play friendly with your visualization software, it can be time consuming switching back and forth between the applications to make the changes to the building design. Clients probably think you just press a button but in reality it's more complex. To ensure speedy 3D visuals, Franklin Ellis uses the powerful and well-integrated combination of Vectorworks and MAXON CINEMA 4D, using them for designing the building and visualizing it, respectively.

CINEMA 4D's support for Vectorworks includes import of all materials and lights, the preservation of the scene structure, and an intelligent update function. With the smart updater you can go back to Vectorworks at any time to modify the design, then update the changes in your CINEMA 4D visualization without losing the work already done in CINEMA 4D such as the addition of trees, people and, of course, animation.

"CINEMA 4D's integration with Vectorworks is extremely valuable to us as it allows the building to be refined in Vectorworks and constantly updated in CINEMA 4D in minimal time during the process," says Oliver. One project in which Franklin Ellis combined Vectorworks and CINEMA 4D to provide rapid 3D visuals during the design process was for the new Science Priory at Repton School, England.Notable former students of the leading school span all fields, from author Roald Dahl to Top Gear presenter Jeremy Clarkson and Adrian Newey, the only designer to have won Constructors Championships with three different Formula One teams. The concept behind the unique Science Priory is to locate classrooms, laboratories and lecture rooms in a new science building so that the teaching rooms for physics, chemistry, and biology will be in adjacent spaces and will make use of shared resources such as IT and technical rooms.

Additional resources include a multimedia 'science cave' in which structures can be projected in three dimensions to make it possible for students to examine molecules, channels, organs and intra-cellular structures close-up. "We needed to produce an animation that visually explains the function of the building and at the same time holds the attention of the audience," recalls Oliver.

"It was decided from the outset that the animation would follow a storyboarded approach rather than simply flying a camera around the spaces. This would allow for more realistic camera movements and create a visual that in which the user can engage." One of the main challenges facing Franklin Ellis was controlling the number of polygons within the scene. Architectural scenes are often polygon heavy due to the complexity of foliage and other natural objects coupled with the intricate features of the building itself.

One CINEMA 4D feature that helps with coping with high polygon counts is the XRef system, which enables a scene to be split up into multiple files that are grouped together in a single master file. The entire scene is accessible through the master file but you can go into any of the individual referenced files and make changes there without the overheads of the rest of the complex scene. "XRefs allowed separate files to be worked on without slowing down the main file," comments Oliver. "Scenes were turned on and off as needed."

Franklin Ellis also made good use of other tools that at first sound like they are not part of an architect's toolbox, such as the hair system, which is a favorite of character designers. "The hair system is a great asset to us as it allows for the easy creation of extensive lawns and grassed areas without creating millions of polygons in the scene," explains Oliver. And the MoGraph tools - popular for creating amazing motion graphics - can also play a part in architecture. Oliver reveals: "Combining MoGraph and XRefs allowed us to quickly and easily update furniture such as the chairs in the lecture theatre as the design progressed."

With 3D visuals now a firm part of the design process, Franklin Ellis is finding that it's not just the buildings themselves in which clients are interested. It's also the finer details. "More often we are also getting requests to model individual components of a building such as glazing systems. This additional modeling helps to explain the more complex junctions of the building to the client and contractors."

Almost any request from clients can be met with CINEMA 4D's full suite of 3D tools, even a singing dinosaur if your client really wants one. "CINEMA 4D is invaluable within our design environment. From the early concept model to the polished image, CINEMA 4D provides all the visualization tools we need and it keeps getting better," says Oliver.

More sample pictures:http://feillustration.co.uk]]>news-2072Mon, 02 Jan 2012 16:47:00 +0100Lagoonia – A South Pacific Browser Gamehttps://maxon.net/en-us/news/case-studies/games/article/lagoonia-a-south-pacific-browser-game/Casual Games enthrall the masses and CINEMA 4D supplies the graphics!The story of a tragic airplane crash in the South Pacific that leaves crew members stranded on an idyllic island is nothing new. The list of films and depicting this storyline is long but these are in the end only a re-hashing of the good old Robinson Crusoe story. The reoccurrence of this theme shows how popular this story still is when modern nuances are added.

Next to film and television, this type of island ballad has also made its way into the world of computer games and is now being heralded in the form of a browser game. The idea for Lagoonia, the title of the game, came from Innogames, a company that specializes in the development of browser games. The graphics studio Augenpulver was given the job of creating the graphics for the game based on Innogames’ game design. With a team of 2D and 3D graphics artists, Augenpulver set out to realize Innogames’ vision of Lagoonia.

The basic concept of the game is to make the player explore an unknown island world and survive using farming techniques and re-inventing various technical devices to ensure the survival of the steadily growing family group. First, the team at Augenpulver had to create a fitting landscape made up of graphic elements. Coastal elements had to be created, the ocean in various colorations to depict different depths, various fields and structures, vegetation, boulders, cliffs and other landscape features. Instead of a realistic look, the task was to create a colorful, cartoon-like set of graphics, which were created initially as renderings with CINEMA 4D and then touched up in Photoshop.

After the landscape had been created it was time to create the characters. Like the landscape, these also had to have a cartoon-like look. The graphics for the landscape elements were static but each of the game's characters had to be animated. Since the game was viewed from an isometric angle, all characters had to be depicted accordingly and each respective animation also had to be depicted from 8 different points of view. There were 4 classes of characters, each with 19 possible styles of dress and accessories; 30 animations for each character, each in turn consisting of 35 frames were planned. Each animation and all assets had to be depicted in 8 different perspectives. If you put that into plain numbers, you’re talking about 638,400 frames!. On top of that, additional variations for the color of skin and clothing were made in post-production, which resulted in a couple of million frames – only for the human characters. This resulted in an exorbitant number of animation frames being created in total. After going through its to-do list and realizing the amount of work that lay in front of them, Augenpulver had to catch its breath. Fortunately, the team was working with CINEMA 4D and was using its reliable NET Renderer. The team even continued to work on the project using the same computers on which the NET Renderer was running.

The tools to be used were Photoshop, After Effects and CINEMA 4D - products with which the team had already worked in countless other projects. CINEMA 4D in particular had been a constant in most projects for years and was known for its dependability. For the Lagoonia project, many of the render jobs were done using NET Render, whose reliability allowed the team at Augenpulver to plan their work very accurately. The team rendered the images in separate passes using CINEMA 4D's Multi-Pass function and then composited them in Photoshop and After Effects making the fine-tuning and color adjustment phases child's play.

Thanks to good planning and dependable tools, it was possible to complete this logistic behemoth of a graphics project on time and to the customer's satisfaction. And this despite the art director's extraordinarily high demands, which meant sending everyone back to the drawing board more than once!

Website:

www.augenpulver-design.de/]]>news-2042Mon, 05 Dec 2011 13:20:00 +0100Vienna Fly-Throughhttps://maxon.net/en-us/news/case-studies/games/article/vienna-fly-through/See Vienna from an entirely new perspective at the Viennese PraterDuring the renovation of the Viennese Prater, the idea for a new attraction was born: a motion ride that takes visitors on a flight through this magnificent city and lets them experience several of the city's unforgettable sights in a very unique way. A motion ride is an attraction in which visitors stand on a hydraulic platform in a closed room. A film is shown on a large screen - in this case a flight through Vienna. While visitors are watching the film that completely fills their field of view, the platform moves synchronized to the movements being viewed on the screen and gives the viewers the feeling of standing on a flying carpet.

The flight was christened Vienna Airlines, has become very popular and offers a completely new flight experience with its breathtaking flight maneuvers, showing even native Viennese new aspects of their hometown. The question is always asked how this spectacular flight imagery was created because even a helicopter is not able to fly between houses and so close to structures, as is done in the film. The entire flight is in fact virtual! The makers simply created a digital version of Vienna!

Whereas the planning phase was relatively easy, the challenges lay in the actual production process because it became obvious that the flight route through and over most of Vienna would bring a great part of the city into view of the camera - an incredible number of buildings, monuments, landmarks, streets and fountains! Because the film for the motion ride was supposed to be photo realistic, all models had to be modeled and textured accordingly. Several low-poly objects were also created to get a better feel for the project’s dimensions. The team of artists then made their way through Vienna and photographed all key areas the camera flight would show and photographed all buildings that would appear in the film. The photographs were used to create the textures. The team was glad that all foliage had already fallen off the trees and that clouds had covered the city – resulting in almost uniform lighting conditions - which made texture creation much easier. The overcast conditions meant that the photographs of the buildings contained almost not shadows. If the sun had shone, stark shadows would have been cast, which the team would have had to tediously remove. This would have meant several hours of retouching in an already tight schedule.

Finally, the team turned its attention to the first high-res object, the Urania building. This is where the film was to start and this location would also be used for the film's establishing shot. This was also the starting point from which the rest of the production was created. At the same time, the flight route was fine-tuned and adapted to the topography.

Another challenge was the camera's flight along the planned route, which had to be completed in a single take, without cuts or transitions. The entire flight was then divided into sections that were subsequently fine-tuned. The renderer and computers were pushed beyond their limits, which meant that the Project had to be sent to a render farm. This made it possible to render the film as desired and enabled the team to meet the production deadline.

Making of “Vienna Airlines”:www.immortal-arts.com/immortalartswp/?page_id=665]]>news-1906Wed, 09 Nov 2011 11:24:00 +0100Scott’s 3D Submarinehttps://maxon.net/en-us/news/case-studies/games/article/scotts-3d-submarine/No, it’s not yellow but at least as appealing as the famous Beatles song.Matthieu was given the job of designing around 30 scenes as well as animation sequences. As always, he started almost every object with a simple cube and proceeded to extrude it. The basic shape was fine-tuned using HyperNURBS. The subsequent scene creation was done according to storyboard sketches approved by Square Igloo and designed for optimal integration into the iPhone app.Scott’s Submarine was originally designed as an iPhone app and is already available for iPad and iPod – and an Android version is in the works.

Website Matthieu Roussel:www.mattroussel.com/Website Square Igloo with Scott’s Submarine:www.squareigloo.net/en]]>news-1986Wed, 09 Nov 2011 10:10:00 +0100History alive with 3d!https://maxon.net/en-us/news/case-studies/visualization/article/history-alive-with-3d/IM Innovations adds new dimension to history lessons in Singapore schoolsFor the first time in Singapore schools, 250 elementary-grade students in 7 different classes in Pasir Ris Primary School experienced a new paradigm of learning using an integrated multimedia teaching resource. Instead of traditional classroom lessons, students discover old Singapore using 3D educational games, animation and quizzes. Students can wander around and explore a virtual environment when Singapore was only a fishing village in 1819. They will encounter and are able to engage with various key historical figures such as Singapore’s founding father Sir Stamford Raffles or Sultan Hussein and the Temenggong of Johore, to learn about Singapore’s past. Using hyperlinks to such characters, students can watch animated scenes and learn how Sang Nila Utama, the Prince of Palembang, survived the stormy sea on his journey to Singapore and named the island Singapura, the Lion City, after seeing a lion in the forest. Students can play a mini-game to locate places in a village and will quickly appreciate how challenging the past was by using simple clues like a dead tree instead of addresses. Quizzes are also used to reinforce the learning process.

“Singapore’s History Alive with 3D!” seeks to ignite interest in Singapore’s history among young learners. Mr Justin Pierre Arul, principal of Pasir Ris Primary School and IM Innovations’ project partner school remarked: “It’s hard for younger kids to appreciate history. With this project, history comes alive for them”. This novel project, which is being developed by IM Innovations over 20 months, is supported by Singapore‘s Media Development Agency and Temasek Polytechnic. Ms Adila Ong, the Head of Department for Character and Citizenship Education, Pasir Ris Primary, gave this feedback after the school rolled out the pilot phase of this program for their 250 students in early 2011:

“The teachers concurred and gave positive feedback that the package which leverages on 3D technology has indeed made the lessons for the Primary 4 Social Studies (SS) chapters more engaging for the students. Since the first lesson, the students have been more enthusiastic during the social studies lessons. Some students have even gone the extra mile in becoming motivated self-directed learners to find out more about the lessons covered in class through reading books or even surfing the web. They are more familiar with the various key characters and learning points. As the students were more engaged in their learning, very often during lesson closures, the teachers were able to elicit encouraging responses from the students about their learning that meet the lesson objective. As such, not only the students, but the teachers involved are also more convinced, encouraged and enthusiastic in seeing the positive outcomes in their students' learning through the effective use of the package in their classrooms.”

All the 3D digital assets are modeled, textured, painted, rigged and animated in Cinema 4D. These assets are then exported via Cinema 4D’s built-in FBX exporter to Unity 3D, a game development platform. The integration between Cinema 4D and Unity 3D is seamless. Textures are automatically applied; new animation is added automatically; model names and structures are maintained. Objects that are multiplied and arranged in Cinema 4D using the Instance or MoGraph Cloner object with Render Instance activated will be recognized in Unity 3D as a placeholder for programmer to instance the object to populate the scene. This feature helps greatly in reducing the size of and simplifying complicated scenes such as houses in a village.

Vincent Ong, managing director of IM Innovations, explained that this multimedia solution is designed to address a wide range of young students, from low to high academic ability. It is web browser-based and can run on multiple platforms such as Windows and Mac. It is easy to use, does not need a powerful hardware configuration and can be quickly deployed in many Singapore schools. This solution will eventually be available on iPad and iPhone as well as include stereographic features for large immersive displays.

Weblinks:YouTube: http://www.youtube.com/watch?v=MktZ_zuLoHA&feature=youtu.beIM Innovations: www.im-innovations.comPasir Ris Primary: www.pasirrispri.moe.edu.sgTemasek Polytechnic: www.tp.edu.sg Media Development Authority: www.mda.gov.sg ]]>news-1845Mon, 18 Jul 2011 13:10:00 +0200Through the Darkest Dungeons into the Distant Future with CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/games/article/through-the-darkest-dungeons-into-the-distant-future-with-cinema-4d/German Augenpulver Design &amp; Illustration studios use CINEMA 4D to create fantastic imagery for enthusiastic gamers!Augenpulver uses CINEMA 4D as its standard application for the creation of game graphics, a field in which Augenpulver has established a good reputation through their work for several clients. Such was the case when the development team from Deck 13 approached Augenpulver for the creation of UV coordinates and texturing of models for the game Venetika. An exciting and challenging job for Augenpulver not least of all because Deck 13’s art director knew exactly what he wanted and had very high expectations with regard to the quality of work. The 25 characters were created in both CINEMA 4D and BodyPaint 3D. The goal was to create stylized characters based on a detailed, realistic look. The desired result was achieved using Normal, Specular and Glossy Maps. The workflow was sped up by the fact that CINEMA 4D was able to display these properties authentically in the Viewport. CINEMA 4D’s projection painting feature also allowed the team to paint directly onto rendered images, which made texturing in critical regions much easier.

Another very interesting project was Lord of Ultima, a browser game based on the Ultima series of games. Many of the game’s buildings were created, textured, illuminated and rendered as raw version in CINEMA 4D. The rendered images were subsequently edited in Photoshop to achieve the desired look. CINEMA 4D’s Multi-Pass rendering feature in particular proved very valuable for this project because it allowed selective corrections to me made very easily in Photoshop.

For the game Perry Rhodan, the Augenpulver had the primary job of modeling and were able to deliver top quality and meet tight deadlines thanks to CINEMA 4D. The complex futuristic environments required many objects to be created using multiple Boolean operations and subsequently be modified with Deformers – a feat that was no problem for CINEMA 4D with no crashed.

Finally, the project Dungeon Empires let Augenpulver really make its mark. Not only did they create the graphics but the entire design itself. Almost all locations and elements were created in CINEMA 4D, and all objects not created in CINEMA 4D were imported into the application and saved there. No other application offers easier export and management of complex elements.

CINEMA 4D has proven itself to be an excellent tool for the creation of game graphics, offering fast and reliable solutions for the creation of complex 3D graphics.

Deck 13: Developer of Veneticawww.deck13.com]]>news-1841Fri, 10 Jun 2011 11:46:00 +0200CINEMA 4D Brings Dinosaurs to Lifehttps://maxon.net/en-us/news/case-studies/visualization/article/cinema-4d-brings-dinosaurs-to-life/How do you make an ancient fossil pose in front of the camera? The team of animators at Soulpix in Hannover, Germany, used CINEMA 4D to make it happen.The job of breathing life into this fossil was given to the Hannover, Germany-based animation studio Soulpix. When production began, VFX supervisor and project lead Frank Sennholz travelled with the film crew to New Caledonia to coordinate the filming. In addition to the real-world background footage, with which the CG dinosaur would later be combined, HDRI shots were made with the help of a mirror sphere, which would later help illuminate the shots.

While the footage was being filmed in New Caledonia, the remaining team members began preparation for production. Digital impressions were made of the dinosaur that was to be created in CG. 3D artists and paleontologists worked together closely and made sure to include the most up-to-date anatomical information for the creation of the model. Numerous photos of skeletons and fossils from the researchers’ archives were used as reference. Many corrections to shape of the dinosaur’s anatomy were made by the 3D artists with the scientists at their side to ensure utmost accuracy.

After the scientists had given their approval to the models, the team began creating the textures, which had an average size of 8000 x 8000 pixels. All details had to look as authentic as possible. The integration of BodyPaint 3D in CINEMA 4D made texturing a very comfortable process since textures of this size have hardly any impact on the performance of CINEMA 4D. The artists were still able to edit textures in real-time and view the results as they worked. A great deal of creative freedom was possible in the creation of the textures because the paleontologists themselves could only make educated guesses as to the way they actually would have looked. The Layer Shader in CINEMA 4D was used to mix a myriad of procedural masks and textures, making it possible to create an individual material for each dinosaur.

To make movement and states of rest as authentic as possible, slow-motion shots of modern flightless birds were used as reference. In addition, fossilized footprints of various dinosaurs made it possible to draw conclusions about how the dinosaurs must have propelled themselves forward. First, various animations were made that were added to a library to which each animator had access. Hence, various animated walk cycles were available, which were later mixed with other animations, e.g. with the roar of a dinosaur, using CINEMA 4D’s Motion Layer system. This made it possible to quickly create individual animation, which in turn helped the team at Soulpix meet the tight production deadline. The final animations were then baked using Point Cache, which allowed complex animations with inverse kinematics, Jiggle deformers and other physical effects to be rendered easily using NET Render.

A combination of Vray and CINEMA 4D was used to render the scenes. CINEMA 4D and Vray’s excellent connectivity made this the perfect combination. CINEMA 4D’s layer-based and very intuitive material system made it possible to quickly create Shaders for Vray. Even the dinosaurs’ extreme displacement and mass could be rendered quickly thanks to this seamless integration.

The rendered scenes were split up into several render passes. The beauty passes were completed exclusively with Vray and CINEMA 4D was used to render the shadow, Ambient Occlusion and hair passes. All masks for selective color correction in post-production and partial effects were also rendered in CINEMA 4D.

Depending on the number of dinosaurs in and the complexity of a Project, up to 20 render passes had to be made and the render time varied from just 5 minutes to one hour. The entire post-production and the integration of the live shots was done in After Effects, including depth of field, motion blur and any last-minute color correction to give the renderings their final polish.

Info

www.soulpix.de]]>news-1715Fri, 04 Mar 2011 10:42:00 +0100Stadiums, Dunes and MAXON CINEMA 4Dhttps://maxon.net/en-us/news/case-studies/architecture/article/stadiums-dunes-and-maxon-cinema-4d/HHVISION studios use CINEMA 4D to design the stadiums for the 2022 FIFA World Cup Qatar.Still images of the stadiums had to be rendered in large format for various presentations but also had to be used in the creation of a five-minute animated visualization. AS & P turned to HHVISION and its team of specialists for the creation of this visualization.

The source files, which were created using the architecture software Allplan, was easily imported into CINEMA 4D and integrated into highly complex scenes. Since not only the stadiums had to be visualized but also the environment around each one, including greenery, parking lots, vehicles, surrounding terrain, bodies of water and hundreds of visitors, the decision was made to use proxies (instances that reference higher resolution source objects). This made it easier to work with scenes in the Viewport's real-time viewing mode (OpenGL). It was also possible to adapt the complexity of the scene to the available system resources for the final render process.

To achieve the desired results within the deadline, the team at HHVISION took advantage of numerous CINEMA 4D features, in particular MoGraph, the Layer Manager, the Stage object and the Surface Spread plugin.

This made it possible for HHVISION to beat the deadline and finish the project to the client's complete satisfaction. It can be assumed that the outstanding visualizations created with CINEMA 4D had a positive influence on the committee's decision.

RealizationHHVISION / Architekten Hoersch & Hennrich GbR ]]>news-1678Thu, 20 Jan 2011 16:40:00 +0100CINEMA 4D Helps Visualize Heavy Excavatorhttps://maxon.net/en-us/news/case-studies/visualization/article/cinema-4d-helps-visualize-heavy-excavator/What's the best way to introduce a 120-ton excavator at a major trade show when it only exists as a blueprint?Henning Weidhase supervised a team of three artists who used CINEMA 4D to model the LH120C and create an environment that reflected the excavator's real-life working surroundings. Details such as puddles, dirt, grime and graffiti helped make the scene look more natural. In the middle of it all stood the Liebherr LH120C ready for action.

The scene's complexity and the hydraulic hoses were the biggest challenges the team faced. Henning Weidhase remembers, "We rigged the loading arm and the hydraulic hoses using XPresso so the movement of the excavator's arm would be realistic. In the end, the scene size reached a whopping 450 MB, with more than 4.6 million polygons - which was no problem at all for CINEMA 4D. Due to the tight deadline we had to meet we rendered the project without Global Illumination. We instead used Advanced Render's Ambient Occlusion feature, which itself produced fantastic results. What we got was a great-looking visualization that showcased the product as it really works in a realistic environment, which met and exceeded the manufacturer's expectations."

Liebherr International Deutschland GmbHwww.Liebherr.com]]>news-1588Wed, 22 Sep 2010 00:00:00 +0200Rebirth of the Aureole by Michael Marcondeshttps://maxon.net/en-us/news/case-studies/architecture/article/rebirth-of-the-aureole-by-michael-marcondes/Glass sculptures in the legendary restaurant on New York’s 5th Avenue.Using the works of 19th-Century French artist William Bouguereau as a reference, Theory Engine created conceptual designs that were very well received by the client. Bouguereau’s paintings were modern interpretations of classical subject matter with a strong emphasis on the female body. “The poses and body expressions [Bouguereau] did were phenomenal and they had the life we were looking for”, explains Bram Tihany, Owner of Theory Engine.

CG played a major role in this project from the very beginning. Creating and exploring the expressive motion of stroboscopic imagery could not have been done to this degree using another method. “Because of the organic style required by the client we had to model intuitively and position the figures precisely”, explains Banjamin “Selwy” Leitgeb, the project’s digital sculptor who breathed life and emotion into each character. “The high polygon count meant that a solution had to be found to reduce the number of polygons and add detail using displacements in Cinema 4D. Another important aspect was making the models physically correct for shading. I refined the figures using zBrush to optimize them for shading. It was a challenge creating a hyper-realistic glass look because the shading is difficult to control – it had to look realistic without looking like plastic.""In the end we used an extremely high render resolution – with image sizes exceeding 40k. Chromomeric prints, which itself is an amazing technology, will be made of the final results”, continues Michael.

All this was done using Cinema 4D’s comprehensive and stable standard tool set. “The fact that I had complete control over all features made it possible for me to achieve the final result”, says Michael

According to Michael, Cinema 4D’s strengths with regard to this project lay in its ability to use traditional methods to create light and shadow without having to use GI and still achieve a realistic result at the desired resolution. The software met or exceeded all demands made of it.

Cinema 4D’s simpler tools in particular influenced the workflow. Elements such as the Layer Browser or tagging were very important.

“It’s difficult to explain to your wife why you are working overtime if all she sees on your monitor is naked women celebrating wine! But they were all only made of glass!”, joked Michael

Michael started to work with Cinema 4D in 2007 and was hooked immediately. “The clear interface design and solid tools were a ‘wow factor’ for me. It didn’t take long to learn how to use the software and get to work.” The clearly laid out interface can be confusing, he admits, simply because you can’t imagine how powerful the software behind it really is.“As a freelancer I use numerous programs and I select the best one for each job. I can guarantee that Cinema 4D will be my software of choice for most projects.”

Michael remembers when he first started sharing his first Cinema 4D renders with the Cinema 4D community around the world. Everyone was very enthusiastic and was asked which software he had used to create the images. It was very rewarding to see how users were eager to use Cinema 4D when they realized the power it had to offer.“Thanks to Cinema 4D I was able to focus more on being creative than on the technical side of work. It was very refreshing!”

Bram Tihanywww.theoryengine.com]]>news-1674Fri, 17 Sep 2010 16:00:00 +0200Villa TEO – Time, Space & Oxygenhttps://maxon.net/en-us/news/case-studies/architecture/article/villa-teo-time-space-oxygen/Villa TEO is a meta design story – a seven-day fantasy voyage between utopia and a given reality, told by design historian Raymond Guidot and designer Jean-Baptiste Pontecorvo.„Creating a tangibly believable 3D world, starting with a few simple polygons, is challenging…”, explains Jean-Baptiste. To do so he used Cinema 4D’s wide variety of material and lighting tools to achieve the film’s realistic look. He wanted to visualize the architecture, interior spaces and other objects in a manner that lets readers dive into the material until they land on an atom particle, as if in a different world. The images also had to match the descriptions that had already been written according to a schematic storyboard.

„I have to say that the intensive training I received at an accredited training center saved me a lot of time in the end”, says Jean-Baptiste, who has always met deadlines but never cut corners in doing so. “First, I modeled everything, then created the lighting, followed by the camera setup and finally I created the materials and photo textures. I didn’t do a lot of editing in Photoshop – if a few images don’t look that realistic they can still have an interesting, poetic style.”

Currently the technology for the self-sustaining techno-ecological villa is still under development and the parties involved are negotiating with potential partners for future exhibitions. A business model is currently being developed. One thing is for certain for the designer: In the long term, the Villa TEO will be self-sustaining with zero emissions – or the villa won’t be built at all.

Cinema 4D’s stability was key for this render-intensive project, which includes 40 pages of large-format images in 300 dpi. Jean-Baptiste was also very impressed by the speed and number of features of Cinema 4D’s render engine, as well as by the seamless connectivity between Cinema 4D and ArchiCAD – which turned out to be very useful.

“While working on the project I showed a few of the rendered images to the co-authors of the written content. They were certain that the images had been retouched”, continues Jean-Baptiste. “This proved to me that the illusion can be convincing for those outside the field of 3D. Even the printer thought the house had really been built when he saw the images of the villa illuminated in HDRI.”

Jean-Baptiste had successfully created a one-of-a-kind universe used to illustrate an entire book. “It took me a long time to find a 3D software that I like and I’ve finally found it. Without a doubt, Cinema 4D is my software of choice – in particular because if its ease of use.”

With a project of this size, the untriangulation of CAD data on import was essential for achieving clean models. Thanks to the snap functionality, subsequent design modifications such as scaled windows in point mode could easily be made.

CINEMA 4D’s MoGraph module proved to be indispensible for the dispersion of the great number of plants needed throughout the landscape. Groups of trees were dispersed as render instances.

The visualization included various times of day and used consistent furnishings to make it appealing to the project’s selected target group.

“After becoming familiar with CINEMA 4D, its very intuitive workflow lets artists achieve the results we need for high-end, photorealistic architectural visualizations. Its comprehensive editing tools offer solutions for practically every modeling, lighting or animation problem”, explains Altermatt. “What’s more, we really value the fact that the user interface is so clearly arranged even though it contains such a great number of commands.”

DECC’s goal was to present this project in an entertaining style. One way they wanted to achieve this was with self-decorating apartments, which turned out to be quite a challenge to realize. The magically appearing furnishing and fixtures in the apartments were completely animated using keyframe animation. This process had to be shown in a single camera move.

A six-member team at DECC needed two months to complete the entire 7 min. 37 sec. animation. During this time, however, the project had to be set up twice. “One week after we had completed the project and had delivered the finished animations our client informed us that he had permission to include a hotel into the structure. We had less than a month’s time to build, light, decorate, animate and render all outdoor scenes,” explains Alejandro Nogueira Jiminez Pons, CEO of DECC S.C. “But we still met the deadline.”

“Cinema 4D gave us all the tools to get the job done – easily and intuitively,” he continues. The entire project was completed using MAXON Cinema 4D’s Advanced Render module, HAIR and Cinema 4D NET Render render clients.

The 3 Multi-Pass layers were composited using After Effects and Final Cut.

ClientCH Arquitectoswww.charquitectos.com]]>news-1423Wed, 23 Jun 2010 17:15:00 +0200Renderhouse: “The Power of 3D is Underestimated.”https://maxon.net/en-us/news/case-studies/architecture/article/renderhouse-the-power-of-3d-is-underestimated/“Investment in good visualizations is peanuts compared to the total cost of a project. It is easier to get potential buyers interested because you can present all the architectural qualities,” Ludwig Desmet declares. In the past five years his one-man business Renderhouse has earned its place at the absolute top of 3D visualization.“I don’t allow haggling over the prices of my visualizations. The image quality has to be top priority and I need job satisfaction,” Ludwig Desmet tells us. “Consumers of luxury products want to see precisely what they will be getting for their money. My clients want to showcase the high-class quality and added value of their projects. Of course the best way to make your case is with high-quality 3D images.”

Environment

Ludwig Desmet believes you cannot detach architecture from its environment. “Architecture never stands on its own. There is always an environment and you have to show that, too. If you are selling an apartment by the sea, show the sea. For a house in a quiet neighborhood, the peace and quiet are a big selling point. You need to be able to see it, by drawing the eye to the trees and greenery in the surrounding area, for example."

Own photographs

Desmet not only provides the most accurate rendering of the environment, he also wants to shed the right light on it.

“What is of great importance for a visualization? The right sunlight. Before I start working on a commission, I go to the site and make a series of pictures for the backgrounds. I research the orientation of the building beforehand so that I know at what moment of the day I can take the best pictures. As a consequence I also know the camera’s perspective and the exact lens used, so that I can reconstruct the viewpoint and the lens perfectly in Cinema 4D.”

Other eyes

Desmet studied professional photography earlier in his life.

“I think that as a photographer, you view 3D differently than an architect, for example. I use several of the primary principles of photography – such as composition, perspective, lighting – in my 3D work.”

Material study

Desmet’s 3D models have more than a commercial function. Often architects also seek out his help to furnish one design with various materials, so that the 3D model becomes a decisive factor during the design process.

Quick return on investment

Furthermore, the 3D models also serve as a control of the structural design.

“Sometimes architects need to adjust their design after seeing the 3D visualization. There are often mistakes in the 2D plans: a window may be missing in an elevation, or the stairs may not be in a good place.”

“For the architect the investment has already paid for itself in those cases. If the faults had been revealed after construction, the cost would have been far higher."

“It also happens that architects adjust their design after seeing the 3D visualization to improve it aesthetically.”

Now and then Renderhouse is asked to develop a concept that will not be worked out by an architect until after the 3D illustrations have been made.

“I have been working on a project to redesign the site of a trading company for a while now. The client does not yet know exactly what he wants so we are exploring various options. As soon as we have a finished concept, we will present our idea to an architect.”

Small offices

Some of the clientele of Renderhouse consists of “smaller offices that do not make their own 3D drawings and prefer to leave it to a specialist. Most of my assignments are pure visualization work. Occaisionally I am asked to make Quicktime VRs (QTVR). This is a very underestimated interactive tool that allows you to show the project in its entirety in a very attractive fashion.”

With QTVR you explore a project through 360° degrees. A British study shows that on architecture websites the pages with panoramas are the most-visited after the homepage itslelf.

One of the challenges for good architectural visualization is the right lighting. “The artificial sun needs to be in sync with the real sun in the background picture; the same identical morning/afternoon/ or evening lighting. The clouds and season also all have to be right,” Desmet explains. “Nothing looks worse than a north-facing elevation with bright sunlight shining on it.”

From design to 3DThe source material Desmet receives usually consists of CAD files and cross sections. “In Cinema 4D I transform these into 3D. If it is the image of an exterior, I only put things in the interior that are visible from outside."

"For an interior I first create the wall volumes. After that I cover them with materials and look at the lighting of a scene first of all. Once that is completed I begin with the interior decorating. Sometimes I make the detail-renders when the client asks for them. For example, when it is relevant to include certain details for a project presentation in a brochure. To render things as accurately as possible I ask the client for lots of product references, such as the RAL colors that were used and the type of brick.”

Artistic versus realistic

Renderhouse is known for its photorealistic 3D models. “But I have great admiration for people who can make a good visualizations using Sketch and Toon or the traditional way with paint and brush. A good artistic impression is a worthy alternative for photorealism.”

Long road

Desmet is one of the pioneers of the better 3D visualization. He says Belgium still has a long way to go in the world of 3D. “There are some very good 3D artists in this country, but it is a very small group.”

The lack of good visualizations, in his opinion, is a result of the lack of interest in good architecture. “Admit it: Belgium is hardly a trailblazer in exciting architecture. And that has an effect on the field of visualization. Too often the view is: it can’t cost too much, and that applies to the real architecture as well as the visualization. The frustrating thing about 3D work is the never-ending discussion about money. In London the budgets for architectural visualizations are four to five times as high as in Belgium. So there is still a lot of work to be done here. Often a rotten Sketchup drawing of a project will do. I would rather have a beautiful hand-painted watercolor than bad 3D.”

INFORMATION

Renderhousewww.renderhouse.eu]]>news-1417Mon, 31 May 2010 14:25:00 +0200Kizo & Arscom Studio - Specialists in Hyper-realistic Fictionhttps://maxon.net/en-us/news/case-studies/architecture/article/kizo-arscom-studio-specialists-in-hyper-realistic-fiction/More and more 3D artists are specializing in the creation of architectural visualizationsZoran 'Kizo' Gorski (Arscom Studio, Croatia) is a seasoned CINEMA 4D user and is well-known in the 3D community. Last year he won rendering competitions at Evermotion, 3D Allusions and Zwischendrin. In addition, he has been a beta tester for VrayforCINEMA4D recommended Daniel Schild of DNS-Plugins for the development of a parametric window plugin for CINEMA 4D.

Kizo's was introduced to CINEMA 4D during his time as an industrial design student. ”Our education was focused on Vectorworks and formZ. My girlfriend was attending a graphic design course and was using CINEMA 4D. I was impressed by the renders that she produced and I immediately began playing around with the program myself.”Architectural Visualizations

Kizo is one of the founders of Arscom Studio, which is made up of a team of four that specialize in architectural visualizations and web development. Their portfolio consists of a considerable number of tourism projects such as hotels, apartments, residential complexes, country estates and even a football stadium.

The studio employs two industrial designers, one architectural engineer and one informatics engineer”As a result of our diversity, communication with our clients is effortless”, states Kizo who specializes in architectural and interior visualization.Real Estate

”We established Arcsom because of the rapid growth of the real estate market along the Adriatic coast in the past couple of years and the resulting high demand for architectural visualizations. Due to the economic crisis, the demand has shifted slightly. Consequently, we started to also focus on product visualization and illustrations.”

”Most of our visualizations have been created specifically for project marketing. Therefore, we always strive for an impeccable representation by using accurate geometry and realistic materials.”

”Creating a realistic and attractive environment is of the utmost importance to attract potential customers, which is why we also pay close attention to landscaping.”Complex

Architectural visualization is more complex than product visualization. The environment and the models are more detailed. Furthermore, higher-resolution textures are required. As a result, scene files require more computational power and memory. Thankfully, CINEMA 4D can address the proxy functionality of Vray, which reduces the amount of memory required and allows you to render large scenes. CINEMA 4D offers the same functionality since R11.5. Arscom works with a variety of source materials and reference, such as hand-drawn sketches or simple 3D models. Usually, a CAD file with 2D information is used. ”We use the CAD file as a 2D plan, onto which we build our 3D models.”

”We always request photo and material references from our clients. This allows us to accurately recreate the desired environment.”Custom Textures

A few years ago, Arscom decided to primarily use custom textures.

”Our company wants to make a good impression. We will make custom textures if a project requires it. This is often the case, since textures are usually very specific to a particular object or building.”

”We UV-map and texture our geometry with BodyPaint 3D. At times we will edit the textures with other graphics applications. We continuously expand our image library with new images and we download sources from web sites such as www.cgtextures.com.”

”Procedural textures are also used, if possible, but this depends on the render engine used. Thankfully, VrayforCINEMA4D is tightly integrated into the program, which allows us to use all of the available procedural shaders.”Evermotion Contest

The participants of the EVERMOTION contest had to texture and visualize a base model. Kizo won this competition by creating a unique composition with custom textures.

”I used an artistic approach. Due to the limited time available, I mainly focused on the composition of the image. Even custom textures were specifically created to accommodate the composition. Reflective surfaces were used to avoid having to use detailed materials. These filled the scene's emptiness perfectly, similar to lighting.”

”In order to stand out from the competition, I thought of an unusual wide lens camera angle. The foreground distortion caused by the lens was too extreme. Therefore, I scaled a few objects to compensate for the distortion.”

Plugin

Kizo suggested also to Daniel Schild of DNS-Plugins that he develop a parametric window plugin that would help improve and accelerate the modeling workflow.

]]>news-1330Fri, 16 Apr 2010 15:20:00 +02003D Digital Model of Hamburg, Germanyhttps://maxon.net/en-us/news/case-studies/architecture/article/3d-digital-model-of-hamburg-germany/The State Geological Information and Surveying office, Hamburg's primary source for official cartography, geo data and surveying, presents its unique model of the port city - a digital 3-dimensional model.The first level of detail depicts a very rough version of all structures - similar to simple building blocks - and does not take into account the actual topography. This rough level of detail is used to depict all of the city's roughly 330,000 buildings at once.

The second level of detail depicts a more exact version of all structures, including the shapes of the individual roofs, and includes an approximately 250 square kilometer (approx. 96 sq. miles) area with about 130,000 structures. The height of the structures and the shapes of their roofs were derived from aerial imagery. This level of detail also includes topographical information, which was generated with the help of a digitized landscape model.

These highly detailed models of the city's actual structures are a realistic re-creation of Hamburg's cityscape and can be used for a wide variety of purposes, including city planning, real estate investment and more. Development projects can be analyzed under realistic conditions, for example if they fit into the city's overall planning concept. Natural lighting and shadows can also be simulated, and the model can also be assigned realistic textures, vehicles, figures and trees, depending on the customer's wishes. Rendered results can be output as images, 3D pdf files or as movies.

Cinema 4D offers an excellent workflow and delivers outstanding results, in particular due to its ability to work with CAD data. Cinema 4D also offers excellent usability and performance, especially when working with very large and complex projects.

Landesbetrieb Geoinformation und Vermessung Sachsenkamp 4 – 20097 Hamburg 3D-Modelle]]>news-79Mon, 24 Nov 2008 16:35:00 +0100Peugeot 4002 - Project Name "Lion"https://maxon.net/en-us/news/case-studies/visualization/article/peugeot-4002-project-name-lion/The full scale model of Stefan Schulze's car of the future will take your breath away.We are proud to report that since then Stefan's stunning design has been turned into a full-scale model. And it's been turning heads at motor shows ever since!

The months have passed by since the Lion concept first came into being, but Stefan still has fond memories of the design process.

"CINEMA 4D is so user-friendly that it didn't take long until I got results that I could actually use.

"I based my design on a lion's head with the mane falling behind it. The lion's head fitted my concept, and it also fits in neatly with Peugeot's logo. There's also a hint of motorbike in the design, which you can see most clearly in the high position of the wing mirrors.

"What has winning the contest meant for me? Well, I've had lots of new experiences and met interesting people. I've had my five minutes of fame, you might say!

"By developing CINEMA 4D, MAXON also played a part in my win. CINEMA 4D has a fast workflow and it's easy to learn. But most of all you can use it on bog standard computers too. Unlike other 3D apps, you don't need a high-spec system.

"Also, other 3D apps suit technical people more than artists, whereas with CINEMA 4D it's the other way around. As a creative person you can be really imaginative in CINEMA 4D and get your designs out of it very quickly."

For more info on project "Lion" visit www.p4002.com, www.firstsignal.de or www.peugeot.com.]]>news-39Thu, 20 Nov 2008 13:03:00 +0100Bose Flagship Store in NYC Entirely Designed in Cinema 4D!https://maxon.net/en-us/news/case-studies/architecture/article/bose-flagship-store-in-nyc-entirely-designed-in-cinema-4d/The world-renowned audio/video specialist Bose Corporation recently inaugurated its new flagship store in the newly opened AOL/Time Warner building in downtown Manhattan.Ædifica, this project maximized the presentation and storage areas of this otherwise awkwardly configured space through the use of a large leather-clad curved wall and concealed storage cabinets behind the displays. Ædifica also co-developed the new communication strategy with Diesel Marketing.

Traditionally, Ædifica has always relied on CAD-based software as its core modeller, using CINEMA 4D for texturing and rendering.

Since 1999, Aedifica has used CINEMA 4D as its primary presentation tool. CINEMA 4D's rendering speed and rock-solid stability has permitted Aedifica to realize concepts ranging from small kiosks to a 10,000-seat stadium. For the Bose project, however, CINEMA 4D was used from the project's very beginning, even for the conceptual work. Walls, shelves and floor fixtures were designed in CINEMA 4D and exported to a CAD application where the final construction drawings were made.

Lead designer Stephane Bernier was very impressed with the precision and flexibility of CINEMA 4D's modeling tools: "Many things were simply much easier to create and modify with CINEMA 4D than with a CAD application."

Infos:Stéphane Bernier, B.Arch.Design Manager Ædifica | ARCHITECTURE + DESIGN www.aedifica.comwww.bose.com]]>news-38Thu, 20 Nov 2008 13:00:00 +0100Das Silo https://maxon.net/en-us/news/case-studies/architecture/article/das-silo/°dilight was hired by the firm Aurelius of Hamburg, Germany, to visualize the retrofitting of an old silo.in the retrofitting. This was a major reason computer visualization was used to get an impression of what the final result of the retrofitting would look like.

This particular silo was of interest to us since so many areas had to be visualized. It was quite unusual for a project of this type that the outer area as well as various inner areashad to be visualized. The visualizations were started prior to the begin of construction and continued parallel to the project until its completion. The results of the visualization wereused online, for print and for presentations.

"We valued CINEMA 4D from the very start for its smooth workflow, its very good import functinality and especially for its intuitive interface.", said project manager Ullrich (Yu Kei) Kopka. He continued, "The software is exceptionally stable -very important when working in a professinal environment - and so well-programmed that we've had it running on all workstations for over two years."

The °dilight team, comprised mainly of architects, began with computer visualization in the mid-80's has been awarded over a dozen national and international prizes since the mid-90's.

In the little free time he has, "Yu Kei" tries to pass on his knowledge and experience to others. This gave birth to the website land.liquid-light.org, a collection of various tutorials which are available to the CINEMA 4D community.

Although dilight now covers the complete spectrum of new media, from filmmaking to content management systems, visualizations have remained a major part of their business.

Infos:

°dilightKopka Borgmann Melzer www.dilight.comwww.das-silo.declouds.liquid-light.org]]>news-61Wed, 19 Nov 2008 13:34:00 +0100World Cup 2006 Arena Düsseldorf, Germany https://maxon.net/en-us/news/case-studies/architecture/article/world-cup-2006-arena-duesseldorf-germany/bgp-design produced an animation of the new Arena Düsseldorf using MAXON CINEMA&nbsp;4D and received an Animago Award in 2003 for its work.The following elements had to be shown: the architecture and its surroundings, the different usages for sporting venues, convention and concert events as well as transportation connections using shuttle trains.

The project had to be completed within three months.

After filming had been completed, bgp-design created over 50 additional single image renderings with a resolution of up to 8,600 x 6,300 pixels for large scale reproduction.

In the end, a CDROM was created using Macromedia's Director. The entire film, selected images, QTVR's and a wealth of background information from the roject were united into one medium.

Film and images of the Düsseldorf Arena were used for the website and various television broadcasts and print publications.

bgp-design stuttgartThe offices of bgp-design offer product development, visualisation and trade show design. Bgp-design was founded as a traditional design studio with a strong engineering contingent and has evolved to a broad-based service provider. Success through professional design in the above mentioned fields is bgp-design's credo and the key to it's success.]]>news-212Mon, 04 Aug 2008 15:51:00 +0200Singapore Students Excited About Working with Cinema 4Dhttps://maxon.net/en-us/news/case-studies/visualization/article/singapore-students-excited-about-working-with-cinema-4d/Since March 2005 educational institutions in Singapore have been offering an increasing number of 3D visualization courses that have sparked the interests of even the youngest CINEMA&nbsp;4D fans.Numerous 3D workshops have been completed at various regional educational institutions, and the enthusiasm shown by our younger students shows it is never too early to learn 3D animation.

IM Innovations offers these 3D seminars in their own 3D Media Studio facilities or on location at educational institutions with adequate computer facilities. Our students benefit from state-of-the-art hardware and software and our courses are tailored to the needs of each participant. Comprehensive, in-depth practical application and theoretical exercises are part of every course. Courses are offered for everyone from beginners to professionals looking to improve their existing skill level.

Vincent Ong, managing director of IM Innovations says, "3D graphics and animation used to be exclusive to professionals and some institutions of higher learning. But with falling IT costs and a whole new e-generation of youths and children, 3D animation can now be every student's cup of tea."

"One of the next things we are going to do is to reach out to the industries, not just in Singapore but also in the region, so together we want to be seen as the lighthouse for 3D technologies and applications."

The VPN project even raised eyebrows at television broadcaster Channel NewsAsia, who decided to broadcast a report about this successful project. Channel NewsAsia has offices in Singapore and Hong Kong and is the leading Asian news broadcaster, focusing mainly on Asian issues. Channel NewsAsia is considered to be the Asian language equivalent to "CNN".

The Admiralty Secondary SchoolIn March of 2005 twenty students and four Admiralty Secondary School teachers were selected to get to know the world of 3D modeling and animation in the school's in-house 3D Digital Animation Studio. The project turned out to be a complete success. By the end of the scholastic year twenty students and seventeen teachers had taken part in over sixty hours of 3D animation training.

Mrs Lim Ai Poo, Principal of Admiralty Secondary shared the experience of their students, "The sessions gave us a glimpse into the endless possibilities for innovative teaching and learning.The software is user-friendly and easy to manipulate. In fact we were amazed at the learning speed in which some of our students, namely Sec 1s and 2s displayed. We want to equip our students with cutting-edge skills in today's digital world. As for our teachers, they can make use of the newly-learned skills to prepare dynamic interactive teaching courseware as well as act as trainers in 3D animation for the students."

The students quickly able to apply the knowledge they acquired in these courses. They created a virtual school flag and a very informative geography video that demonstrates how rice is cultivated. The students are currently working on a virtual tour of the Admiralty Secondary School.

Law Wei, an Admiralty Secondary School student says, "Working with CINEMA 4D is very interesting and I want to quickly improve my skills. I recently created a sword and want to create many more objects. Maybe I can use them to create an animation."

The 3D animation course has been part of Admiralty Secondary School's curriculum since 2005.

The East View Primary SchoolThe principal Veronica Tay heard about the VPN project she was impressed by the possibilities 3D animation had to offer but was not sure her students were ready handle such assignments. Nevertheless she sent two of her most IT savvy teachers, Mdm Kamalnoorzaman and Mdm Sarila to test it out at a 3-day animation workshop. Soon another 10 teachers signed up for the next animation workshop. To their amazement, acquiring these skills was much easier than they expected. East View had been making use of video production to create short videos as a teaching aid. For example, when explaining about different professions to students, the school could create videos of real people in these fields instead of simply relying on images in books. With the use 3D animation, they could now bring their lessons to the next level, especially when combined with video. In 2006, twenty students have begun a twenty-hour training program, which takes place over ten weeks, two hours per day following regular school. This training also takes place in the school's own computer lab.

Infocomm Development Authority (Singaporean government www.ida.gov.sg)

Advanced Micro Devices - Hardware sponsor for 3D Media Studio

Hewlett Packard - Hardware supplier

The Dino WorkshopIn addition to the various seminars it offers, IM Innovations also offers the Dinosaur Animation Workshop for students 10 years old or older. Students interested in taking part can register for free at: www.im-innovations.com/dinoWorkshopNoLinks.htm. The students' assignment is to create a prehistoric scene using CINEMA 4D. The scene should contain a dinosaur, trees and rocks or cliffs. This workshop is supported by the Media Development Agency (MDA) and takes place in the Singapore Science Center. To date, more than 1000 students have taken part in this workshop, which was also reported on by Channel NewsAsia.]]>