Through community flash fundraisers and a financial aid program, five low-income post production professionals attended Editfest in Los Angeles this month

The Blue Collar Post Collective (BCPC) provided five emerging post-production professionals the opportunity to attend Editfest LA, an annual series of panel discussions held in Los Angeles since 2008 by the American Cinema Editors (ACE). This year’s Editfest LA featured editors from Breaking Bad , The X-Files , Star Wars , and more.

Through BCPC’s Professional Development Accessibility Program (PDAP), Indiana native Hillary Lewis was selected by committee from a pool of applicants to travel to Los Angeles for the event. At least 80% of BCPC’s revenue, generated through vendor and community donations, ticket sales, and other grants, is allocated toward this financial aid fund in order to make attending important industry events more accessible to everyone. Lewis was awarded airfare and hotel, as well as a ticket for Editfest provided through a partnership with BCPC board member Bobbie O’Steen and ACE.

Lewis, an emerging short-form editor in Indianapolis who graduated from Indiana University in 2013, explained she would not have had the opportunity to explore career opportunities in Los Angeles and make valuable connections without BCPC’s financial aid.

“As an emerging young editor, the opportunity to attend Editfest LA through the PDAP was invaluable. It exposed me new facets of the post-production industry, new people and positions I had not known, and a clearer sense of what it means to be a part of this industry’s culture.This experience brought me into a strong, supportive community I would not have had access to in my hometown, and I hope BCPC can continue to fund people like me to attend these life-impacting events,” Lewis stated.

Lewis also spent time touring short form, reality and scripted television and film facilities at Viacom and The SIM Group in Hollywood, shadowing offline and online editors, assistants, workflow supervisors, and camera technicians.

In addition to the PDAP recipient, BCPC’s community of members “passed the hat” around to see if they could send a member to Editfest with Lewis. The flash fundraiser was so successful that four additional qualifying members local to LA were also provided tickets: Isabel Yanes, Luke Palter, Clarence Deng, and Joaquin Elizondo. Through a collaboration with community leaders Lawrence Jordan and Richard Sanchez, BCPC was also able to connect member Graham Palme to accept a donated ticket on behalf of Sanchez and Jordan’s upcoming course “Master the Workflow”.

“Opportunities like these can be a game changer for someone’s career. They have been for mine. There are great storytellers and artists and technical geniuses all over the world. With the PDAP, we want to help some of them achieve things they couldn’t in their current circumstances,” BCPC West VP Chris Visser added.

The Professional Development Accessibility Program, which collects applications from full-time post workers whose income is below the median income for their city of residence, will continue to bolster Hollywood’s efforts to increase inclusiveness in its below the line workers, creating a bridge between the industry and the diverse membership of the Blue Collar Post Collective. By bringing new faces to major events, we are reminding the wider industry that all professionals, including those who are low-income earners, have voices that are of equal value and importance to the post production community.

Creativity: Innate Talent or Learned Skill?

At the most recent SIGGRAPH conference in Los Angeles (Special Interest Group on Computer Graphics), I had the opportunity to spend time with a delegation from Shenzhen, China. Shenzhen is a large, modern city of over 18 million people not far from Hong Kong.

At the core of our conversation was the subject of creativity. The Chinese professionals confessed that while the students who graduated from their art colleges were very competent and talented artists and designers, they couldn’t create anything that people wanted to watch! They asked me how they could teach their students to be creative.

Most of us think that you either are creative or you aren’t. That you were born with this talent or it passed you by. But that’s not true. Everyone can learn to be creative but it requires a couple of key factors and perhaps some new habits.

Rethinking Creativity as a Skill

Did you already know how to read when you were born? Unless you are severely dyslexic, you learned how to read by building the requisite skills. You also probably lived in an environment with books and newspapers with parents and siblings who read and who encouraged you to read as well.

Encouragement is a key factor in growing creativity. As educator and TED talker, Dr. Ken Robinson says, if you ask a room full of 5 year olds if they are artists, they all say yes. If you ask a room full of adults, few will raise their hands.

Dr. Carol Dweck, a professor at Stanford and one of the world’s leading researchers in the field of motivation, focuses on why people succeed and how to foster success.

The key is mindset: do you have a growth mindset or a fixed mindset. Do you think you are born with creative talents or not? Maybe you can’t draw like Rembrandt but that doesn’t mean you aren’t creative. Everyone can learn to be creative.

However, if that’s true, why are the Chinese having problems?

In order to enhance your creativity, you need three things:

1) Become your own teacher.

Learn from everyone but explain things to yourself and to others. In many cultures, teachers are the “sage on the stage” and not the “guide on the side.” Only the teacher is an expert, never the student. If you can’t explain something to another person, you have no idea if you really understand it.

2) Be open

Be open to new ideas and new ways of thinking. In a culture where “the nail that stands up must be pounded down,” openness is not valued. This is why the first rule in improvisation is “yes, and…:” All new ideas need to be considered and expanded upon even if they fail. In fact, failure should be celebrated. Some cultures and workplaces don’t value this trait either.

3) Challenge authority

You have to ask questions and challenge assumptions. Just because it’s never been done before or we’ve always done it that way doesn’t mean there aren’t at least several other, maybe better ways to solve a problem. Again, if you aren’t encouraged to do these things, your creativity become squelched.

So, my response to my Chinese colleagues was to look at what their culture valued. Are people allowed and encouraged to try new things even if they fail? Are they permitted to question authority? Do companies value their employees for the elevator asset they are or are people fungible?

Many in the creative technology industries assume that creativity lies in the tools. But the value is in the artist and the creative use of those tools. For example, I occasionally meet a student in high school who will say “I’ve been using Maya since I was in 6th grade.” My response is always “And what does Maya make for you?” And while they are fumbling for an answer, I pick up my pen. “I’ve been using this pen since I was in elementary school, and I’m still not a novelist. I must be using the wrong pen!”

The tools are getting better and will continue to eliminate some jobs. If your job can be routinized and standardized, then someday soon, a machine will do it. So what will you bring to the table? What is it that a machine can’t do? A recent article in Fast Company stated “The more you’re required to personally help other people, the less likely you’re going to be replaced by a robot. If your job requires negotiation, or a high degree of creativity, there’s also less risk.

Creativity is a mindset, not a talent. When we understand that creativity is a way of thinking that blends our imagination with the world around us, then true innovation can exist.

_________________________________________

Kathleen Milnes is a member of the HPA Board of Directors and the Assistant Chair of Digital Media at the Otis College of Art and Design in Los Angeles. She spent most of career in film, television and commercial production, public policy workforce development. kmilnes@otis.edu

192017Jul

Tackling the Format Explosion at the second annual HPA Tech Retreat UK

The HPA Tech Retreat UK is fast becoming a must-attend event for the cognoscenti of professional content production in Europe, as it already is for the US community who meet in Palm Springs each year.

Those who gathered to retreat at Heythrop Park Resort in rural Oxfordshire for three days in July came from locations as diverse as France, Poland, and East and West Coast America.

Unlike at trade shows where most of these senior folk meet, there is a real chance at the event to relax into proper discussions and connect and re-connect with contacts. In addition to the programming, continuous opportunities for conversation and exploration took place at breakfast roundtables, cocktail receptions, lunches, dinners and parties.

Cutting across much of the agenda, and front of mind for everyone from CEOs to engineers –– is the explosion in format and versions to serve the international market on ever multiplying platforms.

Setting the scene, delegates enjoyed an invigorating keynote from Eric Pearson, Home Entertainment Supervisor at Pixar Animation. He explained how, with just a team of seven, they created a remarkable 7482 new shots for international versions of Cars 3. This picture localisation entails catering to the nuances of culture which Pixar takes extremely seriously by making artistic changes to frames or whole sequences. For example, it regularly substitutes cultural and language appropriate text in newspaper headlines in backgrounds to ensure a joke or plot line is followed.

“We’re creating an experience for the Mandarin or Thai speaker so they can be lost in the movie as if it were made natively in their language,” said Pearson. “This dramatically increases the complexity but we think it’s worth it.”

“For the same amount of money you spend on a transcription house you can use machine learning to deliver speech to text, localization and an almost infinite other variety of data tasks – and you end up with richer content,” said Josh Wiggins, CCO, GreyMeta.

Machine trained automated speech to text may not yet be good enough for BBC One, admitted Stephen Stewart, VP, Global Content Operation, BBC Worldwide. “But if you have an opportunity to push content where it’s not economically viable at present and you can inform people about a subject they would not otherwise have seen, then it is worth it.

“Machine Learning is getting there,” he added. “We can expect to see artificial intelligence encroaching more and more on the content creation, production and delivery ecosystem in a very short time.”

Lydia Gregory of Jukedeck demonstrated two music tracks – one composed by human, one by computer – illustrating that the line between art and science is already blurring.

SMPTE Fellow and BBC standards lead Andy Quested chaired a discussion of the format minefield that went into creating BBC Natural History series Planet Earth II.

“If you’re going to the ends of the world you want whatever you do to be futureproofed,” explained producer Elizabeth White. “What we didn’t know then was how it going to be post produced so we recorded with no real knowledge it was going to be finished as UHD let alone an HDR product.”

This staggeringly complex show was made over four years, shot on at least 16 formats and accumulated a 400 to 1 shoot ratio.

As it was being postproduced, BBC R&D’s Andrew Cotton explained how the broadcaster helped devise the HLG format of HDR in order to serve both legacy and new TV sets with dynamic range.

The event began with a series of expert reports on VR, AR and MR. While there is exciting work being done, the tenor of discussion was that the industry needs to take a reality check.

“Perhaps the biggest problem is that there is no audience for VR yet,” said Zillah Watson, a former current affairs producer, who is now editorial lead on future content and storytelling projects for BBC R&D. “We haven’t got a way of distributing VR to an audience to find out what they want from the experience.”

She said the industry has come a long way in terms of creating hard news programming in 360° since the BBC’s first news experiment from the Calais migrant camps in 2015 —but it was clear that there are still challenges to overcome before VR news goes mainstream.

“360° has been justified by the broadcast news industry as a gateway to VR. It is not. I question if there is any evidence that watching 360° will make a user want to watch on a VR headset. 360° video on mobile or in browsers will not drive people to VR. If we don’t create a good content ecosystem that people want to explore and view and we don’t make headsets better, then the whole thing won’t work,” said Watson.

Evidence that VR can attract a positive response from audiences was provided by BT Sport’s Andy Beale. He shared the background to the live streamed VR experience for the UEFA Champion’s League final earlier this year.

“We’re not doing VR just because we can but only if it adds value,” said Beale. “Rather than saturate viewers every week with it we want to keep it as a tool for big occasions.”

One of the best received sessions was a call to action to extend racial and gender diversity across the industry. IBC project manager Jay Sakallioglu moderated a talk with Geoffrey Okol of ITN Productions, multi-cam operator Abigail Dankwa, and BAFTA’s Emma Perry, rejecting tokenism and calling for a pro-active stance to encourage greater range within craft and technician levels. “Diversity is not only common sense; it helps media companies adapt to the fast-paced environment, capturing ideas and delivering on innovation,” said Okol.

The HPA AWARDS: Call For Judges, Creative Categories

The HPA Awards honor the best work from our industry’s finest artists and companies. Are you interested in helping us find award-winning entries? Categories include color grading, editing, sound editing and visual effects for features, television and commercials. Ideally, judges are working in the categories, or have expertise in the craft. You join a high-caliber cadre of HPA Awards judges, and your contribution of time and expertise helps make this wonderful show what it is.

Please send us your contact information, credits or IMDB link, and we will make sure there’s no conflict of interest. Judging takes place at facilities in the Los Angeles area, and usually entails one evening of commitment, sometimes two; and begins in mid August.

LOS ANGELES — The Hollywood Section of SMPTE®, the organization whose standards work has supported a century of advances in entertainment technology, will host a demonstration of classic movie sound technology at its monthly meeting, scheduled for Tuesday, July 25, in Hollywood.

The event will include a live performance by Joe Rinaudo, founder of the Silent Cinema Society, on an American Fotoplayer (provided by the Academy of Motion Picture Arts and Sciences). The Fotoplayer is a type of player piano used in movie theaters during the silent era to provide sound effects and music.

Motion picture archivist Bob Heiber will deliver a presentation on restoring magnetic soundtracks from the 1950s. He will also screen sequences from 70mm Todd-AO and Cinemascope 55 productions, including Oklahoma and The King and I.

“Movie sound has undergone an incredible evolution since the early days of cinema,” said Jim DeFilippis, chair of the SMPTE Hollywood Section. “Our July meeting will provide a wonderful opportunity to experience what movies sounded like before the sound era, as well as when widescreen and stereophonic sound first hit theaters.”

The Hollywood Section of SMPTE® was originally organized as the West Coast Section in 1928. Today, as its own SMPTE Region, it encompasses more than 1,200 SMPTE Members with a common interest in motion-imaging technology in the Greater Los Angeles area. The Hollywood Section offers free meetings monthly that are open to SMPTE Members and non-members alike. Information about meetings is posted on the Section website at www.smpte.org/hollywood.

About SMPTE®

For more than a century, the people of SMPTE (pronounced “simp-tee”) have sorted out the details of many significant advances in media and entertainment technology, from the introduction of “talkies” and color television to HD and UHD (4K, 8K) TV. Since its founding in 1916, SMPTE has received an Oscar® and multiple Emmy® Awards for its work in advancing moving-imagery engineering across the industry. SMPTE has developed thousands of standards, recommended practices, and engineering guidelines, more than 800 of which are currently in force today. SMPTE Time Code™ and the ubiquitous SMPTE Color Bars™ are just two examples of SMPTE’s notable work. As it enters its second century, SMPTE is shaping the next generation of standards and providing education for the industry to ensure interoperability as the industry evolves further into IT- and IP-based workflows.

SMPTE’s global membership today includes more than 7,000 members: motion-imaging executives, creatives, technologists, researchers, and students who volunteer their time and expertise to SMPTE’s standards development and educational initiatives. A partnership with the Hollywood Professional Association (HPA) connects SMPTE and its membership with the businesses and individuals who support the creation and finishing of media content. Information on joining SMPTE is available at www.smpte.org/join.

All trademarks appearing herein are the properties of their respective owners.

Burlington, MA– Avid® (Nasdaq: AVID), a leading media technology provider for the creation, distribution and monetization of media assets for media organizations and individual creative professionals, today announced the findings of the inaugural Avid Customer Association (ACA) Vote. The ACA Vote gave Avid’s preeminent customer community the unique and unprecedented opportunity to directly influence Avid’s future offerings. The findings on emerging technology and new business requirements also provide valuable insights into the media industry’s future plans and challenges in relation to cloud computing/virtualization, IP networking and content delivery, 4K/UHD in mainstream broadcast, multiplatform content delivery, and virtual/augmented reality.

The ACA Vote set a precedent for the media industry by giving ACA members the opportunity to weigh in on their most important requirements and ensure that Avid continues to deliver new or improved offerings that will positively benefit the community, demonstrating a deeper collaboration between Avid and its community. Over 6,500 unique voters from over 4,000 organizations in 109 countries participated in the vote. Spanning the areas of creative applications, workflow solutions and emerging technology, it uncovered what will most significantly impact the future performance and success of Avid’s customer community.

The ACA Vote revealed that the vast majority of media professionals (71.7%) are considering moving some part of their infrastructure or workflow to the cloud over the next two years—the most popular being remote access workflows (15.8%). Just 4.8% are considering moving their entire infrastructure and workflow to the cloud, highlighting the important role that hybrid cloud deployment models will play in the media industry’s journey to the cloud.

A hybrid approach will also be important to the industry’s transition to IP. Just over half of respondents (50.9%) are considering hybrid SDI/IP connectivity for new investments. 26.6% of media professionals are considering IP-only connectivity. Dynamic scalability is the most popular reason for considering IP video/audio (36.6%), followed by new high-bandwidth productions like UHD (28.8%) and format-agnostic workflows (16.3%).

High-resolution media formats are firmly taking hold, with the majority of media professionals (64.6%) expecting to implement 4K/UHD across their organization within the next two years. OTT or internet delivery is by far the most prevalent delivery mechanism for 4K/UHD (50.7%), followed by theatrical/venue viewing (21.6%) and satellite or cable delivery (13.6%). Just 9.9% said terrestrial broadcast is their most prevalent form of 4K/UHD delivery. The biggest challenge to adopting 4K is the burden on storage capacity (31.6%), followed by the cost of adding/upgrading 4K capabilities (30.5%), and the negative impact on the real-time performance of creative apps (24.7%).

While most media professionals (73.3%) are creating content for multiple platforms, less than a third (32.3%) use a single online video platform for social media content distribution. The majority (67.7) use the social media service’s online video platform, making content distribution cumbersome and inefficient. The top two most important drivers for investing in multi-platform content production are reaching new audiences (37.8%) and maximizing audience engagement (37.7%).

While more than half of media professionals (58.4%) said that virtual and augmented reality are important to their strategic growth plan, the vast majority (82.3%) aren’t yet sure which business models to consider, and most (63%) have no plans to implement VR/AR over the next two to three years. The most appealing applications of VR/AR are entertainment (23.1%), live events (21.25), gaming (20%) and film (19.4%). 15-30 minutes is seen as the ideal length for VR/AR programming (29.4%), followed by 5-10 minutes (25.5%), less than five minutes (18.7%), feature length (16%), and one hour (10.5%).

“The ACA Vote represents a new phase of customer participation in Avid’s future direction, building on the deep community partnership with our customers and users,” said Avid President Jeff Rosica. “I am proud of our community for reaching this exciting milestone and applaud the ACA Executive Board of Directors, who oversaw this process. The results of the ACA Vote will directly influence innovations for the MediaCentral® Platform, the industry’s most open, tightly integrated and efficient platform designed for media, and ensure that the ongoing development of our comprehensive tools and workflow solutions for media creation, distribution and optimization continue to support what is most important to our customers and their creative, technical and business requirements.”

The newest software release for Avid NEXIS is now available to all new and current customers, delivering greater bandwidth for fast, reliable workflows and support for Avid Pro Tools to optimize professional audio production workflows

BURLINGTON, MA – Avid® (Nasdaq: AVID), a leading global media technology provider for the creation, distribution and monetization of media assets for global media organizations, enterprise users and individual creative professionals, today announced that the availability of Avid NEXIS®, the world’s first and only software-defined storage platform for media. Powered by the MediaCentral® Platform, the most open, tightly integrated, and efficient platform designed for media, Avid NEXIS and Avid NEXIS | PRO systems now provide the fastest, most efficient and reliable workflows for real-time media production, including highly intensive professional post-production and broadcast environments. With support for Avid Pro Tools®, Avid NEXIS also enables new collaborative shared storage workflows for professional audio production.

Unrivalled performance, scalability, and reliability for every production environment

For larger post and broadcast environments, Avid NEXIS | E4 and E2 enterprise-class storage systems offer greater performance for 4K/UHD, color grading and finishing workflows. New high-performance storage groups (HSPGs) deliver up to 28.8 GB/s of bandwidth in a single Avid NEXIS system, providing the throughput needed to handle full-resolution media for online editing workflows. For smaller environments, Avid NEXIS | PRO professional-class storage provides the industry’s most comprehensive collaborative capabilities while also delivering real-time 4K performance at up to 2.4GB/s, all at an even more cost-effective price point. Avid NEXIS and Avid NEXIS | PRO also provide real-time creative team collaboration using not only Avid Media Composer® but other editorial and creative tools including Adobe Premiere Pro CC, Apple Final Cut Pro X, DaVinci Resolve and more, as well as allowing for easy integration with third-party asset management systems.

New collaborative workflows for audio production

Avid Pro Tools, the industry-standard digital audio workstation, is now qualified on Avid NEXIS, enabling audio creative teams to leverage the industry’s most efficient and powerful media storage environment. With Pro Tools combined with Avid NEXIS, users can share projects on a centralized pool of media storage, turning work around faster by eliminating the time wasted moving files between different systems.

Major new Avid NEXIS features include:

High-performance storage groups, with each media pack capable of data rates up to 50% faster at 600MB/s, providing the performance needed for high-volume 4K/UHD, HD, and bandwidth-intensive media workflows.

Scalability enhancements that double the capacity of Avid NEXIS | Enterprise systems to support up to 48 media packs across a single scale-out networked system. Customers can mix and match a combination of Avid NEXIS | E5, E4, and/or E2 engines, providing nearly 3PB of storage capacity and 28GB/s of high-performance bandwidth.

The new version of Avid NEXIS including support for Pro Tools is now available to new customers as well as current Avid NEXIS owners with an active annual support and software maintenance plan. For more information, visit www.avid.com.

142017Jul

AMPAS at Work on Next ACES Version

At NAB 2015, the Academy of Motion Picture Arts and Sciences officially launched the ACES (Academy Color Encoding System). Now, says AMPAS Science and Technology Council managing director Andy Maltz, more than two years later, ACES has been adopted by innumerable product manufacturers and used on at least 100 feature films, from Guardians of the Galaxy 2 to Woody Allen’s Café Society. Marvel, Screen Gems, NBCUniversal and Netflix are among the studios that have committed to the standard. The Academy also launched ACESCentral.com, an online forum on which 700+ users discuss and seek support on ACES questions from online mentors.

It’s about time for some changes. “We always said there wouldn’t be a next generation ACES until ACES 1.0 was widely adopted,” says Maltz. “Right around the two-year point of ACES 1.0, it became more than apparent that it was time to start moving towards enhancements and extensions.” After years of serving as co-chairs of the ACES project, Starwatcher Digital principal Jim Houston and RFX president Ray Feeney stepped down, making way for Marvel Studios vice president of technology Annie Chang as incoming ACES chair and HBO director of production R&D Rod Bogart and EFILM vice president of imaging science/technical director Joachim Zell as vice chairs.

Joachim Zell

“Annie, Rod and I are all using ACES as a tool in our day-to-day production,” says Zell. “So we will also be able to talk about what does or doesn’t work, and guide it in the right direction to make it an end-to-end system.”

The first version of ACES has largely been a success. “Everything worked the way we expected it to work,” says Maltz. But, despite the fact that nothing in ACES 1.0 is “broken,” Maltz and the ACES project team became increasingly aware that some portions of the standard hadn’t been as widely adopted as others, and that some tweaks and enhancements were required for ACES to reach its ultimate destination. “Our goal is to get all six major studios to declare publicly that all deliverables should come in as ACES deliverables,” says Maltz.

As the first step, says Zell, they identified 15 different constituent groups, including the major studios, post production houses, DITs, cinematographers, VFX professionals, colorists, manufacturers and producers. Zell reveals that they have just conducted a meeting with Disney, Fox, Paramount, Sony, Universal and Warner Bros. “We all met in one room, and although it is too early to talk about what we’ve discovered, I can say that the studios are the ones who benefit most from a common standard in terms of look and color management, so we expected they would give positive feedback,” says Zell. “The studios definitely support the mission the Academy is going for by inventing ACES and bringing it to the market.”

Maltz enumerates aspects of ACES 1.0 that need to be tweaked. First is adoption of ACES’ metadata file, dubbed ACES clip. “You can use that metadata carrier to better communicate how to reproduce the colors,” says Maltz. “We’re adamant about that; it’s required for archiving. This has to happen for people to be able to get what they want.” The ACES team is also looking at the Look Modification Transform (LMT), an easier implementation of custom looks. Third is the Common LUT. “To communicate a look you need a standardized version,” he says. “We didn’t anticipate a programmatic or algorithmic description of a look, and one new requirement is that people need to use the algorithmic description, like shader language.”

Zell adds that the process of interviewing groups impacted by ACES will provide a roadmap for going forward. “The outcome will help us understand where ACES is at the moment and where people want it to go,” he says. “What we will have learned in the discovery phase will help us get to ACES Next or ACES 2.0.

“It was a high-octane crew,” says Ticknor. “As a team, we blended well, everyone contributed and the ideas flowed seamlessly.”

The Dolby Atmos mix was completed on Sony Pictures’ William Holden Stage.

Although Spider-Man is well known to movie fans around the globe, Homecoming introduces several new elements to the character. Notably, Spidey gets a new suit, designed by Tony Stark, that is outfitted with a variety of sophisticated tech including a detachable drone. “It was an interesting challenge to create the sound for the drone,” recalls Norris. “Steve came up with the idea of using a toy noisemaker that produces a high-pitched whistle. We used that and a palette of other effects to give the drone its personality.”

Later in the movie, Peter dons an older Spider-Man suit that he had created himself. It required a slightly different sound treatment. “Jon Watts wanted the web coming from the old suit to sound a little less modern than the web that shoots from the new Stark suit, which is hot and cool,” notes Ticknor. “We ordered a couple of 5,000-foot rolls of magnetic tape and let them unravel. They created a whooshing sound that became our old-school web.”

One of the most intricate blend of sound effects was applied to the winged suit worn by supervillain Vulture. The massive device changes over the course of the film acquiring new features and becoming more menacing. The team employed a mix of mechanical sounds for its metal feathers and wings, and jet turbines for the roar of its engine. “We used samplers to stack sounds together, and shape them to create a sense of movement that mirrored the action on the screen,” explains Lamberti. “It was a lot of fun!”

Hecker spent much of his time recording Spider-Man’s signature foot, body and suit movements, featured throughout the film. His task was anything but routine. “Jon Watts wanted Spider-Man to be stealthy, ninja-like,” Hecker notes. “He’s light on his feet, acrobatic. In one scene, he climbs up the side of a building, opens a window and enters a house. He’s upside down on the ceiling, crawling on his hands and feet. That’s all Foley.”

Hecker used a special neoprene shoe to recreate Spider-Man’s light footfall. “I wanted to capture his character through his footsteps and body movements,” he explains. “He’s sometimes moving very fast, sometimes fighting, sometimes sneaking. It’s important to convey the emotion of the scene through movement.”

Hecker also worked with Norris, Ticknor, O’Connell and Lamberti on custom sounds for Vulture’s winged suit.

During mix sessions, O’Connell and Lamberti blended thousands of custom sounds with dialogue and music to produce the finished soundtrack. “The sound editorial team was very well organized and that made it easy to swap things out and get it to picture,” says Lamberti. “The mix went very smoothly. Kevin, who’s a legend in the business, had a great overview of the entire soundscape and kept an eye on the big picture. The finished mix is clean and articulate. You can hear everything; nothing is overwhelming or underdone.”

O’Connell credited Norris, Ticknor, Lamberti and Hecker with providing an abundance of technical and organic sounds that help bring the world of Spider-Man to life. He also offered high praise for Composer Michael Giacchino. “He did a fantastic job with the score; it was well-balanced and right on the money,” O’Connell says. “It was a dream come true for a mixer.”

“With so much great visual material to work with, we could focus on Jon Watts’ vision in delivering an experience that audiences will remember for a long time,” O’Connell adds. “Spider-Man: Homecoming is more than an action film. There were scenes where we could have gone crazy with sound effects and music, but, instead, we did our best to stay true to the story and keep the focus on Peter with respect to the world around him.”

Blackmagic Design recently released Blackmagic Camera Control, a free iPad app that lets customers remotely control their URSA Mini Pro cameras via Bluetooth, along with Camera 4.4 Update for URSA Mini Pro cameras. The new Blackmagic Camera Control app is based on the open protocol for URSA Mini Pro cameras that Blackmagic Design demonstrated at NAB earlier this year.

Camera 4.4 Update can be downloaded free of charge from the Blackmagic Design website. Once installed, customers can download the Blackmagic Camera Control iPad app from the Apple app store.

All URSA Mini Pro cameras feature built in bluetooth connectivity, which until now has not been enabled. The built in bluetooth will allow customers to to send and receive commands from up to 30 feet away. Once the camera is paired with the iPad, customers can remotely power URSA Mini Pro on or off, change all major settings, adjust and add metadata using a digital slate and trigger recording. The Blackmagic Camera Control app is perfect for customers that need to control cameras in hard to reach places such as on cranes, drones, in underwater housings and more.

To make URSA Mini Pro’s Bluetooth support even more flexible, Blackmagic Design has developed a new, open protocol and is publishing a developer API, along with sample code, for customers that wish to build their own camera control solutions. This free API and sample code will be available later this summer.

In addition to the Blackmagic Camera Control app, Blackmagic Design has also released Camera 4.4 Update which enables Bluetooth functionality and adds new preset timecode options on URSA Mini Pro cameras. The update also adds compatibility with Canon 18-80mm T4.4 for iris, focus and record trigger, along with improved EF, PL and B4 support, improved digital slate functionality and improved zebra stripe overlays on URSA Mini 4K cameras.

“URSA Mini Pro has become incredibly popular because of its amazing image quality combined with broadcast features and controls, built in ND filters and interchangeable lens mount,” said Grant Petty, Blackmagic Design CEO. “The new Blackmagic Camera Control app and open API means that the possibilities are truly endless. Customers are going to have the tools they need to build completely custom remote control solutions of their own design using the new Bluetooth support!”