Food cold chain technologies such as computer vision, artificial intelligence and data analysis are tracking fresh produce from the farm to trucks to stores in an effort to reduce food waste.

The moment a strawberry is picked in the field, it begins to decay. From there, it’s a race to deliver it fresh to the consumer. This was easier a few generations ago when most people worked in agriculture and lived close to food production. Distributing food today is more complex as more consumers rely on supermarkets to get food.

Today, delivery of perishable food relies on what’s known as the food cold chain. This vastly complex distribution of food from farm to fork relies on maximizing the quality and longevity of crops.

“The goal is straightforward and pretty simple,” said Dan Hodgson, a partner with Linn Grove Ventures, a Fargo, North Dakota-based agricultural venture capital group. “The environment around that crop, whether it’s on a truck or a plane, has to be just right all through its journey — it’s in the journey where it gets complex.”

Keeping strawberries at the right temperature, humidity and airflow is only one problem. The fruit also needs to be delivered to the right markets in the right amounts on the right days, where it will actually be purchased.

The food cold chain keeps strawberries fresh from farm to store to table.

“Managing quality means managing many different people at each stage of distribution and the different speeds it happens at,” Hodgson added.

“Sensors and cloud computing are helping to really get handle on it.”

That’s where a smartphone app for food inspectors can help, according to AgShift, an agtech AI firm in Santa Clara, California. The company uses algorithms to assist with food inspection at different stages of distribution.

“Let’s say we’re looking at 20 strawberries,” said Miku Jha, CEO of AgShift. “Two different inspectors might come back with two different results. What the technology does is to help those inspectors make more objective observations.”

Photographing produce and sending the pics to the cloud for analysis allows AgShift to leverage computer vision and deep learning algorithms to assess the quality of produce each time it’s inspected on its journey.

“Digitization and automation really make an impact on efficiency,” Jha said.

More accurate inspections give sellers better insights into the shelf life and pricing of specific shipments of produce. Knowing the quality of each box of strawberries — and other perishables — serves as a baseline for many kinds of decisions in the food cold chain.

Tech Before Planting

The food supply chain starts well before seeds are even planted in the ground. Produce farmers literally map out every inch of field with GPS technology, already knowing what factors could affect planned crops.

Taylor Farms uses a conveyor system to package lettuce more quickly. Image courtesy of Taylor Farms.

“We have to predict literally a year in advance what our needs are going to be because we contract for all of our lettuce and broccoli acreage,” said Bruce Taylor, CEO of Taylor Farms in Salinas, California.

“If we lose a customer, we’ve got too much lettuce. If we gain customers too fast, we run short. And so it’s a huge planning exercise.”

Taylor said it’s so precise, the operation plants its greatest volume of head lettuce a few months in advance of the biggest salad day of the year in the United States: Mother’s Day.

“It’s important we nail the harvest on that one,” said Taylor.

Taylor Farms is known for being an early adoptor of technologies like computer vision, cloud computing and robotics in the field, especially its move toward automated harvesting.

Self-Driving Food Distribution

“It would be a wonderful thing,” said Steve Geraci, director of fleet operations at Certified Freight Logistics, a company that provides produce distribution for Costco in Santa Maria on the California central coast.

Not only would it increase efficiency but it also would address labor shortages.

“Recruiting drivers is such a nonstop issue for us because over-the-road trucking is a lifestyle and it’s not for everybody,” he said.

Geraci points to governmental policy decisions that could take years to sort through before food distributors get into driverless truck technology. However, there’s a lot to be said for the advancements in the refrigeration units on Costco’s trailers.

“They have a lot of computing power,” he said. “They’re constantly monitoring the engine, the temperature, moisture and the inflow and outflow of the air circulating in the box.”

The information is transmitted to the cloud where the current status of the unit can be accessed in real time while the truck is on the road.

Food Cold Chain and Solving Waste

“We grow and produce enough food on the planet to feed 10 billion people,” said John Mandyck, chief sustainability officer at UTC and co-author of Food Foolish: The Hidden Connection Between Food Waste, Hunger and Climate Change.

“We live on a planet of 7 billion and only about 6 billion are getting enough food so that, right there, tells us that we have a 40 percent inefficiency in our system, where about 40 percent of our food never makes it from our farm to our fork.”

Solving food waste with greater cold chain efficiency not only means meeting the nutritional needs of the world’s population by 2050 but also, according to Mandyck, it addresses significant issues of carbon emissions.

“If you measured that as a country, food waste would be the world’s third largest emitter of greenhouse gases,” he said.

Mandyck envisions reducing food waste by introducing basic food cold chain technologies to places in the world such as India or Africa where it’s not unusual for half the harvest of a crop like tomatoes to be lost in shipping.

Using the internet of things, farms can better manage food waste. Image courtesy of New Holland.

“If those farmers had access to capital to use a small refrigerated truck, they could get 100 percent of the tomatoes to market, get better return on their investment and feed more people,” said Mandyck.

Mandyck’s team has helped design a simple truck cooling system for India and China, where the technology can get food to those underserved. While these aren’t the intelligent cloud-synced trucks UTC develops for the North American and European markets, these lower cost refrigeration trucks maintain 40 degrees Fahrenheit, which can prolong the life of perishable foods.

Perhaps the biggest waste of food happens in the richest countries where farmers sometimes let crops rot because they can’t get a profitable price, supermarkets discard surplus produce, and consumers buy too much food and end up tossing it.

“What we’re doing is helping to connect the dots with people who are focused on this,” said Kevin Fay, executive director of the Global Food Cold Chain Council. “What we found is there are a lot of programs on food waste, but not a lot of are working together.”

Fay believes that solving food waste is a huge deal, one that unlocks solutions to other major global issues.

“Seventy percent of our fresh water supply around the globe is used for agriculture,” said Fay. “If we’re wasting 40 percent of the food we grow that means we’re wasting 30 percent of our fresh water supply.”

New technologies like cloud databases and AI can help cut food waste. Fay points to consumer apps such as YourLocal, which alerts users about supermarkets offering discounted food that could otherwise be discarded.

“By connecting the dots, you’re able to feed more people, reduce greenhouse gas emissions, reduce waste of fresh water and improve the economic lives of society around the world,” said Fay.

Future in Farm Data

Time-to-destination is one variable that many companies like to optimize in the supply chain, and even more so in food cold chaining, said Giby Raphael, director of logistics and asset management solutions at Intel’s Internet of Things Group.

Data is a key driver in improving the food cold chain.

Making faster airplanes, ships and trucks only helps to a certain point, said Raphael. Feeding more into the pipeline helps achieve speed, but at the expense of excessive wastage and cost.

Thus, a holistic approach to optimization is needed and the ability to extract high quality data from the end-to-end supply chain is key to reducing food waste.

“There needs to be a myriad of technologies and standards that come together to make this possible,” he said. “The technology is enabling cheap, granular and high quality data through the ‘internet of things.’”

Sara Menker, CEO of New York-based Gro Intelligence, takes a similar global view in using data to decrease food waste.

“What excites me is making data about the food ecosystem a utility,” Menker said.

She said information is one of the most powerful assets in farming, distribution and marketing. Yet getting access is one of the biggest challenges. The information exists — yet there simply is no central portal to get at it cost effectively.

Menker’s organization takes large amounts of global agricultural data on the entire food ecosystem and cost effectively synthesizes it for users, including consulting firms, hedge funds, governments and other groups.

Hodgson of Linn Grove Ventures agrees that information is the key driver in improving the food cold chain.

“The more I know about the soil and its conditions, the health of the crops while they’re growing, the condition of a crop after harvest when it’s on a truck — all of it means the higher quality product I can bring to market.”

In its third year, drone film festival spotlights the rise in quality aerial filmmaking techniques around the world.

Inside the dark, historic Roxy Theater in San Francisco, the distinct buzz of a drone drew all eyes upward. A dozen audience members instinctually threw their arms up to form a circle, instantly creating a racing obstacle course for the tiny LED-lit drone zooming around inside the crowded movie house.

It was an unplanned, attention grabbing warm up act well suited to kick off the third annual Flying Robot International Film Festival (FRiFF), which brought together a diverse, fun loving audience to experience the best in drone cinematography from around the world.

FRiFF 2017 brought the best in drone films to the Roxie Theater.

Twenty-five films from a dozen countries were screened on Nov. 16. The night’s big winner was 2D Run – Mixed Motion Project, which received Best of Show, Audience Choice and Best of WTF LOL category awards.

More than any other mini film screened that night, 2D Run had people swaying, ducking and jumping out of their seat to follow the protagonist: a daredevil, stop-at-nothing parkour runner. A cross between Super Mario Bros and Jungle Run but in real life, the hero jumped from roof to roof through cities, towns and decaying buildings. The runner hustled through lush landscapes into the sea and back at an unrelenting pace.

“Filmmaking techniques and pilot skills have really matured,” said FRiFF founder Eddie Codel. “The perspective of that piece is top down. I think it comes off so well because of the unique perspective, and how the filmmakers use that to their advantage during unexpected moments when the actor is seen running up and down the side of a building.”

Each year, Codel brings together film festival fans, drone enthusiasts, filmmakers, tech industry workers and students fascinated by drones and visual storytelling. This year, a festival goer brought a Tiny Whoop, one of the smallest FPV drones, and entertained the waiting crowd with a few impromptu aerial performances.

He also saw more spontaneous and daring – at least one judge called it controversial – approaches for drone filmmaking in Flight of the Year by Paul Nurkkala, which took Best FPV Racing and Freestyle.

“That film is one single take conducted by an extremely talented FPV pilot navigating an obstacle course in and around a moving train,” said Codel. “There are at least a dozen moments in that film where the pilot could have lost control of his racing drone and his film would never have been made. It’s a testament to his skill as a pilot.”

]]>jachhax20171130In its third year, drone film festival spotlights the rise in quality aerial filmmaking around the world.Entertainment3NTG22UncategorizedEntertainment30drone cinematography70In its third year, the FRIFF drone film festival spotlights the rise in quality aerial filmmaking techniques around the world.Best of Drone Cinematography 2017-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/DSC00662-e1511979870717.jpgPlayerUnknown’s Battlegrounds Reaches Esports Major Leaguehttps://iq.intel.in/pubg-esports-reaches-major-leagues-at-iem-oakland/
https://iq.intel.in/pubg-esports-reaches-major-leagues-at-iem-oakland/Thu, 23 Nov 2017 00:00:00 +0000

The hit game known as PUBG gets its professional esports debut at Intel Extreme Masters Oakland.

Cole Robbins usually stays up late playing games on his computer, but in early November he was struck by insomnia. The next day, he left his small Midwestern town to compete in his first professional gaming competition: the 2017 Intel Extreme Masters (IEM) in Oakland, California.

With $200,000 in prizes up for grabs, Robbins is one of 80 athletes competing on Nov. 18-19 at Oracle Arena, home of the defending NBA World Champions, the Golden State Warriors.

Robbins’ team, the Miami Flamingos, are part of eports history, competing in the first ESL tournament to feature the runaway hit game PlayerUnknown’s Battlegrounds (PUBG). Although still technically in beta, PUBG has taken the internet by storm, selling more than 20 million copies en route to the big stage.

Competition was fierce at IEM Oakland. Photo credit: Carlton Beener.

“With PUBG, we are seeing the birth of a new esport,” said Michal Blicharz, vice president of Pro Gaming at ESL. “This is going to be the first competitive competition for PUBG that takes place with professional teams.”

From Beta to Big League

In order to transform PUBG into a professionally-played game, the players and organizers had to overcome a few hurdles.

Since it attracted more online players than any other title in history, PUBG’s developer Bluehole has been scrambling to satisfy an audience hungry for high level competition. At present, the game lacks many of the basic features expected from professional esports, including leaderboards, ranked competitive games and a matchmaking system that pits players against others with the same skill level.

Those omissions haven’t deterred fans. Players who wanted a competitive experience forged ahead on their own. Soon after the game launched in March, a group of serious players formed the PUBG EXP. Community, a discord channel that provides a way for PUBG enthusiasts to form teams, connect and compete. Many teams playing at IEM Oakland, including the Miami Flamingos, formed there.

An esports team psychs themselves up at IEM Oakland. Photo credit: Carlton Beener.

Organizers of IEM, an event that has shaped esports for that past 12 years, watched PUBG grow up fast from a fun pastime to an enormous community of players.

“At IEM, we are trying to balance the historical context of esports with the up-and-coming games,” said Tim Hansen, IEM’s program manager. Hansen said that even though PUBG is unproven as an esport, the excitement surrounding the game was impossible to ignore.

Competition is Crucial

Hosting a PUBG stadium event is logistically challenging. With a battleground that continually shrinks, up to 100 players compete simultaneously in the game. IEM Oakland will feature 20 teams with four players each, along with 12 referees overseeing the matches.

And there’s even more action elsewhere at the event. Alongside a tournament for perpetual mainstay Counter-Strike: Global Offensive, the IEM Oakland event also features a regional qualifier for the VR esport The Unspoken, made by Insomniac Games, as part of the Intel and ESL’s VR Challenger League.

“We’ve never had a match this big in IEM history,” said Blicharz. “We’ve never had 80 PCs all on the floor. This is obviously unprecedented.”

Fans will get to watch up to eight different camera views of PUBG action. Photo credit: Carlton Beener.

Because esports are a spectator sport and the PUBG match will have 80 players skirmishing on a perpetually shrinking battlefield, the audience presents another potential headache for organizers.

“This by far is the most complex game to observe across all of esports,” said Blicharz. “Each match contains 20 different stories of individual teams fighting to survive.”

The broadcasting crew has been practicing its live camera coordination for months. Blicharz said that they will have up to eight different people capturing the action and fusing together the footage to make the match presentable to a live audience.

He said “observers” will operate invisible cameras inside of the game, while the director calls out the parts of the battlefield to highlight. Similar to professional football or basketball broadcasts, they can show a wide array of views, from third-person dioramas to battles seen from the player’s point of view.

The athletes are dealing with their own challenges too: Running through grueling marathon practice sessions every day isn’t easy. But in those days and nights leading up to IEM Oakland, Robbins’ mind was drawn deeply into team strategies.

“We practice all day long,” he said. “We’ve come here to do our best.”

]]>dharmrax20171123The hit game known as PUBG gets its professional esports debut at Intel Extreme Masters Oakland.Gaming5CCG13UncategorizedGaming5JasonJohnson0PUBG esports65The hit game known as PUBG gets its professional esports debut at Intel Extreme Masters Oakland.PlayerUnknown’s Battlegrounds Reaches Esports Major League-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/20171115_Carlton-Beener_IEM-Oakland_00296-e1510936299753.jpgFunctional Safety Will Protect Passengers in Autonomous Carshttps://iq.intel.in/functional-safety-will-protect-passengers-in-autonomous-cars/
https://iq.intel.in/functional-safety-will-protect-passengers-in-autonomous-cars/Thu, 23 Nov 2017 00:00:00 +0000

The same processing elements built into industrial machinery, robots and drones will help keep passengers safe in autonomous cars.

When people think about autonomous cars, their most pressing concern is safety.

A 2016 MIT study found that nearly half of people say they would never purchase a fully autonomous vehicle. Among their reasons: They don’t trust that autonomous vehicles will be safe. The wonder: Computers glitch all the time. What happens when the driver behind the wheel is digital rather than physical?

Here’s a fact that might calm some nerves: Unlike human drivers, computer drivers will be able to recognize mistakes before even making them.

Known as “functional safety,” this capability is one of the underlying principles that make autonomous driving more safe than manual driving, according Riccardo Mariani, chief functional safety technologist in Intel’s Internet of Things Group.

“Autonomous vehicles make decisions independently of humans,” said Mariani. “That’s why it’s vital to architect functional safety from day zero.”

Functional safety is built into autonomous cars from day zero.

The designers and engineers behind countless everyday objects employ functional safety to protect consumers from glitches of all sorts, according to Mariani. It keeps stoves and furnaces from getting so hot they start fires; it keeps automatic doors from slamming shut when people walk through; it turns off the lawnmower when it tips over.

“It’s not just what is visible — it’s what is invisible,” Mariani said about safety features.

In autonomous vehicles, functional safety will ensure that the car brakes when there’s a cat in the middle of the road, but not when there’s a piece of trash blowing across it, Mariani said. It makes certain that the car doesn’t speed up instead of slow down in a traffic jam, or suddenly go into parking mode when driving on the highway.

It’s like the airbags in a traditional vehicle, Mariani said. “You don’t want the airbag to explode in your face if there is not an accident.”

Banishing Bugs

Modern cars — including semi-autonomous and fully autonomous vehicles — each have hundreds of millions of lines of code controlling everything from braking and accelerating to steering and signaling.

What cautious consumers might not realize, however, is that human drivers also have bugs.

In fact, human error causes 94 percent of serious car crashes, according to the National Highway Traffic Safety Administration (NHTSA), which says autonomous vehicles could save the lives of more than 30,000 Americans who die in motor vehicle crashes every year.

For that reason, it argues, safety isn’t why people should put the brakes on autonomous driving — rather, it’s the reason to step on the gas.

Like NHTSA, automakers are accelerating toward autonomy. And with manufacturers such as Ford and BMW promising to have fully autonomous vehicles ready by 2021, skittish consumers may have to get comfortable sitting in the backseat sooner than they think.

“When you have an electronic system, you want it to do what you expect it to do,” Mariani said.

“So, you put intelligence inside it that helps it understand, ‘Oh, I’m getting ill, and since I’m getting ill, either I’m going to shut down or I’m going to do something else.’ Autonomous cars will save a lot of lives if we keep them safe so they can accomplish their mission.”

Holistic Safety

What makes functional safety so effective in autonomous vehicles, Mariani said, is that it takes a holistic approach to occupant protection.

Consider a house that’s located on a fault line, for example. If it’s designed and built from the ground up with seismic reinforcements spanning the entire structure, it will likely stand up to earthquakes. However, if protective measures are applied piecemeal post-construction, it probably won’t.

The same principle holds true in autonomous vehicles, which are strong because safety is baked into them. It’s part of the cake, not the icing, Mariani said.

Although automakers employ rigorous testing to debug vehicle hardware and software in the first place, incorporating functional safety into self-driving cars from day zero means is imperative. In addition to adopting a specific lifecycle that follows each step of hardware and software development, vehicles need to be equipped with mechanisms that perpetually self-test to ensure they’re working properly even during random failures.

“Functional safety in hardware and software is necessary because it provides the foundation of safety. But it is not sufficient by itself,” Mariani said.

“We must add on top a multi-agent safety that can determine the kind of maneuvers the car will make and the kind of sensing errors that could lead to accidents.”

International Auto industry standards dictate that fully autonomous vehicles must detect or safely manage 99 percent of faults that could even potentially threaten occupants’ safety, according to Mariani, a key author of the ISO 26262 standard related to functional safety.

“It is about ensuring we can control the gap between the reality designed by engineers and the ultimate reality,” he said.

The most important question about future cars isn’t: Who will be driving? Rather, it’s: Will they keep people safe?

Thanks to functional safety and multi-agent safety, Mariani said, consumers can feel confident that they will.

A new breed of cyber-athlete is breaking ground in VR esports, proving not all video games are played sitting down.

Advancing in the Virtual Reality Challenger League (VRCL) takes more than excelling at video games. At the North American regional qualifier last month in San Jose, winning required something unique: a high tolerance for pain.

Will Romero, who plays for the team Dyskovr, injured his shoulder while hurling a virtual disc in the game Echo Arena, a three-on-three multiplayer game that’s played like hockey in zero gravity. Romero’s teammate Kevin Douglas once stubbed his finger so hard that he felt lightheaded and missed practice. Another teammate, Dave Fox, suffered a soft tissue injury under his rib cage from swinging his arms too hard.

“The physicality on the stage was exciting,” said Lisa Watts, who leads Intel’s involvement in the VRCL. “Players were putting their whole body into it.”

For VR esports, player injuries like these are just growing pains in the VRCL’s sprint to bring esports to new players and audiences. While traditional esports are played with a keyboard, joystick and/or mouse, virtual reality (VR) uses motion-tracking technology that involves the full body.

Games like Echo Arena require a new kind of cyber-athlete that could change esports forever.

“The technology is early, but we’re laying the foundation,” said Stuart Ewen, the senior innovation manager at the ESL Gaming Network, which, together with Intel and Oculus, is producing VRCL in five cities around the world.

“This tournament is just the beginning.”

VR Esports Take Shape

VR esports are already making waves. Eight of the best Echo Arena teams squared off during the annual Oculus Connect event this year in pursuit of bragging rights and a $20,000 purse.

[Read: Esports Athletes Get Ready to Rumble in VR Challenge]

VR esports players get physical in this new breed of active games.

Elsewhere at the event, demos of new VR games were available for the audience to play — many of them tailored for esports. Ubisoft’s Space Junkies, for example, crosses laser tag with jetpacks.

Echo Arena may be the perfect game to usher in this new era. In a three-versus-three match, teams compete for possession of the virtual disc, attempting to toss it into the opposing goal. It closely resembles sports like soccer, another great spectator sport. Players jump, crouch and extend their bodies, performing acrobatic shots and behind-the-back passes while their avatars respond accordingly inside the game.

For a long time, pairing VR and esports didn’t seem likely. Standalone VR headsets are still growing the install base necessary to support a larger community. Moreover, multiplayer VR games are still new, lacking the years of refinement of an esport juggernaut like League of Legends.

That’s why professional leagues like VRCL are so important for taking these exciting performances onto a larger global stage. After magnetizing San Jose, the VRCL traveled to Hamburg for the European Qualifier in Echo Arena, drawing more than 37,000 online fans. Next year, the winning teams from both tournaments will clash at the Intel Extreme Masters (IEM) World Championship in Katowice, Poland.

“There’s a lot of interest in the VR esports space among players, technology creators and game developers,” said Watts. “As we look to potentially expand the league into next season, our goal is to compete on the big stage alongside larger esports.”

VR esports players competed in Echo Arena at Oculus Connect in San Jose.

As VR esports go forward, Watts said that the tournament will strive to strike the right balance between physicality and finesse. The goal is to attract players interested in VR esports and players who come from a more traditional gaming background.

Watts said that VR esports could welcome a broad audience, pointing out that “Echo Mom” Sonya “Hasko” Haskins, a mother of five children, competed in the San Jose event.

Each event generates massive amounts of data that Watts and organizers can use to both better understand how people navigate virtual environments, and to develop new VR experiences and technology.

VR Challenger League Competition

The San Jose competition came to life on a giant screen that displayed nine different views of the matches, as a crowd of approximately 1,000 people snacked on nachos during stops in the action. The event was also streamed on Twitch and Facebook Live.

“We practiced our butts off to get here,” said the formerly injured Fox, who was squeezing a pair of hand strengtheners while waiting for his match to begin. “This is our chance to be part of something big.”

During the final round of the tournament, the top-seeded team Eclipse put on a devastating offensive performance.

“We run up the score, we play as hard as we can and we don’t take it easy on anybody,” said Kerestell Smith, the team’s captain, before the match.

Eclipse regained control in the second game, but with less than a minute to go, Phangasms rallied with a fast break for the tying score. The whole crowd tensed. When they missed the shot, the room let out a collective sigh.

The three teammates for Eclipse dropped their VR equipment and draped their arms around each other’s shoulders in a victory stance. Before hoisting the trophy, they staggered centerstage for a post-game handshake, their bright blue jerseys dark with perspiration.

“You can’t be afraid to sweat,” said Smith, who trains at the gym three times a week with power lifting and cycling. “Just playing the game is not enough.”

]]>Vicky Thompson20171123A new breed of cyber-athlete is breaking new ground in VR esports, proving not all esports are played sitting down.Gaming5UncategorizedGaming5JasonJohnson0VR esports82A new breed of cyber-athlete is breaking new ground in VR esports, proving not all esports are played sitting down.Esports Get Physical in VR Challenger League-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/2017-Oculus4-VR-esports-1-e1510769489498.jpgComplexCon Celebrates Technology and Creativityhttps://iq.intel.in/complexcon-celebrates-technology-creativity/
https://iq.intel.in/complexcon-celebrates-technology-creativity/Tue, 21 Nov 2017 00:00:00 +0000

ComplexCon 2017 is a wrap, but the second annual world’s fair for technology, fashion and music left a lasting impression.

The internet came to life as roughly 50,000 creators and consumers flocked to the Long Beach Convention and Entertainment Center for the second annual ComplexCon.

Equal parts convention and music festival, the event lived up to its billing as a world’s fair for today’s generation. Interactive experiences, musical performances and countless retail activations filled the center.

Attendees may make the trek to Long Beach for the music or fashion, but cultural icon and ComplexCon founder Marc Ecko asserts that technology is what powers the entire experience.

“Technology is the subtext of everything — it is woven into nearly every creative expression,” he said. “It’s the oxygen that our creative culture breathes.”

Nothing Conventional About This Convention

The difference between ComplexCon and a traditional convention was apparent the moment attendees set foot in the arena lobby. There they came face-to-face with a massive video wall featuring ComplexCon host committee member Takashi Murakami’s custom motion art.

With neon signage and graffiti art decorating the venue, live music courtesy of an A-plus lineup of hip-hop artists drew the crowd to the Pigeons and Planes stage.

Skateboarders showed off their skills on waist-high ramps, while ballers shot hoops on the convention’s half court. Tattoo artists used their talents on attendees looking to go home with a permanent memento, while those seeking a more temporary style update headed over to the pop-up beauty salon.

The bright, bold cartoon art of tokidoki stood in stark contrast to the black-and-white motif of the Adidas booth and the rich wooden floors and walls of the G-Star RAW installation.

Meanwhile, Intel’s neon-dominated installation transported attendees to a bustling street in Tokyo, weaving cutting-edge technologies into a single interactive exhibit.

The Intel space was defined by two walls featuring the art of Argentinian motion graphics designer Esteban Diacono and anime-influenced Los Angeles-based graphic designer and artist McFlyy. Intel RealSense technology enabled attendees to trigger animations on both walls using gestures, such as a simple thumbs-up.

Immersing Linkin Park Fans

Those interested in dipping a toe into the emerging world of virtual reality (VR) could do so at Intel’s immersive Linkin Park experience. Surrounded by silhouettes of band members, viewers used handheld controllers to trigger a custom Linkin Park video clip.

Band members Joe Hahn and Mike Shinoda made cameo appearances to share the experience with fans.

“Not only is this a great fan experience, it’s an homage to the band and its tech-forward, innovative spirit,” said Rajeev Puran, manager of Intel’s commercial VR and AR experiences. “Digital media is just another way the band continues to create.”

The combination of neural networks, transfer-learning and Intel Xeon processors made it all possible, according to Shardul Golwalkar, an Intel technical marketing engineer. However, the speed of the process belied its complexity — just minutes after saying cheese, attendees walked away with a work of art.

Unique photo ops were a trend at ComplexCon. Attendees hit up Old Spice’s Puppy Party installation to have their portrait snapped with an adorable pup. GIPHY, the popular online database of GIFs, welcomed visitors to their kaleidoscope photo booth as well as an after-event sharing page for ComplexCon psychedelic shots.

Year two of ComplexCon may now be a wrap, but Ecko hopes the experience has a lasting impact on all who attended.

“I think the interactivity of it and the sort of Willie Wonka-ness of it is inspiring for people, not just as consumers but as creative thinkers,” said Ecko.

“I hope they take that energy back to the projects they’re working on and use it to connect with people.”

]]>patilskx20171120ComplexCon 2017 is a wrap, but the second annual world’s fair for technology, fashion and music left a lasting impression.Entertainment3CCG,IOTG13,18UncategorizedEntertainment30ComplexCon82ComplexCon 2017 is a wrap, but the second annual world’s fair for technology, fashion and music left a lasting impression.ComplexCon Celebrates Technology and Creativity-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/Intel_ComplexCon-e1510267159533.jpgDrone Data Assists Santa Rosa Fire Recoveryhttps://iq.intel.in/drone-data-assists-santa-rosa-fire-recovery/
https://iq.intel.in/drone-data-assists-santa-rosa-fire-recovery/Tue, 21 Nov 2017 00:00:00 +0000

An Intel commercial drone pilot is called to the scene after the Northern California fire to aid the search and rescue team.

Andrew Georgopoulos has only seen the horrible aftermath of the 1945 nuclear bomb blasts of Japan in photos. But as he drove through Sonoma County following the 35,000-acre Tubbs fire that recently killed over 20 and destroyed more than 5,600 structures, he couldn’t help compare the charred scenery to Armageddon.

With a commercial-grade Intel Falcon 8+ drone in the trunk, Georgopoulos was escorted by police through charred neighborhoods of Santa Rosa, California. His assignment was to digitize the aftermath from the sky, so first responders could leverage computer technology to speed the recovery efforts.

“It looked like a nuclear bomb had gone off,” says Georgopoulos, a drone pilot in Intel’s drone team in the New Technology Group.

“As far as the eyes could see, everything was flattened. Metal safes were literally melted. Cars were burned to the ground, except for their frames.”

Getting the Call

A month before the fires, Intel CEO Brian Krzanich announced at InterDrone 2017 a program with California-based Menlo Park Fire Protection District to support its research on how drones can improve emergency response and critical responder situational awareness.

As acre upon acre of wineries and neighborhoods burned in Northern California in October, the Menlo Park Fire District called on Intel to assist with aerial assessment of the scene.

Georgopoulos drove three hours north from Intel headquarters in Santa Clara to Sonoma County, where he joined the Alameda County Sheriff’s Office. Together they used the drone to observe the destruction from above and capture images and data used to determine search and rescue efforts.

Georgopoulos flew drones for the entertainment industry in Los Angeles before coming to Intel a year ago. But this was the first time he flew an eight-rotor unmanned vehicle over several scorched neighborhoods, taking high-resolution photos of areas too hard to reach for fire engines in a timely manner.

Hundreds of his drone photos were then stitched together using the Intel Insight Platform. This allowed his teammates in Santa Clara to create orthomosaics, which are a series of individual photos digitally matched up to form a new composite image. Adjusted for topographic relief, lens distortion and camera tilt, orthomosaics can be used to measure true distances.

On the left is one of thousands of drone images eventually stitched together to create mosaics of neighborhoods (right) to help firefighters access damage and plan search and rescue efforts.

The data was also tagged with GPS coordinates so fire departments could then use those images to assess damage and find possible survivors.

Making it Safe for Fire Dogs

The second day, Georgopoulos flew over a Santa Rosa neighborhood with a thermal camera on the drone to determine ground temperature. Fire officials needed to know if it was safe enough for their dog patrol, tasked with sniffing out an elderly man who refused to leave his home during the fire.

Data from the drone indicated it was OK to send out the rescue dogs, and sadly soon after they found the man’s remains.

“That was a tough day,” recalled Georgopoulos.

Orthomosaics are a series of individual photos digitally matched up to form a new composite image of neighborhoods burned in the fire.

He said drones are providing information in minutes, and in search and rescue situations faster response time can save lives.

“With these thermal and mapping payloads, we can map and dice data and 3D models to help search and rescue in ways they’ve never had before.”

A tiny drone buzzes across a garden. It briefly pauses on a flower to collect its pollen before continuing on to the next plant. At a time when many of nature’s pollinators are dying off, could the future of the global food supply rest on these miniature flying robots?

Eijiro Miyako, a chemist at the Nanomaterials Research Institute of the National Institute of Advanced Industrial Science and Technology in Japan, seems to think so.

That’s why he built a small drone capable of artificially pollinating plants. Equipped with a sticky gel and some horsehair on its belly, the unmanned aerial vehicle (UAV) can catch and release pollen grains as it moves from plant to plant.

“This drone project is one of my lifeworks for saving the world,” said Miyako, a tenured senior researcher. “I believe it will possibly revolutionize agriculture. I believe it will also be a game-changing technology in the near future. That’s my dream as a scientist.”

His proof of concept used a toy drone, purchased from Amazon, to pollinate Japanese lilies. In theory, these robotic pollinators could soon operate autonomously using artificial intelligence (AI) and GPS to work together to pollinate a garden or a farm en masse.

While Miyako admits that the technology is still in its infancy, his prototype comes at an important juncture for pollinators.

Vanishing Bees

Bees are declining at a dangerous rate with possible causes including habitat loss, pesticides, pollution, invasive species, pathogens and climate change, according to a United Nations group. This population drop is a cause for alarm because bees are vital to the global food supply.

Chemist Eijiro Miyako designed a drone prototype with a sticky belly to artificially pollinate plants. Image courtesy of Eijiro Miyako.

Approximately 75 percent of the world’s food depends on pollination by more than 20,000 species of bees, plus other pollinators including flies, butterflies, beetles, moths, wasps, birds and bats. Nature’s pollinators also sustain nearly 90 percent of the planet’s wild flora, so the bee decline is leaving the world’s ecological diversity in potential limbo.

Dave Goulson, a professor of biology at the University of Sussex who has been studying bumblebees for the last 25 years, explained why bee populations are decreasing.

“You’ve got bees that are hungry, infected with foreign diseases and are being poisoned all at the same time,” said Goulson. “It’s not really surprising that sometimes they die. It’s the combination of stresses that’s killing them.”

Beekeepers are beginning to witness the effects of these factors firsthand. Kenneth Boyce, a beekeeper based in Central New York, has seen one invading pest jeopardize his hives. Varroa mite, a parasite that originated in Asia and then spread throughout the world among bee populations with no immunity, is considered one cause of colony collapse disorder.

“It’s much harder to keep bees nowadays than it was in the past,” said Boyce who has been beekeeping for 17 years now.

“We had losses of 10-15 percent per year. Now with Varroa on hand, and the viruses that they vector, losses of 60-70-80 percent isn’t unheard of,” said Boyce.

“That’s on a yearly basis and it’s awfully hard to sustain your population of bees when you’re having annual losses of over 50 percent.”

Plan Bee

In the wake of bee population decline, other problem-solving minds have come to the table. Anna Haldewang learned about issues inflicting bees while watching the documentary Vanishing of the Bees. Inspired by the plight of the bee, the then-student at Savannah College of Art and Design focused her school industrial design project on the issue.

“It’s sad when there’s something that’s so small and doesn’t have a voice and we’re frantically trying to figure out why they’re declining. I thought, ‘Gosh, I really want to design something for this,’” said Haldewang.

The result, entitled Plan Bee, also examines the possibility of artificially pollinating plants with a drone. Unlike Miyako’s prototype, which uses a sticking method to accumulate pollen, Haldewang’s design will attempt to gather pollen with suction.

For Haldewang, the solution is more than just about pollination. It’s about creating a connection between a person, the drone and nature in a way that helps bring awareness to the bee’s predicament. Her model is inspired by the color of the bee and the forms that surround the insect.

“It’s in the shape of a hexagon and that shape pays homage back to the hive. And if you turn the drone upside down it looks like a flower,” said the recent college graduate.

Plan Bee is currently in development and Haldewang hopes to have her design out in the next few years.

Saving the Bees

While both Haldewang’s design and Miyako’s prototype hope to take a load off a beleaguered population of bees, their ideas are not without skeptics.

Goulson questions the costs, resources and environmental impact of building an army of robotic pollinators. The professor claims it would take a fleet of at least 3 trillion drones just to replace just honeybees.

He hopes people never write off bees and focus instead on conserving the Earth’s living insects, arguing that society has a moral obligation to look after the environment and the organisms with which humans share the planet.

Bees pollinate 75 percent of the world’s food supply.

“They give us honey as well. Why would we want to replace them with robots?”

Haldewang, however, doesn’t envision her design as a replacement for bees.

“Technology can be an extraordinary complement to us and nature, and my goal is to show that both can work in harmony.”

NFL and Intel bring football fans immersive experience of highlights and commentary from a handful of 2017 games.

There’s nothing like the sights and sounds of an NFL game, but a revolution in post-game content is pulling football fans closer than ever to football’s biggest plays. Cutting-edge virtual reality (VR) technology puts fans on the field, behind the end zone or on the sidelines to view the best plays from a variety of perspectives.

The NFL teamed up with Intel to create new VR experiences for five select games during the 2017 season. Instead of just watching the highlights, viewers get to be right there with the host and analyst, taking part in the post-game analysis. VR stats and graphics help add to the post-game experience.

“We are bringing fans to the stadiums in virtual reality, transporting them to experience the content in a way that’s more immersive than watching it on a 2D screen,” said David Aufhauser, managing director of strategy and product for Intel Sports Group (ISG).

By downloading the free Intel True VR app, viewers can get an up-close view on the most spectacular plays, alongside commentary by NFL personalities and game analysts. Through the head-mounted display (HMD), fans can switch camera views, so they can watch the play from a variety of angles.

“True VR enables fans to experience the story of the game,” said Aufhauser. “You can experience it from different angles and locations – from the sidelines to the end zones. In VR, fans feel like they’re actually part of the action,” explained Aufhauser.

For games in stadiums equipped with freeD technology, fans can also get 360-degree highlights, to see plays from unique perspectives around the field.

How it Works

To capture the game highlights in VR, camera “pods” are set up in various locations around stadiums. Each pod is equipped with six pairs of lenses (or 12 cameras) that capture a stereoscopic view – capturing 180 degrees of action, but also depth, so viewers feel like they’re at the event.

“It’s a full 3D stereoscopic experience with graphics and data that immerses fans,” said Aufhauser. “They get to experience the intensity and amazement of the game up close and from different angles.”

How Does True VR Work?

True VR was developed when Dr. Sankar (Jay) Jayaram, chief technology officer at Intel Sports and a diehard Seattle Seahawks fan, wondered: What if you could use VR to put people in the stadium to watch the game? What if you could bring a live VR experience to spectators anywhere in the world?

“People told us live VR could never be done,” Jayaram said. “People couldn’t imagine how we could process all the data in real time.”

For its first attempt, the team set up a huge rack of computers and cameras at an NBA game in 2010.

“It took us a month and a half to process the data to produce a five-minute playback,” said Jayaram.

“In a typical game we might be capturing 40 to 50 gigabytes of data per second,” he said. “That’s enough to fill up your cell phone’s memory in 8 seconds.”

“It took a lot of hard work from our engineers and a lot of thinking through the process and figuring out how we can maximize the use of computing and graphics and CPU, and GPU, and algorithms,” he said.

NFL 2017 Highlights in VR

Intel’s partnership with the NFL is making post-game VR video-on-demand content available for free after five games in the 2017 season. This example of the digitization of sports is a way to make watching games more immersive and intimate.

“Everyone has a personalized experience,” Aufhauser said. “You get to see what you want to see, not just what the camera dictates.”

Post-game VR highlights packages are available for the following games: Texans at Bengals (Sept. 14), Rams at 49ers (Sept. 21), Bills at Jets (Nov. 2), Chargers at Chiefs (Dec. 16) and Colts at Ravens (Dec. 23).

Anyone who downloads the True VR app also gets access to custom VR experiences in a variety of other sports and events. Click here to learn more.

]]>nkotwanx20171110NFL and Intel teams bring football fans immersive experience of highlights and commentary from a handful of 2017 games.
Entertainment,Sports3,10UncategorizedSports100NFL84NFL and Intel teams bring football fans virtual reality immersive experience of highlights and commentary from a handful of 2017 games.Fans Experience NFL Post-Game Recap Show in Virtual Reality-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/NFL_truevr_consumption_video11-e1510248007648.jpgDaring Game Designer Evokes Emotion in VRhttps://iq.intel.in/daring-game-designer-evokes-emotion-in-vr/
https://iq.intel.in/daring-game-designer-evokes-emotion-in-vr/Tue, 14 Nov 2017 00:00:00 +0000

To pull more people into virtual reality, game developer Robyn Tong Gray designs experiences that strike memorable, emotional chords.

“My favorite thing about the game, which may sound terrible, is that it makes people cry,” said Robyn Tong Gray, an indie game developer and the designer of the game A•part•ment.

The game tells the story of Nick Connor, a comics artist going through a difficult breakup with his girlfriend. Players explore the wreckage of their relationship by uncovering the memories scattered throughout their apartment.

When it debuted at the Game Developers Conference in 2015, misty-eyed fans flooded Gray with their own stories of failed romance. One man even buttonholed her with the story of his messy divorce.

“I like to create amazing virtual experiences that draw people in with emotions,” said Gray, who has produced 12 standalone virtual reality (VR) experiences, including the haunting Sisters, which has been downloaded more than 3.5 million times. Players explore a haunted house for clues to the sisters’ mysterious past.

After developing A•part•ment as a graduate student project at the University of Southern California (USC) Institute for Creative Technologies, Gray dug in her heels and began leading the charge to bring emotional content to VR.

Immersed in Learning

Gray entered into VR through gaming. As a preschooler, she was fascinated by her teenage brother’s skills at playing Super Nintendo. After graduating from high school, she enrolled in the undergraduate game studies program at Carnegie Mellon because her older brother was an independent game developer.

“I felt that if he could do it, then I could too,’” said Gray. “As a woman in technology, I don’t feel like I’ve run into many barriers.”

“I had a lot of opportunities because of USC that other people might not have had. I was very lucky,” said Gray.

At school, she was surrounded by a faculty of VR pioneers, including her mentor Mark Bolas, who helped create the first VR system for NASA in the 1980s.

Gray was captivated by the experimental research at the university. In the mixed reality lab, she built a system that reconstructed real-life objects in VR, allowing users to interact with the physical world while wearing a headset.

Then in 2014, she helped to produce Leviathan with Intel, plucking a giant flying whale from the pages of a steampunk graphic novel by Scott Westerfeld and bringing it to life in VR.

Crafting Emotional Moments in VR

Otherworld Interactive, the VR studio that Gray co-founded in 2014, is tucked away in a concrete-laden industrial park in Culver City, next to California’s roaring 405 freeway.

The studio is deep in development on their latest project, Taco Sloth. More than 10 Android phones lay scattered across Gray’s desk. She has been painstakingly testing the game on each one.

Taco Sloth is a game about a sloth who runs a taco truck, where the player must meet the ever-increasing demands of a hungry mob. The game is indicative of the type of content Gray believes VR is lacking.

“With VR, you see a lot of the same tropes that you see in traditional games,” she said. “My hope is to cultivate an audience for a wider variety of experiences.”

The studio’s output is eclectic, ranging from interactive haunted houses to VR music videos about the fall of civilization. To keep users emotionally engaged, Gray fills her games with everyday situations rather than big, grandiose scenes. Little details from daily life, such as ordering a chorizo burrito, creates experiences that feel immersive and real.

“Storytelling in VR is really about finding small moments that evoke empathy in your audience,” she said. “Life is made up of small moments.”

Teaching Others to Create

Understanding how to generate an emotional response in players requires a great deal of empathy — it means knowing what will connect with different people. Because of that, Gray keeps her team diverse. Otherworld strives to maintain a balanced ratio of male and female employees, she said, which allows for a wider range of creative voices.

That drive to include women extends to Gray’s personal life, too. Earlier this year, she hosted a game design boot camp sponsored by Intel — for eight weeks during the summer, she mentored a group of future female developers.

Gray mentors other women, helping them to get a start in VR.

“It was really inspiring to work with people who are coming to this medium with fresh eyes, who don’t really know yet what they can or can’t do,” she said.

By leading new voices into the industry, she is helping to expand VR’s emotional intelligence.

Eddie Codel transmits his love of drones into the Flying Robot International Film Festival, a celebration of aerial cinematography.

A live DJ spins infectious funk tunes in the backroom as Eddie Codel fidgets on a laptop in his movie-set-turned collaborative workspace in San Francisco’s Mission District. He’s eager to edit fresh drone footage shot at this year’s Burning Man Festival.

“Drones are super fascinating,” he said. “They’re robots, but in some way they’re extensions of our souls.”

That wide-eyed, open-minded approach connects him to an expanding community of drone cinematographers, racers and enthusiasts.

That fascination is fed by the out-of-body feeling he gets flying a drone. It’s a sensation that strikes at the heart of all humanity. Floating above it all, seeing what a drone sees, locks him inescapably in the moment.

“Maybe humans have always wanted to be birds, to have that perspective of flying,” he said. “Drones do that in a way that really couldn’t be done before.”

Not many people can fly airplanes or helicopters, but relatively inexpensive drones let anyone experience the sensation of flying. Codel said drone innovation is lowering barriers to entry, so enthusiasts don’t have to hack together an ultimate filmmaking rig. Gradual price drops are making it easier for more people buy a camera-equipped drone that shoots HD or 4K video.

So far, he’s lost four drones. It was painful at the time, but carries no regrets.

Eddie Codel’s favorite spot to fly drones is by the Bay Bridge in San Francisco.

“If you don’t lose a drone or two along the way, you’re not doing it right.”

The Ultimate in Remote Control

Codel’s path to drone geekdom started at an early age, when he was into remote controlled cars and helicopters then model rocketry. After college he worked in IT and learned about data networks before becoming a shutterbug, digital videographer and go-to-guy for livestream events.

Today, the quintessential jack of all trades, tech tinkerer, bohemian artist and visual storyteller is rapt by drones. So much so, he founded the Flying Robot International Film Festival (FRiFF), held each November at San Francisco’s historic Roxie Theater, just up the street from his workspace.

The ah-ha moment for Codel came in 2012, when he saw a friend shoot a video using a first-generation drone at a pool party in Palm Springs.

“It was the first time I had seen somebody with a quad copter, and the brilliant seven second video he created blew me away,” he said.

The idea of doing something creative and original drove him to buy his first drone in early 2013. It was a DJI Phantom 1. Soon after, he bought a gimbal, GoPro camera, video down link transmission system and monitor.

“All that stuff had to be wired in and soldered,” he said. “It was all very piecemeal at the time. Today, all that stuff is integrated in a little package. You can just use your phone to see what’s going on.”

That second ah-ha moment hit in 2013, after bringing his first drone to Burning Man festival in Black Rock Desert, Nevada. He captured steampunk and space age tribal scenes, making the gathering even more surreal from the sky. It wasn’t until he got back home and reviewed the footage that he realized “it was epic.”

He compiled the best shots into one video and uploaded it to YouTube. It went viral, attracting a couple million views within the first week.

Codel edits his drone videos in a quintessential San Francisco office near the historic Roxie Theater.

“That made me realize I should pursue drone videography more seriously.”

Since then, experiencing that bird’s eye, drone’s eye, that aerial view as he calls it, is an almost constant state of mind.

He returned to Burning Man this year with several drones, ranging from the small DJI Spark (with artificial intelligence powered by Intel’s Movidius) to a very large cinema rig that holds cameras with interchangeable lenses.

“I was more deliberate about what I captured. I knew the best time to shoot was at sunrise and sunset. In past years, it was mostly about waiting until the wind died down so I could fly.”

Flying Robot International Film Festival

Improving skills and better, easier-to-use tools are driving an explosion of high-quality drone cinematography. Seeing a rapid rise in creative drone videos sparked the idea to found FRiFF in 2015, which celebrates drone storytelling from all over the world.

“I love making drone videos and photography, but there are a lot more people better than I am,” he said. “I wanted to provide a place for novice and experienced filmmakers alike to come together.”

Twenty-five nominations from 11 countries are competing in the 2017 FRiFF on November 16. Host of the third annual event, “the internet’s” Justin Hall, will announce two winners for each category: Cinematic Narrative, Epic Landscape, Drones for Good, FPV Racing and Freestyle, WTF LOL, Student Films, and Promotional. Films must be five minutes or less.

Last year, the festival received 180 submissions from more than 40 countries. The 2016 Best of Show winner was Moon Line, a night time skiing short by France’s Frédéric Rousseau.

One of his all-time favorite films was the 2015 FRiFF winnerAll Away by Colin Solal Cardo of France.

“It was about two lovers dancing, and the drone becomes part of the choreography,” said Codel. “The drone is as much an actor as the actors are, and it’s all done in a single shot.”

Stunning Memories

Initially, Codel was only interested in photography and video, but now he uses drones to do 360 image capture. For example, he flew his drone above the iconic 1910 World’s Fair building on Treasure Island to capture about 100 photos that he stitched together using his laptop to create a 3D model. He enjoys messing around with the building in the digital world.

His Johnny-on-the-spot good karma often leads to an unexpected bonanza of eye candy. This summer, a walk downtown to get a close look the construction of a new skyscraper resulted in Codel becoming one of the first to fly a drone from the top of the Salesforce Tower, now the largest building in San Francisco.

The skyscraper is just a few blocks up from his favorite place to fly, the waterfront overlooking the Bay Bridge.

“Flying along the waterfront, where moving water and land intersect is super compelling.”

Beyond aerial cinematography, Codel sees the potential for drones helping humanity. They can deliver blood to hospitals or medicine to far out places; assist with search and rescue; help fight fires; or map areas that have been affected by hurricanes.

“People call them many different things: Flying lawn mowers, eye in the sky, flying robots, UAVs, UASs,” he said. “But in the end, I think drone is the right word and I’m sticking to that.”

]]>nkotwanx20171110Eddie Codel transmits his love of drones into the Flying Robot International Film Festival, a celebration of aerial cinematography. Entertainment,Tech Innovation3,11NTG22UncategorizedEntertainment30drone film68Eddie Codel transmits his love of drones into the Flying Robot International Film Festival, a drone film celebration of aerial cinematography.Drone Nerd’s International Film Fest Takes Flight-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/IQ_Drone_SF-020-e1509839026993.jpgDrone Cinematographers Capture Unsullied Placeshttps://iq.intel.in/drone-cinematographers-capture-unsullied-places/
https://iq.intel.in/drone-cinematographers-capture-unsullied-places/Tue, 14 Nov 2017 00:00:00 +0000

Historically, cinematographers used helium balloons, cranes and helicopters to capture scenes from above. But when affordable camera-equipped drones hit the market, aerial photography reached a new stratosphere of storytelling.

Drones are portable, relatively easy to command and can capture high-quality footage. They can hover closer to the ground than a helicopter and deftly maneuver through compact spaces.

They’re also bringing intriguing storytelling perspectives to a new breed of filmmakers like Nestoras Kechagias and Athanasia Lykoudi, known for their spectacular moving vistas of their homeland, Greece.

Under the moniker Inva + Sla, the web and motion designers took their experience in the advertising industry to a new level when they began using drones to tell stories.

Through drone photography, they found creative freedom and were unencumbered by the limitations and bureaucratic approval cycles so common in their industry.

Easier than Ever

Rapid technological advancements and affordability are making drones a must-have device. According to the Consumer Technology Association, 2.4 million personal drones were sold in the U.S. alone in 2016 — more than double the 1.1 million purchased in 2015.

Festivals of Drones

Inva + Sla’s aerial cinematography is well-known on the drone film festival circuit. They’ve racked up numerous nominations and awards at established international competitions including FRiFF, DroneUp IFF, Los Angeles Drone Film Festival and InterDrone Film Festival for their drone work on Marinière, 2016 Digitized – Opening Titles and Reflecting on Peloponnese.

“Competing in Flying Robot was our goal since the first day that we got our first drone,” said Lykoudi. “It happened to be our first stop to see quality drone videos and we learned a lot from previous winners.”

Their latest work was submitted to this year’s competition while 2016 Digitized – Opening Titles was recently selected as a finalist for the first-ever Los Angeles Drone Film Festival, the sister event to the New York City Drone Film Festival. The film also won the “Promotional” category at FRiFF 2016.

Inspiration from Above

Gleaning inspiration from the opening credits of movies and TV series, which tell their own stories, to music videos and animated gifs, Inva + Sla create an atmosphere of aerial surrealism that is omnipresent in all of their videos.

“We tend to like intros like the Halt and Catch Fire series that harmonize motion with sound and tell a story through symbolic abstract shapes,” said Lykoudi.

Location is another significant factor for planning out aerial projects. Their homeland’s geographical landscape is comprised of contrasting terrains. Coastlines and mountain ranges coexist in small spaces.

“We found ourselves scouting locations in Greece that we would otherwise not consider exploring,” said Lykoudi.

When setting out to rediscover the entire country from scratch and unearth unknown gems, the duo avoided iconic locations and commonplace views.

“We needed to find beauty in remote and unsullied places and show a new perspective to overlooked locations,” said Lykoudi.

While inspiration is a critical motivator of their work, the team believes imagination is the most important aspect of filmmaking. It helps elevate a filmmaker’s craft, especially when limited to the tools at hand.

For Inva + Sla’s earliest videos, the gear was what Lykoudi called very low budget and based on what was available at the time. They used a DJI Phantom 3 Advanced, a Panasonic Lumix G3 and a borrowed GoPro HERO4 for Digitized 2016 – Opening Titles.

Since then, they’ve upgraded nearly all of their gear.

“We now shoot on a DJI Phantom 4 Pro, a Panasonic Lumix G85 and acquired microphones, lights, extra filters, lenses, sliders and more,” said Lykoudi. “Still not fully professional but much more capable than before.”

For editing and compositing, they use Adobe Premiere and After Effects.

Future Drone Technology

Kechagias and Lykoudi come up with a lot of new ideas on a daily basis, but they can’t execute because it’s not possible with technology currently available. Sometimes they only have to wait a few months to purchase what they need to create the desired effect. Other times, they need to delay their plans — sometimes indefinitely. In some instances, “that is a good thing because we discover new ways to achieve the desired outcome.”

Lykoudi sees lots of opportunities for drone cinematographers to extend their skills and bring high-quality visual storytelling to a wide variety of industries.

“The advertising industry is already shifting toward aerial shots and there is huge potential for drone-only stories that can be used to promote brands,” she said. “We have a feeling this is just the beginning and storytelling will shift dramatically to a new way of filming.”

She and Kechagias have trips planned to Morocco, even though flying drones is still prohibited, and Greenland, where they expect to encounter a radically different landscape from Europe.

This dynamic drone duo believes that creating extraordinary work requires the ability to dream, the discipline to set goals — and the gumption to give it their best shot.

Johannes Saam’s early love of movies sparked his journey from computer generated imagery artist to frontiersman of real-time VR experiences.

When most people enter their teenage years, they’re just vaguely starting to consider college majors. Johannes Saam, on the other hand, already knew exactly what he wanted to do and was well on his way to making it happen.

“When I was 14, I fell in love with visual effects and special effects. Movie magic, basically,” said Saam. “I wanted to know how special effects were made, how you could fool people’s eyes.”

He plunged into the wealth of information available online to find out how 3D effects worked and began learning how to execute the “magic tricks” that mesmerized him on the silver screen.

In the decades since he was first inspired by cinema, he’s gone on to craft the dusty, post-apocalyptic world of Mad Max: Fury Road and bring the wonder of deep space to Earth in Prometheus.

Academy Award-winning Johannes Saam has loved movie magic since he was a teen. Photo courtesy of Framestore.

His handiwork made the escapades of super-spy Ethan Hunt looked thrilling in Mission Impossible:Ghost Protocol. The super powers of Thor and Captain America effectively straddled the line between incredible and convincing on the big screen thanks to Saam’s expert touch.

In fact, Saam’s work has pushed the boundaries of what the film industry could do with visual effects to such an extent that the Academy of Motion Pictures Arts and Sciences honored him with a Technical Achievement Award in 2014.

Saam is one of the premier developers and visual effects artists in the industry, according to Raj Puran, Intel’s manager of commercial virtual reality (VR) and augmented reality (AR) experiences. Puran got a firsthand look at Saam’s talent when he seamlessly blended photorealistic and augmented elements to completely recreate an entire floor at the Smithsonian American Art Museum for SAAM VR, a collaboration with Intel.

“Johannes isn’t just an artist. He’s someone who is always trying to progress the technology used by his art form,” said Puran.

Saam applies existing visual effects to new mediums, such as VR, as well as crafts entirely new tools, giving artists the ability to create wholly unique experiences.

“It’s kind of like saying, ‘OK, I’m a painter, but I need my paint brush to do this.’ And instead of hoping someone else will create that paint brush, Johannes figures out how to make it himself.”

Saam and his collaborators won the Oscar for inventing a technique known as deep image compositing, which allows effects artists to assign 3D information to flat images, which can then be layered within one another.

“Imagine you have character that’s inside a cloud,” Saam explained. “With classic compositing operations, you can either put the character in front or behind the cloud. With deep image compositing, because we know the depth of the character and the multiple depths of the cloud layer, we can put the character inside the cloud.”

Not only does this technique produce a visually impressive result, it also saves time and money by allowing an animator and an effects artist to work on the same shot simultaneously.

Saam’s ability to innovate isn’t limited to software, though. He also helped develop best practices and workflows in tandem with a 24-camera array that allows users to record immersive video with six degrees of freedom. Using the Surround 360 camera system, filmmakers can shoot video moving forward and back, up and down, left and right, and across three other perpendicular axes.

Relentless Youthful Persistence

Saam’s journey from 3D wunderkind to a pioneer in VR has taken him across the globe, but it all started when he landed a position at Scanline VFX, a film effects company in his home country of Germany.

“It was a matter of being at the right conference, talking to the right people and being very annoying,” said Saam. “I landed almost all of my jobs because of my unrelenting passion. That was how I got started.”

With a foot in the door of the moviemaking industry, he knew that establishing a solid educational foundation was critical for him to excel. After a few years with Scanline, he left Germany to attend Bournemouth University in the U.K.

Saam likes working with more interactive technologies like VR. Photo courtesy of Framestore.

“I needed to go back to teach myself properly,” said Saam. “I wanted to deepen my knowledge of programming, actually get a proper understanding of physics and math.”

With his master’s in computer animation in hand, Saam accepted a technical director job nearly 10,000 miles away in Australia at Rising Sun Pictures. He expected to only spend a season down under before returning to Europe, but ended up staying much longer.

“I just wanted to go there for the summer, and seven years later, I was still there.”

Moving to the Virtual World

During those seven years, Saam worked at nearly every visual effects studio in Australia, but the continent would prove to be just a pit stop along his journey. Next, he would travel to the city that was perhaps predetermined as his ultimate destination way back when he was just 14: Los Angeles.

According to Saam, “L.A. is where all the movie magic happens,” so once again, he traveled halfway around the world, where he found his new passion.

“I’d done film work for long enough and achieved something with that, so I decided to focus more on real-time content and virtual reality as a new challenge,” said Saam.

“I fell in love with it straightaway, so that is what led me to the more interactive technologies.”

Saam believes VR needs good storytelling to be compelling. Photo courtesy of Framestore.

Two years ago, that love landed him a position at Framestore, a creative studio which also houses the most awarded VR studio in the world, where he has been able to immerse himself in the cutting-edge medium as a senior creative developer.

Throughout his transition from film to VR, Saam never lost sight of how to reproduce that movie magic that first enticed him so many years ago.

“Just because you’re doing VR doesn’t mean you’re making something amazing and new,” said Saam. “You need a combination of the right story and the right technologies to make an immersive and compelling experience.”

Clearly, this ability to balance the creative and the technical has served Saam well throughout his career, and Puran doubts the uber-talented artist will ever relinquish his position at the cutting edge of virtual effects.

“Johannes is just one of those guys that pushes boundaries. It’s not just about, ‘Hey, check out this cool content I made.’ It’s about pushing the limits of the cameras, of the computing platforms, of everything,” said Puran.

“He knows there’s so much more out there and is on a quest to dig deep.”

Art is a conduit to bring cultures and ideas together, to cross-pollinate and offer new perspectives. Three of the world’s leading contemporary artists — Jeff Koons, Marina Abramovic and Olafur Eliasson — collaborated with Acute Art, the first virtual reality (VR) arts platform, to develop the relationship between human expression and technology.

The wider mission of Acute Art is to bridge art — as it’s known in the physical world — into the new, disruptive realm of VR, bringing technology-driven art to a new level in immersive experiences.

VR provides an unparalleled canvas for storytelling, allowing people to explore beyond their physical limitations. Koons, Abramonvic and Eliasson shared their unique approaches to working in the developing VR medium at the Symposium Stockholm’s Brilliant Minds conference in Sweden.

As the most immersive technology to date, VR has the ability to provide new depths of connection, experiences and information. However, technology — particularly VR — also has the capacity to isolate. The artist explored all of these aspects and more.

Virtual Reality as Shared Reality

When Eliasson first shared his VR creation with his 11-year-old and 13-year-old children, their initial question was: “Could we go in together?”

“I think the industry is fast to respond to this collective desire. I hope to use VR to bring people together,” said Eliasson.

Eliasson, a Danish-Icelandic artist known for sculptures and large-scale installations, works with ephemeral materials such as light, water and air to create his art.

“At first I was afraid of it [virtual reality] being escapism, and I could see how it could become isolating,” Eliasson said.

“At the same time, I think one of the drivers is focusing on what is the ethical and moral compass of culture, and how can we contribute to that from where we [as artists] stand?”

Eliasson’s VR piece “Rainbow” recreates the beauty and mystery of encountering a rainbow through a digital process. The virtual rainbow mimics the real experience in nature, only appearing when light and water droplets are aligned from a certain point of view.

Olafur Eliasson focused on the collaborative nature of VR in his immersive experience. Photo courtesy of Acute Art.

“We had the challenge to recreate the physics of the real world in the virtual world,” said Dado Valentic, chief creative technologist at Acute Art.

“We had to research the behavior of the water droplet — what happens when the light particle hits it and refracts, creating a rainbow in effect.”

As a nod to his children’s initial question about VR, Eliasson’s piece is social in nature: multiple viewers can enter the space at the same time. The experience also incorporates handheld controllers as an additional layer of interaction.

“Neural and muscular interfacing is what made VR so incredibly exciting for me — the relationship between my brain and my physical activity,” said Eliasson.

“VR gave a really radical answer to a physical relationship with your own body and an interface. The work encourages the user to become active and produce your own world.”

Giving New Meaning to “Social Media”

With VR acting not only as a medium, but also as a distribution platform, the doors are wide open to the socialization and conversation of art in this immersive format.

“I find the abundance of information at our disposal today extremely attractive,” said Koons. “I would choose this moment to live just for that reason.”

Koons’ VR piece “Phryne” (pronounced “fry-nee”) captures his ethereal vision for creating an imagined world on a blank three-dimensional canvas. His inspiration is a revered woman of the ancient world by the same name, created in the spirit of the pastoral arts.

Koons invites the viewer to meet Phryne, a metallic ballerina, in a peaceful garden. She interacts with and guides the viewer through her world, eliciting the role of a muse.

Through VR, Jeff Koons brings a metallic ballerina to life. Photo courtesy of Acute Art.

“We had to recreate this passion in manufactured chrome to bring the ballerina to life,” said Valentic. “We used a dancer from New York Ballet, who has performed all the moves that were then captured in on a motion-capture stage.”

For Koons, VR is yet another tool to share his visions. His point of view embraces an open mind for the intersection of technology and popular culture.

“I think it’s wonderful that people can pick up a [smartphone] and feel the freedom of creativity, like with Snapchat,” said Koons. “To create something solely for an aesthetic is extremely emotionally pleasing — and also just fun.”

Inciting Action Through Immersion

As much as art can bring people together, it can also be a wake-up call, inciting action. Abramovic’s work is synonymous with human confrontation. Since the beginning of her career in Belgrade during the early 1970s, her work has explored trust, vulnerability and connection in the human experience through performance art.

That’s exactly what Abramovic’s VR experience elicits. In “Rising,” she brings the viewer face-to-face with the reality of humanity’s role in global warming.

In the piece, Abramovic herself — as a life-like avatar — issues an immersive call to action that asks the viewer to save her from drowning, trapped in a glass tank slowly filling with water.

The piece transports the viewer to a landscape of melting glacial ice caps. From the glass tank, Abramovic asks viewers to make commitments to help reverse the human effects of global climate change. For every commitment made, the water level in the tank decreases. Without any commitment, the water rises until Abramovic drowns, the experience ending in the destruction of the environment.

“Performance is all about process. I can push my body’s limits as far as it can go. We are in the 21st century — things change. Being introduced to virtual reality, I understood that the possibility is enormous,” said Abramovic.

Marc Ecko’s festival of creativity, arts, music and design is all about collaboration and creativity.

When Marc Ecko brought the world the inaugural ComplexCon in 2016, he aimed to bring creative people from out behind their screens and encourage them to connect and collaborate IRL.

“Most of our interactions are transactional and ephemeral,” said Ecko, who built a cultural empire upon a youth spent designing clothing and creating graffiti art in suburban New Jersey. He created the world-famous Ecko Unltd — both fashion brand and cultural zeitgeist — and later founded Complex, the leading media platform for youth culture.

“We really wanted to take the idea of the Complex ecosystem and use it to host all these other great collaborators and creatives,” he said. “We set out to create some sort of tentpole event that could provide more meaningful engagement with our audience.”

ComplexCon is a weekend-long event that’s part music festival, part exhibition and 100 percent of-the-moment. The event was credited with bringing the internet to real life, and this year (Nov. 4-5), Ecko plans to do it again with ComplexCon 2017.

Beneath the Surface

Attendees may make the trek to Long Beach for the music or fashion, but Ecko asserts that technology is the vehicle that makes much of the ComplexCon experience possible.

“Technology is the subtext of everything — it is woven into nearly every creative expression,” he said. “It’s the oxygen that our creative culture breathes.”

Several installations and panels at ComplexCon will shine a spotlight on the cutting-edge innovations that are poised to shape tomorrow’s artistic expression.

The “What is Money?” ComplexCon(versation) will take a look at the impact cryptocurrencies like bitcoin will have on the future of commerce, and Shopify is debuting 10 digitally native brands at the ComplexCon Marketplace, giving attendees a glimpse at the tech-powered future of retail.

Los Angeles-based technology and design studio incite’s massive sequencer BABS will be onsite for those who want to channel their inner music-maker, and attendees will get a chance to immerse themselves in the world of Linkin Park through One More Light, a new VR experience from Spatialand and Intel.

“The most forward-thinking creators of our current generation and beyond will be at ComplexCon,” said Rajeev Puran, manager of Intel’s commercial VR and AR experiences.

“Being in the same space as those creative people really helps us to understand how the next generation of computing will take place.”

A Cultural Icon

“Collaboration” is the theme of this year’s ComplexCon, and Ecko said he was inspired to choose it after witnessing the collaborative spirit at work firsthand during ComplexCon 2016. He saw creators sharing ideas with one another and absorbing feedback from fans and consumers, while teams worked together internally to pull off exclusive sneaker and apparel drops and complex installations.

More than 200 brands will be onsite to deliver exclusive drops, pop-up shops and retail activations, giving attendees the chance to experience the latest in food, fashion and entertainment. Then, in the evenings, they’ll be able to let loose to sets by N*E*R*D, Gucci Mane, M.I.A. and dozens of other musical performers.

While Ecko’s mind is currently focused on the finer details of this year’s event — What will attendees want to see first? Will the lines be long? — he anticipates his “cultural world’s fair” to live on long after November, noting its potential for expansion into new markets in both the U.S. and internationally.

However, a ComplexCon in every nation is not his ultimate goal.

“My intention is not quantity; it’s quality,” said Ecko. “I don’t know where exactly ComplexCon will go, but if I have my way, it will always be made with love and it will be special.”

]]>hbhallax20170111Marc Ecko’s festival of creativity, arts, music and design is all about collaboration and creativity.Entertainment,Fashion3,4UncategorizedEntertainment30ComplexCon71Marc Ecko’s ComplexCon festival of creativity, arts, music and design is all about collaboration and creativity.The Internet Comes to Life at ComplexCon-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/RONY0040-e1509379296204.jpgDrone Photographer Inspires Different Point of Viewhttps://iq.intel.in/drone-photographer-inspires-different-point-view/
https://iq.intel.in/drone-photographer-inspires-different-point-view/Fri, 10 Nov 2017 00:00:00 +0000

Syracuse University student and author of The Handbook of Drone Photography wants everyone to get a new perspective on life.

For a son of journalists, having a critical eye and strong sense for what makes a good story may be second nature. But it wasn’t until drones came into his life that Chase Guttman’s earth-bound passion for photographic storytelling truly took flight.

“Drones are really like this brave new world,” said Guttman, an award-winning travel photographer and author of The Handbook of Drone Photography.

“They help me appreciate the breadth and scale of the world around me.”

By the time he’ll turn 21, Guttman will have visited five continents, all 50 U.S. states and 70 countries around the world. He was named Young Travel Photographer of the Year three times, a World’s Top Travel Photographer by Condé Nast Traveler, a Rising Star by Instagram and won the 2017 Walter Cronkite Award for Excellence in Storytelling and Exploration. His work has appeared in National Geographic publications, Condé Nast Traveler, Travel + Leisure, Lonely Planet, The Guardian, CNN, BBC, ESPN and other major media outlets.

Drones are pushing his passion of photography to new levels and pulling him into new pursuits.

The Syracuse University senior is the president of the Skyworks Project, a student research and development group exploring the technological and social possibilities for drones. It’s where he blends his skills and passion for photography to shape the future of drone technology.

“It’s really just a bunch of drone nerds appreciating the world around campus and beyond,” he said.

The adventure seeker is always looking for an excuse to escape campus and fly his camera drone around the Syracuse region — at places like the New York State Fair, Lake Ontario, Niagara Falls, Ithaca’s Gorges, Quebec City, the New York State Capital and Thornden Park.

Drone Awakening

At an early age, Guttman was bent on finding new ways of looking at people, places and things.

“Everything has been shot to the death, so the way to really make a photograph or video stand out is to find an interesting perspective,” he said. “That perspective is what makes drone photography so fascinating.”

With hundreds of millions of photographs uploaded every day to Facebook, he said drones are allowing people to capture and share more mesmerizing photos.

A nearby tribal leader and local sheep shepherds converge on the cliffs of Maletsunyane Falls in South Africa. Photo credit: Chase Guttman.

Before drones, when standing would no longer do the trick, he hoisted his tripod in the air like a fly fishing rod to get more intriguing angles on his subjects. When drones came along, he said the technology was a game changer.

“I could basically be anywhere and the camera could be somewhere else. I could see things from totally new vantage points, and it opened up my eyes to different ways of telling stories. Drones have led to the democratization of perspective,” he said.

The self-proclaimed technology nerd always strove to be the first person on his block to try out new things. He got his first starter drone in 2014, when he was 17.

“I learned how to fly really by crashing into trees around the park. Now I’m using an Inspire 1 Pro by DJI to fly everywhere.”

Drone Perspective

For Guttman, drones own the airspace that’s just out of reach of the longest selfie stick and the lowest hovering helicopter.

“I like a range that’s not too high, where everything looks compressed in the photo, and not too low, where the perspective hovers just above ground level,” he said. “That sweet spot is really where I’ve lived and have developed my craft over time.”

Piloting a drone over Mont Saint-Michel in France led to one of Guttman’s most extraordinary experiences of his life.

“There’s an amazing abbey that’s like an island marooned out in the sea during high tide,” he said. “It was spectacular to see with my own eyes at sunset but the photograph captured from above with my drone was majestic.”

He sees drone technology evolving quickly, making drones more capable and easier to use, which is allowing anyone to become an excellent drone photographer.

“There are sensors for obstacle avoidance and features that allow you to track people or objects around you, making videography much more of a breeze,” he said. “Battery life, range and all these different features that make drone photography so much more powerful and impactful are getting better all the time.”

Overnight snowfall dusts an iconic Manhattan roundabout, where Christopher Columbus presides over the spot from which all official distances to New York City are measured. Photo credit: Chase Guttman.

[Related story: AI Makes Drones Smart, Easy for Photographers]

His book provides strategies and techniques for taking spellbinding aerial photographs. The book has led to many opportunities for Guttman, including TV interviews and lectures at the National Arts Club and the George Eastman Museum.

“I just want to keep going out there and telling people how drones can really, truly change the world,” he said.

Becoming a Flying Shutterbug

He suggests beginner pilots learn to fly with a cheap trainer drone then upgrade to a more expensive drone once they’ve mastered the basics. He said having a good laptop and editing software are essential.

“Once my drone is safe, I upload my photographs first to an external hard drive then use Adobe Lightroom on my laptop to edit before publishing onto social media platforms like Instagram.”

Shutterbugs curious about drones should abandon any apprehension, he said.

“The number one thing you could do is just get out there, explore and develop your craft. Research where you want to go and plan a variety of different shots.”

]]>hbhallax20170111Syracuse University student and author of The Handbook of Drone Photography wants everyone to get a new perspective on life.Entertainment,Tech Innovation3,11NTG22UncategorizedEntertainment30drone photographer70Syracuse University student and author of The Handbook of Drone Photography wants everyone to get new perspective on life.Drone Photographer Inspires Different Point of View-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/11/Chase-Guttman-Drone-5-e1509134523980.jpgAutonomous Cars Spark an Evolution in Interior Car Designhttps://iq.intel.in/autonomous-cars-spark-evolution-interior-car-design/
https://iq.intel.in/autonomous-cars-spark-evolution-interior-car-design/Thu, 02 Nov 2017 00:00:00 +0000

On the autobahn to autonomous cars, experts from BMW, Carnegie Mellon University and Intel examine how autonomous cars are sparking innovation in interior car design, and helping drive a $7 trillion passenger economy.

In the not-too-distant future, cars will become autonomous, fully functional vehicles that will operate without a driver. This will radically transform how people get from A to B, sparking a “passenger economy” fueled by creature comforts not yet found in today’s top-of-line luxury cars. When no one needs to be in the driver seat, what will cars of the future look like?

The oncoming autonomous car revolution is driving researchers to reshape the inside of cars to be more like living rooms, offices or sleeping pods.

“We love our cars because they are emotional possessions,” said Hans-Joachim Faulstroh, head of Interior, HMI and User Experience at BMW Research in Munich, Germany. Faulstroh and his team study human-machine interface (HMI), examining the relationships people have with their cars.

“We have a deep relationship with them,” said Faulstroh. “We take care of them and they take care of us.”

The BMW i Inside Future sculpture shows an example of what cars of the future might look like. Books are tucked into shelves, the front seats swivel all the way around and have their own audio systems – the person in the front seat could be listening to Beethoven, while the person in the back seat rocks out to Black Sabbath.

The car controls – including the temperature, music and connecting to incoming calls – are managed via BMW HoloActive Touch, a floating hologram that is accessible from anywhere in the car.

“BMW HoloActive Touch is a kind of switch system for the future,” said Faulstroh.

He explained that when a phone call comes in, the “driver” sees a floating image of the caller’s face hovering in the air.

BMW’s world headquarters in Munich, Germany.

“It’s not located in the cockpit like it is today. It’s more free-floating so you can access it even if you’re not looking forward.”

Faulstroh said BMWs of the future will have different modes. Drivers could put the car of the future in “ease mode,” where the driver might opt to take a nap on a long, boring stretch of road. But when drivers want to regain control and feel the joy of driving on a twisty turny route through a forest, they could choose “boost mode,” which puts them firmly back in the driver’s seat.

“The car interior of the future is your living space,” said Faulstroh. “It opens up dramatically new possibilities.”

Are We There Yet?

The technology inside autonomous vehicles may be new, but the dream of self-driving vehicles has been around for a while, said Dr. Stan Caldwell, associate professor at Carnegie Mellon University (CMU) and executive director of CMU’s Traffic21 Institute, which is devoted to developing smart transportation systems.

BMW’s Hans-Joachim Faulstroh believes autonomous car interiors will evolve into modern living spaces.

“It was at the 1939 World’s Fair that General Motors announced its vision for the first autonomous vehicle,” said Caldwell. GM’s original idea for driverless cars was that they would navigate streets paved with electromagnetic strips, like trains riding on rails.

“It wasn’t very long after that we began thinking, ‘Wouldn’t it be great if we didn’t have to actually operate them?’”

Caldwell says the big commercial push for autonomous vehicles began in 2009 when Google hired DARPA Grand Challenge alumni to develop the first publicly available driverless car by 2020. Its fleet of Waymo driverless cars has since logged more than 3 million self-driven miles on public roads.

“I think we’ll look back one day and say, ‘When did we stop driving?’ and not really know,” Caldwell said. “It’s going to happen gradually as the technology becomes less expensive and more publicly accepted.”

That’s what happened with cellphones, computers and traditional automobiles. People didn’t immediately get rid of their typewriters, landlines or horse-drawn carriages — they adopted their replacements incrementally until one day it was hard to imagine a world without them.

BMW is exploring the personal relationship between people and their cars.

Establishing Trust, Sparking Passenger Economy

A survey conducted by the Consumer Technology Association found that 70 percent of respondents are interested in testing a driverless car, and 62 percent would replace their existing vehicle with a self-driving alternative.

“We might be able to build the perfect car from a technology standpoint — and it could drive perfectly and keep you safe all of the time,” said Weast. “But if we don’t feel psychologically safe, then we’re not going to use the service or buy one of these cars.”

Weast believes that with the advent of autonomous cars, people will focus more intently on how the car interior looks and feels. Everything from the interior design to the software systems will become much more important.

“Today, if you don’t like the look of the exterior of a car, you’re probably not even going to look inside,” said Weast. “But that might get turned completely around when the interior becomes a work space or living space.”

Findings from a recent study from Intel and analyst firm Strategy Analytics explores the yet-to-be-realized economic potential when today’s drivers become idle passengers. The study predicts that the “passenger economy” will usurp the “sharing economy” and see an explosive economic trajectory growing from $800 billion in 2035 to $7 trillion by 2050.

At BMW, Faulstroh said the commitment to the “ultimate driving machine” hasn’t wavered. Instead, it’s getting more personalized.

“You are the ultimate driver in a BMW,” he said. “We’re just helping you get the best of both worlds.”

Anyone lifting heavy loads risks crippling back problems. What if people could wear a smart device equipped with sensors and data analytics technologies that coaches and assists with proper lifting techniques?

So-called “ErgoSkeletons” from startup StrongArm Technologies are doing just that, and they could lead to fewer on-the-job injuries. These smart harness belts help workers maintain a safe body posture while lifting or moving heavy objects. Additional technologies allow the wearable to track a worker’s movement and provide almost real-time ergonomic data that can help workers and managers avoid injury.

The top cause of workplace injury is overexertion, which includes lifting and carrying activities, according to the National Safety Council. Nearly 13,000 U.S. employees are injured each day, illustrating the need for prevention strategies.

The V22 ErgoSkeleton tackles those challenges by transferring weight from the upper body to the legs, helping to reduce common back injuries and arm fatigue, according to Matt Norcia, chief marketing officer at StrongArm Technologies.

He said the technology reinforces proper lifting techniques so factory workers can safely lift objects and carry them comfortably for longer distances.

“It redistributes the load over your shoulders and into your core muscles, so you’re using the right muscles to perform the lifts, thus reducing risk,” Norcia said.

Spreading the Load

Workers strap on the V22 like a backpack, securing it around the waist. Additional cords extend from both shoulders of the brace, and workers attach the clutch cords to their hands.

When lifting, the clutch on the hands engages, transferring the load of the object through the cord, over the shoulders, and down through the spine and the hips. It gets people to use their full body, not just the arms and lower back.

FUSE, a wearable sensor on the ErgoSkeleton, tracks data on workers’ performance, using machine learning to help identify injury risks and help workers follow safer lifting procedures.

Real-Time Data Tracking for Optimizing Safety

“This system essentially captures, evaluates, and delivers guidance to both employers and employees on how they can avoid injuries and perform better,” Norcia said.

He said the FUSE device collects data 12.5 times a second from sensors on the ErgoSkeleton. A processor then analyzes that data and delivers real-time insights and on-body haptic feedback to each worker, helping them to lift objects safely.

The FUSE platform generates a “safety score,” which is a single risk grade that’s a weighted metric based on all the different factors that might contribute to risk in an industrial environment.

“We use that score as a baseline for improvement,” Norcia said.

“It becomes a tool for optimizing the workplace, workforce and work process with accurate data.”

Bringing Data-Based Decision Making to Ergonomics

General Electric (GE) began piloting StrongArm’s FLX and V22 systems at seven of its sites worldwide in 2016, according to Sam Murley, digital acceleration leader for GE’s Environment, Health and Safety (EHS) group.

He said GE saw it as an opportunity to try something that applied a digital solution to managing ergonomics while also easing the burden on the workforce.

He said employees using the technology conduct pick and pack operations that involve different body movements, and many perform both repetitive and highly complex tasks at GE distribution sites.

Murley refers to the FUSE platform as a data-based decision-making tool. Ergonomic data is bringing new insights about environmental health and safety for workers, facilitating and prescribing intervention to those who need it the most.

“We can now use all that information to identify and make pre-informed decisions on key ergonomic areas we should be focused on,” he said. “It helps the sites selectively prioritize.”

It’s also sparking serious dialogue at GE.

“Conversations and decisions between health and safety experts at the sites and employees are now hinged on data coming from the FUSE wearable and data platform,” said Murley.

The data helps determine whether the sites need to change worker behavior or the work environment to prevent injuries, he said. Simply put, the technology raises employee awareness of safety issues. It’s provided employees with near real-time digital coaching.

Although the technology has only been used for a short time, Murley believes the real-time data could help employees and managers identify where they need to stop or intervene before an injury occurs. Plus there’s an immediate added benefit.

“Employees are more aware of their lifting behaviors when wearing the FUSE sensor,” said Murley.

Working closely with StrongArm is allowing Murley to fine tune the tech for GE’s needs.

“How can we take changing safety scores that meet certain thresholds and immediately alert the site leads?” he asked, describing how ergonomic data can be used in new ways. “How can we make this more proactive?”

Quantifiable feedback from GE field testing is helping Murley strategize with StrongArm about new features and improvements.

Innovation at New Lab

StrongArm Technologies is one of more than 90 startups at New Lab, an 84,000-square-foot former machine shop in New York City’s Brooklyn Navy Yard. New Lab provides wide-ranging resources for companies in emerging fields such as robotics and artificial intelligence (AI).

New Lab provides resources for multiple tech startups. Photo courtesy of New Lab.

StrongArm and other New Lab members prototype in the Innovation Workshop, which is sponsored by Intel and Lenovo. It’s where New Lab members can use leading-edge technologies for AI object detection, the Internet of Things (IoT), big data, high-performance computing and even drone technologies.

By using New Lab resources to develop products like FUSE and V22, Norcia said access to these evolving technologies allows StrongArm to more quickly bring new and innovative ways to ensure the health, safety and productivity of the industrial workforce.

Brazilian game studios working on major league projects like Horizon Zero Dawn signal a rise of game development in Latin America.

One of the best-selling games this year, Horizon Zero Dawn wowed players with tough battles against giant killer robots. While digital chrome-plated dinosaurs are not a typical Latin American export, these majestic metal monsters were made in Brazil.

“Brazilians are discovering that we can make really high quality games here,” said Thiago de Freitas, CEO and founder of the game development studio Kokku, who was charged with building 3D models of the robotic beasts.

When Horizon Zero Dawn shipped in February, Kokku became the first Brazilian studio that worked on a major current generation video game. They likely won’t be the last. Around the country, others are following in their footsteps.

Thanks to recent incentives by the Brazilian government to promote the exportation of games, the future looks bright for game developers in Brazil.

Played But Not Made

This scenario was almost unthinkable until now. Though Brazilians are passionate about playing video games, notoriously high import tariffs nurtured a culture of knockoffs and counterfeits.

In one famous example, the arcade game company Taito of Brazil opted to make imitations of American pinball tables when faced with outrageous import taxes. Williams Electronics’ fabled Black Knight table became Cavaleiro Negro at the hands of copycats.

These practices set the scene for a place where games were played but not made.

“Fifteen years ago, if you wanted to be a game developer, you had to study and go abroad. There were no companies here,” said Paulo Luis Santos, vice president of Abragames, the Brazilian Game Developers (BGD) association for game studios.

However, these days game development in Brazil is rising. More than 45 universities offer classes in game development, and the BGD website lists 83 professional game companies creating games in Brazil. Kokku, the country’s largest studio, employs anywhere between 40 and 90 developers depending on current needs.

Incentivizing Game Development

Part of this turnabout happened when the Brazilian government began incentivizing game development. The BGD export program, for instance, pays for Brazilian game creators’ admission into key industry networking events, including the Game Developers Conference in San Francisco and Gamescom in Cologne, Germany.

Kokku’s Freitas is a networking pioneer in the export program. Without the funding, his company would have never had the chance to build ferocious robots in Horizon Zero Dawn. After establishing a good rapport with Guerrilla Games, the game’s lead developer studio based in Amsterdam, Kokku was asked to collaborate on the project.

In Freitas’ view, the deal not only helped to establish his company, but planted a seed within the fledgling Brazilian games industry.

“We are learning to become better developers from the biggest companies,” said Freitas. “If someone leaves our company and founds their own, they will take what they learned with them, and know how to make great games.”

Agency of National Cinema (ANCINE), another program opening doors for Brazilian gaming greatness, promotes contests for burgeoning developers. The program selects the most promising games, providing funding in exchange for a potential return on the profits.

Last year, the studio Mad Mimic was awarded funding through a local state agency in São Paulo. Recently released on Steam, they developed No Heroes Here, a four-player battle royale to defend castles by blasting medieval enemies with cannonballs.

Before entering the contest, the freshman studio was somewhat adrift, dithering among several projects without making much progress.

“This was the first time making a game,” said Luis Fernando Tashiro, CEO of Mad Mimic. “We had no deadlines. This process helped us get organized.”

In addition to easing the financial stresses of running a gaming studio, the ANCINE contest compelled Mad Mimic to grow from amateurs into professional developers. The contest required them to create a game design strategy, establish proper deadlines and write a business plan.

Contests like these encourage young talent to reorient themselves toward advancing the Brazilian game industry. Even those who don’t win are driven toward professionalism.

As a result, Brazilian developers are gaining visibility in the international gaming industry. In October, 300,000 Brazilian gaming fans and major game publishers like Sony and Microsoft gathered at the Brasil Game Show in São Paulo, compared to a turnout of 68,000 at this year’s Electronic Entertainment Expo (E3) in Los Angeles.

“The whole market has their eyes open to Brazil,” said Freitas. “They consider us a very creative people. We have creativity in our blood.”

]]>patilskx20171030Brazilian game studios working on major league projects like Horizon Zero Dawn signal a rise of game development in Latin America.Gaming5UncategorizedGaming5JasonJohnson0games70Brazilian game studios working on major league projects like Horizon Zero Dawn signal a rise of game development in Latin America.Will Games Become Brazil’s Next Big Export?-imghttps://iqglobal.intel.com/in/wp-content/uploads/sites/13//2017/10/shutterstock_351718496-e1508526871284.jpgMaking Sense of Virtual Realityhttps://iq.intel.in/making-sense-of-virtual-reality/
https://iq.intel.in/making-sense-of-virtual-reality/Mon, 23 Oct 2017 00:00:00 +0000

The proliferation of new virtual reality (VR) technologies and immersive experiences is happening so fast that many people still haven’t grasped what exactly is VR.

“A lot of the time, the terms ‘virtual reality,’ ‘augmented reality’ and ‘360-degree video’ are mixed up and used interchangeably, which is not entirely correct,” explained visual effects guru Johannes Saam.

If anyone is capable of clearing up misconceptions about VR, it’s Saam.

The Academy Award-winning creative developer has been at the forefront of the visual effects world for more than a decade. His work has transported viewers everywhere from post-apocalyptic Australia in Mad Max: Fury Road to the mythical home of the Norse gods in Thor.

He’s now the senior creative developer at internationally renowned visual effects company Framestore, where he has had spearheaded projects running the VR gamut.

By the end of 2017, VR is expected to be a $7 billion industry, and over the next four years, that number is projected to increase tenfold. Despite its skyrocketing growth, VR is an evolving Wild West technology that many people still don’t quite understand.

Like an enthusiastic professor of VR, Saam explains the similarities and differences between single viewpoint 360-degree video, immersive VR experiences and the real world overlays of augmented reality (AR).

360-Degree Video: Gateway Drug to VR

Though not as immersive as VR or AR on the spectrum of virtual experiences, 360-degree video serves a unique purpose, according to Saam.

“We like to call 360-degree video the ‘gateway drug’ to virtual reality,” he said. “These videos give you the freedom to look wherever you want, but they usually don’t offer a lot of interaction and/or movement within the space.”

These 360-degree videos can fall into two categories: monoscopic and stereoscopic videos.

Monoscopic videos can be viewed on a platform like YouTube and navigated using a mouse. Stereoscopic videos require the use of a VR headset and are navigated by looking from side to side or up and down.

While watching a 360-degree video, like those Saam helped Facebook create earlier this year using the Surround 360 camera, the viewer can explore an entire world, but just from the point in space chosen by the filmmaker.

With the Intel True VR Game of the Week app, fans can personalize their view of select MLB games. Photo credit: Intel Corporation.

As an example, a 360-degree video taken during an event like the Super Bowl might let a viewer feel like they are standing on the 50-yard line. They’re in the middle of the action and can look to the sidelines, up into the stands or at either end zone, but their place within the environment is completely determined by the position of the camera during filming.

However, that limitation doesn’t prevent 360-degree videos from being a step up from standard videos shot using traditional cameras.

“They have the potential to be immersive because rather than looking at a video on a flat screen, you’re getting transformed into an experience,” said Saam.

Experiences like the Intel True VR Game of the Week use a Samsang Gear VR headset app to bring Major League Baseball fans into the action. Fans can see the game from the players’ perspective by selecting up to four camera angles per game or they can watch a produced VR broadcast experience.

Computer-Generated World of VR

For those not content with the passive experiences provided by 360-degree videos, it’s time to step into VR.

“Virtual reality picks up where 360-degree video ends,” Saam explained. “Not only can you rotate your head in a swivel, you can actually move around in the space, either by teleportation or by physically moving.”

“Virtual reality is about total immersion,” said Puran. “It leads you to believe that what you’re seeing in the headset is another world. I see people with headsets on expressing how real something looks in VR or how they feel like they are actually inside a volcano in Iceland.”

Systems that power VR have different needs than traditional PCs, he said. Using the Intel Core i7 processor, VR-ready computers have the power to deliver rich 360-degree 3D visuals, precise controls and immersive 3D sound. Intel Optane Memory helps load data quickly, shortening latency time for a smooth immersive experience.

“In a sense, technology substitutes your current reality, your current view of the world with a simulated one. It’s transformational,” said Puran.

During a VR experience, viewers not only feel like they are in a new environment, they also have the opportunity to choose how they want to navigate that world.

For example, in the LUMEN VR experience Framestore helped create for Time Inc., the viewer navigates a psychedelic forest, controlling everything from the pace at which the foliage grows to the color of the sky above them.

“You get to have a real-time engine-driven experience that allows you to interact with the items within the environment,” said Saam.

Enhancing Reality with AR

Unlike the fully constructed environments experienced in VR, augmented reality blends the digital and the natural worlds using device such as a headset, smartphone or tablet.

“In an AR experience, the real world isn’t closed off,” explained Saam. “It’s still visible, but augmented with 3D graphics. Again, you can move freely around the space, but you still have the real world as a backdrop.”

Using an AR system like Microsoft’s Hololens or the anticipated Magic Leap, people can bring fantastical elements into their everyday world, fighting space invaders in their living room or cradling a tiny elephant in their hands.

AR utilizes several methods that are available today and some methods still in development, according to Puran.

“There are holographic screens that can have images projected on to them to display something on stage,” said Puran, pointing to the Royal Shakespeare Company’s use of AR to engage audiences during the Tempest.

“There is also technology that utilizes your phone and its camera to give a live view of the world you live in, and interjects virtual or holographic objects into that view,” Puran said. Pokemon Go or AR Kit-enabled experiences from Apple on the iPhone are examples.

“Then there is projection to a set of glasses or head-mounted displays. These make it easy to view augmented objects or characters in front of you as if these augmented things are part of your existing world,” explained Puran. ”This gives people more freedom of movement, including the use of both hands.”

AR also has a variety of more practical applications. Users have the option to check the weather or filter through emails without needing to pull out their laptop, and AR can add an entirely new dimension to shopping as well.

Using a tablet, the real world is augmented with graphics and information about local businesses.

“Augmented reality is really great if you want to see how something you might buy would look in your world,” said Puran.

“You can put an IKEA couch in your living room and see how it would look. The reality is, it’s a hologram, but you can preview exactly how it would look in your living room.”

Evolving Virtual Technology

As more examples of 360-degree video, VR and AR reach the mainstream, Saam expects people will understand the differences and nuances between these tech-powered experiences.

As for now, Saam said even VR developers don’t fully grasp it all.

“It’s a whole new language that we’re still trying to define, ” he said. “It’s not set in stone yet.”

“People are still making up rules, breaking them, making up new rules. It’s still a little bit of the Wild West with this technology, but that’s also the exciting part of it.”