ReadWrite - Nvidiahttp://readwrite.com/tag/Nvidia
enCopyright 2015 Wearable World Inc.http://blogs.law.harvard.edu/tech/rssTue, 31 Mar 2015 13:48:16 -0700How Nvidia Aims To Game Self-Driving Cars<!-- tml-version="2" --><p>Nvidia’s press conference Sunday was like a tale of two missions: Announce its new Tegra X1 "mobile super chip,” a processor so powerful it could put Xbox One-worthy graphics on a smartphone. And then reveal where the company wants to put it first…in mobiles, yes, but of the auto variety.</p><p>According to CEO Jen-Hsun Huang, the Tegra X1—built on the Maxwell architecture it unveiled last year—is twice as fast as its lauded predecessor, the Tegra K1. The tiny chip is also energy efficient, which should make it a natural fit for mobile devices. </p><p>Too bad it’s not heading to smartphones or tablets. But mobile’s loss could be automotive’s gain. Because the chip could power Nvidia’s vision of self-driving cars, using a system of sensors and cameras. </p><p><strong>Chips Ahoy</strong></p><div tml-image="ci01c3d20000019512" tml-image-caption="" tml-bad-render-layout="inline"><figure><img src="http://a2.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2MDY2MzA0NDgwNTMw.jpg" /><figcaption></figcaption></figure></div><p>The Tegra X1, Nvidia claims, can handle a teraflop of computing power. For comparison, that would give the world’s fastest supercomputer from 2000 could a run for its money. Although that may not seem smoking fast by today’s standards, the company claims the eight-core, 64-bit chip does not lack for performance.</p><p>To illustrate the X1’s chops, Nvidia showed off a demo of a smartphone running a video built with Unreal Engine 4, a tool used to build graphic-intensive games. The demo worked well, which presumably speaks to the chip’s capabilities.&nbsp;</p><div tml-image="ci01c3d23f100199de" tml-image-caption="" tml-bad-render-layout="inline"><figure><img src="http://a4.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2Mjk0NDc0NDg4Mjg2.jpg" /><figcaption></figcaption></figure></div><p>Unfortunately, Nvidia doesn’t think smartphones can handle the X1’s computational power yet. So instead, it's taking the X1 to carmakers.</p><p>The company announced the Drive CX, a "digital cockpit computer” that brings simulated graphics, realistic finishes—like bamboo or aluminum finishes—and contextual data to gauges, maps and in-dashboard displays.</p><div tml-image="ci01c3d1fdb001efe2" tml-image-caption="Nvidia X1 CES 2015" tml-bad-render-layout="inline"><figure><img src="http://a4.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2MDY2MDM1OTM3MjUw.jpg" /><figcaption>Nvidia X1 CES 2015</figcaption></figure></div><p>Nvidia also unveiled Drive PX, a new platform powered by a couple of X1 chips, for 2.3 teraflops of computing power that can use high-powered graphics from sensor- and camera-festooned cars to enable autonomous driving.</p><p><strong>Driving The Future Of Smarter Cars</strong></p><div tml-image="ci01c3d247d001efe2" tml-image-caption="" tml-bad-render-layout="inline"><figure><img src="http://a3.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2NDE5MDI4NTExMzYz.jpg" /><figcaption></figcaption></figure></div><p>Automobiles will boast more processing power “than anything you currently own today,” Huang said.&nbsp;</p><div tml-image="ci01c3d2474001c80a" tml-image-caption="" tml-bad-render-layout="inline"><figure><img src="http://a4.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2NDE2NjEyNjIwNzY2.jpg" /><figcaption></figcaption></figure></div><div tml-image="ci01c3d2476001c80a" tml-image-caption="" tml-bad-render-layout="inline"><figure><img src="http://a2.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2NDE2NjEyNjg2MzAy.jpg" /><figcaption></figcaption></figure></div><p>To explain what he meant, the exec veered into somewhat academic territory, over-explaining the nature and merits of computer learning, neural networks and specifically "GPU-accelerated learning”—a fancy way of describing processor-intensive image recognition technology that can interpret results and make decisions.&nbsp;</p><div tml-image="ci01c3d248c0012a83" tml-image-caption="" tml-bad-render-layout="inline"><figure><img src="http://a3.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTI3MTc2NDIzMzIzNTE4OTg2.jpg" /><figcaption></figcaption></figure></div><p>But his enthusiasm, and his company's vision, were plain: Nvidia sees X1-powered cars that can park themselves, drive on their own and not only stop for animals, but can even tell you what breed of dog has skipped into your path. The chipmaker believes it has the super-fast processor capable of the sort of detailed graphics necessary for split nano-second decisions. </p><p>Audi appears to agree. The carmaker joined Nvidia on stage to wax poetic about autonomous cars and graphics-festooned vehicle interiors—hinting that our rides may be on the verge of accelerating into the future.&nbsp;</p><p><em>Photos by Adriana Lee for ReadWrite</em></p>A leading graphics processor company wants to level up your ride.http://readwrite.com/2015/01/05/nvidia-tegra-x1-chip-cars-driving-self-driving
http://readwrite.com/2015/01/05/nvidia-tegra-x1-chip-cars-driving-self-drivingPlayMon, 05 Jan 2015 06:21:14 -0800Adriana Lee5 High-Tech Companies That Are Driving the Future of Cars<!-- tml-version="2" --><p><em><a href="http://readwrite.com/series/drive">ReadWriteDrive</a> is an ongoing series covering the future of transportation.</em></p><p>I used to think the future of cars was all about better powertrains and fuels. If the auto industry could only make a transition away from petroleum-based internal combustion—so I thought—and toward electricity (and other low-carbon fuels), then we could drive happily ever after. </p><p>Then, the transition actually started happening. It’s far from complete, but the stricter federal fuel economy standards that passed two years ago put automakers on a path toward an average of 54.5 mpg by 2025. That was a monster piece of legislation that will result in a massive shift toward battery-powered vehicles—from conventional hybrids to pure electric cars. </p><p>But it turns out that changes in powertrains and fuels were only a precursor to a more monumental (pending) transportation revolution. It will be based less on how our cars are powered and more on IT: networked map-based sensor-driven computer technology.</p><p>To better understand how automotive technology is morphing with personal computing and web-based systems,&nbsp;here's a look at five companies that are making this revolution happen.&nbsp;It’s not surprising that only one of the companies, Tesla Motors, has any resemblance to a traditional auto manufacturer—although its similarities don’t go much further than the fact that it produces a product with four wheels.</p><h2>Tesla: It’s All About the Batteries</h2><p>As bond-investment expert Jeff Gundlach said in July, and repeated on CNBC yesterday, Tesla’s value—and transformative potential—is <a href="http://www.businessinsider.com/gundlach-elon-musk-is-still-in-the-wrong-business-2014-7">not in the Model S</a> or other upcoming electric vehicles. "<a href="http://www.businessinsider.com/jeff-gundlach-on-tesla-november-24-2014-11">Tesla is all about the batteries</a>," he said. </p><p>Gundlach sees the Tesla’s future $5 billion ginormous battery factory as the company’s way forward, making Tesla a primary supplier of batteries to the auto industry and eventually for use in buildings. The success of the Model S showed that Tesla could successfully play the auto industry’s game.</p><p>But we’re talking about the re-invention of cars for a long century ahead. What Tesla illustrates is that cars, to varying degrees, will be battery-powered transportation devices. (By the way, Tesla is also showing how cars can be sold in retail outlets that bear more resemblance to an Apple store than to today’s auto dealerships.)</p><h2>Nvidia: Superprocessors On Board</h2><p>Meanwhile, Nvidia, the Santa Clara, Calif.-based chipmaker, continues to invest in its <a href="http://hexus.net/ce/news/automotive/77325-nvidia-audi-show-most-advanced-in-car-technology/">car-based technology</a>. As I posted on ReadWrite a year ago, more than four million cars today already <a href="http://readwrite.com/2013/12/04/nvidia-tegra-gpu-connected-cars">have Nvidia's Tegra chips on board</a>, with another 25 million in the pipeline.</p><blockquote><p><strong>See also: <a href="http://readwrite.com/2013/12/04/nvidia-tegra-gpu-connected-cars">The Supercomputer In Your Driveway</a></strong></p></blockquote><p>Last week at the Los Angeles Auto Show, Nvidia showcased <a href="http://blogs.nvidia.com/blog/2014/11/18/nvidia-audi/">an Audi TT using an array of Nvidia processors</a> to run the vehicle’s instrumentation, infotainment and navigation functions. These systems are upgradable, so many of the vehicle’s driver-facing functions can be refreshed throughout the ownership cycle—rather than every five to seven years with a new model.</p><p>Last week, Audi also announced that its “Piloted Driving” technology—a significant step toward self-driving—was approved for production. Those assisted driving capabilities are made possible by Nvidia’s Tegra K1 mobile supercomputer. </p><h2>Google and Here: It’s Really About the Maps</h2><p>Tesla and Nvidia will continue to re-shape cars into high-tech processor-managed battery-powered mobility machines, but these large transportation devices obviously have wheels and function in space. </p><p>Enter Google. As both Slate and IEEE Spectrum recently noted, the <a href="http://www.slate.com/articles/technology/technology/2014/10/google_self_driving_car_it_may_never_actually_happen.html">key to self-driving cars</a> is the <a href="http://spectrum.ieee.org/robotics/artificial-intelligence/the-unknown-startup-that-built-googles-first-selfdriving-car">mapping of streets</a>. So, while Google’s funky little self-driving prototype car is the poster child for autonomous vehicles, I have my eyes on all those Google Maps cars already roaming the street capturing images and LIDAR-based data to create a detailed map of roadways.</p><blockquote><p><strong>See also: <a href="http://spectrum.ieee.org/robotics/artificial-intelligence/the-unknown-startup-that-built-googles-first-selfdriving-car">Why Google's Driverless Car Is Evil</a></strong></p></blockquote><p>Even if Google manages to commercialize some type of self-driving car in the next five years, I don’t expect the search giant to compete on safety, comfort and performance. Traditional carmakers are not going away as manufacturers of vehicle platforms. But they will become ever-more reliant on software that situates those conveyances in the real world. </p><p>That’s why I put not only Google, but Here—a Nokia company—on this list. You don’t need me to explain the myriad ways that Google is deeply entrenched with maps and other location-based data. But readers might be less familiar with Here, which describes itself as “the largest and most highly trained team of mapmakers on the planet, with over 6,000 people in 55 countries.” </p><p>Here maps are already used in four out of five factory-fit car navigation systems. Here customers include Toyota and Garmin. Last week, Here launched its “Predictive Traffic” product—a way to forecast traffic as far as 12 hours into the future based on predictive analytics and cloud computing. Here collects what its calls “probe points,” more than 70 billion of them per month (a 1.3 trillion of them over the past decade) to make this work.</p><h2>Uber: Moving From Cars to Mobility Webs</h2><p>You can rightly object to the way <a href="http://readwrite.com/2014/11/19/uber-error-in-judgment-tech-execs-behaving-badly">Uber attacks journalists or uses personal information</a>, but it’s hard to overestimate the impact of the company and its app on mobility. Uber is the king of on-demand mobility services—what we used to call taxicabs, car rentals and other ways to get around that don’t involve owning a car. </p><blockquote><p><strong>See also: <a href="http://readwrite.com/2014/11/19/uber-error-in-judgment-tech-execs-behaving-badly">An Uber Error In Judgment: When Tech Execs Behave Badly</a></strong></p></blockquote><p>Mobile computing, mapping and disintermediation allow this to happen. But what’s driving it are titanic economic forces such as urban congestion and the rise of the so-called shared economy. For the purposes of this discussion, the critical trend to follow is a fissure in the long-held sacrosanct relationship between driver and car.</p><p>For the first time, we can glimpse what mobility might look like if trips—getting from A to B—become the undividable economic and functional unit of mobility, rather than the complicated painful process of buying, fueling, maintaining, financing, insuring and parking an automobile.</p><p> You might disagree with my selection of Tesla, Nvidia, Google, Here and Uber as the top five movers and shakers in mobility. That’s cool; tell me in comments who I missed.</p><p>The players will change over time, and niche start-ups will have a role. But as we move forward with the Drive series, this will be our focus: the hardware, software and business models used for connected, electric, sensor-controlled computerized map-based networks of mobility.</p><p><em>Lead photo by <a href="https://www.flickr.com/photos/ryc-behindthelens/15546459598">RyC - Behind The Lens</a></em></p>Some are not who you think they are.http://readwrite.com/2014/11/26/5-tech-companies-future-of-cars
http://readwrite.com/2014/11/26/5-tech-companies-future-of-carsWorkWed, 26 Nov 2014 14:09:03 -0800Bradley BermanThe Supercomputer In Your Driveway<!-- tml-version="2" --><p></p><div tml-image="ci01b28190c0026d19" tml-render-position="right" tml-render-size="medium"><figure><img src="http://a4.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTIyMzAyNTk2NzEwMDM0NzEz.jpg" /></figure></div><p><em>ReadWriteDrive is an ongoing series covering the future of transportation. In December, this series is presented by <a href="http://bit.ly/18P5Iai">Buick Regal</a>.</em></p><p>Nvidia, the Santa Clara, Calif.-based chipmaker, is well known for inventing the graphics processing unit, or GPU. Its chips have long souped up gaming PCs, laptops, workstations and supercomputers. But you might be surprised to learn how the company is revving up cars to become sophisticated, sensor-driven, connected mobility machines.</p><p>Four million cars today already have a 21st century tiger in the tank—in the form of Nvidia's Tegra chips. Another 25 million more cars are in the pipeline—by virtue of relationships with long list of high-end German luxury auto brands like BMW/Rolls Royce, Volkswagen/Audi, and Aston Martin. Japanese and American car companies can't be far off.</p><p>Nvidia’s automotive development kit, called Jetson, is an under-the-hood car-stereo-sized box that provides all the I/O connectors a modern car needs, including USB, Ethernet and HDMI.&nbsp; Nvidia system-on-a-chip processors—essentially fully functional, &nbsp;self-contained computers—power the instrument clusters, navigation and infotainment. It’s what provides the computing horsepower for the giant dual-touchscreen console on the Tesla Model S.</p><p>Nvidia provides some of the most complex 3D rendering for games and technical design—so it has a decisive head start in the brave new world of automotive computing.&nbsp; Other players—Qualcomm or even Apple—will make a similar transition from consumer electronics to cars, according Thilo Koslowski, an analyst of vehicle information and communication technology at Gartner.</p><p>“Yet Nvidia is pushing the envelope,” said Koslowski. “It’s like a Ferrari versus a Volkswagen.”</p><h2>Where Sensors Meet The Road</h2><p>As cool as those vivid digital dashboards are, they amount to child’s play compared to what Nvidia has in mind for its automotive-grade Tegra processing systems. High-performance processors allow car companies to design virtual vehicle prototypes on screen, and then run precise aerodynamic simulations in virtual wind tunnels or accurate ersatz road testing of traction control systems or crash events. Now, the technology is being used to interpret and integrate an ever-widening stream of data from sensors.</p><p>The list of critical components on today’s cars now includes cameras, radar, sonar, and laser sensors, or lidar.&nbsp;</p><blockquote tml-render-position="right" tml-render-size="medium"><p><strong>See also: <a href="http://readwrite.com/2013/10/15/seven-ways-3d-lidar-is-transforming-our-physical-world">Seven Ways 3D Lidar Is Transforming Our Physical World</a></strong></p></blockquote><p>“A CPU [central processing unit], GPU, image processor, audio processor, and video processor are all baked into this tiny thing,” said Danny Shapiro, senior director of automotive at Nvidia, referring to a component the size of a thumbnail, embedded on a board not much bigger than the size of a playing card. That board houses the memory and components needed to make the device function like a standalone computer. I spoke with Shapiro on the sidelines of the Connected Car Expo at the 2013 Los Angeles Auto Show.</p><h2>How to Author a Car</h2><p>“It’ll run on Linux or Windows or Android,” explained Shapiro. “Now, software gets laid on top of this incredible processor power to do whatever the automaker wants it do.”&nbsp;</p><p>Nvidia also supplies a wide range of software libraries that perform algorithms, thus speeding up development time for automotive programmers and designers. Nvidia provides the hardware and software, but not the apps—the same kind of relationship the company has with gaming companies like Entertainment Arts, Ubisoft and Valve.</p><p>UI Composer, Nvidia's authoring system, speeds up design of instrument clusters with built-in 3D objects like gauges and dials, and scripts for how they move. These aren’t canned screens—more like runtime engines.</p><p>“This comes right out of gaming,” said Shapiro. Unlike gaming graphics—designed for fun—the rendering and interpretation of data from car-based systems (whether in the car’s development, and even more importantly when carrying you and your loved ones on the road) has to be 100 percent accurate. It’s a matter of life and death.</p><p>The technology also means much greater customization for car owners. So you don’t like the dashboard layout on the 2014 Audi 3? No problem. Download the instrument display from the vintage 1970 Audi 100, rendered with such amazing realism that you’ll think the speedometer is purely analog. The system directly accesses engine speed and performance directly from the car’s CANBUS computer network.&nbsp;</p><p>Moreover, automotive superprocessors mean that car companies—generally considered technolaggards taking three or four years to develop a new engine or transmission—can start to keep up with the pace of innovation in consumer electronics.</p><p>“Pop out the old module and pop in a new one.&nbsp; Each vehicle model, year after year, can have a more powerful system without redesigning it,” said Shapiro. “Just like what happens with phones.”</p><h2>Car Talk</h2><p>We could discuss what this means for richer navigation and streaming media in the car—cool stuff, but it’s far more intriguing to consider the vehicle’s core functions, such as computer-based accelerating, braking, and steering.</p><p></p><div tml-image="ci01b2819160008266" tml-image-caption="One application of high-end graphics processing is software that can analyze a driver’s condition." tml-render-position="left" tml-render-size="medium"><figure><img src="http://a5.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTIyMzAyNTk5Mzk0MjU4MjAx.jpg" /><figcaption>One application of high-end graphics processing is software that can analyze a driver’s condition.</figcaption></figure></div><p>The Holy Grail is object detection and natural language processing, so we can get our Hasselhoff on. At the 2013 CES electronics extravaganza in Las Vegas, Audi announced traffic-jam assist, which uses cameras and radars to detect congestion, and with driver approval, engages advanced cruise control to maintain a constant and safe distance from the car in front of you, while automatically steering inside your lane. This signals the future direction of the Nvidia onboard platform.&nbsp;</p><p>At that show, Nvidia demonstrated integration of Google's Street View feature in the Audi A7’s navigation system. Nvidia is expected to come back to CES in January 2014 to announce advances, bringing vehicle autonomy one closer step to reality.</p><p>Imagine the processing power needed for this scenario coming soon: The car’s processing unit is running the dashboard, rendering speed and engine functions as you like, while the ultra realistic, &nbsp;Street-View-fed navigation system guides you to a destination set by voice command, as the kids enjoy a streaming Netflix movie. A camera aimed at the roadside detects speed limit signs, and presents them on the dashboard.&nbsp; Radar and Lidar data are run in algorithms 30 to 60 times a second to keep track of all traffic, differentiating between other cars, or maybe a kid running across the street—readying the car to apply brakes as necessary.</p><p>Meanwhile, an inward-facing camera handles driver state monitoring.</p><p>“The same kinds of processing power used for pedestrian detection will do blink detection, and run the algorithms to determine if somebody is distracted or falling asleep,” said Shapiro. “Who just got in the car? Let’s adjust the seats, radio and mirrors for that person.” Talk about a smart car.</p><p>This scenario is imminently possible, but not without very fast and efficient processors.</p><p>“We’re developing what is essentially a mobile supercomputer for cars that can handle all of this,” Shapiro said.</p>High-performance system-on-a-chip processors are leaping from the desktop to the blacktop.http://readwrite.com/2013/12/04/nvidia-tegra-gpu-connected-cars
http://readwrite.com/2013/12/04/nvidia-tegra-gpu-connected-carsWebWed, 04 Dec 2013 09:01:00 -0800Bradley BermanNvidia Finally Gets Faces Right - Until They Open Their Mouths<!-- tml-version="2" --><div tml-image="ci01b2821ae0018266" tml-render-position="center" tml-render-size="large"><figure><img src="http://a2.files.readwrite.com/image/upload/c_fill,cs_srgb,w_620/MTIyMzAzMTkwMjIwNzAyMzEw.png" /></figure></div><p>Nvidia has just about pulled off the trick of rendering computer-generated human faces -- in real time -- that won't make viewers squirm. At least so long as they don't grimace. Or try to talk.</p><p>The graphics chip maker Nvidia said on Tuesday that it had teamed up with the University of Southern California to develop two sets of simulation technologies designed to improve rendering and simulations in video games, one for oceans (Wave Works) and one for faces (Face Works).&nbsp;</p><p>The faces technology is the big deal here. At certain moments during a demonstration at its <a href="http://www.gputechconf.com/page/home.html">GPU Technology Conference</a>, Nvidia's virtual "Ira" transcended the so-called "<a href="http://tvtropes.org/pmwiki/pmwiki.php/Main/UncannyValley">uncanny valley</a>" and made me think that the virtual head on stage was an actual, living person.</p><p>It's been a long time coming.</p><h2>Graphics Chips: Not Just For Graphics Any More</h2><p>Years ago, Nvidia, Rendition, 3Dlabs and others helped transform the PC with the introduction of 3D graphics, from which evolved PC gaming, CAD animation, video production and a number of other creative enterprises. Nvidia's chief executive Jen-Hsun Huang has been an evangelist of sorts, helping to push Nvidia into the enterprise space with <a href="http://slashdot.org/topic/datacenter/nvidia-launches-vca-appliance-tips-maxwell-volta-gpus/">integrated machines</a> that use its graphics processing units (GPUs), as well as into smartphones and tablets with <a href="http://slashdot.org/topic/datacenter/nvidia-tegra-to-add-cuda-parallel-processing-technology/">new versions of its Tegra chips</a>.</p><p>"Over the last 20 years, this medium has transformed the PC from a computer for information and productivity to one of creativity, expression and discovery," Huang said in his opening keynote. "The beauty and the power of interactivity this medium allows us to connect with ideas in a way that no other medium can. And the GPU is the engine of this medium."</p><p></p><div tml-image="ci01b2821b80028266" tml-image-caption="The original 1992 Alone in the Dark, a PC gaming classic. Source: KentuckyFriedPopcorn.blogspot.com"><figure><img src="http://a3.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTIyMzAzMTkyNjM2NjgxNDk3.jpg" /><figcaption>The original 1992 Alone in the Dark, a PC gaming classic. Source: KentuckyFriedPopcorn.blogspot.com</figcaption></figure></div><p>The fundamental building block of the GPU is the polygon, also known as "triangles" - groundbreaking games like <a href="http://en.wikipedia.org/wiki/Alone_in_the_Dark_(video_game)">Alone in the Dark</a>&nbsp;created 3D characters out of polygons that players could easily distinquish. Today, however, faster processors have allowed those 3D polygons to become so small that they can't seen by the naked eye. Those 3D surfaces can be colored, textured and even "bump-mapped" to break up the regularity of the image, improving realism.</p><p>At the same time, GPUs have become physics engines, modelling everything from how light passes through and reflects off of objects - <a href="http://www.cs.unc.edu/~rademach/xroads-RT/RTarticle.html">ray tracing</a> - to applying real "physics" to objects as they fall and bounce. Tracking particles as they move, such as smoke or water, is also part of the equation. That's the kind of computational power that supercomputers tap into - and in February,<a href="http://www.nvidia.com/titan-graphics-card"> Nvidia launched its Titan card</a>, using the same GPU technology as the world's fastest supercomputer, <a href="http://slashdot.org/topic/datacenter/too-much-bling-delays-worlds-fastest-supercomputer/">ORNL's Titan</a>, uses.</p><h2>Face Works, Ira And The "Uncanny Valley"</h2><p>For a time, both Nvidia and its chief rival, ATI Technologies (now part of AMD) used, well, virtual dolls, to demonstrate the realism of their graphics technology and appeal to hormone-fueled gamers. AMD's Ruby is a thing of the past, but Nvidia's fairy-like Dawn appeared in Huang's keynote. The showcase for 2002's GeForce FX line, Dawn was created to embody "cinematic computing" and turned heads with impressive attention to detail, realistic hair and dynamic lighting effects. But Face Works and Ira are the future.</p><p></p><div tml-image="ci01b2821c10036d19"><figure><img src="http://a5.files.readwrite.com/image/upload/c_fill,cs_srgb,w_620/MTIyMzAzMTk1NTg5NDExNDMw.png" /></figure></div><p>Nvidia's Face Works was developed in conjunction with USC's Institute for Creative Technologies, which helped develop&nbsp;<a href="http://gl.ict.usc.edu/Data/LightStage/">LightStage</a>, a high-speed illumination system designed for human-scale subjects consisting of 6,500 white LED sources. Essentially, Huang said, a person marches into a giant sphere, where the subject is photographed from 253 different directions. Each image is matted onto a black background, and compiled into a 3D object. Face Works allows each object to be modified, or "stretched," to simulate speech and movement.</p><p></p><div tml-image="ci01b2821cc0018266" tml-image-caption="A shot of a model using ICT&amp;#039;s Light Stage technology."><figure><img src="http://a2.files.readwrite.com/image/upload/c_fill,cs_srgb,w_620/MTIyMzAzMTk4MjczOTYyNTk4.png" /><figcaption>A shot of a model using ICT&amp;#039;s Light Stage technology.</figcaption></figure></div><p>It's not easy. "Simulating an ocean is hard; simulating a face is harder," Huang said.</p><p>Humans are trained to instinctively spot things that are a little off, and that reaction, dubbed "the uncanny valley," ironically kicks in the <em>more</em> realistic a simulation gets. Basically, some people get creeped out by CGI that looks a little <em>too</em> realistic, but not quite realistic enough to be fully convincing.</p><p></p><div tml-image="ci01b2821d80016d19"><figure><img src="http://a1.files.readwrite.com/image/upload/c_fill,cs_srgb,w_620/MTIyMzAzMjAxMjI2NTUwNTUz.png" /></figure></div><p>Ira demonstrates the problem. As these images show, Ira looks quite normal - fully human, actually, under certain lighting conditions. What Face Works does is model light as it enters the skin, reflects, and diffuses through the skin's surface. Slight disfigurements - a freckle, skin pores - add to the realism.</p><p>But the illusion often breaks when the 3D model moves, as you can see in the keynote video below (the ocean modeling begins at about 9 minutes in, Ira and Dawn appear about 16 minutes in). Essentially, Ira looks eerily realistic when motionless, but when he grimaces (and, above all, talks) we begin to pick up on how his facial expressions aren't quite lifelike.</p><p>Still, recent games like <a href="http://www.rockstargames.com/lanoire/agegate/ref/?redirect=">L.A. Noire</a> became famous for their realistic depictions of human faces, and "reading" expressions became a <a href="http://www.gamefaqs.com/boards/929170-la-noire/62786090">gameplay mechanic</a>. Years ago, getting those right at all was an amazing accomplishment. We're now at the point where companies like Nvidia get it right most of the time. "All of the time," it seems, will soon be within our grasp.</p><p></p><div tml-image="ci01b2821e20038266"><figure><img src="http://a2.files.readwrite.com/image/upload/c_fill,cs_srgb,w_620/MTIyMzAzMjAzOTEwOTA1MTEz.png" /></figure></div><h2>Wave Works: Splash!</h2><p>Nvidia's ocean simulation, meanwhile, uses Wave Works to tap into Titan for what the company called the most realistic ocean simulation ever. Most water simulations paint the ocean as a flat surface, with random ripples distorting it. Objects that "float" on top, like a ship, might not actually move in response to the ocean's undulations.</p><p></p><div tml-image="ci01b2821ec0006d19" tml-image-caption="Nvidia&amp;#039;s &amp;quot;Wave Works&amp;quot; models a gale at sea."><figure><img src="http://a4.files.readwrite.com/image/upload/c_fill,cs_srgb,w_620/MTIyMzAzMjA2ODYzNjk1MTI5.png" /><figcaption>Nvidia&amp;#039;s &amp;quot;Wave Works&amp;quot; models a gale at sea.</figcaption></figure></div><p>Wave Works, however, uses 20,000 "virtual sensors" on a ship model to model water pressure, and to respond to the proximity of the water on the ship. And Water Works even models spray, tracking 100,000 "spray particles" as they move through the air. The Nvidia software can model an entire&nbsp;<a href="http://en.wikipedia.org/wiki/Beaufort_scale">Beaufort scale</a>&nbsp;of wind speed,&nbsp;dialing up everything from a sunny day to a near-hurricane, Huang said. And as the ship moves, it crashes through the waves, being tossed up and down. This simulation, at least, was completely convincing.</p>New technologies from Nvidia and USC make CGI faces look like you'd expect them to - until they move and get all creepy.http://readwrite.com/2013/03/21/nvidia-finally-gets-faces-right-until-they-open-their-mouths
http://readwrite.com/2013/03/21/nvidia-finally-gets-faces-right-until-they-open-their-mouthsPlayThu, 21 Mar 2013 03:30:00 -0700Mark HachmanNvidia Plans New HQ For Its Expansion Beyond The GPU<!-- tml-version="2" --><div tml-image="ci01b281ed20008266" tml-render-position="center" tml-render-size="large"><figure><img src="http://a4.files.readwrite.com/image/upload/c_fill,cs_srgb,dpr_1.0,q_80,w_620/MTIyMzAyOTkzNDU3NTEzMDYy.jpg" /></figure></div><p>We're only two months into 2013, but it has already been a big year for graphics-processing giant <a href="http://www.nvidia.com/page/home.html">Nvidia</a>. At the Consumer Electronics Show in January, <a href="http://readwrite.com/2013/01/09/at-ces-2013-nvidia-project-shield-valve-piston-offer-peeks-at-gamings-future#feed=/search?keyword=nvidia">the company unveiled Project Shield</a>, its first foray into console gaming hardware. And in February, it was revealed that Nvidia is helping the team behind the <a href="http://readwrite.com/2012/08/07/can-startup-ouyas-crowd-sourced-gaming-console-challenge-sony-microsoft-and-nintendo#feed=/search?keyword=ouya">Ouya</a>, the wildly successful Kickstarter campaign for an Android-based $99 gaming console, <a href="http://www.engadget.com/2013/02/13/ouya-nvidia-lovefest/">max out its Tegra 3 processor</a>.</p><p>Now Nvidia, which started as a 3-person team 20 years ago, has announced that it's outgrown its headquarters in Santa Clara, Calif.&nbsp;<a href="http://blogs.nvidia.com/2013/02/nvidia-to-build-a-new-home-20-years-after-our-founding/">In a post on the company's blog</a>, co-founder Jen-Hsun Huang unveiled plans to build a new complex across the street from the current HQ, designed by architecture firm <a href="http://www.gensler.com/">Gensler</a> with a team headed by prominent architect Hao Ko.</p><h2>Losing Ground To AMD</h2><p>Its no secret in the industry that by purchasing&nbsp;<a href="http://www.amd.com/us/Pages/AMDHomePage.aspx">rival graphics chip-maker ATI, chip-giant AMD</a> has in recent years been steadily pushing Nvidia out of one of its prime markets - gaming GPUs. The Gamecube, Wii U and the Xbox 360 all went AMD in that respect (<a href="http://www.extremetech.com/gaming/147577-xbox-720-gpu-detailed-merely-a-last-gen-radeon">and the Xbox 360's successor will reportedly follow the same path</a>).</p><p>With the pressure mounting, Nvidia gambled big and moved fast this year, attempting a home-console disruption with its partner Ouya. So far, the plan seems to be working. The Ouya, which runs solely on a souped-up Nvidia Tegra processor, has everyone talking, be it about <a href="http://kotaku.com/5984339/looking-for-a-launch-game-for-ouya?tag=ouya">the system's launch games</a> or the fact that <a href="http://kotaku.com/5982391/ouyas-hardware-will-be-updated-every-single-year?tag=ouya">CEO Julie Uhrman wants to release a hardware update every year</a>. And Project Shield, a strange hybrid device that fits a flip-out screen to a full-sized controller, <a href="http://gigaom.com/2013/02/09/android-this-week-project-shield-packs-a-punch-optimus-g-pro-goes-big-runkeeper-revamped/">has generated some well-earned buzz</a>.</p><h2>New Markets?</h2><p>Beyond gaming, Nvidia is also <a href="http://www.nvidia.com/object/tegra-automotive.html">expanding into vehicle add-ons</a> - driver assistance technology, navigation and in-car entertainment hardware - and has its units in space&nbsp;<a href="http://www.nvidia.com/object/marsrover_success.html">through a partnership with NASA</a>.</p><p>The new HQ is meant to signify the company's new aggressiveness - and free up some desks in the old office, which Huang wrote is getting a little tight now that the company has grown to 8,000 employees across more than 40 sites.</p><p>In a mock-up provided by Gensler, the new office looks like something out of one of the many science-fiction games Nvidia chips have powered over the years. Featuring two large triangular heaps of glass and what looks like a sprawling hedge-filled perimeter, the new design looks flashy enough to represent the company's confidence. An even cooler reason behind the three-sided geometry: the triangle represents the fundamental building block of computer graphics.</p>Graphics processing chip-maker Nvidia plans to build a new headquarters in Santa Clara, Calif., to house its move beyond the GPU market. http://readwrite.com/2013/02/20/nvidia-plans-new-hq-for-its-expansion-beyond-graphics-processing
http://readwrite.com/2013/02/20/nvidia-plans-new-hq-for-its-expansion-beyond-graphics-processingPlayWed, 20 Feb 2013 12:57:00 -0800Nick Statt