When the Khronos Group announced Vulkan at GDC, they mentioned that the API is coming this year, and that this date is intended to under promise and over deliver. Recently, fans were hoping that it would be published at SIGGRAPH, which officially begun yesterday. Unfortunately, Vulkan has not released. It does hold a significant chunk of the news, however. Also, it's not like DirectX 12 is holding a commanding lead at the moment. The headers were public only for a few months, and the code samples are less than two weeks old.

The organization made announcements for six products today: OpenGL, OpenGL ES, OpenGL SC, OpenCL, SPIR, and, as mentioned, Vulkan. They wanted to make their commitment clear, to all of their standards. Vulkan is urgent, but some developers will still want the framework of OpenGL. Bind what you need to the context, then issue a draw and, if you do it wrong, the driver will often clean up the mess for you anyway. The briefing was structure to be evident that it is still in their mind, which is likely why they made sure three OpenGL logos greeted me in their slide deck as early as possible. They are also taking and closely examining feedback about who wants to use Vulkan or OpenGL, and why.

As for Vulkan, confirmed platforms have been announced. Vendors have committed to drivers on Windows 7, 8, 10, Linux, including Steam OS, and Tizen (OSX and iOS are absent, though). Beyond all of that, Google will accept Vulkan on Android. This is a big deal, as Google, despite its open nature, has been avoiding several Khronos Group standards. For instance, Nexus phones and tablets do not have OpenCL drivers, although Google isn't stopping third parties from rolling it into their devices, like Samsung and NVIDIA. Direct support of Vulkan should help cross-platform development as well as, and more importantly, target the multi-core, relatively slow threaded processors of those devices. This could even be of significant use for web browsers, especially in sites with a lot of simple 2D effects. Google is also contributing support from their drawElements Quality Program (dEQP), which is a conformance test suite that they bought back in 2014. They are going to expand it to Vulkan, so that developers will have more consistency between devices -- a big win for Android.

While we're not done with Vulkan, one of the biggest announcements is OpenGL ES 3.2 and it fits here nicely. At around the time that OpenGL ES 3.1 brought Compute Shaders to the embedded platform, Google launched the Android Extension Pack (AEP). This absorbed OpenGL ES 3.1 and added Tessellation, Geometry Shaders, and ASTC texture compression to it. It was also more tension between Google and cross-platform developers, feeling like Google was trying to pull its developers away from Khronos Group. Today, OpenGL ES 3.2 was announced and includes each of the AEP features, plus a few more (like “enhanced” blending). Better yet, Google will support it directly.

Next up are the desktop standards, before we finish with a resurrected embedded standard.

OpenGL has a few new extensions added. One interesting one is the ability to assign locations to multi-samples within a pixel. There is a whole list of sub-pixel layouts, such as rotated grid and Poisson disc. Apparently this extension allows developers to choose it, as certain algorithms work better or worse for certain geometries and structures. There were probably vendor-specific extensions for a while, but now it's a ratified one. Another extension allows “streamlined sparse textures”, which helps manage data where the number of unpopulated entries outweighs the number of populated ones.

OpenCL 2.0 was given a refresh, too. It contains a few bug fixes and clarifications that will help it be adopted. C++ headers were also released, although I cannot comment much on it. I do not know the state that OpenCL 2.0 was in before now.

And this is when we make our way back to Vulkan.

SPIR-V, the code that runs on the GPU (or other offloading device, including the other cores of a CPU) in OpenCL and Vulkan is seeing a lot of community support. Projects are under way to allow developers to write GPU code in several interesting languages: Python, .NET (C#), Rust, Haskell, and many more. The slide lists nine that Khronos Group knows about, but those four are pretty interesting. Again, this is saying that you can write code in the aforementioned languages and have it run directly on a GPU. Curiously missing is HLSL, and the President of Khronos Group agreed that it would be a useful language. The ability to cross-compile HLSL into SPIR-V means that shader code written for DirectX 9, 10, 11, and 12 could be compiled for Vulkan. He expects that it won't take long for a project to start, and might already be happening somewhere outside his Google abilities. Regardless, those who are afraid to program in the C-like GLSL and HLSL shading languages might find C# and Python to be a bit more their speed, and they seem to be happening through SPIR-V.

As mentioned, we'll end on something completely different.

For several years, the OpenGL SC has been on hiatus. This group defines standards for graphics (and soon GPU compute) in “safety critical” applications. For the longest time, this meant aircraft. The dozens of planes (which I assume meant dozens of models of planes) that adopted this technology were fine with a fixed-function pipeline. It has been about ten years since OpenGL SC 1.0 launched, which was based on OpenGL ES 1.0. SC 2.0 is planned to launch in 2016, which will be based on the much more modern OpenGL ES 2 and ES 3 APIs that allow pixel and vertex shaders. The Khronos Group is asking for participation to direct SC 2.0, as well as a future graphics and compute API that is potentially based on Vulkan.

The devices that this platform intends to target are: aircraft (again), automobiles, drones, and robots. There are a lot of ways that GPUs can help these devices, but they need a good API to certify against. It needs to withstand more than an Ouya, because crashes could be much more literal.

Yvo de Haas, who has a degree in mechanical engineering from Windesheim University of Applied Sciences in Zwolle, Netherlands, creates props, robots, and other objects as a hobby. Previous creations include a joystick-controlled turret from Portal, GLaDOS, and a Fallout 3-style Pip-Boy.

The latest project was a Fallout 4-style Pip-Boy that accepts a smartphone, with an LG Nexus 5 shown in the demo video, above. It also contains a (non-functioning) cassette player at the top, which take Fallout-style tapes... so unfortunately you cannot pretend that your Vault Dweller is obsessed with Thriller. This model is currently available on the website for anyone with time and access to a 3D printer. The work is licensed under Creative Commons 4.0 Attribution ShareAlike, so you can use and modify the model however you like, as long as you share your alterations in a similar fashion (and assuming that you also don't violate Bethesda's trademarks in any way -- even though Haas' license permits commercial usage, Bethesda won't).

A second model (the “Accurate version”) is still in progress. This one is supposedly intended to be used with an embedded computer like a Raspberry Pi. It sounds like you will need to install a bare display and other components to make it work, but that will probably be more clear when it is published.

The Register tried out the new Pebble Time which features a colour e-paper Gorilla glass screen for better visibility outdoors, a battery which will last a full week, waterproofing to 90' and all for a $200 price tag. With over 8000 apps for the device it offers most of the functionality of the Apple watch for a fraction of the price. Certain features it lacks such as a heart rate monitor or GPS can be added by using Smartstraps, which not only allows the watch to stay on your wrist but also adds functionality as well. The improvements were noticeable but The Register preferred last years Steel but if you are in the market for a smartwatch you might be wise to hold on as the new Pebble Time Steel is due out in the near future.

"I love what Eric Migovsky has done with the Pebble by creating an antidote to modern smartwatches. The two generations of Pebble so far have been useful, durable and practical – qualities which elude the over-specced and costly Apple and Android kit."

When it graduated from high school, Microsoft was voted “least likely to have an open relationship with itself”. Well who's laughing now, member of the Yearbook Committee? You thought you were so clever, sitting in the back of the late bus for students in extra-curricular activities, giggling as you doodled in your Five Star binder. Even though they always hogged the Windows seat, maybe they would have opened it up for a little fresh air in the Summer time had you taken the time to ask.

While Cortana is first and foremost a Windows 10 feature, it will appear on iOS and Android as well. Peter Bright of Ars Technica got in on the pre-release, invite-only beta and walked through the features. He notes that, while many have complained about crashes, his experienced wasn't marred with stability issues. On the other hand, because Cortana is not as deeply integrated into the operating system, despite the laundry list of permissions it requests, he expects that most users looking for a digital assistant will look to Google Now on their Android devices, even if they use Cortana on Windows 10.

There really wasn't a whole lot of note in the article though, at least in my opinion. There are a few interesting screenshots, but it basically looks like someone grafted the Cortana fly-out menu from Windows 10 onto a fullscreen mobile device. Even though I already saw the similarities in the Windows 10 Technical Previews, it is funny to see it so explicit.

Getting smaller features allows a chip designer to create products that are faster, cheaper, and consume less power. Years ago, most of them had their own production facilities but that is getting rare. IBM has just finished selling its manufacturing off to GlobalFoundries, which was spun out of AMD when it divested from fabrication in 2009. Texas Instruments, on the other hand, decided that they would continue manufacturing but get out of the chip design business. Intel and Samsung are arguably the last two players with a strong commitment to both sides of the “let's make a chip” coin.

So where do you these chip designers go? TSMC is the name that comes up most. Any given discrete GPU in the last several years has probably been produced there, along with several CPUs and SoCs from a variety of fabless semiconductor companies.

Several years ago, when the GeForce 600-series launched, TSMC's 28nm line led to shortages, which led to GPUs remaining out of stock for quite some time. Since then, 28nm has been the stable work horse for countless high-performance products. Recent chips have been huge, physically, thanks to how mature the process has become granting fewer defects. The designers are anxious to get on smaller processes, though.

In a conference call at 2 AM (EDT) on Thursday, which is 2 PM in Taiwan, Mark Liu of TSMC announced that “the ramping of our 16 nanometer will be very steep, even steeper than our 20nm”. By that, they mean this year. Hopefully this translates to production that could be used for GPUs and CPUs early, as AMD needs it to launch their Zen CPU architecture in 2016, as early in that year as possible. Graphics cards have also been on that technology for over three years. It's time.

Also interesting is how TSMC believes that they can hit 10nm by the end of 2016. If so, this might put them ahead of Intel. That said, Intel was also confident that they could reach 10nm by the end of 2016, right until they announced Kaby Lake a few days ago. We will need to see if it pans out. If it does, competitors could actually beat Intel to the market at that feature size -- although that could end up being mobile SoCs and other integrated circuits that are uninteresting for the PC market.

Following the announcement from IBM Research, 7nm was also mentioned in TSMC's call. Apparently they expect to start qualifying in Q1 2017. That does not provide an estimate for production but, if their 10nm schedule is both accurate and also representative of 7nm, that would production somewhere in 2018. Note that I just speculated on an if of an if of a speculation, so take that with a mine of salt. There is probably a very good reason that this date wasn't mentioned in the call.

Back to the 16nm discussion, what are you hoping for most? New GPUs from NVIDIA, new GPUs from AMD, a new generation of mobile SoCs, or the launch of AMD's new CPU architecture? This should make for a highly entertaining comments section on a Sunday morning, don't you agree?

Wondering what is hot in the mobile world right now? Well, you can see what The Tech Report thinks are the best mobile computing devices in their latest round up right here. They have four recommended tablets, ranging from the low cost Google Nexus 7 which is still a hit after years on the market to the much more expensive and brand new iPad Air 2. Of the convertibles they recommend, two happen to be Surface machines from Microsoft and they split up their laptop recommendations between those for general usage and two designed specifically for gaming. Rounding out the list are four phones and a look at what is coming down the pipeline in the near future; what you won't find are any Chromebooks.

"In this edition of our mobile staff picks, we chose our favorites from the current cream of the crop in tablets, convertibles, laptops, and phones."

OnePlus is a Chinese smartphone company founded by Pete Lau, formerly the Vice President at Oppo. Their first phone was basically invite-only for most of its lifespan, but that was justified for a flagship-quality phone at $299 USD. The OnePlus One was first available in April 2014. Their follow-up is the OnePlus 2, go figure, which will be formally announced on July 27th.

Several announcements lead up to that date, though. One day, OnePlus stated that the announcement will be done in VR, and they are selling Google Cardboard for “free” outside of the $5 shipping fee. Another day, they announced that the price will be “under $450 USD”. Today, they announced that the OnePlus Two will have 4GB of LPDDR4 RAM, matching the capacity of the ASUS ZenPhone 2. It will also contain the Qualcomm Snapdragon 810, which should be able to support OpenGL ES 3.1 and Vulkan (whenever that arrives).

It makes you wonder what's left for July 27th, besides the release date. My guess is that day.

The insides of the third generation Kindle Paperwhite match the Voyage, a Freescale i.MX6 SoloLite 1GHz chip, as do the outsides with a new 300ppi screen. Connectivity has been expanded to Wi-Fi as well as an available 3G model and there is also a brand new font called Bookerly. If you are in need of an eReader and are not in Canada so that you can get the Tegra 4 powered Kobo Arc 7, you should head over to Techgage and see if the new improve Paperwhite is the solution you should chose.

"Amazon has just revealed its third-gen Kindle Paperwhite e-reader, and while it doesn’t offer a substantial upgrade over the previous model, it does iterate on what was already a fantastic device. With a 300 ppi screen and brand-new Bookerly font at-the-ready, there’s not much to dislike with this e-reader."

When I was in my Physics program, there was a running joke that the word “Nano” should be a red flag when reading research papers. This one has graphene and nanoparticles, but it lacks quantum dots and it looks privately funded by a company, so we might be good. Kidding aside, while I have little experience with battery technology, they claim to have surrounded silicon anodes for lithium batteries with a layer of graphene.

This addition of graphene is said to counteract an issue where silicon expands as it is used and recharged. The paper, which again is the first source that I have seen discuss this issue, says that other attempts at using silicon adds vacant space around the anode for future growth. If you can keep the material at the same volume over its lifespan, you will be able to store more electricity in smaller devices. I wonder why Samsung would want something like that...

9to5google is reporting specs of the upcoming Moto G refresh, and it looks like the phone will carry over the internals of the current Moto E with a Snapdragon 410 SoC, and add an improved 13MP camera.

The current Moto G has been a favorite for many as a low-cost unlocked option (and one that runs mostly stock Android), and the adoption of the faster SoC with integrated (Cat 4) LTE baseband is a necessary move to update a device that in its current iteration is limited to 3G data speeds. It is interesting that the SoC would only match that of the $149 2015 Moto E (reviewed here), but it makes sense from a financial standpoint if the rumored Moto G is to be sold at or below its current $179 price point.

There is certainly stiff competition in the midrange smartphone market, bolstered considerably by the recently released ASUS Zenfone 2 (reviewed here as well) which starts at $199 unlocked; and with devices like the new Zenfone offering full 1080p screens the rumored choice of the Moto G’s existing 5-inch 720p screen returning in 2015 might be another indication that this new phone will feature a very aggressive price.

The alleged 2015 Moto G photo (image credit: 9to5google)

The phone is also rumored to ship with Android 5.1.1, which would carry on the recent tradition of Motorola phones running the latest versions of Android. All of this is unconfirmed information based on leaks or course, but regardless of its final form more options are always welcome in the $200-and-under unlocked phone space - and this year is shaping up to be a good one for consumers.

The new Sony Xperia Z3+ is a tiny bit thinner than the non-plus model at 146x72x6.9mm and 144g compared to 146x72x7.3mm and 152g. The display is unchanged, a 5.2" IPS screen with a 1080x1920 resolution but the processor received a significant upgrade, it is now a 64-bit octa-core Qualcomm Snapdragon 810. The phone ships with Android 5.0 and The Inquirer got a chance to try it out. The new processor handles 4K video perfectly and the phone feels snappier overall compared to the previous model, check out their full experience here.

"SONY UNVEILED its latest top-end smartphone, the Sony Xperia Z3+ this week, with an updated, slimmer design, which has a lighter and sleeker frame compared with its predecessor, the Xperia Z3."

So I'm not quite sure what this hypothetical patent device is. According to its application, it is a head-mounted display that contains six cameras (??) and two displays, one for each eye. The usage of these cameras is not define but two will point forward, two will point down, and the last two will point left and right. The only clue that we have is in the second patent application photo, where unlabeled hands are gesturing in front of a node labeled “input cameras”.

The block diagram declares that the VR headset will have its own CPU, memory, network adapter, and “parallel processing subsystem” (GPU). VRFocus believes that this will be based on the Tegra X1, and that it was supposed to be revealed three months ago at GDC 2015. In its place, NVIDIA announced the Titan X at the Unreal Engine 4 keynote, hosted by Epic Games. GameWorks VR was also announced with the GeForce GTX 980 Ti launch, which was mostly described as a way to reduce rendering cost by dropping resolution in areas that will be warped into a lower final, displayed resolution anyway.

VRFocus suggests that the reveal could happen at E3 this year. The problem with that theory is that NVIDIA has neither a keynote at E3 this year nor even a place at someone else's keynote as far as we know, just a booth and meeting rooms. Of course, they could still announce it through other channels, but that seems less likely. Maybe they will avoid the E3 hype and announce it later (unless something changes behind the scenes of course)?

All three will be quad-core parts that can range between 12W and 35W designs, although the A8 processor does not have a 35W mode listed in the AMD Dual Graphics table. The FX-8800P is an APU that has all eight GPU cores while the A-series APUs have six. The A10-8700P and the A8-8600P are separated by a couple hundred megahertz base and boost CPU clocks, and 80 MHz GPU clock.

Also, we have been given a table of AMD Radeon R5 and R7 M-series GPUs that can be paired with Carrizo in an AMD Dual Graphics setup. These GPUs are the R7 M365, R7 M360, R7 M350, R7 M340, R5 M335, and R5 M330. They cannot be paired with every Carrizo APU, and some pairings only work in certain power envelopes. Thankfully, this table should only be relevant to OEMs, because end-users are receiving pre-configured systems.

Broadwell launched as a dual-core only option, which resulted in some high-performance notebooks opting to stay with Haswell CPUs. With the introduction of quad-core versions of the new Broadwell chips for mobile, MSI has jumped on the bandwagon to offer a few different options. Of the 20 new notebooks offered by MSI, 18 of them are powered by Intel Core i7 chips.

1920x1080 with this model seems low, especially considering the obscene amount of VRAM (8GB per card on a laptop? Really?). Still, this notebook has excellent external monitor support with dual mini-DisplayPort outputs, though HDMI is limited to version 1.4.

MSI has also introduced a refreshed GT72 Dominator with NVIDIA G-Sync (covered here), and this new version also features USB 3.1. And for the more business-minded there is the premium PX60 Prestige, now refreshed with Broadwell Core i7 as well.

These refreshed notebook models will be “available immediately” from MSI’s retail partners.

MSI has announced a new version of the GT72 gaming notebook featuring NVIDIA G-SYNC technology.

Like the current GT72 Dominator Pro G, this features NVIDIA GeForce GTX 980M graphics, though this announced version has 8GB of GDDR5 (vs. the previous 4GB) powering its 17.3” display. The G-SYNC implementation with this notebook will allow for variable refresh between 30 - 75 Hz, and as the existing G72 is a 1920x1080 notebook also featuring a GTX 980M it might seem unnecessary to implement G-SYNC, though this would ensure a smoother experience with the newest games at very high detail settings.

Based on the current GT72 Dominator Pro G we can also expect an Intel Broadwell Core i7 mobile processor (the i7-5700HQ in the current model), and these notebooks support up to 32GB of DDR3L 1600MHz memory, as well as up to 4 M.2 SSDs in RAID 0.

MSI is also announcing development, in partnership with eye-tracking company Tobii Technology, of a “fully integrated eye-tracking notebook” for gamers, and MSI will have prototype notebooks at Computex to demonstrate the technology.

We’ll post additional details when available. Right now full specs, as well as pricing and availability, have not been revealed.

Launching with Computex this week, ASUS has a set of three new ROG (Republic of Gamers) notebooks for potential mobile gamers to take a look at. First up is the G751JT and G751JY machines that feature Intel Core i7 processors (likely Haswell) and GeForce GTX 980M discrete graphics. After the recent announcement of G-Sync for notebooks, it should be no surprise that this updated G751 will feature an impressive 75 Hz 1920x1080 screen that supports variable refresh gaming!

ASUS G751JT/JY Notebook

For those more interested in a thin-and-light gaming machine, ASUS has the ROG G501. This will be available with either 2560x1440 or 3840x2160 resolution displays and will feature Intel Core i7 processors, again without specification on if that is Haswell or Broadwell based. ASUS claims that the G501 "features dual independent fans and copper heat sinks to ensure efficient thermal management for smooth and stable performance even at high loads."

ASUS G501 Notebook

Finally, the ROG GL552 looks to be a more standard gaming rig with a Haswell-based Intel processor, non-descript "discrete graphics" and an "optional" solid state drive. The GL552 will feature an "easy-access design for additional storage and memory upgrades."

ASUS GL552 Notebook

Look for more details on these notebooks and hopefully reviews very soon!

ASUS has announced the newest version of their Transformer Book 2-in-1, and the T100HA features a Intel Atom Cherry Trail X5 series quad-core processor and will run Windows 10 when released later this year.

From ASUS:

"ASUS Transformer Book T100HA is the successor to the best-selling Transformer Book T100TA 2-in-1, and combines the power of a stylish 10.1-inch laptop with the convenience of a super-slim tablet. This new iteration has up to 14 hours of battery life, and has an ultra-thin 8.45mm chassis that weights just 580g. It has a metallic finish and is available in Silk White, Tin Grey, Aqua Blue and Rouge Pink.

The T100HA is powered by a choice of quad-core Intel® Atom™ ‘Cherry Trail’ X5 series processors, and has 4GB RAM and a USB Type-C port. This device comes pre-installed with Windows 10 and will be available in the third quarter of 2015."

If you remember back to January of this year, Allyn and posted an article that confirmed the existence of a mobile variant of G-Sync thanks to a leaked driver and an ASUS G751 notebook. Rumors and speculation floated around the Internet ether for a few days but we eventually got official word from NVIDIA that G-Sync for notebooks was a real thing and that it would launch "soon." Well we have that day here finally with the beginning of Computex.

G-Sync for notebooks has no clever branding, no "G-Sync Mobile" or anything like that, so discussing it will be a bit more difficult since the technologies are different. Going forward NVIDIA claims that any gaming notebook using NVIDIA GeForce GPUs will be a G-Sync notebook and will support all of the goodness that variable refresh rate gaming provides. This is fantastic news as notebook gaming is often at lower frame rates than you would find on a desktop PC because of lower powered hardware yet comparable (1080p, 1440p) resolution displays.

Of course, as we discovered in our first look at G-Sync for notebooks back in January, the much debated G-Sync module is not required and will not be present on notebooks featuring the variable refresh technology. So what gives? We went over some of this before, but it deserves to be detailed again.

NVIDIA uses the diagram above to demonstrate the complication of the previous headaches presented by the monitor and GPU communication path before G-Sync was released. You had three different components: the GPU, the monitor scalar and the monitor panel that all needed to work together if VRR was going to become a high quality addition to the game ecosystem.

NVIDIA's answer was to take over all aspects of the pathway for pixels from the GPU to the eyeball, creating the G-Sync module and helping OEMs to hand pick the best panels that would work with VRR technology. This helped NVIDIA make sure it could do things to improve the user experience such as implementing an algorithmic low-frame-rate, frame-doubling capability to maintain smooth and tear-free gaming at frame rates under the panels physical limitations. It also allows them to tune the G-Sync module to the specific panel to help with ghosting and implemention variable overdrive logic.

All of this is required because of the incredible amount of variability in the monitor and panel markets today.

But with notebooks, NVIDIA argues, there is no variability at all to deal with. The notebook OEM gets to handpick the panel and the GPU directly interfaces with the screen instead of passing through a scalar chip. (Note that some desktop monitors like the ever popular Dell 3007WFP did this as well.) There is no other piece of logic in the way attempting to enforce a fixed refresh rate. Because of that direct connection, the GPU is able to control the data passing between it and the display without any other logic working in the middle. This makes implementing VRR technology much more simple and helps with quality control because NVIDIA can validate the panels with the OEMs.

As I mentioned above, going forward, all new notebooks using GTX graphics will be G-Sync notebooks and that should solidify NVIDIA's dominance in the mobile gaming market. NVIDIA will be picking the panels, and tuning the driver for them specifically, to implement anti-ghosting technology (like what exists on the G-Sync module today) and low frame rate doubling. NVIDIA also claims that the world's first 75 Hz notebook panels will ship with GeForce GTX and will be G-Sync enabled this summer - something I am definitely looking forward to trying out myself.

Though it wasn't mentioned, I am hopeful that NVIDIA will continue to allow users the ability to disable V-Sync at frame rates above the maximum refresh of these notebook panels. With most of them limited to 60 Hz (but this applies to 75 Hz as well) the most demanding gamers are going to want that same promise of minimal latency.

At Computex we'll see a handful of models announced with G-Sync up and running. It should be no surprise of course to see the ASUS G751 with the GeForce GTX 980M GPU on this list as it was the model we used in our leaked driver testing back in January. MSI will also launch the GT72 G with a 1080p G-Sync ready display and GTX 980M/970M GPU option. Gigabyte will have a pair of notebooks: the Aorus X7 Pro-SYNC with GTX 970M SLI and a 1080p screen as well as the Aorus X5 with a pair of GTX 965M in SLI and a 3K resolution (2560x1440) screen.

This move is great for gamers and I am eager to see what the resulting experience is for users that pick up these machines. I have long been known as a proponent of variable refresh displays and getting access to that technology on your notebook is a victory for NVIDIA's team.