News Posts matching "RX Vega"

Sapphire over the weekend officially launched its cost-effective custom-design Radeon RX Vega 56 graphics card, the Pulse Radeon Vega 56 (model: 11276-02), which began appearing on European e-tailers late-January. The card combines a custom-design short-length PCB that's roughly the length of AMD's reference R9 Fury board; with a beefy custom-design cooling solution that features two large aluminium fin-stacks, ventilated by a pair of 100 mm double ball-bearing fans.

The card offers out of the box clock speeds of 1208 MHz core, 1512 MHz boost, and 800 MHz (1.60 GHz HBM2 effective) memory, against AMD reference clock speeds of 1138 MHz core and 1474 MHz boost. At its given clock, the memory bandwidth on offer is 409.6 GB/s. The "Vega 10" silicon is configured with 3,584 stream processors, 192 TMUs, and 64 ROPs. The card draws power from a pair of 8-pin PCIe power connectors, display outputs include three DisplayPort 1.4 and one HDMI 2.0. Sapphire intended for this SKU to ideally occupy a close-to-reference price-point, a notch below its Nitro+ series, however in the wake of the crypto-currency wave, market-forces will decide its retail price.

Primitive shaders are lightweight shaders that break the separation of vertex and geometry shaders, promising a performance gain in supporting games. Initially announced during the Radeon RX Vega launch, the feature has been delayed again and again. At one of its 2018 International CES interactions with the press, AMD reportedly announced that it had cancelled the implicit driver path for primitive shaders. Game developers will still be able to implement primitive shaders on AMD hardware, using a (yet to be released) explicit API path. The implicit driver path was the more interesting technology though, since it could have provided meaningful performance gains to existing games and help cut down a lot of developer effort for games in development. AMD didn't state the reasons behind the move.

To explain the delay, some people were speculating that the Primitive Shader feature was broken unfixable in hardware, which doesn't seem to be the case, now that we are hearing about upcoming API support for it, so this can also be interpreted as good news for Vega owners.

TechPowerUp today released the latest version of GPU-Z, the popular graphics subsystem information and diagnostic utility. Version 2.7.0 comes with a handful of important bug fixes and updates to its internal modules. To begin with, we've updated the NVFlash module that lets GPU-Z extract video BIOS from graphics cards, the newer NVFlash supports BIOS extraction from some of the newer NVIDIA graphics cards such as the GTX 1070 Ti. We've also fixed incorrect video memory amount reading on AMD Radeon RX Vega graphics cards. TMU and ROP counts, and OpenCL status on AMD "Polaris 21" GPUs is fixed, as is incorrect labeling of a memory clock sensor on NVIDIA GPUs. GPU-Z will no longer prevent system shutdowns and reboots on Windows 10 Fall Creators Update.

It looks like Intel has achieved the design goals of its new Core i7-8705G multi-chip module, built in collaboration with AMD. Combining a 4-core/8-thread "Kaby Lake" CPU die with an AMD "Vega" GPU die that has its own 4 GB HBM2 memory stack, the ruthless duo put similarly-priced discrete GPU setups to rest, such as the combination of an 8th generation Core processor + NVIDIA GeForce MX 150. More importantly, entry-level discrete GPU combinations with high-end mobile CPUs have a similar power/thermal envelope as the i7-8705G MCM, but at significantly higher PCB footprint.

Dell implemented the Core i7-8705G on one of its latest XPS 15 2-in-1 models. The device was compared to an Acer Swift 3 (SF314-51), which combines a Core i5-8250U processor with GeForce MX 150 discrete graphics; and a Dell XPS 13 9370, which implements an 8th generation Core processor that has Intel's workhorse graphics core, the HD 620. The three devices squared off against each other at "Rise of the Tomb Raider" game benchmark. The i7-8705G averaged 35 frames per second (fps), while the MX 150 barely managed 24 fps. The HD 620 ran a bored intern's PowerPoint slideshow at 9 fps.

NVIDIA through the changelog of one of its Linux driver releases may have spilled the beans in an as of yet unannounced, unreleased product. The company's Max-Q variants of their graphics cards typically trade performance for power efficiency, sitting the designs somewhat more optimally in the power/performance ratio curve. The fact that NVIDIA is looking to bolster efficiency of its GTX 1050 with a Max-Q design is likely aimed at competing with the performance level of the already announced Intel + AMD EMIB design, where an Intel discrete CPU is paired with a discrete, Vega-based AMD GPU and its accompanying HBM2 memory stacks, in a small, extremely power efficient package (when compared with current designs.)

The folks at Notebookcheck expect the 1050 Max-Q to perform about 10 to 15 percent slower than the standard 1050 and 1050 Ti, respectively, with TDP likely ranging between 34 W to 46 W - NVIDIA is aiming at the same market that the >AMD + Intel EMIB collaboration is going after (thin, light, adequate performance solutions.)

Intel did the impossible in 2017, by collaborating with rival AMD after decades, on a product. The new Core i7-8000G series processors are multi-chip modules that combine quad-core "Kaby Lake" CPU dies with discrete AMD Radeon Vega GPU dies that have their own dedicated HBM2 stacks. With performance-segment notebooks and sleek AIO desktops building momentum for such products, Intel sees a future in building its own discrete GPUs, at least dies that can replace the AMD Radeon IP from its Core G-series processors.

With former AMD Graphics head Raja Koduri switching to Intel amidst rumors of the company investing in discrete GPUs of its own, details emerge of the company's future "Arctic Sound" and "Jupiter Sound" graphics IP, which point to the possibility of them being discrete GPU dies based on the Gen 12 and Gen 13 graphics architectures, respectively. According to Ashraf Eassa, a technology stock commentator with "The Motley Fool," both "Arctic Sound" and "Jupiter Sound" are discrete GPU dies that connect with Intel processor dies over EMIB, the company's proprietary high-density interconnect for multi-chip modules. It could be a long wait leading up to the two, since the company is still monetizing its Gen 9.5 architecture on 8th generation Core processors.

Sapphire took to CES 2018 to showcase one of the most elusive products in recent times: a custom variant of AMD's RX Vega graphics cards. Sapphire went to great lengths to keep the entire affair under as many wraps as possible, even going so far as to book an entire room to showcase their RX Vega Nitro+, laying still, like a mirage, on top of its anti-static plastic wrap.

There's something eerily beautiful about this graphics card: not on its exquisite back and faceplate designs alone, nor on the custom Sapphire-cut exhaust port; it's really for the condition of a somewhat "unicorn" type of product. Some people believe they're out there, but sightings in their natural environment (read, correct MSRP) are so rare that they are brought up as mass deliriums. The most interesting tech product on CES 2018 based on this story alone, surely.

Today, Intel launched the latest and most powerful Intel NUC to date, based on the newly announced 8th Gen Intel Core i7 processor with Radeon RX Vega M graphics. The new Intel NUC (formerly code-named Hades Canyon) brings this powerful new processor and graphics solution into an incredibly tiny 1.20-liter system. Great for VR enthusiasts and workload-heavy content creators, it will be Intel's smallest premium VR-capable system in the market.

The new NUC will come in two versions: NUC8i7HVK and NUC8i7HNK.

The NUC8i7HVK is based on the unlocked version of the new 8th Gen Intel Core processor with the Radeon RX Vega M GH graphics, giving overclockers the ability to take the system to higher levels.

AMD at CES shed some light on its 2018 roadmap, while taking the opportunity to further shed some light on its graphics and CPU projects up to 2020. Part of their 2018 roadmap was the company's already announced, across the board price-cuts for their first generation Ryzen processors. This move aims to increase competitiveness of its CPU offerings against rival Intel - thus taking advantage of the blue giant's currently weakened position due to the exploit saga we've been covering. This move should also enable inventory clearings of first-gen Ryzen processors - soon to be supplanted by the new Zen+ 12 nm offerings, which are expected to receive a 10% boost to power efficiency from the process shrink alone, while also including some specific improvements in optimizing their performance per watt profile. These are further bound to see their market introduction in March, and are already in the process of sampling.

On the CPU side, AMD's 2018 roadmap further points towards a Threadripper and Ryzen Pro refresh in the 2H 2018, likely in the same vein as their consumer CPUs that we just talked about. On the graphics side of their 2018 roadmap, AMD focused user's attention in the introduction of premium Vega offerings in the mobile space (with HBM2 memory integration on interposer, as well), which should enable the company to compete against NVIDIA in the discrete graphics space for mobile computers. Another very interesting tidbit announced by AMD is that they would be skipping the 12 nm process for their graphics products entirely; the company announced that it will begin sampling of 7 nm Vega products to its partners, but only on the Instinct product line of machine learning accelerators. We consumers will likely have to wait a little while longer until we see some 7 nm graphics cards from AMD.

Today, Intel is launching a first-of-its kind processor: the 8th Gen Intel Core processor with Radeon RX Vega M Graphics. Packed with features and performance crafted for gamers, content creators and fans of virtual and mixed reality, it expands Intel's portfolio thanks to its optimization for small form factors like 2 in 1s, thin and light notebooks, and mini PCs.

Among the devices launching with this processor: new thin and lightweight 2 in 1s from Dell and HP as well as the most powerful NUC Intel has ever introduced. The new 8th Gen Intel Core processor will come in two configurations:

Intel revealed specifications of its upcoming "Kaby Lake + AMD Vega" multi-chip module, the Core i7-8809G, on its website. A number of these specs were already sniffed out by Futuremark SystemInfo, but the website sheds light on a key feature - dual integrated graphics. The specs sheet confirms that the chip combines a 4-core/8-thread "Kaby Lake" CPU die with an AMD Radeon RX Vega M GH graphics die. The CPU is clocked at 3.10 GHz, and SystemInfo (from the older story) confirmed that its Turbo Boost frequency is up to 3.90 GHz. The L3 cache amount is maxed out a 8 MB. The reference memory clock is set at dual-channel DDR4-2400. What's more, the CPU component features an unlocked base-clock multiplier.

Things get interesting with the way Intel describes its integrated graphics solution. It mentions both the star-attraction, the AMD Radeon RX Vega M GH, and the Intel HD Graphics 630 located on the "Kaby Lake" CPU die. This indicates that Intel could deploy a mixed multi-GPU solution that's transparent to software, balancing graphics loads between the HD 630 and RX Vega M GH, depending on the load and thermal conditions. Speaking of which, Intel has rated the TDP of the MCM at 100W, with a rider stating "target package TDP," since there's no scientifically-correct way of measuring TDP on a multi-chip module. Intel could build performance-segment NUCs with this chip, in addition to selling them to mini-PC manufacturers.

Ahead of its Q1-2018 launch after a CES reveal, Intel's Core i7-8709G multi-chip module (MCM) was picked up by Thai PC enthusiast and tech vlogger "TUM APISAK," revealing some of its first specifications as read by Futuremark SystemInfo, a hardware-detection component common to various Futuremark benchmark suites. The "Kaby Lake-G" MCM combines a quad-core "Kaby Lake" CPU die with an AMD Radeon "Vega M" graphics die that has a dedicated HBM2 memory stack on-package.

Futuremark SystemInfo puts out quite a few specs of the i7-8709G, beginning with its 4-core/8-thread CPU based on the "Kaby Lake" micro-architecture, which is clocked at 3.10 GHz with 3.90 GHz Turbo Boost; Radeon RX Vega M (694C:C0) graphics core with 4 GB of HBM2 memory across a 1024-bit memory bus; with its GPU engine running at 1.19 GHz, and memory at 800 MHz (204.8 GB/s memory bandwidth); although the core-config of the iGPU remains a mystery. We recommend you maximize the video below for legible details.

MSI rolled out the Radeon RX Vega 56 Air Boost and Air Boost OC graphics cards. The two are based on the same board design as the RX Vega 64 Air Boost series the company launched last week. The quasi-custom design card combines an AMD reference-design PCB with a custom-design lateral-flow cooler by MSI that's similar in design to AMD's cost-effective reference cooler. Adding to its effectiveness is the heavily perforated rear I/O bracket.

The base model sticks to AMD reference clock speeds of 1156 MHz core and 1471 MHz boost; while the OC variant ships with 1181 MHz core and 1520 MHz boost. Both cards leave the HBM2 memory clock untouched at 800 MHz. The cards draw power from a pair of 8-pin PCIe power connectors; display outputs include an HDMI 2.0, and three DisplayPort 1.4 connectors. The base variant sells at USD $399, with the OC variant going for $439.

The Sapphire RX Vega Nitro+ series of graphics cards feature a triple-fan, 2.5 slot design and a whopping 3x 8-pin power delivery system - and yes, you read that right, this applies to both the Vega 64 and Vega 56 models. The increased thermal headroom provided by the substantial cooling solution, and the beefed-up power delivery system, mean Sapphire are shipping these graphics cards with a hefty 12-14% base-clock increase over AMD's reference models, making these the fastest (in frequency) factory-overclocked RX Vega graphics cards money can buy. The cards also ship with dual-BIOS, a fan header for either a side-panel or front-panel fan whose speed you want to be under the graphics' card control, and a VGA support plate - a smart move by Sapphire, considering the RX Vega 64 Nitro+ comes in at almost 1.6 kg.

LG Electronics (LG) will introduce its latest LG gram notebooks that deliver superior portability, enhanced powerful performance and convenience features. Since the incredibly lightweight notebook line debuted in 2014, LG has consistently surprised consumers by maximizing portability without sacrificing performance. The 2018 LG gram notebooks push the boundaries of portable computing with improved mobility and durability, as well as upgraded processors and more versatility.

"The new LG gram PCs have been designed for those users who want an all-round, high performance notebook with maximum portability," said Tim Alessi, head of product marketing at LG Electronics USA. "The 2018 gram series ticks all the boxes for users who want versatile and lightweight notebooks with faster processing capabilities."

At a press event, AMD confirmed that its 2nd generation Ryzen desktop processors will debut in Q1-2018 (before April). It also clarified that "2nd Generation" does not equal "Zen2" (a micro-architecture that succeeds "Zen"). 2nd Generation Ryzen processors are based on two silicons, the 12 nm "Pinnacle Ridge," which is a GPU-devoid silicon with up to eight CPU cores; and "Raven Ridge," which is an APU combining up to 4 CPU cores with an iGPU based on the "Vega" graphics architecture. The core CPU micro-architecture is still "Zen." The "Pinnacle Ridge" silicon takes advantage of the optical shrink to 12 nm to increase clock speeds, with minimal impact on power-draw.

AMD is also launching a new generation of chipset, under the AMD 400-series. There's not much known about these chipsets. Hopefully they feature PCIe gen 3.0 general purpose lanes. The second-generation Ryzen processors and APUs will carry the 2000-series model numbering, with clear differentiation between chips with iGPU and those without. Both product lines will work on socket AM4 motherboards, including existing ones based on AMD 300-series chipset (requiring a BIOS update). AMD is reserving "Zen2," the IPC-increasing successor of "Zen" for 2019. The "Mattise" silicon will drive the multi-core CPU product-line, while the "Picasso" silicon will drive the APU line. Both these chips will run on existing AM4 motherboards, as AMD plans to keep AM4 as its mainstream-desktop socket till 2020.

Four months of silence after what can only be classified as a premature announcement, ASUS has finally put up the product pages for their custom RX Vega 56 and 64 graphics cards, marketed under the Strix branding. Yield and packaging issues, as well as differing chip characteristics between different AMD packaging partners, have greatly affected TTM on RX Vega's custom designs, which were sorely needed so as to improve on some of the reference cards' shortcomings. Sadly, the product pages are just that - product pages - and lack the holy trinity of graphics cards important information - clock speeds, pricing, and availability.

AMD today announced the brand title of its 2017-yearender driver release, Radeon Software Adrenalin Edition, which is named after the Adrenalin Rose. Scheduled to release some time in mid-December, under version number 17.12 WHQL, the drivers are expected to introduce performance enhancements across the board for GPUs based on the "Polaris" and "Vega" graphics architectures (Radeon RX 400 series, RX 500 series, and RX Vega series), while introducing new features.

AMD today put out its fourth Radeon Software release of the month, the Radeon Software Crimson ReLive 17.11.4 Beta. These drivers come with optimization for "Doom" VFR, and Oculus Dash Open Beta. The drivers fix an issue with certain levels of HBCC size adjustments causing system instability on machines with Radeon RX Vega series graphics cards. It also fixes a system hang noticed when switching display modes on "Star Wars Battlefront II" on CrossFire machines. Also fixed, are incorrect clock and power values being reported on some machines with RX Vega series graphics cards. Grab the drivers from the link below.

GIGABYTE has a custom-design Radeon RX Vega series after all, with the company announcing the RX Vega 64 WindForce 2X and RX Vega 56 WindForce 2X graphics cards. These cards combine a 100% custom-design PCB by GIGABYTE, with a large WindForce 2X cooling-solution that the company is debuting with these cards. The cooler features a split aluminium fin-stack heatsink to which heat drawn by 8 mm-thick copper heat-pipes is fed; ventilated by a pair of large 100 mm fans, which stay off when the GPU is idling. The heat-pipes make direct contact with the GPU and HBM2 stacks, while a base-plate conveys heat drawn from the VRM MOSFETs.

The back-plate has a copper center-plate and a flat heat-pipe of its own, drawing heat from the PCB via non-electrically-conductive thermal pads. The two fans blow air onto the heatsink, but one fan spins clockwise to do this, while the other spins counter-clockwise. The custom-design PCB features a 13-phase VRM, and draws power from a pair of 8-pin PCIe power connectors. Both cards come with factory-overclocked speeds, with the engine-clock boosting up to 1560 MHz, while the memory clock is left untouched. The card features an unusual display connector loadout, including three each of DisplayPort 1.4 and HDMI 2.0 ports, all located on the rear panel. The company didn't reveal pricing.

TUL Corporation, a leading and innovative manufacturer of AMD graphic cards since 1997, has launched a new PowerColor Red Devil RX VEGA 64 and Red Devil RX VEGA 56 and opened up a new generation of the graphics cards market. The VEGA series are for the extreme gamers looking for the highest resolutions, highest framerates in maximum video setting.

The PowerColor RX VEGA graphics are designed to deliver exhilarating performance in the latest DirectX 12 and Vulkan game titles. With a dedicated High-Bandwidth Cache, the VEGA utilizes HBM2, the latest in graphic memory technology, to provide incredible levels of power efficiency and memory performance. The Next-Gen Pixel Engine found in the Vega GPU is designed to boost shading performance more efficiently to bring the latest VR and extreme resolution games to life.

AMD today released the Radeon Software Crimson ReLive 17.11.3 hotfix. The drivers specifically address an intermittent crash issue with Radeon RX Vega graphics cards. If you're a RX Vega owner, it is highly recommended that you update to this version. As with all driver releases, some known issues remained unfixed. These include the game-related crashes in titles like Tom Clancy's Rainbow Six Siege, Rise of the Tomb Raider, and OverWatch.

After teasing us with a somewhat bold design for their custom RX Vega graphics cards, XFX has officially taken the lid of their finalized design for their RX Vega graphics cards. These have been a long time coming, for sure; and the design is definitely bold enough to be divisive, promising to be a "hate it or love it" affair. XFX has taken their brand-recognition-fueled X and applied that design to the graphics cards' shroud, with a recess in the middle of the graphics cards that separates the two air cooling fans giving the card an X-shaped design. This design quirk has been put to other uses than just aesthetic considerations, though, with the card's 2x 8-pin power connectors being slotted smack in the middle of the graphics card, which might be good (or bad) according to your cases' routing ability, though it should, in theory, allow for somewhat decreased length of the graphics card. The backplate on the XFX custom cards also looks great (black, gray and red are almost impossible to get wrong).

AMD has started a new offer on its RX Vega 56 and RX Vega 64 graphics cards, which brings users two of this year's most interesting FPS titles: Arkane's Prey, and Machine Games' Wolfenstein: The New Colossus. The offer is good from November 24th through December 31st, or until the stock for game codes is gone, so that should give users plenty of time to take advantage of the offer. Keep in mind this is retailer-dependent, with not every sales point partaking in the offer, so be sure to check first whether or not your purchase spot of choice is offering this promotion.

The AMD Unique ID which grants you access to both game codes must be redeemed within two (2) months of the end of the Campaign Period (February 28, 2018) to obtain Application downloads. After this deadline, the AMD Unique ID is void, so users won't be able to redeem their games anymore. The offer is valid for RX Vega 64 Liquid and Air cooled graphics cards, and RX Vega 56. AMD AIB partner cards (such as Gigabyte, Sapphire, XFX, and so on) should be eligible, but you should take some time to confirm this. best Buy, for example, seems to only be applying this dual game code promotion to XFX Vega graphics cards. For now, this promotion seems to only be applied to reference design graphics cards, though this might change according to retailer.

Benchmarking company Futuremark has recently introduced a new benchmark to its VRMark suite, the Cyan Room, which brings the latest in rendering technologies to the VR world. Futuremark expects this test to leverage the latest hardware and software developments in DX12 to better utilize today's GPUs still somewhat untapped power. In something of a plot twist, AMD's Radeon architectures (in the form of Polaris 20-based RX 580 and Vega-based RX Vega 56 and RX Vega 64) trump NVIDIA's equivalent offerings in pure performance numbers.

Testing was performed by pairing a Ryzen 7 1800X CPU with a selection of graphics cards from both AMD and NVIDIA, supported by 16GB of DDR4-2933 system memory, and Windows 10 x64. In a post on Radeon gaming, Scott Wasson said that "The Cyan Room (...) highlights AMD's continued performance leadership on this (VR) front," adding that "the Radeon GPUs we tested have clear leads over their direct competition. What's more, all the Radeon GPUs are meeting the key requirement for today's VR headsets by delivering at least 90 frames per second in this test."