THE Personal Computer Thread | I Love You 3000

Welcome to the thread representing the best way to game in 2019. Free from the barriers of console gaming, you choose how to approach your gaming experience. Build your own or go through the babyz route like Laboured. Deck it out with RGBs or be a bore like our guy Kibner who prefers nothing, and has 120hz hearing. Suffer through a console like experience with a GTX or AMD GPU at 1080p, or ascend to a higher place with the RTX line. The choice is yours!

Things to expect in 2019:

Zen 2UW HRF monitorsThe continuation of RGBAMD hopefully being competitive in the GPU spaceNew Intel ProcsFurther Ray Tracing exposure on PC and more

Cross posting from the previous thread as I am looking for some opinions.

So my laptop died and I am going to have to buy a new one. Currently looking at a HP Omen 15 Intel Core i7+ 8750H 2,2 GHz / 16 GB DDR4-2666 / Nvidia GTX 1050 4GB/ 256 GB PCIe® NVMe™ M.2 SSD + 1TB 7200RPM SATA for 1100€.Will only be doing some ocasional gaming, I need it more for audio/video editing and DJ software but that doesn't require much horsepower.Doesn't seem like a bad deal for me, it's 15% off right now until Sunday. Haven't seen many cheaper options for something similar.

By LFMartins86Go To PostThe 1060 ones are 300€ more, for the amount of gaming I am going to do it's not worth it. It actually is a GTX 1050 Ti 4 GB after all, which is a bit better.

That's the only standout to me, but you know your use case better than I do ofc. I'm assuming it's a 1080p screen, and the 1060 is a perfect match for that. But if you don't think you'll use or that much, then save the $$. That CPU is 6 cores too? Yeah you'll be good to go.

I think im gonna hold off purchasing the 9900k build until Amd announces ryzen 3000 at CES later this month. If single thread performance is close enough to 9900k in apps I use, then i might go that direction. The only thing holding me back are render times is slower on intel.

By KabroGo To PostI think im gonna hold off purchasing the 9900k build until Amd announces ryzen 3000 at CES later this month. If single thread performance is close enough to 9900k in apps I use, then i might go that direction. The only thing holding me back are render times is slower on intel.

Pulled the trigger on the laptop and I am glad I did as it was the last one and a guy who entered the store 20 seconds after me was also going for it.Currently doing a fresh install but damn, this is a huge update from my previous one. It's so fast and screen is quite nice.

By DominicanPowerGo To PostHoly 144hz, brothers. I've been playing Fornite in slow motion.

bought my nephew a Dell s2716dg a couple months ago.1440p, 144hz, 1ms, G-sync.I was a bit hesitant at first because its a TN panel,but when he started playing Overwatch on it, I was really astonished on how gorgeous it looked.

I haven't watched his video yet but i still have a question about 1151v2. It's clear that the socket layout can handle the extra wattage going from 4 cores to 8, but what about the VRM's? a VRM that might have been fine for a 7700k might not last all that long for a 9900k and 1151v2 might have been more of a correction of that.

There are some other things too.. If you have more pins for power delivery you can reduce EMI, i kinda doubt der8auer is going to care about that stuff. Of course, Intel could have just seen it coming and used those extra pins from the start.

NVIDIA will reportedly be using Samsung's new 7nm extreme ultraviolet lithography (EUV) process that uses a plasma laser to drive silicon material into 7nm transistor structures. The new GeForce RTX and Quadro RTX graphics cards are both made on the 12nm process thanks to TSMC, with NVIDIA no stranger to working with Samsung for GPU production. AMD will have the world's first GPU on 7nm beating NVIDIA to the punch, with its upcoming Navi GPU launching in July 2019 according to our sources. NVIDIA's new GPU architecture on 7nm should end up being Ampere, which will succeed the current Turing GPU architecture inside of the new RTX-powered cards.

Damn. They actually did it, or are starting too. That at least gives them a way to support VRR while at the same time continuing Gsync as it's own, premium VRR solution. So now I'm guessing the OLEDs and other TVs that support VRR are now in play for GeForce users.

I'm seeing a lot of people excited that NVIDIA is supporting "Freesync" now. No they're not supporting AMDs take on the AdaptiveSync standard, but AS itself. Why would NVIDIA of all people support a rivals implementation, when they have their own, superior version of VRR?

I'm seeing a lot of people excited that NVIDIA is supporting "Freesync" now. No they're not supporting AMDs take on the AdaptiveSync standard, but AS itself. Why would NVIDIA of all people support a rivals implementation, when they have their own, superior version of VRR?

Am I misunderstanding something here?

Because it hits AMD in the pocket. They no longer have the value proposition of a Radeon card and free sync monitor vs a Nvidia card and a G sync monitor.

By SmokeyGo To PostI'm seeing a lot of people excited that NVIDIA is supporting "Freesync" now. No they're not supporting AMDs take on the AdaptiveSync standard, but AS itself. Why would NVIDIA of all people support a rivals implementation, when they have their own, superior version of VRR?

Am I misunderstanding something here?

FreeSync is just what AMD calls VESA adaptive sync, it's actually the same thing. I believe it's actually what they have been doing for laptops with GSync for a while now.