New system is here. Currently transferring files from a 2TB HDD via the LAN, so this is going to take a very long time despite being a Gigabit connection lol

But I needed to do this first before being able to start installing all the required software.

Anyway, all I can say is: the thing is beautiful. And fast. Haven't performed any tests yet, but Windows 10 on this machine is super responsive (thanks, no doubt, to the Intel Optane - the 8 CPU cores might help a bit too ).

Gbit LAN has a max through-output of 125 MB/s, but the source HDD is old and slower than that. I saw around 80 MB/s on average - took a few hours.

Anyway, just a few hours ago I was cursing the guys who put this together. The Optane is in the PCI #3 slot, which is connected to the motherboard chipset and not directly to the CPU (which is how I wanted it, since I didn't want the GPU to drop down from x16 to x8).

However, for PCI #3 to work at x4 speeds, SATA #5 and SATA #6 needed to be disabled, says so in the motherboard's user guide. I had specifically warn them about this.

So, I start benchmarking the Optane and it's running at half speed of the NVMe 970 EVO. Ok, looks like they forgot to disable SATA #5 and #6 in the BIOS, shouldn't be a problem... until I discover that instead of connecting the 4TB WD hard disk to SATA #1 they had connected it to SATA #6.

To disable SATA #5 and #6 I now had to move the WD SATA cable (and another I am going to connect my Crucial MX500 SSD to later on) to the other SATA ports... which were completely under the tail end of the monster 2080TI graphics card, and therefore I couldn't see directly. To make matters worse, cable management had ensured the cables had almost zero slack.

I considered removing the GPU, but, as you know, PCIe slots have that little piece that locks the cards in place, which must be pressed down first when removing them (many have ripped a PCI slot clean off the motherboard by forgetting about this). Alas, the monstrous NOCTUA NH-D15 cooler is so freaking huge there was no way for me to reach this little lock either.

So there I was, half an hour later still blindly trying to connect the cables to the other SATA ports, very little sleep, not in the best of moods and muttering every bad word I could think of under my breath.

I only succeeded when I remembered I could take off the case's front cover and top-to-bottom integral fan filter. This enabled me to shine a light directly at the connectors under the GPU card and see what I was doing by looking through the front fans.

So, I go to the BIOS, disable SATA #5 and #6, reboot, run the benchmark and... same speed.

Turns out there is an actual setting buried in the BIOS to set the speed of PCIe #3 (either x2 or x4) and it's setting it to x4 that automatically disables SATA #5 and #6. Once I did that, the speed of the Optane finally doubled.

Still re-installing applications, transferring files via LAN, etc... I must make sure everything is properly set up and working as it should before making the final switch.

Worst part is the physical logistics: currently my old Cosmos II case is at the top of the desk, and it needs to be swapped with the new C700M which is currently and temporarily under the desk. Unfortunately - unlike the C700M - it's too tall to fit under the desk, so I need to place it next to it once everything has been transferred over. This means I won't be able to open the door to the office completely, as the Cosmos II case will then be blocking it. Sigh.

Once everything has been transferred I also need to dust off the Cosmos II case and clean the filters (it was long due for a clean up), remove the side fans as I won't be needing them anymore to keep the case cool when playing games, remove the Crucial MX500 1TB SATA SSD and install it on the C700M, etc...

Currently I'm using two monitors: a 30" 2560x1600 HP monitor and a 43" 4K LG monitor. I have another LG 30" 2560x1600 monitor which I couldn't use because Windows 7 DWM only supports a maximum total resolution of 8192x8192 - anything larger than that would turn Aero off because that is the largest texture size supported by DX11.

Windows 10 increased the limit to 16384x16384, so in theory I should be able to use all three monitors at once. But here is the problem: the 30" LG was one of the first 30" monitors ever built and only supports a single DVI-D connection (no HDMI or DP).

I do have an active DVI to DP adapter which I could use with the 30" LG, but the nVidia 2080 TI only has two DP connectors, the rest is HDMI. For this to work I would need three DP connectors.

The 43" LG 4K does have multiple inputs, including HDMI, but the monitor is finicky with the cables used (they warn you about that in the user guide) and they didn't even include an HDMI cable with the monitor, only a DP cable. I can try connecting the 43" via HDMI, and the 30" HP and 30" LG via DP, but there is no guarantee the 43" LG will work properly at 4K 60hz with an HDMI cable.

Guess I will have to wait and see.

Once everything is set up properly, the next task is to overclock the i9 9900K to 5Ghz, or as close to it as possible.

In the mean time, here are the benchmarks for the Intel Optane 905p and the Samsung 970 EVO. Not sure why I got a "gay" drive though.

Notice how much faster the Optane is than the Samsung NVMe in terms of 4K transfers.

Unfortunately moving the old Cosmos II case seems to have killed the nVidia 980Ti in it. I connected it to a 24" HP monitor I have here via DVI-D and when I turned the system on not only did I get an electric discharge when I touched the COSMOS II chassis as I couldn't get any image on the monitor (monitor displayed no signal on all connectors).

At first I thought it was a static discharge. I did manage to get it to display an image when I connected via HDMI (and got another electric shock in the process) but when I looked back at the system it had rebooted and I couldn't get it to display an image again.

It was booting up fine (I could hear the sounds of Windows 7 booting up) but... no image at all.

At one point I heard the fans speed up really quickly for a moment (in retrospect, probably the GPU fans because something in the card was overheating, most likely the GPU itself). At another point the system failed to boot and kept beeping at me (turning the PSU off, waiting a little bit and turning it on again solved this).

For a while I suspected something wrong with the electrical connection to Ground, so I removed the system from the UPS and connected it directly to the wall using a different power cable. Still got the electric shocks and still no image.

Replacing the 980ti with my old GTX690 solved both problems. I suspect the graphics card got somehow dislodged when I moved the case and short circuited, sending current directly to the chassis.

I haven't tried the 980ti on another PC yet, but after that sudden fan speed up incident I suspect the card fried itself. Grrr.

Bummer about the previous system - but lucky it didn't blow anything else. Funny things can and do happen sometimes. Cards can come just a tiny bit loose, plugs can come loose, all sorts. It's always a good plan a) to never move anything while plugged in, and b), to check all connections after moving if possible. External connections especially. I once had a minor fire in a case when a floppy drive's power connection came slightly loose and shorted out, causing the whole power cable to burst into flame! The stench was unbelievable for a few days, but no damage done other than a f***ed floppy drive.

So, otherwise, it's all fun and games now I take it?

Starting to 'assemble' parts for my own new system already. Cleaned out a good full size old ATX server tower, but need to get a beefier PSU for it. (It's got a 500W, but with the 8 core threadripper, 64 Gig RAM and all the rest, it would be pushing things, at the very least. Also got a mate's GFX card coming next week when he gets his upgrade, which is only 6 months old. Still on list, 500GB-1TB SSD, a 2TB HDD or 2, DVD-RW, and an ace sound card. Maybe I'll have everything together towards the end of the year. Going to set it up as a dual boot - Win 10 Enterprise and a Linux distro - not decided which yet, torn between Fedora, Debian, OpenSUSE and possibly Mint. Fedora is favourite though. Runs like a dream in HyperV. But so does Debian. That's one of the problems with Linux - far too many distros, all far too similar. Would be far better to have all those developer resources working on just a couple or so distros and doing something new instead of just copying Windows/macOS. Not going to happen though.

Bummer about the previous system - but lucky it didn't blow anything else. Funny things can and do happen sometimes. Cards can come just a tiny bit loose, plugs can come loose, all sorts. It's always a good plan a) to never move anything while plugged in, and b), to check all connections after moving if possible.

I didn't move the old system while plugged in. What threw me off and made me think it was a static electricity and/or a ground problem instead of a short circuit somewhere was that the office has a carpet and I would only get shocked *once* after touching a metal part of the case. Grabbing it again a second time would have no effect.

I would have to sit down on the chair and only then grab the case again to get shocked a second time.

Of course, hindsight is 20/20. I think it would have been ok (i.e.; no permanent damage done) if I had managed to diagnose the issue in time and forcefully reseat the GPU card before anything else. Once I heard the fan inside the case ramp up furiously while trying to determine if the DVI-D video connector was properly inserted, I think that's the point that the GPU chip itself got fried - connector must have made contact with the case, injecting lots of volts directing into the GPU.

nexter wrote:

So, otherwise, it's all fun and games now I take it?

You think? LOL

It's a lot of work and very time consuming moving everything to a new system and getting everything set up correctly.

I already overclocked it to 5Ghz with 1.3v and level 7 Load Line Calibration to make it AVX stable at that speed. Temperatures running Prime95 (with AVX enabled) stay in the lower to middle 80's, so that's fine.

Overclocked the 2080ti a bit too (ALL the new cards are extremely limited on how far you can go due to a low power level ceiling). With my old 980 ti I would get about 40-45 FPS at 4K on games like the Witcher 3 and Assassins Creed Origins, but I needed to set some graphics settings to High or Medium.

With the 2080ti I get 60-70 FPS with literally everything set to Ultra-High (even nVidia's Hairworks).

Currently I have the Intel Optane connected to the 16x PCIe slot which is in turn connected to the motherboard chipset and not directly to the CPU (so the graphics card is in x16 mode) but I am considering changing that. As it is, the Optane has to share x4 bandwidth with everything else on the system plus I imagine there is some added latency.

If I put it in the remaining x16 slot that is connected directly to the CPU, I could have it working at full speed (but, of course, that would make the graphics card run at x8, which might not be a biggie).

On the other hand the 2080ti is a 3 slot monster and doing so would put the Optane flush with the video card, minimal gap - probably not a good idea temperature wise for BOTH cards as the Optane already runs pretty hot as it is (45c when IDLE) and would then also be choking the GPU fans.

But - ah-ah! - the C700M case comes with a graphics card mount with Riser cable, originally intended to put the graphics card vertical instead of horizontal (which is the classic setup), i.e.; put it in a configuration like this (NOT my setup, just a picture from the net):

This is not something I would do since it typically increases the temperatures the GPU runs at over the standard configuration, especially in a case like this that has no vents on the side panel (solid tempered glass).

But I could use the riser cable with the Optane instead, which would allow me to connect it to the remaining CPU x16 PCIe slot *and* keep it away from the graphics card and all the heat around it. It would probably look a lot nicer than it does now too - main drawback being that the graphics card would then be running at x8 instead of x16. Some says this only leads to a 1% loss in most games, though, especially at 4K - might be worth it.

Had to reduce the i9 9900K overclock from 5Ghz to 4.9Ghz. At that speed it was stable with pretty acceptable temperatures running Prime95, CineBench and RealBench and I thought I had won the silicon lottery - *until* I decided to test with Prime95 small FFT.

Temperatures immediately shot to nearly 100ºC (TJMax) and Windows crashed a couple of minutes later. Try as I might with different BIOS settings, I could not get it to run Prime95 small FFT stable *and* with acceptable temps (this despite the Noctua NH-D15 cooler, which is THE best air cooler money can currently buy).

The reason Prime95 small FFT is so hard on the CPU is because 8K/12K FFTs are small enough to fit in the CPU cache - the CPU thus never needs to wait for RAM access, which means it's running internally at 100% full speed. Power consumption shoots through the roof (200W+) and so do temps, eventually leading to a crash.

Some claim that Prime95 small FFT is an unrealistic CPU load that will never happen in the 'real world', but I prefer my PC to be 100% stable at all times. At 4.9 Ghz it can run Prime95 FFT all day long with temperatures below 95C. High, sure, but most real world applications won't push them anywhere near that high, and, more importantly, it won't crash even if that happens.

I also replaced all the 4 original case fans with 5 Cooler Master Masterfan MF140R ARGB fans (added an extra fan as exhaust to the top of the case). I couldn't believe my eyes when they got here, both the power and ARGB cords were a measly 30cm in length (no wonder NONE of the photos on their site ever show how long they really are)!

Now, for *case* fans that are supposed to go into tower and big tower cases that is simply an unacceptable length, the cords were too short to reach ANYTHING (neither the motherboard nor the case controller), not even in their own C700M case. So I had to order some Noctua 4 pin fan extension cables via next-day delivery from Amazon and wait for them to arrive before I could finish installing the new fans.

Funny how I get ARGB fans and then end up deciding to make all colors monochromatic (white). But I like it like this and my system doesn't look like a disco from the 80's.

As for putting the Optane on the CPU PCIe slot using the riser cable, I ended up deciding against it. First I couldn't get a definite statement in relation to performance improvements - not even from an Intel tech on the Intel forums - second I read that riser cables can lead to signal degradation, or worse if the wires inside the cable break due to the cable being bent.

One thing is to have signal degradation when communicating with your GPU, quite another is to have the same thing happen with your main system drive, especially one as fast as the Optane. Since I don't want to risk silent data corruption, I decided to leave it as it is.

Another problem I had was with the monitors - the new Turing cards no longer have DVI-D connectors, only 2xHDMI and 2xDP outputs (and one USB type C). My old LG 30" only had a DVI-D connection, but I did have an active DVI-D to DP adapter here (which I never actually got to test) that I could use.

Problem is, my 30" HP ZRW30 only has a DP connection, and my 4K LG 43UD79, despite having both HDMI and DP connections, can only refresh at 60Hz 4K when using the DP cable - tried with HDMI but could only get it to refresh at 30Hz.

This meant using these two monitors exhausted all the DP ports on the 2080 TI card, leaving me only with two HDMI outputs. Fortunately I also had a 24" HP LP2475W monitor here which does Portrait mode (at 1200x1920) and has an HDMI port, and that's what I ended up using as you can see in one of the photos above. Good enough to display all the system stats I want visible at all times.

Wow! That's a lot of investigation and trial and error. I guess part of the world of PC's when customizing on an uber level.

Just think what could have happened if you weren't as computer savvy as you are!

I did quite a bit of reading and watching reviews on the mousepad. The reviews were all over the place. Some saying it's cool, some saying it's cool but too pricey, and some saying it isn't worth it.

The one negative they all pretty much had in common is the location of the USB cord (the mouse cord rubs on the pad cord shield, I can see it in the pic too LOL) and the fact that it isn't detachable.

Do you have any issues with the pad being USB? I searched and found that they have a wireless mouse that's compatible with the pad/software. It's the Razer Mamba HyperFlux Wireless Optical Gaming Mouse.

My problem with wireless is that it relies on batteries, and eventually you forget to charge them.

The only wireless object I use are Sennheiser headphones, but only because I like to walk around the house while listening to stuff *and* because when they are not in use you naturally put them on the headstand - which is also the charging station.

There are some new wireless mouses that actually use the mousepad as a charging station (or even their source of power), those are probably cool but I never tried them.

The durability of Razer material, despite being a premium brand, is also in question - they're pretty expensive but my previous experience has not been good: at a time I was running through one Razer Naga mouse once per year or so, crappy button switches - let's see how long this Mamba RGB one lasts.

Same expectation goes for the keyboard - this is a mechanical keyboard so it's supposed to last a long time... remains to be seen if that holds true or not.

And in other news, after flashing the BIOS to the most recent version I managed to get the new system stable at 5Ghz (non-AVX) and 4.9Ghz (AVX), using 1.33v for the CPU core, Load Line Calibration of 6 and AVX offset of 1.

Prime95 small FFT at 5Ghz occasionally peaks core #2 and core #4 (the hottest CPU cores) all the way up to 100ºC (core #1 max is 91ºC), but on average both stay around 96/98ºC and the system neither throttles nor crashes, so that's good enough for me. Not like you run Prime95 Small FFT loads every day.

Going to focus on the RAM next, see if I can overclock it from 3200Mhz to 3600Mhz.

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum