Tech|Inferno News, Reviews & Guides

Even as Microsoft's newest CEO Satya Nadella tries to steer the company back into relevance by attempting to unify Microsoft notebook, desktop, tablet and smartphone platforms under the "One Microsoft" structure devised by his predecessor Steve Ballmer, most consumers and developers don't seem to care and the company appears destined to fall into obscurity.

While it currently dominates the desktop market, it's newest and greatest hope in achieving the "One Microsoft" vision is Windows 10 and at the end of 2015, it barely had climbed to 10% share despite Microsoft's strong marketing push by giving it away for free. In fact, Windows 7 still retains 55% of the desktop OS market with consumers and developers alike perfectly content to stay where they are.

According to Forbes, Microsoft is now changing tactics by attempting to scare consumers into upgrading to Windows 10 by telling them Windows 7 has potentially serious security risks and hardware compatibility issues. In speaking to Windows Weekly, Microsoft Marketing chief Chris Capossela said that users who continue on with Windows 7 do so "at your own risk, at your own peril". Forbes Gordon Kelly notes that Microsoft's statements about Windows 7 amount to "complete rubbish" as Windows 7 will be supported until 2020 and with its greater market share vs Windows 10, it is guaranteed to receive more developer attention--which includes security patches and driver updates. Microsoft's motives are fairly transparent as it has a stated goal of one billion devices running Windows 10 within 2-3 years of it's release and with Windows 10 adoption seemingly faltering, they are getting desperate.

To make things worse, Vox has an article with a very interesting graph created by Joshua Kunst which illustrates Microsoft losing significant ground among developers since 2008. He made the graph by tracking popular tags on Stack Overflow, a popular forum where many developers hang out and answer programming questions.

Looking at this chart, going back to 2008 when Stack Overflow was founded, it becomes clear that Microsoft backed programming languages and applications have declined: C#, .NET, ASP.NET, SQL all lost ground, yet competing alternatives such as PHP, MYSQL, JavaScript gained significantly. JavaScript owes part of its rise thanks to Android which uses it as the main language for Android apps.

In the gaming market, Xbox One started off with a faulty strategy of attempting to force users into purchasing their Kinect device while their competitor Sony produced a more powerful system that did not have similar bundling restrictions. Microsoft eventually backed off but it seems to be too late as Sony's PS4 now holds a dominant lead over Xbox One despite Microsoft's best efforts.

(image credit: Ars Technica)

So will Microsoft be able to turn around its misfortunes? Maybe if Windows 10 has a drastic turn around in 2016 with more developers getting on board, it is possible but as of right now, the future doesn't seem too bright.

If you're into e-sports then you've probably heard of MLG (Major League Gaming) as they were pretty big in the scene and held tournaments for StarCraft 2, Call of Duty and others. It turns out that MLG was running substantial debt and needed a way out and so on Dec. 21st, MLG's Board of Directors approved an Asset Purchase Agreement with Activision Blizzard for $46 million.

Of course the stockholders had no idea this was happening and it seems most of them will be left with next to nothing after MLG's debts are paid off. Esports Observer spoke with one of the affected stockholders who said, "I got fucked on stock". In fact, it was only the next day after the purchase agreement that the stockholders were informed of the sale.

Nobody knows what Activision Blizzard intends to do with this purchase since they are close with ESL so starting their own competing eSports tournament may not make much sense.

Although internet gaming addiction isn't officially a disorder in DSM-5, it is mentioned in DSM-5 section III as follows:

The subject of gaming addiction is nothing new and there are studies that can be found on nih's website which go back several years.

Popular Science has an interesting article about a recent internet gaming addiction study found in the journal Addiction Biology. This study took MRI (magnetic resonance imaging) scans of 78 teenage boys that were diagnosed with Internet Gaming Disorder (as noted above, not an official diagnosis in DSM-5) and compared them against 73 control subjects that did not have this disorder.

What they found is that the teens with the gaming addiction had formed several stronger connections between certain parts of the brain such as the dorsal anterior cingulate cortex and bilateral insulae that allow them to react more quickly to certain events (e.g. twitch shooters like CS:Go).

Conversely however, the researchers also found a lack of impulse control in the subjects attributed to a strong connection between the dorsolateral prefrontal cortex and temporoparietal junction, something that is found in patients with Down syndrome, schizophrenia and autism.

With the prevalence of internet game streaming on twitch and the draw of tournament prize pools worth millions of dollars, the number of teens and adults drawn to long hours of gaming will almost certainly lead to a much larger number of people who may be susceptible to these possible deleterious changes to their physiology.

(photo credit: extremetech)

However, before any definite conclusions can be drawn about internet gaming addiction, many other causal or contributing factors would need to be accounted for such as those with preexisting disorders being naturally drawn to video games. Additionally, female teens and adults addicted to gaming warrants further study to compare them against their male counterparts.

Ian Murdock, the founder of the Debian project and CTO of the Linux Foundation has passed away at a young age of 42 years old. Although no official reason is given for his death, it is speculated that he committed suicide. On his now deleted twitter account , he talked of an altercation with the police who he claimed had assaulted him. In one particular tweet, he threatened to commit suicide to bring to light what he viewed as unchecked police brutality. Unfortunately, he used some unsavory language to convey his anger and feelings of mistreatment:

A reddit thread of concerned users discussed his twitter posts and although his twitter account was deleted, one user managed to preserve all his tweets via pastebin:
7:12pm: I am a white male, make a lot money, pay a lot of money in taxes, and yet their abuse is equally doned out. DO NOT CROSS THEM!
7:08pm: This was right after the female officer ripped off my underwear.. I guess that's not considered rape if you're not a woman being raped.
7:03pm: "We're the police, we can do whatever the fuck we want.."
6:49pm: What does one have to get education wise to become a police officer.. asking for a friend.
6:42pm: The rest of my life is to fight against the police.. they are NOT friends, so don't ever ever believe otherwise.
6:41pm: The police are uneducated, evil, and sadistic. Do not trust them.
6:33pm: (2/2) They are uneducated, bitter, and and only interested in power for its own sake. Contact me imurdock@imurdock.com if you can help. -ian
6:31pm: (1/2) The rest of my life will be devoted to fighting against police abuse.. I'm white, I made $1.4 million last year,
6:07pm: i'm hoping coming from a successful white guy it will help everyone
6:06pm: i'm going to post my case on my blog.. if anyone can post it on hacker news or wherever i would apprieciate it
6:00pm: @jacksormwriter wants me dead
5:48pm: Writing up my experience for others to hopefully prevent others from police abuse then you won't hear from me again
5:45pm: where they put you in a cell with absolutely no instructions whatever aside from the spell on the floor in piss?
5:45pm: shall i post pictures for all my bruises from my against the police officers?
5:38pm: they said no
5:38pm: i asked if they had cameras
5:37pm: then followed my home from there
5:37pm: i had to have swtitches
5:36pm: then they pulled me out of my house and did it again
5:36pm: they followed me home
5:36pm: i had to go to the hospital
5:35pm: they beat the shit out of me twice, then charged me $25,000 to get out of jail for battery against THEM
5:34pm: if anyone wants to come over and see what the police did to me i would be more than happy for that
5:30pm: I'm not committing suicide today. I'll write this all up first, so the police brutality ENDEMIC in this so call free country will be known.
5:27pm: Maybe my suicide at this, you now, a successful business man, not a NIGGER, will finally bring some attention to this very serious issue.
5:25pm: My career is over now, so I'll be gone soon.
5:23pm: Quote: "We're the police, we always win."
5:22pm: I'll write more much later. They still don't have cameras on all police so I'm going to use my somewhat celebrity to hopefully stop this.
5:21pm: My bail for "assault against a police officer" are all that: $25,000.
5:20pm: Then beat me up some more.
5:20pm: I'll write more on my blog later. But the police here beat me up for knowing on my neighbor's door.. they sent me to the hospital.
5:17pm: https://t.co/I1CSCJErWf
5:14pm: watch my blog later http://ianmurdock.com
5:13pm: i'm committing suicide tonight.. do not intervene as i have many stories to tell and do not want them to die with me #debian #runnerkristy67
Whatever the reason or cause, it is a loss to the Linux and Debian community.

Every avid PC fps player has dreamed of being in the thick of the action in a more simulated way than a keyboard and mouse provides. With the promise of VR like Oculus and HTC Vive, we're supposed to be taking the first steps towards that reality although they both still leave a lot to be desired for those that want to hold an item physically and manipulate it in real time.

Hexus has an article about a small group of avid fps players and developers called UzBrainnet that want to fill that gap with their RAIL GUN controller via their Kickstarter campaign. According to UzBrainnet's biography, they're a team of engineers who are obsessed with popular FPS games like Counter Strike that want to bring the feeling of shooting a real gun to the genre.

They do this by offering four separate attachments or "units" for an airsoft gun that connect wirelessly via USB and can be used on nearly every platform including PC, PS3, PS4, Xbox One and even in conjunction with Oculus. UzBrainnet claims that the RAIL GUN employs special "Fast Rotation" algorithms that allow you to turn 180 or 360 degrees naturally without that awkward feeling typically associated with other motion sensing technologies. In addition to motion sensing, because the RAIL GUN kit attaches to an actual airsoft gun that typically has it's own recoil and sound, you end up with a fully immersive experience.

As for game compatibility, it purports to be compatible with a wide assortment of FPS games that include Call of Duty, Star Wars: Battlefront, Destiny, Battlefield Hardline and many others and can be used by nearly anyone ranging from an FPS beginner to enthusiast.

The following is their official Kickstarter campaign video that shows off the device in action and another gameplay video of a stormtrooper playing Battlefront:

What the Kickstarter campaign doesn't mention is what kind of latency one should expect with the RAIL GUN but looking at the Battlefront video above, it is easy to pick up on the slight lag between the person moving the gun and the on screen action. If the campaign is successful, perhaps the development team will continue to improve upon this to minimize any latency since that is one key factor in nearly any FPS game.

With 21 days left in the campaign and nearly $27,000 raised out of the $100,000 goal, it has a decent chance of succeeding. The cost of entry for one of the finished RAIL GUN units is $165.

So I didn't like that the memory on my 980m only clocked to 6.4 GHz after raising the voltage to 1.48V from 1.35V, and wanted my memory to run even faster. I knew someone with a spare 970, so we made a deal where I buy the card, and if it still worked after I switched all the memory chips, he'd buy it back (for reduced amount if it could no longer do 7GHz, but at least 6GHz). Long story short, he bought the card back and I got faster memory.

Both cards are GM204 chips. The 980m has one less CUDA core block enabled than the 970, but it has the full 256-bit memory interface and L2 cache with no 3.5GB issues, while the 970 is 224-bit with 1/8th of the L2 cache disabled. Both cards are 4GB with 8 memory chips.

I highly suspected this memory swap would work because video cards read literally nothing from a memory chip. There is no asking for what the chip is or even the capacity. They write data to it and hope they can read it back. Memory manufacturer information read by programs like GPU-z isn't even read from the memory. It's set by an on-board resistor. I also had changed multiple memory chips in the past, so was fairly confident I could physically do the job.

I started with just one chip switched from both cards. This meant both cards were running a mix of memory from different manufacturers and of different speed ratings, but same internal DRAM array configuration. Both cards worked. Here is a picture of the 980m with one chip switched over:

Now how did the cards react? The 980m behaved no differently. No change in max overclock. The 970 though... I expected it to be slower... but...

I didn't try 3GHz or 4GHz, but yeah, HUGE clock decrease. I shrugged though and kept switching all the memory figuring that as long as it worked at any speed, I could figure out the issue later. With switching more chips through 7/8 switched there was no change in max memory clocks.

What was really fun was when I had 7/8 chips done. My GDDR5 stencil got stuck and ripped 3 pads off the final Samsung chip. Needless to say there was a very long swearing spree. Looking up the datasheet I found that 2 pads were GND, and a 3rd was some active low reset. Hoping that the reset was unused, I checked the 970's side of the pad and found it was hardwired to GND. This meant the signal was unused. I also got a solder ball on a sliver of one of the GND pads that was left, so I was effectively only missing a single GND connection.

I put the mangled 8th chip in the 980m and it worked. Net gain after all of this... 25 MHz max overclock. Something was obviously missing. I figured I would switch the memory manufacturer resistor, hoping that would do something. I saw that Clyde found this resistor on a k5000m, and switching it to the Hynix value from Samsung had no effect for him. He found that for Hynix on the k5000m the value was 35k Ohms, and for Samsung 45k Ohms. I searched the ENTIRE card and never found a single 35k Ohm resistor. Meanwhile the 970 also worked with all 8 chips swapped, at a paltry 2.1 GHz.

Then I got lucky. Someone with a Clevo 980m killed his card when trying to change resistor values to raise his memory voltage. His card had Samsung memory. He sent his card to me to fix, and after doing so I spent hours comparing every single resistor on our boards looking for a variation. Outside of VRM resistors there was just a single difference:

On his card (his is shown here) the boxed resistor was 20k Ohms. On mine it was 15k Ohms. I scraped my resistor with a straight edge razor (I could not find a single unused 20k resistor on any of my dead boards) raising it to 19.2k, hoping it was close enough.

And it was! Prior to this I also raised the memory voltage a little more from 1.48V to 1.53V. My max stable clocks prior to the ID resistor change were 6552 MHz. They are now 6930 MHz. 378 Mhz improvement.

Here's a 3dm11 run at 7.5 GHz (not stable, but still ran)http://www.3dmark.com/3dm11/10673982

Now what about the poor 2GHz 970? I found its memory ID resistor too:

Memory improved from 2.1 GHz to 6.264 GHz. Surprisingly the memory was slower than it was on the 980m. I expected the 970's vBIOS to have looser timings built in to run the memory faster. As for why the memory was over 100MHz slower than the 980m, 980m actually has better memory cooling than the 970. With the core at 61C I read the 970's backside memory at 86C with an IR thermometer. The Meanwhile the 980m has active cooling on all memory chips, so they will be cooler than the core. In addition, the 980m's memory traces are slightly shorter, which may also help.

The 980m at 6.93 GHz is still slower than the 8 GHz that the 970 was capable of with the same memory. I'm not sure why this is. Maybe memory timings are still an issue. Maybe since MSI never released a Hynix version of the 970 meant leftover timings for an older card like a 680 were run, instead of looser timings that should have been used (I know in system BIOS tons of old, unused code get pushed on generation after generation). I don't know, just guessing. Talking to someone who knows how this stuff works would be great. I still want 8 GHz.

Some more pics. Here's one with the 970 about to get its 3rd and 4th Hynix chips:

Here's my 980m with all memory switched to Samsung. Sorry for the blurriness:

So in summary:

1. It is possible to mix Samsung and Hynix memory, or switch entirely from one manufacturer to another, with some limitations.

2. There is a resistor on the pcb that is responsible for telling the GPU what memory manufacturer is connected to it. This affects memory timings, and maybe termination. It has a large impact on memory speed, especially for Hynix memory. This resistor value can be changed to another manufacturer. It is not guaranteed that the vBIOS will contain the other manufacturer's timings. If it does they may not be 100% correct for your replacement memory.

3. If you take a card meant for Hynix memory, you can mix Samsung memory of the same size if it is a faster memory. If the memory is the same speed, the penalty for running Samsung with Hynix timings may hurt memory clocks.

4. If you take a card meant for Samsung memory, you cannot mix any Hynix memory without MAJOR clock speed reductions without also changing the memory manufacturer resistor. It is not guaranteed that the vBIOS will contain the other manufacturer's timings, or if it does 100% proper timings for your specific memory.

5. For Kepler cards the Samsung resistor value is 45k, and for Hynix 35k. For Maxwell cards the Samsung resistor value is 20k, and Hynix 15k.

Next up is changing the hardware ID to be a 980 notebook. Clyde also found HWID to have an impact on the number of CUDA core blocks enabled. In about a month I can get a hold of a 970m that someone is willing to let me measure the resistor values on. It has the same pcb as the 980m. Does Nvidia still laser cut the GPU core package? We will find out.

Full thread can be found here: https://www.techinferno.com/index.php?/forums/topic/9021-hardware-mod-gtx980m-hynix-to-samsung-memory-swap/#comment-134361

The first processor that can use light to communicate with the external world is the result of a combined effort between Universities of California - Berkeley, MIT and Colorado, Boulder.

The new chip, described in a paper published Dec. 24 in the print issue of the journal Nature, marks the next step in the evolution of fiber optic communication technology by integrating into a microprocessor the photonic interconnects, or inputs and outputs (I/O), needed to talk to other chips.

Advantages of the new chip:
Greater bandwidth with less power (10x - 50x greater than current electrical microprocessors and consumes 1.3 Watts of power to transmit a terabit of data per second).
Signal can be transmitted way further without the need of a repeater (1m is approximately the limit for high-speed electrical links).
Different wavelengths could be used at the same time to increase data transfer.
These adaptations all worked within the parameters of existing microprocessor manufacturing systems, and that it will not be difficult to optimize the components to further improve their chip’s performance.

The current research lead to the creation of two new startup companies, one is Ayar Labs (ex OptiBit) where researchers are focusing on photonic interconnects while SiFive is commercializing the RISC-V processors.

We got word that Green Man Gaming is offering 25% off some of the most anticipated titles of 2016 like The Division, Deus Ex: Mankind Divided, Hitman, Street Fighter V and many others. You can check out the full list of the games and the coupon code here: http://www.greenmangaming.com/most-anticipated-2016/

Personally I'll be picking up Deus Ex: Mankind Divided and Hitman from them using this coupon. This is a nice discount for the holidays so pass it on to your friends and family and start 2016 off with some of the best games at discounted prices. Now if only Steam would offer the same kind of deals on new games.

Google's technology combined with Ford's know-how on building cars could result in a self-driving vehicle available to the masses probably sooner than we thought.

Google has 53 test vehicles on the road in CA and TX, logging millions of miles in autonomous driving with their goal being to reduce the huge amount of 33,000 annual deaths on U.S. roads.

Nevertheless, Ford is not the only automaker Google has been talking to while at the same time Nissan, Volvo and Mercedes are developing their own self-driving technology and vehicles equipped with it by 2020.

Applications of this technology could find their way on taxi or car-sharing services in urban areas as well as self-driving trucks.

One interesting aspect of this technology could be in the case of an accident, where liability will be hard to be determined.

Samsung's new Galaxy A9, a premium midrange phone with a design of glass and metal will rock a 6-incher FullHD (1080p) AMOLED panel, minimal bezel and 2.74mm width as well as a 4000 mAh battery with fast charging.
The package will include:

A Snapdragon 652 (Quad-core 1.8GHz ARM Cortex A72 + Quad-core 1.2GHz Cortex A53, Adreno 510 GPU) with 3GB of RAM and internal storage of 32GB, expandable with the help of a microSD card.
It will come with a 13MP primary camera with a wide f/1.9 aperture and optical image stabilization. On the front there's an 8MP snapper behind an equally bright lens.
A fingerprint sensor inside the home button, complete with Samsung Pay support.
Wi-Fi a/b/g/n (no ac, though). Bluetooth v4.1 and GPS/Beidou for positioning as well as NFC.
Unfortunately it will come with a Lollipop 5.1 and pricing and availability remain to be detailed.

A little more than a year ago, NVIDIA, one of the largest graphics processing unit (GPU) companies in the world, claimed Samsung infringed on three of it's core patents and asked the ITC to ban Samsung smartphones and tablets that used Samsung's Exynos SoC (system on chip) and Qualcomm's Snapdragon SoC.

However, an ITC administrative law judge ruled that Samsung and Qualcomm did not infringe on two of NVIDIA's patents and declared the third that they did infringe to be invalid. After the case went to the full ITC commission, it upheld the administrative law judge's ruling in favor of Samsung.

In turn, Samsung counter-sued NVIDIA claiming that it had violated three of Samsung's patents, specifically 6,147,385, 6,173,349 and 7,804,734 which date back to the 1990s covering implementation of SRAM. And now an ITC administrative law judge (ALJ) has found NVIDIA did violate those patents and the case is set to go before the full ITC commission. NVIDIA argues that the patents Samsung used in its countersuit are outdated and no longer used in modern designs: "We look forward to seeking review by the full ITC which will decide this case several months from now." One of the three patents is set to expire in 2016.

NVIDIA, despite being the world leader in visual computing on the desktop, has not had much success in replicating that dominance in mobile designs with it's Tegra SoC and has since moved on to using its technology in other products and applications such as the Drive PX self-driving platform and it's consumer SHIELD android based gaming box.

AMD has allegedly delayed their upcoming flagship dual GPU based on the Fiji silicon dubbed "Gemini" to Q2 2016. Earlier this year, during an E3 livecast, Lisa Su (AMD CEO) had committed to a release date of Christmas 2015. When questioned by hardware.fr about the delay, AMD claims it has been pushed back because the HMD (head mounted display) ecosystem isn't quite ready yet and therefore they opted to hold off on Gemini's release until Q2 2016.

However, this does bring into question whether Gemini will even be relevant in Q2 2016 as AMD is also scheduled to begin release of it's much anticipated next generation Greenland GPU during that time frame.

NVIDIA earlier today released the 361.43 WHQL driver which comes with some long awaited fixes that cover power usage when idle at 144 Hz as well as SLI issues with Star Wars Battlefront. In addition, the driver adds support for GameWorks VR 1.1 as well as support for Oculus VR's latest SDK.

South Korean site etnews reports that AMD's next generation Greenland GPU, scheduled to be released in Q2 2016, will be produced by both Samsung and Global Foundries using 14 nm FinFET LPP. Since both Samsung and Global Foundries share a common IP for 14 nm LPP, AMD will be in a position to leverage both of them for maximum production capacity.

TSMC, which traditionally produces GPUs for AMD and it's rival NVIDIA, lost AMD's contract due to it's inability to keep up with yield and supply demands.

Greenland is expected to offer 2x the energy efficiency of the current GCN architecture and is AMD's direct competitor to NVIDIA's Pascal.

Source: WCCFTech

This is yet another win for Samsung which has managed to steal back Apple from TSMC and will also be producing chips for Qualcomm. It will be interesting to see whether AMD being on 14nm LPP will give it any advantage over NVIDIA who reportedly will be using TSMC's 16nm FinFET+ for Pascal.

Ubisoft was supposed to have an open beta for The Division this year but pushed it back to early 2016. However, a few Xbox One owners that pre-ordered the game and others that registered on Ubisoft's website were invited to test out a small section of the game. Of course it goes without saying that Ubisoft strictly forbid any leaking of gameplay footage since it's an alpha but it seems that was ignored by one of the closed beta testers.

Ubisoft has been busy filing DMCA notices and getting the footage pulled from YouTube but NeoGaf still has a few backups available for those that are interested.

I've been sitting on this build for a while, meaning to make a build log but its been so long I've forgotten most of it!
So here's the short and sweet version!

My previous build with this card was a water-cooled wall mounted windows gaming machine. This was a great rig for a former student just having started a new job with still a fair bit of time on his hands... Then things got busy, and the water beast became stagnant. Long story short, I converted to mac mostly due to requiring portability yet still wanting a powerful machine that wasn't a total door stop to carry around.

So it became that a GTX 970 soon found its way on my book shelf in a nice compact case. Just for kicks, heres my old rig!

Firstly, why did i drop the water-cooling? The original plan was to make a similar wall mounted eGPU. After some pondering I came to the conclusion that this particular card I owned wasn't much of an overclocker. I hadn't actually bothered unlocking the card or overclocking it a whole lot. I had planned to do it, but it had never happened. For 5 months I had used the card running on stock power and it was still maxing out games on my 2560x1080 ultra wide.

I went ahead a bought the Akitio thunder 2 off a german website, delivered to the UK within two days for a very good price!

I initially used a 120W 12V power brick to run the card, clearly this didn't cut it, and the card would die instantly on load. I resorted to grabbing a Dell DA2 18A power brick and things started working great!

This post by dschjin inspired me to try the noctua fans with the stock heatsink. To my surprise they worked very well and I was getting great temps under load. I could even hold my previous water-cooled overclock and it would hang around 75 degrees C.

I then proceeded to create a funky case cooling design, and two days of drilling later I ended up with this!

It looked great! But it was an awful cooling solution... absolutely useless, wouldn't even hold stock settings before throttling...

I then decided to cut out the entire side and top panel with the idea of finding a grill/mesh material to put in its place. I ended up going with a desktop wire magazine holder like this one:

Here it is cut out

I then cut it to size and slid it between the fans and the edge of the aluminium case. Its all very much a tight fit and required a lot of effort to close while keeping everything in place.

I used some PCIe power extenders that plugged on the top of the card as two six pin power. They required trimming of the plastic and heat shrink to get the clearance:

As you can see the sharp inside of the aluminium enclosure already mangled the nice new heat shrink!

The fans are also just about held in place with some bits of plastic. Due to the design of the heatsink, the fans couldn't be sat flush without having to cut some metal tabs and bending things. the way it is now lets the fans sit tight between the mesh and the heatsink. Once the case is closed, nothing can move.

I then added a power switch with LED (switch contacts go on the dell PSU and led goes to the existing led pins on the Akitio motherboard.

chopped up 24pin ate connector is in there as a total bodge job. The wires are breaded in pairs and simply pass through the vent holes of the card. Too easy! The wires are stiff enough that it doesn't really matter anyway.

Finally, here are a few of my favourite things!

Electrical tape to cover up sharp edges of steel.
3M VHB tape can stick anything to anything like foam tape! Then come off like it was never there. I swear by this stuff!
sharpie to coverup dings and dents
Wago wire to wire clamps, these things are quicker and much more reliable than terminal blocks if you're too lazy to solder wires together. Like me!
stick on foam to space out bits of floating mesh grill and make a snug fit
mains powered dremel with EZ click metal cutting disk. This thing makes short work of thick aluminium.

And don't forget boys and girls, always wear protection!

Software woes

Lets just say the hardware was the easy bit.. I started out with a bootcamp of Windows 10 and the card would just about start. It seemed very unreliable, sometimes it would work everytime, then I'd get home one day and the thing just didn't want to start..

Optimus made everything worse, though it was great when it worked.

I ended up going to several installs of windows 10 and 8.1, even a UEFI rebuild..

Finally I gave in a resorted to OSX drivers. Automate GPU is fantastic and it just works. I've been very surprised how well most of my steam library works on OSX. I had a nice surprise the other day when I found out Thief was available for OSX and that sold it to me. I got rid of my windows partition and all my gaming is done in OSX now.

Overall this seems to be a great solution for portable computing and still having the ability to run desktop graphics. I've been very surprised and look forward to Thunderbolt 3 where this should be natively supported!

For those interested, I did manage to get a fair bit of overclocking done within windows when I had it working, here are the results:

CPU temps:

Card info:

Over thunderbolt:

running this card on a Z77 desktop motherboard with i5 3570k @ 4.2GHz gave :

If you're visiting this website regularly then the chances are that you're an avid PC gamer. And if there's something PC gamers love, it is discounted new games--something the console crowd rarely gets. Green Man Gaming has a November 23% off voucher active right now that can be used on new select titles such as Just Cause 3. I have used Green Man Gaming extensively and recommend them as a good alternative to Steam sales and the best part is that most of their games have a Steam key.

Mr. Fox is holding a brainstorm session in the forums asking: What's missing in current notebooks that you would like have added? Not your typical 4k display but something that takes hold of the imagination and breaks the status quo. Read the rest of his post below for details and the original post link.

ORIGINAL POST BY @Mr. Fox: https://forum.techinferno.com/index.php?/forums/topic/8869-2016-and-beyond-what-are-we-missing-in-laptopsnotebooks/ now merged with @Prema thread into one.

After visiting with our friends at Eurocom about their quest to drive the future of computing technology, I decided to pose the question to the community to brainstorm as many ideas as possible.

I could not identify a perfect sub-forum for it, so I chose this one. It is not for one company, but all of them. Hopefully, they will be paying attention. I know that the Eurocom Team will be, and hopefully at least a couple of others that care enough to be disruptive.

As the thread title suggests, this thread is not a place to list out the features that you want in a notebook/laptop if those are available already. Rather, it's a twist on your typical wish list and poses the question about what we want to see that is not available.

In other words, no need to mention 4K. That's already here and starting to become popular. No need to mention an eGPU contraption. We already have a couple of those available. No need to mention thin and light... we already have a plethora of options like that.

What do you want that IS NOT available?
Yes, WILD and CRAZY ideas and concepts more than features are what we want here.
Let imagination be your guide and assume all things are possible in the future.

Dare to hate the status quo.
Innovation and the future go together.
Innovation is capability to create the future.
The only stupid ideas are those you keep to yourself. After posting your ideas here, see if the company you do business with has a suggestion box email, and use it. Point them here as well. If they not, or even if they do, feel free to use the email suggestion box that Eurocom has made available: future@eurocom.com if you would like to.

The more ideas we share, the brighter the future can be.
To set an example, I will go first...
- - - Updated - - -
First, I want:

X99 laptop with 5960X CPU
Delid from the OEM/Reseller
Three-way SLI
1KW AC adapter
Internal liquid cooling for CPU and GPU (please, no desk-bound colostomy bag for the GPU a la ASUS)
No more plastic laptop bodies... metal, metal and more metal in the chassis
Second, I want AMD to make a comeback and kill the monopoly that NVIDIA and Intel have on the industry.
Third, I want a real, full-fledged, fully functional OS replacement for Micro$oft Windows. Since they have lost their way and are trying to do a one-size fits all OS loaded with spyware, the Redmond Mafia is no longer adding value to being a PC owner. I want a new OS that can run anything made for Windows; not a half-baked Linux that only does some things well... and not something from Apple. Something fitting for the hardware mentioned under my first want.

Posts like the one highlighted below are what make up the heart and soul of Tech|Inferno. We've got enthusiasts like HotPantsHenry that take a perfectly good MSI GT60-0NC notebook with an 880m GPU and decide to take it to the next level by strapping on a waterblock to it using electrical tape..that's right electrical tape!

Check out the rest of his thread here: https://forum.techinferno.com/index.php?/forums/topic/8923-msi-880m-overclock-help/

Original Post Content:

Here is an awful picture, but will give you an idea of what is going on.
Cpu cooling is stock. Even with minor OC to the cpu, the temps stay pretty cool.

About Us

Tech|Inferno was formed in 2011 by a small group of enthusiasts who wanted to create a platform that would empower others to share in their passion of all things related to modern technology. To achieve this vision, Tech|Inferno invites developers, modders, enthusiasts and dreamers to come on board and share their knowledge with its large and ever growing community.