The supreme overlord of Windows operative systems, Microsoft, have filed a patent with the U.S. Patent and Trademark Office, where they propose cheat detection at platform level that is handled by artificial intelligence, or AI in short.
The founder of Blue ScreenTM is obviously laying the foundations for proper anti-cheat detection on their console, without having to rely on game developers. Unfortunately for humans, the amount of data required is much more than mere mortals could chew through.

Microsoft's main concern is that platforms that "host third-party games may not be able to detect cheating that occurs". As such, platforms can, and often do, reward cheating behaviour, simply on the grounds of not being aware of it.

Come in - machine learning. Microsoft proposes a compartmentalised model of AI anti-cheat detection, where the first module is called the goals management module. This is just your run of the mill achievements system, which establishes the general framework.

The second one is where the magic happens, as the cheating detection module would track your in-game progress and use machine learning to closely study your gaming patterns. Games would inform the module of gameplay information and player progress, letting the AI crunch the numbers.

As you can imagine, the amount of data involved is huge and would take an average human hours, perhaps even days to decipher. AI, on the other hand, can pull projections of your normal gameplay, data comparisons or just about anything else in mere seconds, which does make him the perfect candidate in Microsoft's plan.

If the AI finds discrepancies in these patterns, it flags the user as cheater and sends him to the "enforcement module", which may sounds a bit too Orwellian for our taste, but never mind. It's basically customer support, where only first offences will be managed by actual people. Repeat offenders, will be killed. Just kidding, they'll either be suspended or permanently banned.

Disregarding the paranoias of AI taking over the world, Microsoft may be onto something here. After all, statistical analyses are invaluable tools once properly used, although it'll probably be a while before we see this in commercial usage. I mean, the juice you'd need for these would probably reduce Xbox One to Xbox 360 in terms of power under the hood, so you may wanna put off building that anti-AI shelter. For now.

That's interesting, but at the same time it may be decades before it comes to fruition. Not to mention the fact that it'd only affect games played through whatever platform is set up to support it. Most likely a console or online gaming platform.

That said, statistical analysis is the backbone of cyber security and the like already, so streamlining the system using machine learning is not a new idea, even though the execution of it has yet to be proven. For us gamers, not much is going to change for a long long time.

Another issue I can think of is how does an AI make up the difference between what is a 'cheat', or external influence on a game system, and say a proper MOD. The two are practically the same, from a coding perspective. Even using external software like CE that makes changes in real-time can be consider modding. Any game that allows third-party software/coding adjustments will cause issues for a machine learning anti-cheat system.

I don't really know how software patents work, but for regular stuff you have to have a working model to file the patent (or at least it was that way, not really sure). But I agree it'll be awhile before we see a working version of this; I don't think the current hardware can really handle this kind of thing yet, and publishers tend to want to make the game accessible to as many people as possible.

It was a conversation with STN, that caused me to find this. I was actually looking for an article where DARPA is funding some kind of hardware level security, due to fear of China making hardware trojans; and Microsoft supposedly is jumping on it. But nothing really said anti-cheat, but a hardware level monitor would be impossible to bypass without modifying the hardware itself; so it would be one hell of an anti-cheat, but seems like DARPA might see using it that way as a security risk (seems more like a Military/Government thing anyway, but of course a person in the Military/Government without it might compromise the whole thing, so it's hard to say). (EDIT: But Microsoft doesn't make much hardware so I'm not to sure how this would work out; but that may be a price thing and with DARPA paying, it's end price may not matter.)

Yeah, I have no idea how hardware level security systems would work. Even BIOS, which is written directly into hardware, is really tiny in comparison to OS. That's because modern contemporary hardware is all based around the software doing the work. Trying to write active software-level systems into hardware would make that hardware massive in comparison, probably with much higher power requirements as well.

The other problem with using hardware to run active systems is that no software is perfect, requiring patching and other forms of manipulation. Which defeats the whole purpose for security systems. Allowing any sort of access on the hardware, especially regular access, would be like leaving an unlocked door for anyone who wanted to get into it. That, and manipulating already set hardware would be a nightmare.

And admittedly I am not a hardware or software kind of guy. Lol. So all of this is just off the top of my head.

Yeah, I have no idea how hardware level security systems would work. Even BIOS, which is written directly into hardware, is really tiny in comparison to OS. That's because modern contemporary hardware is all based around the software doing the work. Trying to write active software-level systems into hardware would make that hardware massive in comparison, probably with much higher power requirements as well.

The other problem with using hardware to run active systems is that no software is perfect, requiring patching and other forms of manipulation. Which defeats the whole purpose for security systems. Allowing any sort of access on the hardware, especially regular access, would be like leaving an unlocked door for anyone who wanted to get into it. That, and manipulating already set hardware would be a nightmare.

And admittedly I am not a hardware or software kind of guy. Lol. So all of this is just off the top of my head.

Kinda my thoughts exactly, even how a hardware trojan would work; I mean does it just get hardcode to a chip like the BIOS, or would it be hardwired like the old rope programming (Wikipedia.org - Core rope memory). And with hardware security, when an exploit is found what do they do then, send out new chips or boards; but it'd probably be more elegant than that. I just know MIT has a long history with DARPA and the US government (oh and NASA too) and if they are working on it, well it might actually happen.

I honestly think the technology of computer systems will eventually reach the point where the distinction between 'hardware' and 'software' will become rather lax. Rather it be proposed systems like 'Organic Hardware' or 'Crystal Memory', which at this point is still mostly under the purview of science fiction, the end goal is to allow hardware to be mutable in a way it isn't today.

The big fear is that silicon-based chip technology won't be able to continue advancing as it has been over the last several decades. Eventually it'll reach a point where we simply cannot etch or enclose any more circuits onto a chip without atomic-level manipulation, basically atom-to-atom. 3D Printing technology is being studied as a way to allow such a thing to be possible, but it isn't a reachable goal with modern technology or techniques, either.

That, and quantum computer technology, which is seen as the next big breakthrough in computer tech, doesn't use the binary system that computers around the world does. Instead of a circuit either being 1 or 0, the tag could be 0.1, or 0.2, or even 0.1243. Lol. I'd consider that 'fuzzy logic', because it isn't a concrete selection between two solid states. So you'd need dedicated hardware and software in order to translate it properly due to simply how widely the input or output could be.

Hmm, I may have gotten off track there a bit and started to ramble. I'm definitely looking forward to what the future will bring though.

I honestly think the technology of computer systems will eventually reach the point where the distinction between 'hardware' and 'software' will become rather lax. Rather it be proposed systems like 'Organic Hardware' or 'Crystal Memory', which at this point is still mostly under the purview of science fiction, the end goal is to allow hardware to be mutable in a way it isn't today.

The big fear is that silicon-based chip technology won't be able to continue advancing as it has been over the last several decades. Eventually it'll reach a point where we simply cannot etch or enclose any more circuits onto a chip without atomic-level manipulation, basically atom-to-atom. 3D Printing technology is being studied as a way to allow such a thing to be possible, but it isn't a reachable goal with modern technology or techniques, either.

That, and quantum computer technology, which is seen as the next big breakthrough in computer tech, doesn't use the binary system that computers around the world does. Instead of a circuit either being 1 or 0, the tag could be 0.1, or 0.2, or even 0.1243. Lol. I'd consider that 'fuzzy logic', because it isn't a concrete selection between two solid states. So you'd need dedicated hardware and software in order to translate it properly due to simply how widely the input or output could be.

Hmm, I may have gotten off track there a bit and started to ramble. I'm definitely looking forward to what the future will bring though.

I think I went off-topic first, but it's general discussions so it should be fine.

Yeah, some kind of "adaptive hardware" seems like it's an inevitable conclusion of the current technology, and the real way to have true learning machines. And with software having started as hardware, the next step may be mind blowing; easily reworkable logic was mind blowing. I actually saw a documentary that some physicist was talking about how to grow some kind of microchips, and he had founding and was actually working on it; didn't see any grown chips, but if chips can continue to grow and adapt then that would be game changing.

While we can't make the chips much smaller at this point, they are getting a lot cheaper. So the idea of using chips in everything becomes more of a reality, which I think might have effects on the advancement of the technology that really can't be foreseen.

And, Google was claiming to have a working quantum computer; but it's supposed to only do very basic arithmetic (I think it only added 2 small numbers).
But for what I understand (which isn't much) in quantum computing, the quantum particles can also be in all states at once. Which just sound like a fun thing to figure out; but we (humans) figured out how to use the power of the stars, so who knows.

But one thing I really think is neat, is using quantum entanglement to communicate across any distance in real time. Maybe someday this could be used to end lag, not to mention the impact on space travel and digital communications in general.

First off, the thing about quantum entanglement is that it is very energy taxing to set up. From what I recall from experiments done, you could power a modern home in the United States for a year with the electricity that's required to force two paired atoms into an 'entangled' state. And even then they only hold that state for a limited amount of time. Last I heard they'd only managed it with a pair of captured photons.

Now, instant transmission of information over unlimited distance is a great idea, but there are several issues on that end. The first is that the entangled atoms have to be created together, and then separated and moved, in very strict conditions, to the places they are needed to be in. The second issue is the matter of bandwidth. Anyone who uses the Internet can tell you that dialup is hell slow today, and yet the bandwidth available for only a single pair of entangled atoms is far, far worse. So it'd take hundreds if not thousands of entangled atoms, all grouped together, in order to achieve any sort of usable bandwidth for anything longer than a couple of numbers or letters at a time. And that's just in binary coding, which is much more efficient then say, Morse Code.

Simply with the technology we have today, quantum entanglement is nearly impossible to get into a usable state. At the same time though, the Mass Effect games have shown how they could possibly be used later in time. The viability of the technology is sound, we just can't produce it yet. Lol.

Now, I mentioned in my last post about 'Organic Hardware', which is still theoretical today. The basis of it is using an organic culture 'programmed' via designer DNA in order to grow and develop like a living brain does. The mutable prospect of it is that it could be used to -grow- adaptive systems for Machine Learning or even true Artificial Intelligence systems.

The human brain is already the most advanced biological system we know of, and yet we've only tapped the bare minimum of what it can do on both a conscious and subconscious level. A single brain cell can have hundreds or even thousands of connections to its neighboring cells, and the more expansive and dense those connections, the faster the cell transmits data from point to point. This sort of structure is great for connective bandwidth, because you can run many concurrent tasks along a single set of connections, but the complexity of it is mindblowing. We haven't been able to fully map a living brain yet, much less figured out how DNA allows for such a thing to exist based on known bio-mechanical principles.

That doesn't even get into aspects of consciousness, which is basically a giant Black Box in modern science. This is the core concept that's given rise to paradoxes such as 'souls' and rather or not 'Intelligence' translates into 'Sentience' or not. After all, a machine can be intelligent, and yet not sentient on its own. Machine Learning can and already does work to prove that point. But the prospect of using biological means as hardware is another path forward in advancement anyway. Lol.

Crystal Memory technology on the other hand is similar to the growing of crystalline hardware tech you were mentioning. I know for a fact that there is research going on to grow microchips, using silicon and crystal-like minerals like salts. But that still comes down to how fine of control and manipulation we are currently able to use and produce. It's like growing sugar crystals in candy, except on a much finer and controlled scale. Even an atom out of place could totally wreck a growing chip in such a way.

In fact, glass manufacturing is already at this stage, which counts contamination in the parts per billion for use in satellites and space mirrors. The Hubble Space Telescope is a great example of the perfection required, but for computer chips it would be far worse and harder to do. A single flaw could destroy an entire chip when power is actually run through it. Semi-conductors get really finicky, to boot. I think we'd need a great breakthrough in semi-conductor/super-conductor technology before any real progress is made on that end.

Just saw a deal ("The Real Story" -> "Star Trek" episode), where they are making the transporter. But what they're doing is isolating a single atom then with a laser they force a different atom to take on the state and properties of the other. And they say the "copied" atom will change in the exact same why the original one is if it's changed after copying. So this may not be a true transporter, but a copier; but they say it can work across any distance (power and line of sight for the laser being the only limit), and I instantly thought about communications again (that and "The Prestige" and all those cats).