02:21 pmWelp, the day is fast approaching. Tomorrow I'll look my program counter in the register, fetch, and execute an increment, sending the current accumulator contents to the ALU, and restoring them after execution of the instruction.

Of course, with pipelining, some of this has already happened.

Nothing special -- as far as I know -- is planned, though our weekly game night happens to fall on the date of my usual increment. (Actually, that's a lie -- I think I know one thing that's happening, but I'm trying really hard to forget it. REALLY hard). I tossed up a post in scgamenight about possible ideas for tomorrow evening.

It just so happens that today may well prove to be a red letter day in the history of computing. At the Computer History Museum, a company called D-Wave demonstrated a 16 qubit quantum processor -- the first public demonstration of a quantum processor in a commercial application and the first public demonstration of a quantum processor of that size. They apparently maintain a blog too.

For those suffering from technobabble overload, I'll explain. A quantum processor works very differently from a normal one. A normal processor like you have in your computers run a series of instruction. Some programmer wrote something that tells the computer what to do and it blindly follows it. As a result, solving a problem requires one to write a series of instructions to come around to a solution. How the programmer writes that directly affects how quickly the solution is computed and so on.

By analogy, a normal processor is like a truck driver. You give him directions and tell him that his cargo's expected in Manhattan in a week's time. He gets to Manhattan and your job's done.

A quantum processor specializes in finding optimal solutions to things, but instead of writing algorithms to tell it how to do things, you give it a series of constraints and it settles at a solution with those constraints that uses the least energy. There isn't a meaningful road analogy for it -- and this only pretends to be a good analogy -- but picture a glass surface. You want to find out where the lowest point on it is, so you put a drop of water on the surface, and it runs along by gravity until it forms a pool at the lowest point.

Very neat and very strange. It doesn't really have an application in desktop PCs yet and the company believes it won't *replace* modern day processors, but rather, it will work in concert with one. At the moment, with minimal modification, they can take normal programs and execute them using their quantum processor -- and did so. One matching molecules in a drug database, another solving a Sudoku puzzle, and another handling seating arrangements. It was all pretty small scale stuff and they mentioned a number of times that their present method isn't nearly as fast as the conventional algorithms and chips, but for a proof of concept, it's serious stuff.

What captured my mind was the future of software though. To give a (very) brief history of programming languages, the first real ones were all high level, pie-in-the-sky type stuff that forced you to express problems in a given context. LISP, for example, had you do everything in terms of lists of data. FORTRAN had you express things in math. COBOL had you express things in terms of database tables and manipulations.

What really changed things was C -- a language which accurately reflected how the hardware worked. Eventually, people used it to define their own paradigms -- C++ and Java are excellent examples of this.

I've been trying to think of the equivalent quantum programming language -- it's a topic that no one really seems to want to tackle at the moment. The designers seemed to show a proclivity towards SQL which made me think of a different declarative language that was arguably SQL's ancestor called Prolog.

Prolog does strange things. Essentially, you define all of your data and you define a series of rules. For example, you could have a database of flights -- departure times, arrival times, and locations. If you wanted to find how to get from LA to Portland, you'd make a function to handle the trivial case of a direct flight:go(X,Y) :- flight(X,Y).

Comments:

Tomorrow I'll look my program counter in the register, fetch, and execute an increment, sending the current accumulator contents to the ALU, and restoring them after execution of the instruction.

You won't do any of that, because you are not a processor. You just can't even talk about waking up in the morning without ramming your dull second-hand technical knowledge into everybody's faces. What a shitstain.

The original poster is suggesting that you make vain references to shallow technical concepts and terms in order to impress people. I'm sure that it works well amongst your peers (who could only be common garden sheep types if they're associating with somebody with your characteristics), but it doesn't fool the rest of us.

Many people can absorb knowledge, studying over a broad subject range or focusing on a particular field, and manage not to force terminology into their sentences unnaturally. "...on a web page that they explicitly have to make an HTTP request for" is the language of a confused child. It's not indicative of your natural thought patterns, despite the impression you want to give off. It's just masturbation.

As far as it appears, you only have a cursory understanding of many of the matters you would like to claim expertise in. Your "OS", for example - I looked at the binary, and could figure the whole thing out just from the arrangement of the symbols. There was nothing there. It was an exercise in futility. The only thing it achieved was giving you an excuse to post it and gain automatic acceptance from non-discriminating people. The misguided design babble blurb you gave beforehand certainly doesn't correlate to anything you've actually done.

You will never learn. You will always assume that you have dredged the deepest depths of every lake you stick your cunting-oar into. We realise that it's futile to even acknowledge you, but we're angry people, and maybe you're good therapy.

Now, I expect you've got something to write about our inferior minds and ignorant ways, or a hostname to look up so that you can call us "silly little Brits" again, or something similarly xenophobic and arrogant. No doubt you use DVORAK because you heard it was cool, so please, flap your fingers around in a statistically optimised fashion and astound us all with your... 'personality'.

Actually -- I don't use DVORAK. I've used QWERTY for around 25 years now and it's not a habit I see breaking easily.

What you don't realize is that it's my turn to test you. Since it's *so easy* to figure out from the symbol layout (and it ought to be -- I've no interest in obfuscating it), why don't you tell me exactly what that binary does?