"Just over two months ago, Chrome sponsored the Pwnium browser hacking competition. We had two fantastic submissions, and successfully blocked both exploits within 24 hours of their unveiling. Today, we'd like to offer an inside look into the exploit submitted by Pinkie Pie." A work of pure art, this. Also, this is not the same person as the other PinkiePie. Also also, you didn't think I'd let a story with a headline like this go by unnoticed, did you?

You say there is no known language where this calculation would return the right result? Obviously you don't know Python or Ruby. These language have variable length integers which means that you never have a integer overflow/underflow.

I've programmed in Python. I love Python. How would you suggest Python be able to directly instruct the GPU? I'll give you a hint: you write the extension in C.

This is the cause of my earlier lament. People like you treat it as though languages like Python and Ruby magically spawn out of nowhere without having anything to do with C.

Yes, the result is then a negative number. But given the definition of the function and the parameters the result is "correct".

But given the PURPOSE of the function, the "correct" answer is wrong. And you'll end up with the same problem of incorrectly addressing the buffers contents.

So it is a LOGIC error. The formula is WRONG. Languages cannot fix wrong formulae, which is the heart of the problem with the function.

And in Ruby/Python you don't have any buffers through which you can access arbitrary memory anyway.

Unless you write an extension in C, which you pretty much have to do if you want it to talk to the GPU.

This is why one of the earlier commenters was right. This is about the GPU. Not the language.