Are there limitations to computer generated imagery?

Recommended Posts

We've achieved 99% photorealism in Hollywood movie special effects, while there are some impressive real-time effects in modern video games that are approaching photorealism. Will there eventually be a stopping point, where a computer generated image has the same level of detail as the real life object it is attempting to mimic? For instance, if a human character is rendered with individual hairs and pores, why stop there? Why not take it a step further and model all the way down to the cellular level? If taken beyond that level, why not also include individual moelcules, atoms, aub-atomic particles, quarks, gluons and finally strings, if they even exist? It's difficult to imagine the practicality of having that level of detail, but the implication seems to be that

a) There is no reason to believe that, as long as processing power increases, we can't render the same level of detail in a simulation as we experience in real life, even at the microscopic level, and

b) With enough processing power, we may be able to render beyond what our most powerful equipment can display to us, such as say, rendering all of the strings making up an object, or to display the entire 28 billion light-year diameter of the known universe simultaneously.

Again, the practicality of such rendering is debatable, but this is more a question of whether we can, rather than why we should.

Share this post

Link to post

Yes and no. In theory, there aren't; unless you demonstrate mathematically that a Turing machine couldn't generate a type of image. In practice, there's just the problem of speed: an image could take a computer network so long to compute that the heat death of the universe would be reached first.

The biggest limitation with increasing the level of detail, however, is not so much computational as related to storage. If you simulate an entire house at the atomic level, you're going to need an absurd quantity of storage space to contain the data needed to render it.

Share this post

Link to post

Uh there's a new company that has been working on graphics technology that rounds between points instead of straight lines. And makes it so you see more detail when you zoom in. So you can see the atoms of something.

Like you see a grass texture... then you zoom in and see the blades of grass... then you zoom in and see the molicules of it.

Its pretty interesting stuff, but I'd hate to model for technology that detailed.

Share this post

Link to post

"If you look close enough you can't tell where my nose ends and space begins"

Thus far we have no evidence that there is any sort of base unit of space. With that in mind, images could get infinitely more detailed.

Hate to bust your bubble, but the "basic unit of space" has been identified; it is the Planck length, which is the smallest measureable amount of space in which anything can be said to exist. Anything smaller would disassociate itself from classical physics, and probably be comprised of little more than so-called quantum fluctuations and uncertainties. It is not an arbitrary number, but comes from, according to wikipedia, "the speed of light in a vacuum, Planck's constant, and the gravitational constant." Light photons, which seem to be continuous, actually come in tiny packets which are no smaller than the Planck length. There is an actual fundamental limit to size; one could say the universe has an actual limit to its "resolution," if you want to think of the universe as ultimately being composed of something akin to pixels.

Share this post

Link to post

In theory, any image of a given resolution and color depth can be generated simply by randomizing, even if it takes an eternity to do so, and it may represent a scene of arbitrary complexity, beauty, horror, wisdom or evil.

The tough part is getting the image that you want, when you want it and knowing how to do it consistently. Otherwise we would not have artists and image processing.

Share this post

Link to post

CGI still has a long way to go to mimic real life completely. Ironically, with each large stride they make in rendering technology, I realize there are even more details that bring me out of it.

The main thing that bothers me now is that CGI characters are too still and graceful. Real humans twitch and jerk in very subtle ways, mostly due to tiny muscle movements or blood circulating through the body. I can't even begin to imagine how you'd model that.

Share this post

Link to post

As motion capture becomes more detailed, the movement will become more believable. Real-time rendering is catching up to pre-rendered animation. We're still a good ways away from having individually rendered leaves and grains of sand that realistically interact with the environment, but the level of detail in games like God of War 3 isn't a whole lot lower than what it would look like in a movie.

Share this post

Link to post

I've said it before and I'll say it again. The power of computers has long since surpassed humans' abilities to effectively develop for them. Even if we speculate on the possibility of rendering to the atomic level, which at this rate will never happen in any of our lifetimes, humans will still find a way to fuck it up every time and make it look like complete shit.

Share this post

Link to post

I've said it before and I'll say it again. The power of computers has long since surpassed humans' abilities to effectively develop for them. Even if we speculate on the possibility of rendering to the atomic level, which at this rate will never happen in any of our lifetimes, humans will still find a way to fuck it up every time and make it look like complete shit.

Don't be so quick to doubt. Quantum computing--which involves constructing computers from individual atoms, which we have actually accomplished (although they are currently less powerful than pocket calculators)--will really take off in the next couple of decades. Whereas standard transistors use the familiar on/off states for the transmission of data, the spin of an electron can allow for states between on and off, which is many times more powerful than binary. Imagine having two employees, only one of which can be working at any given time (standard bits), as opposed to dozens of hundreds of employees all working simultaneously (so-called quibits or quantum bits). That would make it a lot easier to render environments down to molecular scales and beyond.

Share this post

Link to post

Don't be so quick to doubt. Quantum computing--which involves constructing computers from individual atoms, which we have actually accomplished (although they are currently less powerful than pocket calculators)--will really take off in the next couple of decades. Whereas standard transistors use the familiar on/off states for the transmission of data, the spin of an electron can allow for states between on and off, which is many times more powerful than binary. Imagine having two employees, only one of which can be working at any given time (standard bits), as opposed to dozens of hundreds of employees all working simultaneously (so-called quibits or quantum bits). That would make it a lot easier to render environments down to molecular scales and beyond.

Considering that researchers have yet to develop functioning prototypes that maintain coherency above ~6 qubits, I'd say that useful quantum computers are far more than a couple of decades away. Even then, a quantum computer is limited to very specific applications where it's possible to devise an algorithm that can make use of quantum effects. The most commonly cited application for quantum computing is cryptography, for example, but there are cryptographic ciphers that don't have any known QC vulnerability.

Basically, don't assume that quantum computing is a magic bullet that is just going to make things faster to do. I expect that we will develop functioning quantum computers eventually (if nothing else than because certain organisations with three letter names will be more than willing to fund their development), but I don't anticipate them being general purpose pieces of hardware like you'd have in a commodity desktop computer. They'll probably be used for very specific purposes, possibly even designed and built as custom machines.

Share this post

Link to post

Sounds like your typical pump'n'dump scam. The video, for all we know, can be 100% pre-rendered and made with polygons all the way.

Their website is very sketchy at best, and they have no mentions or links to scientific papers, or even claiming -at any part of their video- that they have a real-time rendering engine that can actually work in a totally different way than traditional polygons of voxels, or that the video itself is made with a different/innovative process.

Personally, they lost me when they said something to the effect "normally, each point you process takes some more processor time, but believe us, we got past that". Yeah. Right.

The main issue is that they basically say "Yeah, you CAN model stuff with a bazilliom tiny points". The problem is HOW THE FUCK YOU RENDER IT/WITH WHAT HARDWARE/WHAT SOFTWARE/HOW MUCH MEMORY.

Link to post

Sadly, many times being able to market/pre-capitalize on the promise of an upcoming technology (which may very well be snake oil or, at best, nothing exceptional/unique) is equated with "success" by the general public.

Share this post

Link to post

The "Unlimited Detail" tech is such a scam, it's funny to see so many people here who believe in it. If the technology they are pushing was actually feasible/useful, don't you guys think the bigger guys would've gone for it already?

Although I understand all the technical reasons behind the fact that this nonsense would never work, I don't care to type it all up for you guys. Just read this page and it's sequel. Most of that is all true.

Share this post

Link to post

Notch says it's old and scam and whatever, Carmack says it's next gen.

That just tells me you don't know what you're talking about. Both Carmack and Notch know that voxel technology is the next big step. Carmack hasn't said anything about the "unlimited detail" scam. Notch pointed out why this particular engine is a scam. Coincidentally, this engine uses voxel tech, which Notch is NOT disparaging.

Share this post

Link to post

I don't see why so many people say it's a scam based on the sole belief that "it's too good to be true".

We knows it's a scam based on two things:

1. They're hiding any explanation of what's going on behind marketing terms, and avoiding explanations. These are the sorts of tactics used by the kinds of people who want to take money and run.

2. They're claiming to be able to do something that is impossible. Notch gives a nice summary of the arithmetic involved. Unless they're willing to give us all datacentre-sized super computers just to store all the voxels, this is going nowhere.

Share this post

Link to post

Quantum computing--which involves constructing computers from individual atoms, which we have actually accomplished (although they are currently less powerful than pocket calculators)--will really take off in the next couple of decades. Whereas standard transistors use the familiar on/off states for the transmission of data, the spin of an electron can allow for states between on and off, which is many times more powerful than binary.

Sounds like a form of analog computer, they can be fun to home-brew and play with though very slow.