I want to learn about computer graphics? I want to know about theories of raster, render, ray casting, how graphic processing unit works, render pipeline... and so I can work with them. Where should I begin?

The math and general explanation of how things in graphics theory works is good, but the GL code they provide in it is outdated at least in the version I have (might be 2nd edition). There is also no mention of shaders in that book as at the time that was written they weren't as common as they are now, so I wouldn't recommend that book to be honest.

Computer Graphics - Principles and Practice is an old and stuffy textbook that contains discussion about everything between GUI design, color spaces, page descriptions for printing, and Silicon Graphics' hardware design circa 1988. Would not recommend and absolutely would not recommend for a beginner. I haven't really gotten any use out of my copy.

Personally, I would start with something old, to seek for origins of the field. There are some old books worth reading.
For example:
"Graphics Programming Black Book" by Abrash.
"Graphics Gems" series.
Glassner's book on ray-tracing.
Aforementioned Foley's book.

Don't get me wrong, I love mathematics with a passion, but most CG books teach you the concepts with mathematical notation in all its formality, which I was never properly taught in school. There is this website, called Scratch a Pixel, that teaches these concepts, and I found most of their lessons very good for beginners. There is no escaping mathematics, and there should be no reason to, but sometimes the notation is confusing. There are also several university pages about computer graphics in general, particularly from the University of Utah Computer Science Department. There are also books, I like Real Time Rendering, 3rd Ed. and I also like Computer Graphics, Principles and Practice. You do not need to go buy a lot of books, but they are excellent references, contain much information, and are edited to the teeth (usually).

Edited by MrJoshL, 25 November 2012 - 12:51 PM.

C dominates the world of linear procedural computing, which won't advance. The future lies in MASSIVE parallelism.