If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Very interesting read. I think he could be right, growing demands on software complexity (computer games are a great example) cause hardware to get more complex as well. Because cpu's rely more and more on multithreading and gpu's are used for gpgpu, software development becomes harder. This could (or perhaps already has) reach a point at which it isn't economical to achieve higher quality while the computing power is available.

A combined cpu+gpu approach like Larrabee and AMD's Fusion (I don't expect Fusion to be comparable to Larrabee though) could be the solution on the hardware side. On the software side I think OpenCL could provide (a partial) answer. If OpenCL becomes a much used standard, that would be great for the Linux community I think. Because it can run on any platform, it becomes more interesting and less complex to create portable software.

It could be fun to look back at these slides in 5 - 10 years time and see whether Tim Sweeney was right.

Comment

That was sort of my thought too. In particular, I think it gives much more creedence to the claims Intel has made about Larrabee and, to a lesser extent, dispels some of the random FUD people have been spreading about CBE (Phrack had a really good writeup on that subject recently).

It might be a bit silly and idealistic, but with the Intel ventures of the last couple years in open source, and the way they're starting to leverage it, it seems like Larrabee is almost made to get along swimmingly with Gallium. I'm hopeful that it gains some amount of mind-share, even as I rather dislike x86 (ARM is much neater (Down with the CORE scum! )).

In terms of programming, it's interesting to note that while it's not all there yet, a lot of what he was talking about reminded me strongly of Walter Bright's D programming language. I'd like it if that took off real hard.

Comment

Another point whose validity should still be checked from GPU designers is a suspection that GPU's nowadays are actually fail-fast and that's what causes you to screw up so bad. They can't handle weird situations, they simply lockup. I'm not sure I'd want to use such a processor as a general-purpose one.