Post Your Comment

16 Comments

"Photonic signaling is in many ways superior to electrical signaling as you can get much higher bandwidths out of an optical bus than you can out of an electrical bus, thanks to photons traveling much faster than electrical fields."

Electric signals are photons too. All EM is photons. You are just talking Giga vs Tera freq. This lets you build more ideal wave guides. But that's all. They are both apples. Reply

quote:Photonic signaling is in many ways superior to electrical signaling as you can get much higher bandwidths out of an optical bus than you can out of an electrical bus, thanks to photons traveling much faster than electrons.

Electrical bus doesn't signal with electrons, it signals with an electrical field. Individual electrons can take several minutes to get from one pin to another. Optical bus has a bit less latency thanks to photons traveling faster than electric field (around 0.5ns less for a distance of 30cm/1 foot). It has a lot more bandwidth thanks to lack of interference and immense frequency of the carrier signal.

quote:however the idea of using light as an alternative to electrical signaling at the chip level is a bit more revolutionary.

I assume you mean signaling at the board level. It's not practical to use optical signaling at the chip level. Wavelength of the light used is too long for widespread use as chiplevel interconnects. Reply

Actually, optical buses will have more latency than electrical buses. E&M signals propagate in copper wires at ~.9c, while fibre optic cable propagates at ~.6c

Furthermore, there will be inherent latency in converting from an optical signal back to an electrical one. What you get with optical interconnects is both increased bandwidth and increased latency.

This may be offset by the ability of serial interfaces to easily revert back to parallel interfaces when coupled with optical rather than electrical signaling (i.e. it is easy to duplex optical signals and essentially impossible for electrical ones- which meant routing was a nightmare for parallel electrical signals as bandwidth needs increased, and will make routing simple for optical signals). Reply

Its all the same all that matters for "speed" of the particle traveling is the characteristic impedance of the waveguide. Signaling bandwidth has to do with how many wiggles you can make EM fields, that is just related to the freq. Now the reason we jump to optical freq after GHz is just because there is no easy way to make the intermediate freq. Reply

The point of silicon photonics is to get light down to the chip level interconnects. It's obviously not there yet. Board level signaling should already be possible, though not generally practical, and of course system level signaling has been optical for a while (server to server fibre optic connections). Reply

I'm already looking forward to Core 2 Quad. The news that DDR3 support is also right around the corner makes me want to wait a little longer for DDR3 and the added bandwidth and 1:1 FSB to Memory Clock. Reply

While actually having a highly multi-threaded game will be nice, until much more specific performance info is available I see little reason why "gamers [will] start thinking about the move to dual/quad core if they haven't already."

Several problems stand in the way of quad core being ideal for gaming:

1) Anandtech itself reported how difficult it was for a game developer to make their games trully multithreaded, so it remains to be seen how many games actually have the degree of threading present it Alan Wake.

2) The publisher only quoted actual performance use of the physics thread (80% of a standard clocked C2Duo), so it's VERY possible if the other threads use substantially less processing power the four cores may not be trully needed.

3) Even in this game, the publisher itsself admits that simply overclocking the dual-core machine allows for equal performance compared to the stock quad-core, so raw power seems to be trully important, not a specific number of cores.

4) The most important performance increase will always be from one core to two, since the overhead of the OS, antivirus program, etc can be removed from the primary game thread. After that, each additional core will suffer from deminished returns

5) Most importantly, going from dual to quad core costs money, and if the past is any indication, that mnoey should be better spent on a GPU upgrade. In fact, with the near theoritical doubling of performance with dual core over single, the GPU upgrade is probably reletively a better idea than it's ever been.

Until these issues are resolved I think while the game may be impressive, the idea that gamers will want to jump to quad-core is mostly marketing derived. Reply

I find it disturbing that there isn't much comparison of that same Alan Wake game on a Core 2 duo. I do realize that there is a push to highlight the benefits of a quad processor, but are they now making the window for a dual core being the sweet spot to say, just 2007?

See, inadvertently, they are likely smashing the sales of core 2 duo's. A lot of folks are wondering if a core 2 duo is going to be "enough" and then for how long will it be enough? Intel has built a lot of "transition" chips over the years and they often have had relatively short useful lives. Is that the reason the Core 2 duo came out so reasonably priced? Is it going to be a small one year chip that is outdated, and potentially badly so, by the end of 2007? If so, why buy one?

Some assurance would have helped a lot if we'd seen a comparison of how Alan Wake, and the benchmark quad processing programs ran on Core 2 duo's. As it is, a lot of folks are getting nervous that they have bought, or might be buying into a dead technology of "just" a dual core.

I sure hope Intel addresses this market concern before they conclude. If not, the holidays could be a very rough season for sales of Core 2 duo's.

It's IDF and Intel is pimping new technology. I would say it's pretty reasonable to assume that quad cores are not at all required. Stating that HyperThreading can run the game with lower detail is also sort of funny, as HT only gives about 10% more performance. Let's see... Athlon 64 single core 4000+ is about 20-30% faster than the best HyperThreading Pentium 4 chips, but HT is enough while a fast single core is not? I don't really buy it, although "enough" seems to be a stretch at best. I will wager that Remedy will work hard to make sure the game at least runs on single core setups, as that is still a very large market segment.

Another thought: audio often uses maybe 10-20% of the CPU time in a game. So physics + audio is one core. The streaming and terrain tessellation sounds like maybe half a core at best, and the rendering would probably use the rest of the available power and then some. Remember that Xenon only has 3 cores available, all without OoO execution (Out of Order), so it's reasonable to assume 3 OoO cores will be more than enough, and in fact 2 cores is probably going to be fine. Reply

"It's IDF and Intel is pimping new technology. I would say it's pretty reasonable to assume that quad cores are not at all required. Stating that HyperThreading can run the game with lower detail is also sort of funny, as HT only gives about 10% more performance. Let's see... Athlon 64 single core 4000+ is about 20-30% faster than the best HyperThreading Pentium 4 chips, but HT is enough while a fast single core is not? I don't really buy it, although "enough" seems to be a stretch at best. I will wager that Remedy will work hard to make sure the game at least runs on single core setups, as that is still a very large market segment."

This is what I think too, they would dig their own grave if they released a game that performed pitifully on a 2.6Ghz single core system. Alot of people still use those HP, and Dell systems their parent's bought them to game on (sometimes with upgraded GPU).

There's no way they can convince me that all 4 cores are needed for any game. I can see 2 because that's becomming mainstream, but to commit suicide by relegating users to abysimal performance is bad news. Reply

what say you when AMD's inverted hyper-threading is available? the one that runs 1 thread on multiple cores. the reason why game developers have to start programming multithread games is because in the not to distant future all threads (including video) will be run via multi-cored cpu. Reply

it's been stated numerous times that 'inverse' or 'reverse' hyper-threading is nothing more than a myth. No one even knows how it got started, but it may have been a misunderstanding of AMD's Dual Core Optimization utility.
Reply

"Reverse hyperthreading" is just not practical, nor would it give much of a boost in performance. It would take some major communication between cores during instruction fetch, instruction issue, and instruction retire.

Why wouldn't it give much of a boost? As it is, with 4 issue processors like the Core 2 Duo, it is very hard to find a window of instructions in a program that the processor can issue in parallel. Even now, some execution units are empty during most clock cycles. So what would be the point of trying to do even more in parallel? You would just end up with more empty execution units, but now on 2 or more cores instead of just one.

This is why multithreading on processors was invented in the first place. It was to fill up empty execution units with work from other threads, which is inherently parallel and doesn't have dependencies. Reply

so like i was excited about the laser things. i cant wait till i can put those chips inside my mutated seabass (with attached laser also on their head) cybornetic brain. i will have frikin lasers galore! Reply