If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

You should never trust anything without authoritative references, and with any user generated content you should always be able to discriminate between biased information...but either way, my point was that the information on most straight forward technical subjects is always two clicks away.

-sp0nge

Did curiousity really kill the cat, or is that just what they want you to think?

although it is correct, I never trust wikipedia. mainly because anyone can get on there and write up information (true or false) on a subject.

Anyone can write a book and publish it. Does that make the written word more trustworthy? At least with wikipedia there's some sort of peer review. And as sp0nge said, check and verify with other sources.

Oliver's Law:
Experience is something you don't get until just after you need it.

Anyone can write a book and publish it. Does that make the written word more trustworthy? At least with wikipedia there's some sort of peer review. And as sp0nge said, check and verify with other sources.

yes anyone can write and publish a book, and anyone can create a website and put false information on it. I was merely saying that with wikipedia you dont have to go through the trouble of creating a website, or getting a book published... it is probably one of the easiest ways to create false information on the internet.

ok this is how i understand it. for example: in a ps3 it has 7 3.2 ghz processors. now all these processors do have specific functions with-in a certain parameter. ie. physics calcuations and such allowing for more realistic actions such as a leaf blowing in the wind or a sniper bullet (call of duty 4) having a trajectory based on many variables. now, though the processors would be a total of 22.4ghz they actually run faster than that because they each perform speciaized functions. now the tricky part. they do theoretically run faster than 22.4ghz but in all actuality they will never operate at optimal speed. because of slight variations in clock speed (by minute amounts) they dont all run exactly the same. therefore, each processor is always waiting for the other processors to finish calculations in order to move on to the next equation. so the total is in fact 22.4ghz but the specs are for each processor ( 7 3.2ghz processors). hope this answers your question.

The fact is, no, it is not even close to effectively being 22+gHz. The seeming confidence of your assertion honestly even makes me question your capacity for rational thought.

Simply having 2 cores on a normal computer doesn't give you the effective speed of both. If that was the case, why aren't 8-core servers more revolutionary? Why wasn't Itanium a bigger hit? Why doesn't everyone use Sun hardware? Having two cores in my 'relaxation' desktop doesn't give me effectively the sum of their speeds; it lets me run two separate applications at almost the speed of one. Because of the busing systems and hardware systems involved, there is actually a slight performance decrease. Checkout TomsHardware.com to see this in action with AMD Dual vs. Single cores.

Especially in the case of the PS3 we see another phenomenon: coding practice not keeping up with hardware creation. Programmers simply cannot harness multiple CPUs well at present (as an observation of general practices on the average.) This is even more the case when it comes to the Cell processor architecture. For Itanium or SPARC-based systems, developers use different methods entirely to take advantage of multiple cores or processors. Think back to the `fork` MPM for Apache versus using threading with `worker` or any of the new variants in Apache2.2+.

In summary, you definitely do not even effectively get the sum of the clock speeds.