You currently have javascript disabled. Several functions may not work. Please re-enable javascript to access full functionality.

Register a free account to unlock additional features at BleepingComputer.com

Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.

AMD's Threadripper will bring a 16-core, 32-thread monster to the desktop

It's called Threadripper, and it will be the first consumer-focused chip with 16 cores and 32 threads of computing power. Yes, you read that right: 16 cores and 32 threads of computing power, confirming that the CPU wars will rage on at the high end.

AMD's Jim Anderson announced the high-end desktop chip during the company's financial analyst day Tuesday. Details were sparse, but the CPU has been making the rumor mills the last few weeks, though the company was expecting to announce at the upcoming Computex show.

BC AdBot (Login to Remove)

Damn... and I remember the days when dual-core CPUs started shipping in personal computers.
I just hope, though, it's a well-made processor and not like those mobile CPUs & SoC where Apple's A-series 2-core CPUs consistently manage to outperform some of their rivals' 8-core ones.

You seem to be much more experienced than me in this topic, so let me ask you this. I've always wondered, is it true that the vast majority of computing taks and applications utilize just a single CPU core? For example, browsing the Internet. I see that this might be the case, since the iPhone 7+ works quicker than the new Samsung Galaxy S8+, evident by all the comparison videos online, even though it has more than twice lower CPU core count. Then again, if this were the case, why do manufacturers, such as AMD as per the topic posted by JohnC_21, keep upping the amount of CPU cores when you can have a much faster final product should you focus on better optimization? For example, dual-core Intel i7 series processors.

Yes there are a lot of tasks that utilize only single core and single thread and that includes gaming (this is why the new Pentiums are so good at it), internet and common tasks such as browsing files.

Why CPU's nowadays are using more cores/threads is to deliver a better performance and yes for the most part more threads/cores = better CPU as while a Intel Pentium G4560 is great at gaming trying to multitask becomes harder and production is out of the question.

Use case is where things really matter and you cannot base everything in benchmarks as even if the iphone 7 works quicker than the samsung S8 what you can do with a S8 will be far greater if talking about production/multitasking.

Also keep in mind S8 is still new so optimization has not hit it yet while the iphone 7 has been out for a while.

There are factors where a octa core CPU will be better then a quad or dual core, especially in production/workload scenarios .

But yes there are times where it isnt as good as seen with AMD's Ryzen in its Ryzen 7 series, while I say a Ryzen 7 100 is a great processor for content creation (why I got mine for as I plan on making videos here in the future) it doesn't do as good at gaming as say a i7 and is more on par with a i5 (which I have no issues with) again it goes back to optimization.

However there is a good case for 6 and 8 core CPU's and more threads as high workloads call for such power under the hood.

One thing I like about ryzen 7 at least is that I can run my games without it stuttering or freezing while running stuff like chrome or office in the background as the processor isnt being overloaded by just playing the game.

Dividing the workload is a big advantage over dual core or even quad core and having more threads helps too.

Use case is something that has been debated of course if purely gaming quad core should be your minimum as games are starting to catch on to multi core/thread performance as consoles are starting to become more like PC's.

After all both the PS4 and the Xbox one feature a 8 core AMD processor and both do more than just game as both feature the ability to do live streaming and multitasking thus why there is a good case for if someone wants to consider a Ryzen 7 1700 and believe me i will be doing some live streaming and perhaps even using multiple monitors so i can see the chatrooms while I game.

Dual core however is indeed dying out, quad core is starting to take over its market and soon you may see quad core pentiums.

Got it! Thanks for the clear explanation. I guess, if I were to buy a computer/smartphone, I would go for optimisation, since I almost never multi-task - I am more of a bookmark type of guy and only open one app at a time.

Well for buying a PC then the very least a dual core as its still viable even though I feel the end is near for dual core especially with Ryzen 3 coming out possibly offering quad core for the price of a i3 thus making dual core far less compelling for the budget gamer.

Also keep in mind optimizations happen all the time so what doesnt work now may be able to use its full potential in a year from now.

This is why AMD's Ryzen series is such a big deal as it has so much potential compared to anything intel has for the same price point.

Don't take this the wrong way, Madman, but, as in so many other cases where I hear glowing reports of the Ryzens.....absolutely everybody is talking about them from the point of view of gaming... (*sigh*)

Why is it always assumed that high-end, high-power processors are only of interest to gamers....hmm? Why is it always assumed that everybody plays games? Personally, I haven't got the time of day for computer gaming; to me, it's a complete waste of time, when you could be doing something more productive. (But that's just me; I get my kicks doing other stuff..! I like to multi-task in real life, as well as in cyberspace...)

I will confess that moving from a single-core Athlon64 to a dual-core X2 made one hell of a difference to my 'workflow', such as it is; it's definitely the multi-tasking aspect that improved things out of all recognition. I'm a big believer in 'music while you work', so being able to listen to streaming audio, or my own stuff, while at the same time browsing, or programming, or packaging, or working on some of my graphics projects, has made life so much more pleasant. Especially with Puppy, since it's so lightweight in nature; it's easily as fast as my bother's rig with a quad-core, running Windows 10.

You may be right about dual-cores being on the way out; I won't argue with you on that score.....you're clearly more in touch with this stuff than most of us (including me!) All I can say is that, for my personal use-case (which proceeds at a fairly sedate pace!), the Athlon64 X2 will serve me well for a while to come yet.

I think the question as to why high-end CPUs are marketed to gamers is two-fold: lots of inveterate gamers are of the "I want the latest and greatest and I wanted it yesterday!!," mindset which makes them an easy target and there are games that definitely take advantage of multi-core processing and threading.

Most folks who do web browsing, e-mailing, and run typical office software can get along just fine with what is now truly ancient hardware. These activities do not require all that much computing power while running things like games or other stuff like architectural design software that has to deal with 3D-rendering in real time (and sometimes there's not a separate GPU) does.

I mean really the highest end intel wise for the desktop is the Intel Core i7-6950X or the Intel Core i7-6900K both are well over $1000 and are not for gaming, neither are the highest priced Xeons which are for servers (though i have seen people use Xeons for gaming PC's and they are interesting to say the least)

The same can be said about AMDs line.

For me the most anyone should pay is the $349.99 for Intels i7-7700K which is still the best performer for gaming alone.

However if someone got a i5-7600K they would still have great gaming performance.

For gaming and production Ryzen is certainly changing things, the Ryzen 5 1600 is actually doing really good at games and is cheaper than the i5-7600K and not as expensive for those wanting to go into production its a nice choice.

Perhaps not as good as the Ryzen 7 1700 but a good option who wants to make something for content creation without spending a lot of money.

I feel I made a good choice with the Ryzen 7 1700 as its one heck of a powerhouse and while being pricey it delivers on performance

I'll go along with that summation. All I know is I upgraded to the X2 two years ago, for the grand total of £7.20p off eBay (obviously, it was several years old when I got it).....around US$9-10 at current rates of exchange. Socket 939 was one of the few that supported the single-to-dual upgrade, without needing the additional investment of a new mobo and RAM, too.

A sensible person realizes that all principles that can be expressed in a statement of finite length are oversimplified.

~ Robert Heppe

I agree with what you've said, but I'm thinking more about processors that are marketed to the consumer market (and I haven't seen the Xeon class ever fall there). My head now swims at the number of variants on the Intel i-series of processors, many of which seem virtually indistinguishable from one another from a practical standpoint for most "common" computing purposes.

I will say that with technology, like anything, there is a consumer market segment for whom "new and improved" attached as a label, whether the thing in question is really either new or improved from a practical standpoint, is enough to get them to clamber for the latest shiny processing bauble. It actually saddens me how many people have been taken by salespeople who've grossly oversold them with hardware that they will never even scratch the surface of its capabilities for what they'll be using it for. In fact, in a lot of those cases, ancient hardware can meet their needs, but I get that people want to replace what they have every once in a while. What doesn't seem to be understood by many is that their "top end" system of 5 years ago is what's now being sold as the "student entry level" machine of today. So if they actually want or need more power just buying something at the upper end of the lower range of current processors or perhaps mid range at most will blow what they had out of the water.

I always believe in fitting the tool to the task while allowing some growing room based on a realistic assessment of how much growth, if any, will be likely to occur for a given user. I'd starve if I were a computer hardware salesman because I couldn't unnecessarily up-sell and sleep at night.

I always believe in fitting the tool to the task while allowing some growing room based on a realistic assessment of how much growth, if any, will be likely to occur for a given user. I'd starve if I were a computer hardware salesman because I couldn't unnecessarily up-sell and sleep at night.

The only moral argument I could think of for doing such a deal would be the 'no hassle argument'. By this I mean that for some potential buyers it might be in their best interests, whether they realise it or not, to spend more money and get those Intel Quad-Core i7 CPUs if they clearly are the type of people who kind of, at least in the context of the tech world, 'buy to last' or don't like any technological stuff at all - they just want a fast computer, which they can get comfortable with and own for 5-15 years, without doing much more than reinstalling their OS from time to time and spring-cleaning their system every year.
As for a real life example, a friend's dad, who was technologicaly illiterate, asked me for help with choosing a new TV about 5 years ago. I deliberately pushed for them to max out their budget, due to my knowing that they are the kind of people I talked about above, and got them a 32-inch (it's a smaller room) LCD LED Panasonic 1080p Full HD TV with extended 15-year warranty. At the time it was pretty expensive for them at ~$500. Then, all TV signal was still 480p, so they had no realistic benefit from their new TV, as compared to the other less-costly options. Nowadays, however, they are content with their choice, as they know the ins and outs of their TV and its remote and about half the TV signal is 720p HD & half is 1080p Full HD. By their words the image quality is 'crisp'. They should be good with this TV for another decade or so. The only downside I could think of is that this device, as compared to if they bought an average new one every now and then, has somewhat bigger bezzels at around 1 inch on each side and also I've seen that newer TVs of the same quality use up about 40-50 Watts/hour in power, whereas this one, despite also being LCD LED, uses about 85.
And so, I think that you are entirely right and I would also not be able to sell something that I know has no value unless it actually did have to this type of people.