Post Your Comment

158 Comments

Intel has something impressive in the works with Broadwell (at least on paper). I can't wait to get a Broadwell based Surface Pro. Assuming that Microsoft improves an already impressive hardware design from the sp3, the Broadwell iteration will likely be my next computer purchase. Reply

I have a feeling SP4 will be fundamentally the same design as SP3 save for minor tweaks and improvements. SP3 was clearly designed for a processor with the kind of power profile Broadwell is set to deliver rather than the current Haswell profile. It will be interesting to see which set of SKUs Microsoft will put in the SP4, Core M or Broadwell ULT. Core M has a number of obvious benefits for power and area efficiency, but will it be powerful enough for their market with some features reduced from Haswell and Boradwell ULT.Reply

Pure speculation, but I think Intel might already be giving MS premium bins of Haswell for SP3, because SP3 is the only device to date to actually show off the ability to run premium Intel CPUs in a tablet format. Sure, MBA looks great, but SP3 took it to the next level.

That said, I doubt that MS will use Core M in SP4, for the same reason we don't have Haswell-Y in SP3 (at least at the high end). It will probably be a step back in processing power to use one. Reply

Do we know what wattage the Broadwell ULT and Core M chips will be targeting? 15W TDP is clearly too high for the SP3 to handle so moving all SP4 chips to 11.5W like the current Haswell Y looks quite plausible at the moment, it just seems to be a matter of which version of Broadwell will have the 11.5W TDP. Reply

15W is only a problem in SP3 to people who use it like a high performance computer (24x7 full load applications) but for general purpose use it barely warms up. We have people running Lightroom 8 hours a day on these things and like the Surface 2's (which I still have) they never got "hot" or "loud".

That said, someone in the office infected their SP3 with some malware a few weeks ago (they literally owned the tablet not even 24 hours) and when they handed it to me, it was VERY hot with the fans whirling. Some 800kb task was using 100% of their CPU doing who knows what...at first I thought it was Cryptolocker but it turned out to be retrying a network connection. This was an i5 model, however, and it didn't seem to be throttling. The i3 will presumable run cooler, even at the same TDP.

agreed, Broadwell, and skylake will be vast improvements to PCs in general. Intel's Broadwell-Y announcement is all about "small, cool, efficient" while the recent FX-9590 seems more about "big, hot, gluttony" similar to the David vs Goliath story, the interesting part was the small one besting the big one. Ironically Intel is the bigger company. Hopefully AMD's new A1100 pans out as I don't want another Comcast, Microsoft, De Beers or Luxottica. Reply

well, if amd was as agressive as intel in shrinking dies or whathaveyou, then an AMD FX chip will probably be toe-to-toe to an intel i7-4930k or whatever the 6-core enthusiast intel chip is labeled. and not even die shrinks, but, also aggressive in producing a $500 cpu. imagine that. and you'd probably see an a10-7850k performance in a laptop by now. but, AMD seems content is sitting back and letting the other company do all the work, creating a path. as long as AMD doesn't completely die out, it's fine. we just need an alternative and AMD is the only one. so, go AMD. don't worry about broadwell. build it and we will come. be a niche. convert future x99 users to a future AMD product. and start from there.Reply

For one, die-shrinks costs money... For fab contracts, man-hours, research and possibly buying technology from other companies such as IBM.

AMD can't aggressively shrink dies anyway, they are at the mercy of fabrication companies like TSMC and Global Foundries, so what they can produce is limited to what they provide.Intel has always been ahead of the industry in fabrication, the only way AMD can beat Intel is through something ground breaking (Like moving away from silicon?) or if Intel drops the ball, like they did with Netburst.Or, AMD buys a fab company who is far ahead of Intel, which simply isn't going to happen.

Otherwise they can only compete on price and using an older more mature fabrication process allows them to do just that as the chips are much cheaper to produce, they just need to provide "Good enough" performance to mostly stay relevant, which the FX doesn't really do.Reply

well, an fx-8350 is toe-to-toe with an i7-2600k, which is no slouch until today. and comparing fx-8350 with today's i7-4770k would be a little unfair since the 4770k is 22nm while the 8350 is at 32nm. and we're not even considering software optimizations from OS and/or programs that are probably bent towards intel chips due to its ubiquity.

so, i think, you're wrong that the fx-8350 doesn't provide good enough. i have both i7-3770k oc'd to 4.1 ghz and an fx-8320 at stock and the amd is fine. it's more than good enough. i've ripped movies using handbrake on both systems and to me, both systems are fast. am i counting milliseconds? no. does it matter to me if the fx-8320 w/ lets say amd r9-290 has 85 fps for so and so game and an i7-4770k w/ the same gpu has a higher fps of 95, let's just say? i don't think so. that extra 10 fps cost that intel dude $100 more. and 10 extra frames with avg frames of 85-95 is undecipherable. it's only when the frames drop down below 60 does one notice it since most monitors are at 60 hz.

so what makes the fx not good enough for you again? are you like a brag queen? a rich man?Reply

Not fair to compare against a 22nm from Intel? Bogus, I can go to the store and buy a 22nm Intel so it should be compared against AMDs greatest. An i5-4670K matches or exceeds the performance of even the FX-9590 in all but the most embarrassingly threaded tasks while costing 50$ more. Cost to operate the machine through the power bill makes up for that price difference at a fairly standard 12c per KWh when used heavily 2 hours per day for 4 years or idling 8 hours per day for the same 4 years.

Your argument for gaming with the 8350 being good enough is weak too when the 10$ cheaper i3-4430 keeps up. Or spent 125$ less to get a Pentium G3258 AE, then mildly overclock it to again have the same good enough gaming performance if >60FPS is all that matters. The i3 and pentiums are ~70$ cheaper yet when power use is counted again. Reply

well, if a pentium g3258 is good enuff for gaming, then so is an fx-8350. whaaaaaat? omg we know intel is king. i acknowledge and understand that. intel rules. but, amd is not bad. not bad at all is all im trying to make.

first off you are assuming a lot and not bothering to check any published benchmarks out there so,

1. 8350 isn't even equal to 2500 i5 let alone 2600 i7.2. 32nm vs. 22nm means nothing at all when comparing raw performance in a desktop. it will limit the thermal ceiling so in a laptop the higher nm chip will run hotter therefore be unable to hit higher clocks but in a desktop it means nil.3. handbrake ripping relies on speed of dvd/blu-ray drive, handbrake transcoding relies on cpu performance and the 8350 gets spanked there by a dual core i3 not by miliseconds but tens of seconds. i5 it gets to the level of minutes i7 more so.4. let's say you're pulling framerates for an r9-290 out of somewhere other than the ether... reality is an i5 is faster than the 8350 in almost any benchmark i've ever seen by roughly 15% overall. in certan games with lots of ai you get crazy framerate advantages with i5 over 8350, things like rome total war and starcraft 2 and diablo 3 etc...

i'll just say fx8350 isn't good enough for me and i'm certainly not a rich man. system build cost for what i have vs. what the 8350 system would have run was a whopping $65 differenceReply

#3 is B.S. a dual-core i3 can't rip faster than an fx-8350 in handbrake.

#4 the r-290 was an example to pair a fairly high end gpu with an fx-8350. a fairly high end gpu helps in games. thus, pairing it with an fx-8350 will give you a good combo that is more than good enough for gaming.

#2 22nm vs. 32nm does matter in desktops. the fx-8350 is 32nm. if it goes to 22nm, the die shrink would enable the chip to either go higher in clockspeed or lower it's tdp.

oh and #1--i am not saying the fx 8350 is better than the i7-2600k. i said "toe-to-toe." the i5-2500k can also beat the fx-835o b/c of intel's IPC speed advantage. but, i think the reasons for that are programs not made to be multithreaded and make use of fx-8350 8-cores to it's potential. since amd trails intel in IPC performance by a lot--this means that a 4-core i5-2500k can match it or sometimes even beat it in games. in a multithreaded environment, the 8-core fx-8350 will always beat the i5-2500k. although it might still trailer the 4-core + 4 fake cores i7-2600k. just kidding. lol.

i said toe to toe with 2600k which means its "competitive" to an i7-2600k even though the AMD is handicapped with slower IPC speed and most programs/OS not optimize for multithreading. so, to be 10-20% behind in most benchmarks against an i7-2600k is not bad considering how programs take advantage of intel's higher IPC performance.

i'm sorry, is your argument here that the FX-8350 is better because it's inferior? because that's all i'm getting out of this. Of course a benchmark is going to take advantage of higher IPC performance. That's the point of a benchmark: to distinguish higher performance. The way you talk about benchmarks it's as if you think benchmarks only give higher numbers because they're biased. That's not how it works. The benchmarks give the i7-2600k higher scores because it is a higher performance part in real life, which is what anyone buying a CPU actually care about. Not to mention the significantly higher efficiency, which is just an added benefit.Also, it's really hard to take you seriously when your posts make me think they're written by a teenage girl.Reply

also, if the fps disparity is so huge btwn fx-8350 and say i5-2500k in games u mention like starcraft 2, then something is wrong with that game. and not the fx-8350. i actually have sc2 and i have access to a pc w/ an fx-8320. so i am going to do a test later tonight. my own pc is an i7-3770k. so i could directly compare 2 different systems. the only thing is that the amd pc has an hd5850 gpu, which should be good enuff for sc2 and my pc has a gtx680 so it's not going to be a direct comparison. but, it should still give a good idea, right?Reply

i just played starcraft 2 on a pc with fx-8320 (stock clockspeed), 8GB 1600Mhz RAM, 7200rpm HDD and an old AMD HD5850 w/ 1GB VRAM. the experience was smooth. the settings were 1080P, all things at ultra or high and antialiasing set to ON. i wasn't looking at FPS since i don't know how to do it with starcraft 2, but, the gameplay was smooth. it didn't deter my experience.

i also play this game on my own pc which is an i7-3770k OC'd to 4.1, 16GB 1600 Mhz RAM, 7200rpmHDD and an Nvidia GTX680 FTW w/ 2GB VRAM and i couldn't tell the difference as far as the smoothness of the gameplay is concerned. there is some graphical differences between the AMD GPU and the Nvidia GPU but that is another story. my point is that my experience were seamless playing on an FX chip pc to my own pc with 3700k.

to make another point, i also have this game on my macbook pro and that is where the experience of playing this game goes down. even in low settings. the MBP just can't handle it. at least the one i have with the older gt330m dGpu and dual-core w/ hyperthreading i7 mobile cpu.

so.... there.... no numbers or stats. just the experience, to me, which is what counts did not change with the pc that had the amd fx cpu. Reply

So you must be feeling pretty darn stupid now, realizing that you never had to buy the more expensive 3770K (plus the gtx680), since from your point of view it "feels" the same as an 8350 or an 8320 with an HD5850... eh? Here's an idea, sell your 3770K/GTX680 system, and buy an FX8320/HD5850... you would still get some of your money back - if you can't do that, then at least just shut the hell up and stop deliberately spreading misinformation, you unethical hypocrite.Reply

Global Foundries was AMD spunoff fab, AMD still holding share on Global Foundries, Global Foundries are working tightly with Samsung fab right now for better manufacturing process, when they reach their goal in nm race they can compete with Intel in die shrink.Reply

Actually StevoLincolnite and others, you are quite confused. Using larger node sizes is not "cheaper to produce". Larger node sizes are more expensive per chip. The reason AMD (global foundries) does not just jump down to the next node is that it requires a great deal of capital up front (they are relatively broke) and R@D time which they are already behind on. Intel has proven more adept at shrinking process nodes and invests more in R&D ahead of time. This allows Intel to use the new node budget to partially increase performance, partially decrease power and partially decrease cost/chip. Cost per chip is the main driver for increasing density and depending on the generation Intel has re-balanced the performance/power/cost relationship.Reply

Not true, assuming Intel will integrate Iris Pro into its mobile CPUs. With HD5200 (iGPU) you can run just about any game on 768p with at least medium settings. Here are some benchmarks for that iGPU: http://www.anandtech.com/show/6993/intel-iris-pro-... I know that TDP there 47W but it includes quad core i7 clocked at 3,6GHz wnich itself draws some power. Intel could improve on that design, put it into its newest CPUs and if they do, they sure will brag about it.Reply

Half the problem with Intel Decelerators is with the graphics drivers, AMD and nVidia's drivers are gold plated in comparison, sure they are better these days than they used to be, but still hardly ideal.Reply

In a tablet form factor you are going to encounter severe thermal throttling while gaming, the Anandtech article you linked doesn't have those cooling restraints. Just look at Anantech's article on the Surface Pro 3's thermal throttling. Despite having a slightly better i5, it does worse in most heavy use scenarios.Reply

I think this discussion is completely irrelevant. Very few are going for a graphical gaming tablet (unless it's poker or fruit slices, etc.). That field is dominated by Playstation or XBox on an HDTV.Tablets and convertibles are for Internet Surfing or business documents, the low end for Internet surfing only.Reply

No, we use them for gaming, too. If you get a high-end convertible tablet, it's because a regular tablet doesn't do it and you don't need a similarly-sized laptop. This means that you'll either have a proper big gaming rig or you'll be relying on your convertible to game. I'm doing all my gaming on my Surface Pro 1, and I can play most games with my CPU at 700 MHz. (I mean, there are orders of magnitudes of game needs. I'll sometimes switch it to full-throttle when playing a game, when it needs more power.)Reply

Sure, but there's an obvious difference between gaming on a mobile device and sitting down in front of a TV. So talking about Xboxes and PlayStations is missing the point by a pretty large margin.Reply

Original prices. Haswell is not the "mass market" chip for Intel, which means high volume, and can afford lower prices than initially, while IVB is low volume chip now, so it won't be much cheaper. Watch for when Broadwell comes out. It won't be just 10 percent more expensive than Haswell.Reply

I believe Krysto was referring to the jump in prices from IB to HW, and not SB to IB.But either case, I'll have to disagree with Kyrsto. prices seems to have decreased over the years if anything. don't forget, because of IPC gains you are paying the same for more performance.3570 = $2104590 = $200

All depends on from factor. If you want a tiny tablet, this is the only manner to do it.Or you do like Tegra K1 that scores 12W under full GPU operations??? Come on, put it in a slim convertible if you can.Reply

Gartner regularly tracks PC unit sales, and tablets are regularly cited as a factor. On a more micro level, I know several people who have reduced their PC usage over the last couple of years (and delaed PC replacements) due to their tablet use.Reply

Actually, the tablet market as a cause for dropping PC sales is only half or less than half of the story. Logically speaking, most consumers find their 2-4 year old machines sufficient for their "productive" needs. Unlike previous years, PCs are living far beyond their intended years of services, and consumers are in no dire need for newer, faster components. Mind you, hardware isn't getting that much faster with each iteration (relatively speaking), and a simple SSD upgrade and/or more RAM would improve performance significantly that a whole hardware upgrade becomes less appealing (same for other components). It's mostly about convenience, connectivity and mobility for consumers these days (cheap mobile media consumption) that's why the tablet market "appear" to have affected PC sales. Those who find a tablet replacing their fully fledged PC didn't need a PC in the first place.Reply

How on earth is TurboBoost cheating? Is it cheating to include a feature that actually does result in a CPU that allow itself to do short sprints in moments where it is needed the most in order to make a device "feel" fast (i.e. respond quickly to events generated by the user)?

Cannot talk for BDW but both HSW and SMT usually stay at or very close to max turbo for _very_ long, Z3770 sitting inside a 10" tablet can run at 2.3-2.4GHz on all 4 cores for several minutes.Reply

TDP isn't a marketing term, it's the wattage of heat displayed by the processor. Wattage is a bit of a weird word for it, but Watts are joules-per-second, and a joule is the amount of energy needed to heat water a certain amount.Reply

TDP is "Total Dissipated Power" and as used in describing processors refers to the maximum instantaneous power that the processor will dissipate. Watt, being a unit unit of power, is the only SI unit suitable for specifying TDP. Since nearly all of the energy expended by a processor is due to heat transfer, a Watt is in this case essentially a measure of the rate of heat transfer, and TDP is the rate of heat transfer required when operating the processor at or near its maximum operating temperature. Reply

@Jaybus Nope. Please don't correct people with false info. TDP is Thermal Design Power, and it is NOT "peak" power. It is used for marketing as mkozakewich said above as the definition of TDP measurement varies from company to company.Reply

Oops! I read mkozakewich comment backwards. It is a marketing term in a sense... It is used to design cooling systems but it gets thrown around by marketing groups all the time since TDP limits how small a device can be.Reply

With all these reductions in power use on a per core basis and a stagnation of clock speeds we very well could see a quad core i7-5770k with a 65W TDP. I hope Intel plans on bumping up their mainstream high end SKUs to six core, the desktop market doesn't need maximum power use that low. Reply

Yes this worries me too. I think we will see yet another pointless desktop release, with hardly any improvements (once you consider the bad overclocking).

I still don't see a desktop Broadwell replacing my 2500k, which runs low(ish) heat at 4.3GHz. What I would love to see is a 5.5GHz Broadwell monster, otherwise I will be skipping broadwell, just like Hazwell and IB before it.

And someone in your position also of course has the double irony of still having moreperformance headroom if they're willing to explore better cooling, etc. H80i would letyour 2500K run at 5GHz no problem, assuming your particular sample isn't naturallylimited in some way (actually, even an old TRUE and two typical fans would be fine,something I can confirm as I've done it with several different 2700Ks, never mind a2500K, so I know the heat isn't an issue). Thus, except for threaded workloads, anupgrade would have to be a lot better than what you already have in order to beworth bothering with, except if you wanted something new due to newer tech sychas PCI Express, M.2, etc.

Anyway, this is why IB was so bad with its default metal cap setup, ie. Intel made SBtoo good which resulted in lots of 2500K owners just not seeing the point in upgrading.Thing is, they've still not boosted performance enough to make upgrading worthwhilefor most SB users (the exception of course being non-K owners).

My 2700K runs at 5.0. Takes me 3 mins to set this up on an ASUS M4E, ludicrouslyeasy to do. HW at 4.5 is about the same or a bit quicker, but much more complex todo, given how many samples won't go that high, and for gaming it makes no differenceat all. I suppose the 4790K can help on the oc front a bit, but the cost has put me off sofar, used 2700Ks are so much cheaper. I want to build a 4K gaming system next, butthat'll be X99 instead I think (don't see the point of mainstream chipsets for 4K gaming,surely with multiple GPUs the limited PCIe lanes is going to hold back performanceeventually).

I wish I had your luck with SB. My 2500k will only do 4.5 GHz stable with 1.365 vcore (in reality, monitoring software reports 1.391~ vcore with my motherboard's way of handling vdroop). If I could go any higher, I'd have to dump a load of voltage into it that I'm just afraid of doing. Though this may be a motherboard problem as it's given me problems before, but that still wouldn't account for a 500 MHz variance. I guess I just got a bad SB. :/Reply

They want you to go up to LGA2011 for the solder-based TIM. And only because of die-cracking issues (I found the papers covering that) on smaller dies that force them to use boring old polymer TIM.Reply

People are cool with high TSkins on their devices. I know my phone passes 35°C easily if I load it up, and I'm fine with that. Then again, I'm completely fine with a 60°C idle, because that's where ICs like to live...Reply

I'm surprised they didn't move the PCH to 22nm. Relatively low power consumption or not, they pushed everything else to the wall to get Core M's TDP as low as possible and between doing custom designs for it anyway and soft sales meaning they've got the spare 22nm capacity available I don't see any obvious reason why they couldn't've done so.Reply

The process nodes are very expensive to produce, so they need to get as much life out of them as possible. Also, a new(er) process isn't going to have a high enough yield. 22 Might have worked, but I bet the older process gave them a better bang for their buck.Reply

i think AMD is your hope. if you don't have a sandy bridge cpu or in phenom land, an FX series cpu is a great cpu that will hold one over until AMD updates their desktop FX series line of cpu's. i mean a 990FX mobo has all you need. i think pci 2.0 is still adequate for today's video cards so that doesn't really matter. even though on paper, intel has pci 3.0 and usb 5.0 (just kidding) and thunderbolt 9.9, they're superflous and doesn't make you game better. i mean, an fx-8350 with a decent gpu should give one smooth framerates. i mean who cares if in the benchmark, an fx-8350 with so and so gpu will run so and so game at, say 120fps, while an intel chip will run it at 190fps? 120 fps is like very good and that person who has an intel chip running 190 fps probably paid hundreds of dollars more for their system and i bet they wouldn't be even to decipher those 70 or so of more frames. i mean, it's not total fps that matters but the average frame rate and i think an fx-8350 will deliver that. it's a beast of a cpu. and broadwell? who cares. 90% of the words in the article above are buzz tech fancy words to get gadget heads salivating and stock wigs greasing their palms.Reply

Yeah, sure, and that's exactly what everyone was saying back when we were waiting forthe followon to Phenom II; just wait, their next chip will be great! Intel killer! Hmph. I recalleven many diehard AMD fans were pretty angry when BD finally came out.

Benchmarks show again and again that AMD's CPUs hold back performance in numerousscenarios. I'd rather get a used 2700K than an 8350; leaves the latter in the dust for all CPUtasks and far better for gaming.

Btw, you've answered your own point: if an 8350 is overkill for a game, giving 120fps, thensurely one would be better off with an i3, G3258 or somesuch, more than enough for mostgaming if the game is such that one's GPU setup & sscreen res, etc. is giving that sort offrame rate, in which case power consumption is less, etc.

I really hope AMD can get back in the game, but I don't see it happening any time soon.They don't have the design talent or the resources to come up with something genuinelynew and better.

an fx-8350 isn't holding anything back. come on, man. are you like a stat paper queen obsessor or something? oh, please. an fx-8350 and an amd r9290 gpu will give you "happy" frame rates. i say happy because it i know the frames rates will be high enough. more than good enough even. will it be lower than an i7-4770k and an r90? maybe. maybe the fx-8350 will avg 85 fps on so and so game while the i7-4770k will avg 90 fps. boohoo. who cares about 5 more frames.

also, while you mention i3 as a sufficient viable alternative to an fx-8350. remember that the cost will probably be about the same. and fx-8350 is like $190. maybe the i3 is 20 dollars less. but, here's the big but, an i3 is not as good as an fx-8350 in video editing stuff and photo editing stuff if one would like to use their pc for more than just games. an fx-8350, while not as power efficient as an i3 (but who cares since we are talking about a desktop) literally has more bang for the back. it has more cores and is faster.

amd will get back in the game. it is just a question of when. an fx-8350 is already toe-to-toe with an i7-2600k, which is no slouch in todays standard. so, amd just needs to refine their cpu's.

as for talent? amd came up with x64, or amd64 before intel. intel developed their own x86-64 later.

the resource that intel has over amd is just die shrinking. that's it. architecturally, an fx chip or the phenom chip before it seems like a more elegant design to me than intel chips. but that's subjective. and i don't really know that much about cpu's. but, i have been around since the days of 286 so maybe i just see intel as those guys who made 286 which were ubiquitous and plain. i also remember cyrix. and i remember g4 chips. and to me, the fx chip is like a great chip. it's full of compromises and promises at the same time. Reply

the games they have in that comparison are starcraft 2 and dragon age. 47 fps at 768 p for 8350 looks suspect on starcraft 2. what gpu did they use?

it's not way worse as i say. omg.

i have an i7-3770k oc'd to 4.1Ghz and a stock FX-8320 at stock. both can run cod: ghost and bf3. haven't tested my other games. might do the starcraft 2 test tomorrow. i don't have the numbers nor care. what ppl need to realize is the actual game experience while playing games and not the number. is the game smooth? a cpu that can't handle a game will be very evident. this means it's time to upgrade. and there are no fx cpu's from amd that can't handle modern games. again, they will trail intel, but that is like a car going at 220mph so that car wins but the other car is going at 190mph and it will lose but realistically and the experience of going at 190mph will still be fast. the good thing is that amd or cpu don't race each other unless you care about benchmarks. but, if you look past the benchmarks and just focus on the experience itself, an fx series cpu by amd is plenty fast enuff.

well, they don't have to get the fx-9590, which has serverlike cpu of 2008 like tdp or a gpu like tdp of 220 watts. there is a more modest tdp of 125w with the fx8350. all overclockable. seems like a good cpu for tinkerers, pc/enthusiast, gamers and video editors. i don't even think it's a budget cpu. there is the 6-core and 4-core variants which are cheaper. i am also not saying that an fx-8350 is like the best cpu since it's not and falls way down in the benchmark charts. but, it's not a bad cpu at all. it gets the work done (video editing) and let's you play games (wit's a modern cpu after) even though it's sort of 2 yrs old already. the 990FX chipset is even an older chipset. there's something to be said about that and i think im trying to say it. in light of all the news about intel, which we are guaranteed to get every year with each tick and tock... there is that little AMD sitting in the corner with a chipset that hasn't been updated for yrs and an 8-core cpu that's remarkably affordable. the performance is not that low at all. i mean, video editing with it or playing games with it doesn't hamper one's experience. so, maybe one will have to wait a couple more minutes for a video to render in a video editing program versus say an i7-4790k. but, one can simply get up from one's chair and return. instead of staring at how fast their cpu renders a video on the screen.

know what i'm saying?

so, yeah. an fx-8350 with an old 990fx mobo and now intel's upcoming broadwell cpu's with z97 chipsets and all the bells and whistles and productivity for either one will probably be similar. also, most video editing programs now will also leverage the gpu so an old fx-8350 w/ a compatible gpu will have help rendering those gpu's....

i guess it's like new doesn't mean anything now. or something. like m2 sata and pci 3.0, which intel chipsets have over amd is kinda superflous and doesn't really help or do much.

Oh yes, Skylake.Intel has given 5% IPC improvements for every generation since Nehalem, but now Skylake is going to change everything?If you're one of the ten people on the planet who can actually get value out of AVX-512 then, sure, great leap forward. For everyone else, if you were pissed off at IB, HSW, BDW, you're going to be just as pissed off with Skylake.Reply

No, the interest in Skylake is for all the non-CPU speed things promised with it. PCIe 4.0 and a bump from 16 to 20 CPU lanes (for PCIe storage) are at the top of the list. Other expected, but AFAIK not confirmed, benefits include USB3.1 and more USB3.x on the chipset than the current generation. We should have consumer DDR4 with Skylake too; but that's not expected to be a big bump in the real world.Reply

Actually, apart from power-users I fail to see any tangible improvements in performance of modern CPUs that matter to desktop/notebook usage, Intel or otherwise.

In the mobile space, it is improvements in GPU which mattered, but even that will eventually flatten once some peak is reached since graphics improvements on 4" / 5" screen can only matter to wide audiences up to some point.

However, there are surely enough customers that do look forward to more power - this is workstation and server market. Skylake and its AVX512 will matter to scientists and its enormous core count in EP (Xeon) version will matter to companies (virtualization, etc.).

Standard desktop, not so much. But, then again, ever since Core 2 Quad 6600 this was the case. If anything, large-scale adoption of SSDs is probably the single most important jump in desktop performance since the days of Conroe.Reply

I find the reduction in die thickness to be a big deal. Maybe this will prevent temperatures from getting out of control when the cpu core area gets cut in half for 14nm. High power 22nm cpus already easily hit 30c temperature difference between the cpu and heatsink.Reply

PC sales are down mostly because people can keep their systems longer due to the lack of innovation coming from Intel on desktop chips and the lack of utilizing the current CPU technology by software developers. They could be so much more, if only developers would actually make use of the desktop CPU capabilities for things such as a voice command OS that doesn't need to be trained. Intel would then have a reason to produce more powerful chips that would trigger more PC sales.

As it is, the current processor generation is less than 10% faster clock for clock compared to three generations ago. A great many thing aren't any faster at all. Know what? It doesn't even matter because nothing uses that much power these days.

Tablets and smartphones can't take the place of full PCs for most people. Their screens are just too small. Perhaps the younger generations prefer the small form factors right now, but give them a little time, and their eyes won't let them use such things. I can see the move to laptops, especially with 14-15" screens, but trying to show the same content on a 10" screen is just near unusable, and a 5" smartphone screen is just downright impossible. However, desktop PCs still have their place, and that's never going to change.

This push by "investors" for the tablet and smartphone market is just asinine. Broadwell isn't going to help sales all that much. Perhaps, they might sell some more Intel based tablets, but it won't be all that much of an improvement. Tablets have a niche, but it really isn't that much of one. Reply

Tablets are a niche and not much of one? lol yea ok... well while you were asleep in a cave, over 195 million tablets were sold in 2013 between Android/Apple/Microsoft which is just shy of 80 million more than the previous year. World wide PC sales totaled 316M units, so we are talking nearly 2 tablets for every 3 PC's sold. Eh...small niche...Reply

yeah, lots of people have them, but how much do they really use them? I have two, one Android and one Windows RT, and I only use them for reading books or for reading the web news while away from home. The Windows unit showed promise, since I could use it to run Office and terminal programs, but I ended up not using it at work anymore because it couldn't use a USB to serial adapter for talking to switches and raid arrays. It ended up being only half useful. They're nice to have for certain things, but they aren't as versatile as a PC. My parents own two, and two PCs, and they use the PCs far more. My older sister has one, and she barely uses it. Her 7 year old uses it to play games most of the time. My nephew has one, and he's only ever used it to read Facebook. It's a telling tale that everyone I've known who has one only has limited used for it. Reply

Point taken, but if people are *buying* them, irrespective of whether they use them,then it doesn't really matter.

Besides, this whole field of mobile computing, smart phones, tablets, now phablets,etc., it's too soon to be sure where we're heading long-term.

Many people say the copout porting of console games to PCs with little enhancementis one thing that's harmed PC gaming sales. This may well be true. Now that the newerconsoles use PC tech more directly, perhaps this will be less of an issue, but it's alwaysdown to the developer whether they choose to make a PC release capable of exploitingwhat a PC can do re high res, better detail, etc. Wouldn't surprise me if this issue causesinternal pressures, eg. make the PC version too much better and it might harm consoleversion sales - with devs no doubt eager to maximise returns, that's something they'dlikely want to avoid.

BDW-Y is 82 mm^2. The PCH looks like it's about a third of that, so total is maybe 115 mm^2 or so.In comparison, Apple A7 is about 100 mm^2.A7 includes some stuff BDW-Y doesn't, and vice versa, so let's call it a wash in terms of non-CPU functionality.BDW-Y obviously can perform a LOT better (if it's given enough power, probably performs about the same at the same power budget). On the other hand it probably costs about 10x what an A7 costs. Reply

Sure, also let's conveniently forget that Broadwell Y benefits not only of 3D transistors, but a 2 generation node shrink, too, compared to A7. Now put A7 on 14nm and 3d transistors...and let's see which does better.

This is the issue nobody seems to understand, not even Anand, or just conveniently ignored it when he declared that the "x86 myth is busted". At the time we were talking about a 22nm Trigate Atom vs 28nm planar ARM chip, with Atom barely competing on performance (while costing 2x more, and having half the GPU performance). Yet Anand said the x86 bloat myth is busted...How exactly?! Put them on the same process technology...and then we'll see if x86 myth is indeed busted, or it's still bloated as a pig.Reply

The 'myth' was that x86 was a horrible power-sucking pig. It was shown that it was possible to at least get close to ARM processors.

Meanwhile, Intel's chips are NOT as low-performance as Apple's. The chip in Surface Pro 3 is about 2x-3x faster, and these should be about the same. With this year's Apple chip, imagine it to be 2x faster or so. Meanwhile, they'll probably both take the same amount of power. SoCs these days are running multiples of Watts. Even Intel Atoms used to take over 10, but now a full Intel Core computer can run under 6 Watts. It would be really interesting to run all these numbers and give a complete report on the state of mobile vs. notebook processing. They seem matched on power usage, but ARM chips are far cheaper and run at far less performance.Reply

Finally, a long awaited fanless design. I don't care about thickness that much as about energy waste on the heat and the fan to dissipate the heat.But... I have 2008 MacBook Pro with 2.4GHz Pentium M. If they achieved fanless design by bringing frequency down to, say, 1.8GHz, I am not interested (given that IPCs are not that different for real world applications). For me to upgrade, I want it to reach 3GHz, even if for a second and in a single thread, when starting applications for example. Anything below that is not a noticeable upgrade, and below 2.2GHz or so will be downgrade in practice.And biggest problem of Intel is not how thick they processors are, it is Microsoft - with Windows 8 (and 8.1) being so unbelievably awful (yes, I do own it for a while). Krzanich should call Nadella immediately and tell him to fire Larson-Green, or they both are going down.Reply

You do realize this is for tablets and low power laptops (Chromebook/netbook style) only, right?

It's not coming to MBP for any time soon unless you're talking about something else entirely, like the MBA, which also is not going to get M either because it'll be too big of a regression on the performance.

You forget about IPC. Last I checked, compared to a Core 2 CPU at equal clock speeds, a Sandy Bridge CPU is 50+% faster on average, and Haswell is a further 15+% faster on top of that, all the while using less power.Reply

Since this is the FIRST core to be "fanless", they're probably squeezing a lot of stuff to make that work, and it probably still overheats. I wouldn't be too excited about it until we see how it does in actual devices, both from a power consumption point of view, but also a performance one (because if performance didn't matter, then we'd all just use ARM chips, no?).

It would be laughable if Denver, which already beats mainstream Haswell Celeron, would be in range of Broadwell Y in performance, but still more energy efficient and with much better GPU performance.Reply

OK, I'm confused, so any help appreciated. I want to replace my old desktop replacement laptop with a Broadwell equivalent. For example, the Broadwell version of an HP Envy 17t. What flavor of Broadwell am I waiting for? The Y flavor? U? H? Something else? thanks.Reply

I would ask the opposite: why do you need a replacement at all? What is it your currentdevice cannot do, or does poorly? Once the newer products are out, find out which onesolves those issues at the lowest cost. This focus on the jargon side of computing techis the absolute worst aspect of how consumer computing has evolved since the 1980s.

I appreciate the thoughts, but I'm actually an ASIC designer, so I have no problem with lingo - I just am not informed regarding my specific question. And my current laptop is a Penryn 2008 laptop, so forgive me if we ignore the need question and stay focused on what version(s) of Broadwell are intended for mainstream (non-gaming) desktop replacement laptops. thanks!Reply

You want the full HQ/MQ series for your next laptop. Those are the full quad-core-enabled machines, which is something your probably want for a hi-performance machine. Since this is a gaming laptop though, you may want to look into building a small mini-ITX desktop though, they're comparable if your primary definition of "mobile gaming" is "drag computer to a LANparty/friend's place" rather than actually gaming on a train or similar.Reply

If you can afford it, why are you dicking around with trying to save a few hundred dollars?If you want a serious machine, buy an rMBP with quadcore i7.If you want a light machine, buy an MBA.Then either stick Parallels on it (if you use OSX) or just run Boot Camp and never touch OSX, except maybe once every two months to update the firmware.

If you can afford it, life's too short to waste time trying to figure out which of fifty slightly different laptop PCs will suck less in their different hardware, different pre-installed crap-ware, different drivers. Just pay a little more and get something that most people consider works well. Reply

As an ASIC designer, OSX is just a non-option. Hell, even Linux is rather hairy compared to Windows for ASIC stuff.

I personally like the Dell Precision line, but Lenovo Thinkpad W and HP Elitebook would also doo. In a pinch, a ridiculously-specced Compal (in it's various Sager rebrands) would also do, but IMO are just not built anywhere close.Reply

Thanks, Kjella. Upon further googling after reading your post, I learned that there apparently will be a mobile and deskop version of the H flavor, and it does look like the mobile H is the most likely for me. But that's not rumored to come out until May or June 2015, which is disappointing.

Even weirder, since the first desktop version of Broadwell will be available in PCs in May/June 2015, and since the first version of Skylake (rumored to be a desktop version) is rumored to be "available" 2H 2015, it seems Broadwell H desktop is slated for a very, very short product life. Similarly, is Broadwell H mobile also slated for an extremely short product life?

Perhaps I missed it, but it would be great if there were an Anandtech chart comparing Intel's definitions of "announced", "launched", "samples", "volume shipments", "available", and similar Intelese to figure out about how many months from each it is till Joe Consumer can buy the chip in question. I suspect these definitions and even the lingo can vary with each tick and tock, but some kind of cheat sheet guestimates would be great (and revised as better info arrives, of course).Reply

To further clarify the need for a cheat sheet, I'm familiar with the timing of tick and tock for the last few years, but it seems that 2014/2015 at a minimum will diverge so much from the last few years that previous expectations add confusion rather than clarity.Reply

Judging by the delay for BW, Skylake will probably be pushed forward at least 6 months, if only to make up the R&D costs of BW. Then again, Intel wants that tablet market, so they might not either.Reply

"Similarly, is Broadwell H mobile also slated for an extremely short product life?"

No its not. Companies like Intel cares about 5% profit differences, so having a short product life would make absolutely no sense. Broadwell isn't coming to "mainstream" desktops, only high-end enthusiast ones like the K series.

So they will all happily be a family like this:-Skylake mainstream desktop-Broadwell H/U/Y/KReply

Semiaccurate says that quadcores (which I take to mean H-series) will not be out until 11 months from now.(Makes you wonder WTF has happened to the Skylake timeline. They haven't yet admitted that that has even slipped, let alone by how much.)Reply

I also fail to see how will Intel be able to keep their cadence with Skylake without either skipping the entire generation (obsoleting it in 6 months does not sound reasonable from the financial point of view) or delaying the Skylake introduction.

Also the fact that Intel decided to enable all 18 cores in the Haswell EP is telling IMHO. Initially, this was to happen only with BDW-EP, so it might not be impossible that Intel might just skip Broadwell for some segments and go with Skylake.Reply

Those who know ain't talking, but I can observe the following; - Delaying Skylake for financial reasons related to Broadwell is braindead. Broadwell development costs are sunk costs and Intel is or should be trying to maximize overall profits, not just a particular program's profits. Intel should release Skylake as quickly as possible when its yields hit target values, regardless of Broadwell delays, with two caveats: - If the Broadwell yield difficulties also slowed down Skylake, then Skylake will likely be inherently delayed to some degree - If Intel screws up product planning such that they flood the market with Broadwell, then their customers might be very angry if they are stuck with that inventory upon a Skylake release.

My bet at this point? Broadwell H mobile will be a very short-lived product (about 6 months).Reply

For the gpu It is noteworthy that unlike nvidia and amd the subslice block (at least before gen8) doesn't really have an inherent minimal size which cannot be changed without significantly altering the architecture. E.g. Gen7 (which is just about the same as Gen7.5) had subslice sizes of 6 (IvyBridge GT1), 8 (IvyBridge GT2) and 4 even (BayTrail).It is also quite interesting that everybody (nvidia since gk2xx and Maxwell, amd since even before GCN, notably their Northern Islands VLIW4 designs, intel since Gen8) now has ended up with the exact same ALU:TEX ratio (one quad tmu per 64 ALU lanes), though of course the capabilities of these tmus vary a bit (e.g. nvidia can do fullspeed fp16 filtering, amd only half speed etc.)Reply

In the fourth to last paragraph an intel driver dev says that broadwell graphics "dwarf any other silicon iteration during my tenure, and certainly can compete with the likes of the gen3->gen4 changes."I'm going to go with the guy who's actually developing the drivers on this.Reply

Because the people who obsess about discrete graphics are a RIDICULOUSLY small fraction of the purchasing public, a fraction which is utterly unwilling to accept this fact and the fact that CPU design is not targeted at the needs of high-end gamers. Reply

Compared to the total Intel CPU market and compared to the cost of creating an IGPless CPU die for the mainstream socket it's entirely on the mark. If you want an IGPless design your only choice is to wait a year for the Xeon derived LGA2011 model; and the fact that LGA1366 launched as an enthusiast design well before the Xeon's did, but that Intel hasn't done the same for any other models shows that it didn't sell well enough to justify rushing one for us.Reply

Small fraction is right. Projected worldwide PC hardware sales for 2015 is ~ $385B (Source: eTForcasts). Projected PC gaming sales (both hardware and software) is ~$20B (Source: Statista), less than 10% of total PC hardware sales alone. A 10% market niche is very, very small in the overall scheme of the PC market.Reply

"Maybe not obsess, but to characterise the PC gaming market as ridiculously small, is pretty far off the mark...."

I think the original comment was fairly accurate, even in the PC gaming market there's a large proportion of people using Intel graphics cards. Looking at the current Steam survey results, 75% are using Intel processors and 20% overall are using Intel graphics which means around 1 in 3 people with Intel processors on Steam are using the onboard graphics card. The means even among the gaming market there's a lot of integrated cards in use and that's just one small portion as I'd expect most other areas to mainly be using integrated cards.

There are workstation graphics cards but professionals using those are unlikely to be using consumer processors and the enthusiast/workstation processors do not have an integrated graphics card.Reply

I have had steam on my company laptop with just internal GPU just to take part into the sales campains etc. This makes my contribution to 50:50 in terms on dGPU / iGPU, even though 100% of gaming happens with dGPU.Reply

So....how do NVIDIA and ATI stay in business? Obviously many people use discrete cards. The fact you say "obsess" tells me you probably don't realize the massive performance difference, and it's not limited to gaming. CAD uses 3D.Reply

Because they are using that anti-competitive tactic to drive out the discrete competition. They force OEMs to buy them bundled, so more and more people say "why should I pay twice for the GPU...I'll just get the Intel one".

It's a nasty tactic, Intel has been employing for years, and unfortunately it's working. But it's terribly uncompetitive. Reply

It's akin to Microsoft bundling IE with Windows "Why would I need to get another browser...I'll just use IE". That tactic WORKED for Microsoft. It only stopped working when they became lazy. But they could've hold the 90 percent market share of IE for a lot longer, if they didn't get lazy.Reply

"we’ll still have to wait to see just how good the resulting retail products are, but there shouldn’t be any technical reason for why it can’t be put into a mobile device comparable to today’s 10”+ tablets. "

There may not be TECHNICAL reasons, but there are very definite economic reasons.People think of tablets as cheap devices --- iPad at the high end, but the mass market at $350 or so. This CPU alone will probably cost around $300. MS is willing to pay that for Surface Pro 4; no-one else is, not for a product where x86 compatibility is not essential.We'll see it in ultrabooks (and various ultrabook perversions that bend or slide or pop into some sort of tablet) but we're not going to see a wave of sub<$1000 products using this.Reply

Nothing was said about cheap tablets in that quote, so I'm not sure why you're bringing up the price.

Not that I disagree with your point. Of course, by continuing to focus on premium priced parts, Intel is never going to gain a profitable foothold in the mobile market. Core M needs to be cheaper, not just lower power, to be interesting. Otherwise there's no reason to care. If you're paying for a $1000 device, why do you want something that's obviously going to be so performance gimped compared to Y-series Broadwells?Reply

i would like to see ordinary 13" ultrabooks with broadwell-y. don't make it too slim and see what performance and battery life is like with a 4.5w cpu. if performance is high enough for everyday tasks, it would really be nice to have slim notebooks approach 20 hours of battery life in light usage scenarios.

but i guess companies will just use the low power cpus as an excuse to implement smaller batteries and 4k displays and we still won't get much more than 10h in best case scenarios...Reply

I'd argue that it may well be too late for Intel to enter this market, unless they can deliver a major step change in performance compared to ARM. Right now the ARM-Android & ARM-iOS ecosystems are well established and humming along nicely. On top of which tablet sales in the developed world are slowing down. The developing world is still growing but in those regions, cost will be a key factor.

That leaves Intel targeting a market with an entrenched competitor, a set software ecosystems with no benefits from migrating to a new architecture (what do Apple, Google, Samsung, HTC, LG etc gain out of this?) and slowing hardware sales.

I Core M can deliver double performance for the same power draw AND price, then sure, I can see a rush to migrate to it, otherwise what's the point?Reply

"Core" chips will never EVER compete with ARM in its main market. The best Intel can do is try to compete in $700+ devices. Core is just not competitive on price. Not even close. Period.

Intel's only competition against ARM in the MOBILE market is Atom, and Nvidia's Denver is already TWICE as fast as that. Also Atom is twice as expensive as Denver, but Intel keeps subsidizing half the cost...for as long as they can, which probably won't be much longer.Reply

This sounds as a distraction meant to divert attention from the fact that Intel is about a year late with 14nm. It seems than the Moore's law will stop working quite soon. I would still bet on Intel being able to implement 10nm but that will likely be the end of the road for the current technology. And there is no clearly visible path beyond that. Not that progress will stop but it may slow down to the pace we observe e.g. in car design and production rather than what we are used to expect in chip design and production.Reply

"Intel’s solution to this problem is both a bit brute force and a bit genius, and is definitely unlike anything else we’ve seen on PC GPUs thus far. Since Intel can’t reduce their idle voltage they are going to start outright turning off the GPU instead;"

That qualifies as genius these days? The most obvious method to reduce a GPU's power consumption is to turn it off.Reply

I liked the comments in this thread but was disturbed by the fan boys. Just because there was an amd marketer here doesn't mean that we should resort to intel worship. At least for me, those discussions are totally moot in this thread.Reply

Broadwell Core M is supposed to support fanless designs. Will Broadwell-U support fanless as well? Meaning will high end laptops with Broadwell-U be fanless? If not, will Skylake be able to have high end machines be fanless?Reply