Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

MojoKid writes "When the Federal Trade Commission settled their investigation of Intel, one of the stipulations of the agreement was that Intel would continue to support the PCI Express standard for the next six years. Intel agreed to all the FTC's demands, but Intel's upcoming Oak Trail Atom platform presented something of a conundrum. Oak Trail was finalized long before the FTC and Intel began negotiating, which means Santa Clara could've been banned from shipping the platform. However, the FTC and Intel have recently jointly announced an agreement covering Oak Trail that allows Intel to sell the platform without adding PCIe support — for now."

The FTC filed its complaints against Intel on Dec. 16, 2009. It charged the chip maker with illegally using its dominant position to stifle competition for decades. The complaint was filed just a month after Intel had settled antitrust and patent disputes with Advanced Micro Devices for US$1.25 billion.

").(1) Section 5 of the FTC Act prohibits "unfair methods of competition," and was amended in 1938 also to prohibit "unfair or deceptive acts or practices.

Seems to have been part of a broader move against Intel at the time, I admit I don't remember it very clearly, but Reuters adds [reuters.com]

A wide range of antitrust enforcers have gone up against Intel for its controversial pricing incentives. New York Attorney General Andrew Cuomo accused Intel in November of threatening computer makers and paying billions of dollars of kickbacks to maintain market supremacy.
The European Commission has fined Intel 1.06 billion euros
($1.44 billion) for illegally shutting out AMD.
In June 2008, South Korea fined Intel some $26 million, finding it offered rebates to PC makers in return for not buying microprocessors made by AMD.
Japan's trade commission concluded in 2005 that Intel had violated the country's anti-monopoly act.
The case before the FTC is "In the Matter of Intel Corporation," docket number 9341.

Instead of telling Intel how to make their product, I consider it much better to confiscate the relevant patents and copyrights and put them into the public domain. That way AMD, nvidia, etc. will have all the access they need. They use asset forfeiture on us all the time. Time to use it here. Fair is fair.

The mediocre solution (GMA HD) they are gluing to the CPU is a derivative of the solution they shipped 140,000,000 of last year (GMA* in 90%+ of every laptop manufactured.) That's pretty hilarious. It will be downright hysterical when, integrated into the CPUs, another 100,000,000 displace most of the discrete desktop graphics cards.

Here is a good article [arstechnica.com] about the original antitrust settlement.

Basically, Intel refuses to license it's new DMI or QPI bus protocols to NVIDIA, so they can no longer make chipsets for intel processors (like nForce). Furthermore, it has been feared that with the push towards systems on chip, that Intel would eliminate the PCI-e bus as well leaving no way for any graphic company to supply a discrete graphics chip for netbook or notebook computers.

AMD also has HyperTransport. Maybe this was why there were rumours about Nvidia making a CPU.

If Intel & AMD decided to offer GPUs linked by QPI & HT it would give their GPUs a big advantage with Nvidia unable to compete.

I think non-portable computers will end up a lot more modular in this way. Memory, CPUs, GPUs, Northbridge all connected to each other on a future generation of a switched HT/QPI bus. It would make the computers much more scalable, futureproof, adaptable and efficient. It might also

High-end graphics and discrete cards are making up a smaller and smaller percentage of the market. It is quickly getting to the point that the only people who are buying discrete GPUs are gamers and graphics professionals. Most people just don't see the need for the added expense.

The "mid to high-end gaming market" is fairly small on the PC, relative to the entire PC market.

Did I say that they didn't? No... I said that high-end consumer gaming machines are not where Intel is making its money.

And those cards don't have to use PCIe, there's other ways of getting the controllers in. Gigabit network cards for example already offload a lot onto the processor itself and are usually built into the chipset.

So basically, if youre on a budget, youd do AMD. And if youre looking to make a decent gaming rig, youd do AMD. And if you were looking to do high end, youd do Intel, except with no PCIe, no, you wouldnt-- youd do AMD.

Who in the hell is gonna buy a core i7 laptop and stick with integrated graphics, again?

I like and buy AMD CPUs, but I've always preferred nVidia for graphics cards; mostly because I run linux, for which ATI/AMD cards have notoriously poor support compared to nvidia. Vendor lock in, no matter the company is a terrible thing, I and everyone else should be able to get GPUs independent of CPUs, or any other hardware for that matter.

Furthermore, it has been feared that with the push towards systems on chip, that Intel would eliminate the PCI-e bus as well leaving no way for any graphic company to supply a discrete graphics chip for netbook or notebook computers.

If they did that, every manufacturer of even moderately high-end laptops would drop their CPUs faster than an LSD addict drops acid.

Even if Intel's GPUs were the best in the industry, there are too many other critical things you wouldn't be able to properly support without PCIe-

Intel doesn't want nVidia making chipsets, true enough, because Intel makes chipsets. However the want expansion slots on their boards because they want people using their boards. I'm quite sure they are plenty happy with nVidia and ATi graphics cards. Heck they've included ATi's crossfire on their boards for a long time (they didn't have SLI because nVidia wouldn't license it to them). Intel has nothing that competes in that arena, and they recently revised their plan so they aren't even going to try. They want people to get those high end GPUs because people who get high end GPUs often get high end CPUs since they are gamers. Not only that, they STILL sell the Integrated GPU, since it is on chip.

I just can't see them not wanting PCIe in their regular desktop boards. They know expansion is popular, and they also know that the people who expand the most also want the biggest CPUs.

Now on an Atom platform? Sure makes sense. These are extremely low end systems. PCIe logic is really nothing but wasted silicon. You don't have room for PCIe expansions in there, never mind the desire for it. Those are integrated, all-in-one, low end platforms.

However desktop and laptop? I can't see them wanting to eliminate it there.

Depends on the market. How many laptop users actually care about ExpressCard? I've had one on this machine for 4 years, but never plugged anything in to it. FireWire and SATA controllers are small enough that they could be on die on a low-power SoC. Things like USB and Ethernet / 802.11 are already commonly provided on die by ARM SoCs, so I'd imagine that they would be with most Intel single-chip solutions too.

Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not. Considering this is the same company that bribed OEMs [pwn3d.com], rigged their compiler [digg.com], and paid 1.25 billion to AMD [nytimes.com] just to keep them from digging all the skeletons in their closet? It really shouldn't be surprising.

I was a life long Intel man, going back to the 486Dx, but after all the dirty underhanded shit they've pulled recently I've gone full AMD for my customers and myself. If you win a market because you are faster/cheaper/better? No problem with me. But rigging the market is a BIG no no in my book, and makes it worse for all of us. Just look at how many power hogging P4s are still in use, thanks partially to the fact that Intel paid off OEMs not to run the better at the time AMD chips. The regulators in the USA may not have any teeth anymore, but I can't wait to see what the EU does to them. Intel has been so nasty lately they make MSFT look like the Care Bears.

Just look at how many power hogging P4s are still in use, thanks partially to the fact that Intel paid off OEMs not to run the better at the time AMD chips.

Prior to the Athlon-64, P4s _were_ better than Athlons unless you wanted to run x87 floating point instructions. When I bought my last Windows PC I expected to go AMD but when I actually looked at the benchmarks the P4 was up to twice as fast as a similarly-priced Athlon at rendering in the 3D and video editing packages I was using at the time.

It was only in the final P4 space-heater era that choosing AMD became a no-brainer.

You really do have to consider that the performance per watt in the era since the P4 has been stellar from Intel, while AMD hasn't quite been up to par in that department. The roles really have reversed in the past few years in regards to wattage, with Intel also keeping the raw performance crown on the high end.

You really do have to consider that the performance per watt in the era since the P4 has been stellar from Intel, while AMD hasn't quite been up to par in that department. The roles really have reversed in the past few years in regards to wattage, with Intel also keeping the raw performance crown on the high end.

On the other hand, for the price of the highest-end Intel chip [newegg.com] (and a motherboard [newegg.com] to run it on) (also note: board and chip ONLY; no OS, no drives, no case, no nothing), I can practically build twohigh-end AMD systems [newegg.com] (If Newegg will sell me a pre-built system for just under a grand, I'm willing to bet I can build it myself for $800 or less - especially without the MSFT tax).

Yeah it is truly crazy how much "bang for the buck" from AMD and lets be honest here: VERY few of us are gonna have the kind of day to day work lined up that is gonna pound the dog snot out of the CPU hard enough to make the price difference worth it. For less than $530 after MIR I got an AMD 925 quad with 8Mb of Lvl 2 cache, 8GB of DDR 2 800MHz RAM, 2 500GB HDDs, a HD4650 1Gb GPU, a 20x DVD burner, and a nice case to put it all in. You really can't beat that.

That's not a good comparison. Intel has an obscenely high priced chip. Fine, they always have for those people who have more money than sense. They also have reasonably priced chips. Try instead looking at, say, a Core i5-760. 2.8GHz quad core chip for $210. Look up some performance numbers and then compare to AMD chips. It isn't very favorable. More or less they need their 6 core chips to compete, and then it is only competitive if you happen to have an app that can use all 6 cores (which is very rare stil

my list:1)emerge2)ffmpeg3)Wow.exe4)Chome while on carrerbuilder.com(something is dumb there and loops using 10% cpu).ffmpeg can use 2-4 cores, emerge can as well, wow uses 2, and chrome could use a bunch as well. hmm just about everything i run is threaded 6-12 cores sure sounds nice.

See here's the problem: Even if Intel's compiler does better for Intel's own chips, which I'm sure it does, it is still the best compiler out there by a long shot. Any time you see compiler benchmarks it is consistently the best thing on the market. Intel has a really, really good compiler team. So if that is a problem for AMD, well then they should be writing their own compiler. Like the ICC, they should make it plug in to Visual Studio so that VS developers can use it as a drop-in replacement to speed up

Even if Intel's compiler does better for Intel's own chips, which I'm sure it does, it is still the best compiler out there by a long shot.

This isnt about not putting effort into optimizing for non-Intel. This is about intentionally putting effort into sabotaging non-Intel performance. They have been convicted of this act, and so far have not honored the courts ruling on the matter.

My "contrived situation" was simply to go to newegg.com, and look up the highest rated intel chip, then get the highest rated motherboard for that chip, then compare that price to a good gaming system sold as a single, pre-built unit.

The funny thing is, you managed to rant about how I used a six-core chip against the Intel four-core, while completely ignoring that for the money, I would actually get twelve cores with the AMD-based solution. Unless you can argue that it takes 3 AMD cores to equal the perform

Because I don't care that AMD gives the same performance (sometimes, under specific circumstances) for 3/4 the cost (according to you). If I can't sit in the same room as the thing, I'm not going to be able to use its performance anyway. A real comparison between the AMD PhenomII X6 is the i7-860 (4 core with HT), which is $4 more and beats the pants off of it with far less energy consumption, so I don't know how you arrived at your pricing figures unless you went with sale prices TBH.

You're picking Intel's most expensive chip to try and prove a point, and failing horribly. Intel has a $279.99 offering on Newegg [newegg.com] that beats [tomshardware.com] the [tomshardware.com] living [tomshardware.com] shit [tomshardware.com] out [tomshardware.com] of the AMD processor for things normal people do on their home computers, and is damn close in the rest. Oh, and it uses far less power both at idle and at load [anandtech.com]. (Tom's didn't have power numbers for the i7-860).

Now, you might have a point about code "not being optimized for AMD blahblahblah", but here's a newsflash: Not only do the testing suites us

Actual measurements of power consumed are very close, with upper Intel chips often consuming more; with perf differences that can't be noticed anyway (TDP is not consumption, and Intel uses more optimistic method anyway)

Why are you thinking that you need a $290 motherboard again? AMD may be better bang for your buck but you have only yourself to blame if you want a budget system and then sink that much into a Mobo.

See my reply further back in the same thread [slashdot.org]. The discussion was about Intel chips being more expensive, without delivering enough additional power to justify the additional expense. Indeed, with the number of AMD chips you could buy for the price of a single Intel chip, you could outperform any multi-threaded process with the AMD chips; If you're building a cluster, AMD wins hands down.

I'm not trying to tweak your nose, here, but I just don't see any reason to buy Intel anymore - they might be the king of

Intel isn't dropping PCIe support except on the Atom chip line. I don't give a shit what they do to Atom's PCIe bus, since I'm not going to put a GTX480 or any other expansion card in one of those netbook systems anyway.

Prior to the Athlon-64, P4s _were_ better than Athlons unless you wanted to run x87 floating point instructions.

Clock for clock the Athlons beat the shit out of the P4. Only by getting a fast and thus very power-hungry and expensive processor could you build a faster machine with a P4. Does that mean they were "better"? Also, at the time floating point had just become massively important to gaming since we were fully into 3d land. fp math was one of the most important differentiators and competition over specfp benchmarks was intense.

When I bought my last Windows PC I expected to go AMD but when I actually looked at the benchmarks the P4 was up to twice as fast as a similarly-priced Athlon at rendering in the 3D and video editing packages I was using at the time.

Only if you bought your AMD from someone whose pricing was designed to remove their

From 2000 - 2005 I bought a few AMD systems because they were a bit cheaper, but I also had quite a few CPU's fail and even one melt despite heatsink, fan, two case fans, plus another PCI slot fan. Maybe it was just my luck of the draw, but since 2005 everything I've bought except my PowerMac G5 tower has been Intel CPU's. And I haven't had any problems with the intel CPU's.

Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not.

They did, but the DPI license is mostly a diversion. The real story is that with the Core i3/i5s you already have integrated graphics on the CPU, so even if nVidia manage to claw their way back into the motherboard game there's nothing for them there since graphics used to be the main differentiator. By turning it into a license/contract issue it seems a lot cleaner than "oh, you can still produce boards but we moved the essential functionality into the CPU". Though honestly AMD has been talking about the m

. The real story is that with the Core i3/i5s you already have integrated graphics on the CPU, so even if nVidia manage to claw their way back into the motherboard game there's nothing for them there since graphics used to be the main differentiator.

They still have shit graphics, so there is still a need for fancier graphics for laptops. If I was buying a laptop for my girlfriend I would have used intel integrated video before and I would still use it. If I am buying one for me then I wouldn't use it before, and I won't use it now. Nothing has changed except where the hardware I don't want lives. Before it was soldered to the motherboard where I couldn't (reasonably) remove it. Now it's built into the CPU and I still can't remove it.

Nothing has changed.What has changed is that with core 2 stuff you have the option of a nvidia chipset with integrated graphics that were better than intel integrated graphics while being physically smaller and lower power consumption than a discrete solution with it's own memory.

With current gen intel stuff that option is gone (though admittedly from a users point of view the fact that intel integrated graphics are better than they used to be somewhat makes up for it).

Basically Intel locked down all I/O on many of their chips to specifically lock out Nvidia and force their lousy GPUs onto you, whether you like it or not.

Do you understand what this chip is? It's a system on a chip. The whole point is a small, integrated, specialized, low power chip for things like tablets. There's absolutely no point in allowing for an NVIDIA chip on it because 1) the integrated graphics are ALL you need. 2) if you added another GPU chip you would hurt power consumption and increase overall costs and 3) why the hell increase the complexity of the chip to support something that it fundamentally contrary to the design goals

Hi Mr AC! I sell MSFT because Apple is worse with their walled garden and crazy high prices? And don't even say Linux, because Linux on the desktop is like a bad joke. Crazy 6 month upgrade cycle, no stable hardware ABI so half the time the upgrade breaks drivers, frankly it is just a total mess. I predict that Canonical is gonna quickly get tired of Linus and all the other kernel devs (who are all being paid by SERVER vendors BTW) and simply fork the kernel away from him.

Wow, I can't believe it, I'm actually agreeing with you. Whether Con was a dick or not there are some serious issues, like X and the lack of stable ABI, which I have been bitching about forever. I simply hope that since Canonical is breaking away with Unity and the new OpenGL X replacement, they might just go that last step and fork the kernel so they can stick in a hardware ABI.

Because if Canonical can make it to where I install Ubuntu 11 on a machine and when 12 rolls around ALL the hardware still works w

Issue is that Atom except Intel STB boards with their media accel processor is deliberately crippled in terms of video performance. As a result the entire system ends up being crippled wholesale giving the consumers a perception that the computer is slow while it really isn't.

Nvidia has demonstrated this - when paired with a devent video chippery Atom makes a perfectly adequate desktop and notebook. As a result Intel has gone as far as damaging its own media STB roadmap to lock Nvidia out so that Atom does

Intel has been bitten by its non-cannibalisations strategies in the past.

One of the main reasons for Athlon to have its time in the sun is not its performance. It was _NOT_ that much better initially. It was Intel deliberately limiting and crippling various product lines on non-cannibalisation grounds. i810, i840 and most importanty crippling i815e which had 2G memory addressing capacity by design with a 512MB strict marketing requirement (sounds familiar doesn't it?) had more to do with it. As a result Ath

I went and looked up the specs for the chip in question. It's a SoC chip, just a PCI bus is all I could find. There's no market reason for PCIe, and it really wouldn't even offer much of a benefit, since the single-core CPU is barely pushing a gigahertz. The FTC behaved pretty much reasonably in this case.

Maybe they can just glue on dummy PCIe slots, kind of like the Chinese used to hand-paint barcodes on boxes.

Dang, I'm no good with Google today. I can't find the reference. Years ago, when barcodes were just starting to become popular on boxes/cases used for shipping, I recall a story where some American company had specified that their Chinese supplier had to begin bar-coding boxes of goods sent to the US to make warehousing here easier, and proceeded to have fits when none of the barcodes scanned. They

I don't even see the relevance of the Atom platform anymore. It used to be about power efficiency and they really got there with the Z series maxing out at 2.4W. This was, of course at the expense of processing power, addressable memory and such. However after the release of the SU7300 which maxes out at 10W - and doesn't have the limitations of the Atom. I get that there are some power savings in there with all the integration Intel is planning but I'm skeptical how much that bares out. My wife was r

Off-topic, but FYI I think Oak Trail basically is a PC-compatible chipset for Moorestown (the other chipset was not PC-compatible). It includes all the legacy stuff that is need to maintain compatibility back to the original IBM PC in 1981, and extensions such as ACPI, so most normal x86 OSes will run.

I thought that Intel wanted to break into the embedded market that contains a lot of ARM and PowerPC cores with Atom? The FPGA + Embedded processor combination is pretty common, and PCIe is the way to interface them. Hence your low power/low performance chip is bundled together with another (FPGA or ASIC) that does the heavy lifting for a specific task. Every application that requires some serious, but fixed, number crunching is appropriate for this. I do broadcast related stuff, so the things that spring to mind are video compressors, deinterlacers, etc. Why spend lots of dollars and lots of watts on a powerful CPU when you can combine a amsll core and an ASIC/FPGA and get the same result? Without PCIe no one is going to consider the Atom for these applications.

With no PCI Express support, I can just skip anything from Intel, since I won't be able to use any decent video card in their rig.

Thanks, Intel, for throwing away any chance you had at selling stuff to the gaming market.

Wait... does this mean Intel is going to be the next big corporation screaming about piracy hurting their profits? I mean, obviously, if no one is buying their crap anymore, it's the fault of the pirates...

because intel's cards don't do CUDA. And the 5450 is rather slow if you want to push a game to 3 monitors. Does the Intel card do 3 monitors? how about 6? how about decent h264 decoding in mainline mplayer? VLC? xine?

You mean besides the fact that a $40 video card is only good for entry level gaming?

Even the HD4650 blows the doors off of the HD5450. That HD5450 ranks right up there with 2 year old mid range (at the time) graphics cards.

That the Intel integrated solution is a "tiny bit slower" than something no serious gamer would even consider buying today. The Farcry 2 benchmark for the HD5450 puts it at an average 20 FPS on Medium Settings, Low Shadows, No AA. Simply horrible. Its probably fine if you want to play

Secondly, Intel doesn't need to be bastards, they can just continue with the bog-standard half-speed PCIe 2.0 link that they have on their Atoms. This doesn't provide enough bandwidth to run a retired analog cigarette vending machine let alone a modern GPU. If Intel doesn't want a GPU on their platforms, it is trivial to abide by the letter of the law and still s

1) Nobody gives a shit about PCIe speed on the Atom. It is a low end platform, for netbooks. You are not putting discrete GPUs at all on it, never mind fast ones. You do not want that kind of battery drain, or cost, for that platform. Speed is really not relivant.

2) PCIe is way, WAY faster than it needs to be. 8x, which is half speed, is still more than you need as HardOCP found (http://hardocp.com/article/2010/08/16/sli_cfx_pcie_bandwidth_pe

And neither of those things at all matters. So CUDA stuff needs more bandwidth? I can believe it (though I'd like to see evidence) but you don't go and run CUDA stuff on an Atom. Intel's desktop boards have plenty of bandwidth for PCIe. All the desktop chipsets and boards support full PCIe 2.0 16x on their slot. Their x58 and 5520 chipsets support multiple 16x slots. You can pack on the CUDA cards and have plenty of bandwidth, no problem. We've got a Supermicro system at work that uses an Intel chipset that

I wasn't saying a media center needed a pile of bus speed, simply if i want netflix i have to run windows, which means x86, low powered and cheap and that means atom (yes i know about via, and they are lower power than atom, but are more expensive, cap-ex wise).

So if I want my atom to do 1080P high bitrate high profile level 5 h264@24fps I need either the mini-pcie broadcom card, or a nvidia gpu... Maybe the hd4500 can do otherwise, but not the last time i looked, nor could the low powered AMD gpus. So no p

OpenCL is made for something like an Atom. When you start talking about number crunching, serious numeric computation, an Atom along with a couple of GPUs makes a hell of a lot more sense than almost anything else. Especially when you are talking about thousands of these machines.

You seem to have an obsession about the Atom and its inadequacy for most tasks. Guess what? They sell reams and shitloads of Atom boards to the server market, and I know of several big ass rooms that

We have 2 desktops and one HTPC storing most data on an Atom 330, just using its internal SATA and IDE connectors, and it never even hiccups. ZFS is a beautiful thing.

I also have seen these computers deployed in helicopters and fixed wing craft, in remote (read: tent and tiny generator) applications, in cars, and other places you wouldn't want to put a screaming server into mainly for power consumption issues.

2) PCIe is way, WAY faster than it needs to be. 8x, which is half speed, is still more than you need as HardOCP found (http://hardocp.com/article/2010/08/16/sli_cfx_pcie_bandwidth_perf_x16x16_vs_x16x8/6) even for extremely high end cards in multi-card setups. For that matter on the forums Kyle said that 4x (quarter speed) is still more than enough for cards at 1920x1200. The highest end discreet cards don't need it, you are fine.

Imagine a world without AMD, cyrix, Nvidia or other chip manufacturers. There would be no market or competition to face Intel and the company could force you to run whatever they wanted. I mean, a lot like it is now, but more so. As a consumer, figure out how to support the competition equally or there won't be any.

also light peak only works with intel video and if you want to use your usb keyboard or mouse $30 cable or hub that needs a wall wart as light peak may not pass power.

Not really, light peak complements USB as they share the same physical port. You can still use your cheap USB cables at USB speeds. If you want "light peak" speeds then you need to use a more expensive cable - one with 4 fiber optic lines in addition to the 4 regular USB electrical lines. Passing power will work just like with USB.

want Ethernet $30 cable

Possibly, but only in those rare situations where your device is so small that there is no room for a regular ethernet port.

want to use a ati or nvidia video chip you may need a piggy back cable to make it tie into the light peak network.

The original Apple laptops with DisplayPort also did not route audio through the displayport. This is why there were no displayport to hdmi converters available from Apple (other people offered them but they lacked audio). So there might be some hardware design limitations to think about - or it might just be software.

But the new Apple computers do support audio through the displayport - so they have obviously thought about the problem. And it is not some hack like what you see with your ATI card. So