Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Dputiger writes "Given the recent emphasis on mobile computing and the difficulty of scaling large cores, it's easy to think that enthusiast computing is dead. Easy — but not necessarily true. There are multiple ways to attack the problem of continued scaling, including new semiconductor materials, specialized co-processor units that implement software applications in silicon, and enhanced cooling techniques to reduce on-die hot spots."

Of all the next-generation technologies that we’ve discussed at ET, including carbon nanotubes and graphene, III-V semiconductors that use materials like indium, gallium, and arsenide are by far the most likely to make an a mass market appearance within the next ten years.

Arsenides are indeed a class of materials containing the element arsenic that includes In-Ga-As semiconductors. But let me try to fix the original sentence, it's not as bad as you imply, though definitely incorrect.

Of all the next-generation technologies that we’ve discussed at ET, including carbon nanotubes and graphene, III-V semiconductors that use elements like indium, gallium, and arsenic are by far the most likely to make an a mass market appearance within the next ten years.

Changes in bold. Indium and gallium are the group III elements and arsenic the group V element that make up III-V semiconductors. Poorly edited yes, but not enough to disqualify the whole article, at least in my humble anonymous opinion...

The writer, having done the research, would be unlikely to make a mistake like that. It's more likely a 'correction' performed by the editor, who mistakenly interpreted the sentence as a gramatical error. Easy for someone to see 'gallium arsenide' and misinterpret it as the list 'gallium, arsenide' with a missing comma.

Take a second look. I made a post in their Disqus comments pointing out this error. The author of the article replied in less then ten minutes acknowledging the error with a promise to fix it. The error was fixed by the time I hit refresh. Instead of being all high and mighty, perhaps next time you should help out. I did, and it worked. Consequently your entire post is now moot.

Take a second look. I made a post in their Disqus comments pointing out this error. The author of the article replied in less then ten minutes acknowledging the error with a promise to fix it. The error was fixed by the time I hit refresh. Instead of being all high and mighty, perhaps next time you should help out. I did, and it worked.

Disqus is only there to make you look at ads, not add your $.02 to a forum discussion.They aren't looking for $.02 obviously, it is censored to exclude anything remotely controversial to popular thought.That leaves us with the purpose being ad revenue. It keeps you on whatever page for longer and multiple refreshes so they can show that you saw ads. They make money, you get shit for your time and contribution.When I see a Disqus box, I know it's just a sucker trap.

GaAs semiconductors have been around for years. The issue is it sucks at oxide growth and therefore makes it expensive to fab.

You can get around this by adding aluminum to GaAs, creating a hetrojunction transistor. Other materials like Indium can be used as well.

The beauty of these materials is you can get different bandgaps making it possible to create a true multijunction solar cell bumping up the conversion efficiency to around 40% which is almost unheard of in normal Silicon solar cells. The devices a

Well, without the commas that's a material which people in labs have been using to get diodes as thin as a single atomic layer for nearly a couple of decades, so the tech journalist is nearly there and better than some. The stuff I saw in about 2000 was put on by chemical vapour deposition, which is a fairly cheap way to do things, but of course a very thin diode junction is small in one dimension but needs improved masking technology to be small in 3D.

So I don't see any reason that this would live to "enthusiast computing" (I read this as stuff made at home), but I don't see any problem with the statement you quoted.

I'm assuming you're saying arsenide should be quoted as the entire compound, or that it should be gallium arsenide. It's not exactly an egregious error since it can be solved by adding an s to the end. If they had said "arsenides" it would have been correct - not far off.

I say arsenides would have been correct since IIRC they use several arse

I'd say the bigger question is why we need to continue to scale in any place but the HPC and server spaces? Lets face it folks the reason PC sales are down is because when AMD and Intel switched from MHz wars to core wars PCs went from "good enough" to "insane top fuel dragster" and the software just hasn't kept up, not even close. My main system is nearly 5 years old yet has more cycles than i know what to do with and even my customers on the first gen C2Ds and Athlon X2s frankly spend more time with their

In theory games will always be able to use your spare cycles so long as those cycles are added in series. Unfortunately not all problems are embarrassingly parallel. The only reason games don't already make use of the state of the art in PCs is because they are mostly being made for consoles with PCs as an afterthought and so can't move forward until the console makers decide to do an update and that doesn't happen very often.

Even for our own programs certain problems cannot move forward anymore because the

Reality is that "enthusiast" computing today depends on what companies care to provide as "slightly ahead of the current state-of-art" at exorbitant prices. Intel's not going to launch a new CPU for enthusiasts. AMD isn't going to launch a new CPU for enthusiasts. If they do it's just because they can cherry pick some CPUs from their server process (Intel) or that can perform exceptionally well for equally high power consumption (AMD). It is so insignificant to the overall market that progress would happen the same with or without them. We're just not a significant enough portion of the market to really warrant a new process or capacity or whatever.

The funny thing is that when lightning bolt like breakthroughs hit people almost never know from whence they come. Somehow I get a pic of a kid with a handful of Raspberry Pi units somehow feeding in and out of a multicore processor with a smartphone somehow involved crunching magical equations that leave my jaw hanging down. It is almost like the mathematicians at Oxford getting mail from an unknown person in a mud hut in India with solutions for equations that nobody has ever been able to do before. Genius is a sneaky quality. It lives where it likes and resides in unlikely meat bodies.

Intel *does* make CPUs for enthusiasts - the i7 range, which give the best performance current technology can give at the top end and £1000+ prices. They don't sell in enough volume to make a ton of money - the cash-cow is the midrange stuff, the i3 and i5. They are important for company reputation, keeping Intel firmly established as the King of Semiconductors: They can make the fastest chips around.

For people who don't make lots of looong journeys, running a horse can be cheaper, less impactful, etc.

Indeed, I only stopped riding horses around here because there were too many cars.

The whole "horses are outdated" thing is like the "everyone rushed to the cities for a better job" Industrial Revolution myth: a lot of it was huge landowners making it untenable to continue renting their land, because they wanted to push people into the cities for more profitable work.

Right. Motherboard companies should start churning out Raspberry Pi-sized motherboards, cases and accessories. Better still: why can't we have DIY tablets with upgradeable SoCs, touch screens, RAM and flash memory? I find it perverse that no major tablet brand even has a user-replaceable battery. Why can't an OEM produce a tablet that uses two or even three off-the-shelf cellphone batteries to match the capacity of the larger tablet batteries?

An experimental-electronic-development oriented port, backed by three fuses on the mainboard.
Digital and analog I/O and DC power connector, 37-pin connector on the ISA bus.
Two independent, bidirectional 8-bit ports
Four A/D pins routing to a 12-bit A/D converter
Four D/A pins conne

Oh right, I meant to say tablet. Tablets support serial usb devices, and serial USB ports can be pretty fast too. You still need a PC to program the thing unless you can get openocd to work on your tablet.

I don't think I would say "Enthusiast Computing" are limited to people who upgrade their processor to the latest and greatest every 6 months. I would rather call those folks "PC Game Enthusiasts". I would call Enthusiast Computing things more like building Beagle Bone/Raspberry Pi clusters, or people doing more interesting things than just installing new motherboards constantly.

There is no such thing as post-PC for the same reason there is no such thing as "post-doorknob" or "post-handle."

The PC is the correct form factor for getting work done by humans. Mobile devices are not. This will only change if human physiology changes, which is unlikely in any time frame measured in intervals shorter than 100,000 years.

The "post-PC era" is a marketing slogan designed to make you buy things. It is designed to get you back on the upgrade treadmill starting from the beginning again. It is not technologically accurate.

The PC is the correct form factor for getting work done by humans. Mobile devices are not.

I'd say, instead, that the desktop and laptop PC are the correct form factors for getting done the sort of work that you do when seated for a long time. There are probably people whose work is sometimes done while on the move and for which a desktop PC is obviously not going to work and for whom a laptop PC might not work very well; consider, for example, somebody managing a construction project who might need to look things up, enter data, do some calculations, etc. while on site. I suspect that a mobile

The PC is the correct form factor for getting work done by humans. Mobile devices are not.

I'd say, instead, that the desktop and laptop PC are the correct form factors for getting done the sort of work that you do when seated for a long time. There are probably people whose work is sometimes done while on the move and for which a desktop PC is obviously not going to work and for whom a laptop PC might not work very well; consider, for example, somebody managing a construction project who might need to look things up, enter data, do some calculations, etc. while on site. I suspect that a mobile phone would be the wrong form factor for them, but a tablet might be the right form factor.

I'll actually give you a primary source, real life example.

I'm a paramedic, every single patient for whom I have responsibility of care for, I have to generate documentation for. Up until about 2008, that meant actual paperwork, about then, the industry as a whole being phasing in electronic medical records. To the business office, they're great, because billing the patients, and keeping the records is much easier, and for me, the end user of the system, it's great because, especially when you're using a

(My example was also actually also a real-world example, not a hypothetical example, but was a case of somebody doing that sort of white-collar construction work who asked for my advice on machines to buy, rather than somebody who had that machine already; he already has PCs at home and, I think, at work, but needed something for when he's actually at the construction site.)

First, let's get rid of the notion that laptops are inferior species spec wise, compared to a server, they are, compared to a high end desktop they are, but my primary computer is a laptop, this is because I spend 48 to 96 hours straight at work, so it just makes sense for me to have a computer I can take with me. In fact, I would go so far as to say that my laptop, which is a higher end model, but certainly not the highest end, is superior, spec wise,

In the UK, portable computing devices suited for data collection have been available since late '80s from Psion. And I got my first tablet PC over a decade ago, with a proper stylus - more usable for non-trivial work than thumbing an Android.

The "post-PC era" is a marketing slogan designed to make you buy things.

And to read columns blathering on about the "post-PC era". It's all about the CPM [wikipedia.org], err, umm, the CPI [wikipedia.org].

Not that TFA has that much to do with consequences of the "post-PC era"; they say "Is the PC enthusiast market dead, a casualty of the push into mobile?" (and answer the question in the negative), but that's all I could find. They probably slapped it onto the title just to get people's attention.

Yes and no. Tablets, phones and everything in between are replacing an aspect of computing, and that's strict consumption. Grandma doesn't need a PC to read email or look at grandkids photo's, she can use a tablet and have a much better user experience and gains the benefits of portability and reduced power use. People will use them to read books, watch movies or browse the internet. In general they won't be using them to create anything.

Actually, Grandma does better with a PC, with a real keyboard and mouse and a large monitor with large easy-to-read fonts. Agreed, she doesn't a tower case with dual water-cooled CrossFire GPUs like her grandson. A simple little cube thing with some USB ports, HDMI/DVI and an audio output are sufficient. Grandma's eyes aren't good enough to see a tablet screen, her hands aren't steady enough to manipulate small touchscreens, and she can't hold a tablet and a small dog/cat in her lap at the same time.

The PC is the correct form factor for getting work done by humans. Mobile devices are not.

Oh, I think there are better form factors. Take a look at a traditional workspace: a huge desktop/drafting table, dozens of documents/pages, and walls. Now imagine the desktop, the documents, and the walls all turning into smart, active displays. That's the correct form factor for humans. A 27" HD monitor and a noisy metal box on the floor are not.

Rubbish. You have obviously not been paying attention to the advances in HCI tech. So unless you are saying that a "very VERY long time" is in the 15-30 year ballpark, you are quite simply wrong. Sure, 30 years IS a long time considering how long computers have been around but I certainly hope I'll still be around then. Just like fixed-line phones are quickly becoming a thing of the past, so too will other devices that don't move easily so a person can quickly be fully productive in any quiet, semi-private

I'd say that A decent size monitor, full sized keyboard, and mouse is the current dominant form factor for getting work done. Whether they are connected to a desktop or mobile device is irrelevant to our physiology.

That said, performance of the device connected to the monitor, keyboard, and mouse is what should be considered for productivity.

source: See computing history from mainframes to minicomputers to microcomputers to mobile devices for their form factor relevance..

The new semiconductor technology angle in the article seem highly fishy to me. Apart from the fact that the statement felt like it may have said "In 10 years we will all be living in colonies on the moon", III-V materials have been losing market share to silicon for decades.

The article mentions that great electron mobility of the III-V materials, which is true, but forgets to mention that they had poor hole mobility. Now I am not a process expert, so maybe there are new techniques to address this. However, over the past 20 years or so this meant that you couldn't make very good CMOS logic and had to use NMOS only architectures. This and the poor scaling has kept the III-Vs away from large scale integrated logic chips.

The III-V devices were used in RF circuits, but they were replaced by Si-Ge and now many RF circuits use regular silicon processes. The III-Vs are still useful for optics.

The truth is that silicon has many problems that may prevent the industry from continuing to scale circuits to smaller geometries and the available workarounds are generally painful. But, the other options are worse.

Maybe in 10 years we will all be using cell phones that use carbon nanotubes... in our colonies on the moon.

At this point, any semiconductor is fair game if it helps decrease feature size and power consumption (the main factor limiting performance nowadays). The substrate is likely to remain Si because of its low cost, large wafer sizes and low defect density.

But what does advanced semiconductor technology have to do with enthusiasts? I have not the slightest clue.

An enthusiast wants to own his hardware, he doesn't care about 5.1 GHz uber-core machines. What the enthusiast wants is open specs, common interfaces, accessible GPIO, non-DRM memory or hardware, and open source code. Someone who buys the latest stuff from Intel and slaps Win 8.1 or Ubuntu on it so that they can run WoW is not an enthusiast they're just a rich consumer.

What the enthusiast wants is open specs, common interfaces, accessible GPIO, non-DRM memory or hardware, and open source code.

Unfortunately, enthusiasts like you and me are in the minority. The fact that people buy locked-down video game consoles for ease of use [slashdot.org] is evidence that the majority don't care about owning their devices. It's unclear whether there are still enough enthusiasts to sustain a market for such owner-respecting computing devices.

What the enthusiast wants is open specs, common interfaces, accessible GPIO, non-DRM memory or hardware, and open source code.

Unfortunately, enthusiasts like you and me are in the minority. The fact that people buy locked-down video game consoles for ease of use is evidence that the majority don't care about owning their devices. It's unclear whether there are still enough enthusiasts to sustain a market for such owner-respecting computing devices.

But you don't earn your profits based on the proportion of the market that you control (you'd rather 1% of a market of one billion, than 100% of a market of one)

As the population of the industrialized world increases, you end up competing with more other producers. One metric that matters for economies of scale is the number of consumers per producer, and I don't see how this ratio changes. In addition, one firm or cartel of firms that dominates a market has the power to create a network effect toward the cartel's products, and if you want, I can explain how the mainstream console makers have been such a cartel for decades.

Both those with vintage restored spit polished classic cars and the ones with souped up race track cars are enthusiasts just in a completely different fashion. In your world only the tinkerers are "real" enthusiasts and the people who want a car that can handle 150 mph well are just rich customers. Nobody but an enthusiast would ever start tweaking DRAM timings or the BCLK or look at anything considered "exotic cooling", even if squeezing the last FPS out of their closed-source game with DRM on closed-sourc

Can't we just use an SSD? I think a cooler hack would be to have programmable FPGAs directly attached to your computer, acting as fast custom hardware which you can reshape on the fly to be anything you want.

The field is becoming mature, and the point of assembling your own computer, and getting it to work is just not what it used to be.

Yes it is. The only difference is that now the prefabbed computers are a lot closer in price (frequently cheaper!) to what it would cost for you to build it yourself with equal components.

You still get to mix and match components that cannot be found in mainstream prefabbed computers, and in those cases you are still significantly better off money-wise building it yourself. As an example any sort of silent PC setup isnt mainstream, so you pay a significant premium having someone else build it for you.

Gallium arsenide has been just about to replace silicon for 25 years, now. And Transputers were invented in the 80s. Sure, maybe it's finally time for these to hit the mass market, but one would be ill-advised to hold one's breath waiting for it.

If it is indeed time to change (why change into GaAs? If you are changing, why not carbon?) you can expect that change to happen in a 15 to 25 years journey, as no fab is prepared for that, and no process works well on the next substrate yet.

Fab issues. There are no economical-at-scale ways to manufacture graphene processors, even if certain engineering issues (poor band gap) are solved. But GaS is a well-established technology, been around for decades - all it needs is a few incrimental improvemenents, no need for revolutionary new science to support it.

Programs like "Mail" or "Messages" could be implemented in reprogrammable silicon.

You need how much compute power to read mail?

Most users just don't need that much power. Once everybody could play streaming HDTV, the couch potato market was covered. Rendering in gaming could still improve, and NPC behavior could get smarter, but really, GTA V pretty much has that nailed and it runs on last-generation consoles.

There are people who need more power, but they're running fluid dynamics simulations or rendering movies or simulating new ICs or something like that. I've run Autodesk Inventor on 24-CPU workstations. That's one of the few interactive programs that can usefully use a 24-CPU workstation. It's not a mass market product.

The applications that need vast amounts of additional compute power are there, but they're not high-volume applications. Nor are they "enthusiast" applications. There's not enough volume there to justify heavy investment in faster CPUs.

This may change as we have better robots or something like that. But speeding up existing desktop apps, no.
(Program load times are still ridiculous long, but mostly because of stupidity like phoning home for updates, waiting for the license server, fetching ads, or using virtual memory in a world where memory is cheap.)

There are people who need more power, but they're running fluid dynamics simulations or rendering movies or simulating new ICs or something like that. I've run Autodesk Inventor on 24-CPU workstations. That's one of the few interactive programs that can usefully use a 24-CPU workstation. It's not a mass market product.

The gaming market is only rudimentaly separating the workload into X number of threads -- anything else is a complexity nightmare for them -- sure many are now separating the physics stuff into one thread and the A.I. stuff in another, but they are not breaking up the A.I. into multiple threads nor are they breaking up the physics stuff into multiple threads.

Instead they are relying on the middleware frameworks ability to be more granular without their intervention, and the middleware just isnt designed f

The gaming market is only rudimentaly separating the workload into X number of threads

They did that with Xbox because they had 3 x 3GHz cores towork with. They did stupid things like pipelining (huge input lags) whole engines to bump up fps.This time around they will get 8x 1GHz and there is no other way than to design engines with data parallelism in mind.

Doubt it. Most game developers have not even figured out how to use more than 2GB of main memory or more than one core.

Game developers don't give a fuck about the CPU anymore. It is all GPU where hundreds to thousands of "cores" are in play.

Yes they do and no it isn't.
CPU cores are much faster than GPU cores so for things that can't be parallelized it is much faster doing the calculations on a CPU. There are no games that do the main physics and AI calculations on the GPU because most of that stuff can't be parallelized enough.
The only time something will perform faster on the GPU is when it can be parallelized into hundreds or thousands of calculations.

There are a few games that use the GPU for physics. Collision detection can be parallelized nicely. Not many though, simply because few games would see any benefit from it: You rarely have more than a handful of moving objects at a time, easily enough for the CPU to handle alone.

I wrote a mod for ut2k4 that uses CPU to calculate volumetric explosion simulations - due to the limited CPU time available it has to use a very crude model, but it's still better than the standard line-of-sight approach games move

Game engines use their own thread handling so it doesn't matter which OS the game is on. And they do often parallelize things like AI, but it is still all done on the CPU because it is still faster than using the GPU unless there are thousands of things. And often things that could benefit from mass parallelization can be simplified so that they don't need to be.

You could actually speed a lot of calculations up using the GPU, but the GPU has limited power and you're taking cycles away from it that coul

Doubt it. Most game developers have not even figured out how to use more than 2GB of main memory or more than one core. I can't even think of a game that currently uses four cores. The next gen consoles have four, and thus that will be the norm for PC games as well for the next six to nine years.

Total War series has been using four cores for a number of years and I'm sure it's not the only game developed for the PC that does so.

We're hitting a wall on single threaded performance due to clock speed limitations, but CPU cores keep getting smaller and more power efficient. In a few years, we'll have the ability to put 32 or more cores in consumer CPUs, and it wouldn't surprise me if we have 8 core CPUs in smartphones and tablets. The key to continued performance improvements is better multi-threaded code, to allow us to effectively split up the workload across more cores.

There is a day and night difference in size requirements between a desktop class x86 core and an smartphone class ARM core. Desktop CPUs with 32 full featured cores are still quite far away in my opinion. And to be honest I'm not even sure if there is any advantage to that. At least for the next decade I can safely expect APU like processors with some big general purpose cores and tons of smaller task optimized cores(GPU cores for example). Even the exynos you posted is not a true 8 core ARM A15 processor b

As an "enthusiast", for me, it's almost all about latency. I want a system that responds as close to instantaneously as possible, especially for the stuff that really should be nearly instantaneous on modern hardware. These days, that means plenty of ram and a fast storage subsystem: SSD is the best upgrade I've done in years. I wait less. A 2 hour render is still a 2 hour render, but when I start up a heavy application I only wait 3 seconds instead of 10, or even 20. It just makes everything less frus

Let's face it. The uber duber turbine-sounded high end desktop doesn't get much of a use if you don't have some kind of time-management disorder or addiction. If you work or study you couldn't get much time on your precious anyway. If you work you get a console: Turn on and play and don't care of the price of games, you don't have much time anyway.
If you run CFD simulations or something like it it's your employer problem to get you the tools you need. From my experience you just use another computer to d

> The uber duber turbine-sounded high end desktop doesn't get much of a use if you don't have some kind of time-management disorder or addiction

Not everyone is a couch potato.

There will always be people that need to get something done or are interested in something a little better. The fashionista mentality that tries to insult anyone with more than half a brain cell will help ensure that average consumer computing devices don't fit that description.

To another, I can quite happily confirm that the enthusiast market is not dead quite the opposite it is thriving. The latest generation of games consoles has created a surge in interest from people interested in upgrading existing systems or purchasing new ones.

I build plenty of PC gaming machines and my build queue is full for the next 18 months. I've even had people interested in getting the best possible sound delivered from their machines for their HTPC setup. No, the enthusiast market is definitely ali

Look up Piratebox. There are a bunch of people working on exactly that. With limited success - the tech is just about there, but the numbers aren't. You can't build a mesh without more people coming together locally.

my digital audio workstation runs Logic Pro X, Pro Tools 11, and Cubase 6.5. no tablet or phone can replace the desktop, which has not only several hard disks and lots of RAM, but an operating system capable of running plugins from a variety of 3rd party sources. I'm in no position to junk this thing for whatever might happen to be "hot" in the next couple of years, because enjoy working with older versions of software which are no longer supported. IOS comes close to OSX and Windows 7 as far as being able to run basic audio and midi recording, but the musical instrument industry still hasn't completely cracked the nut on integrating hardware and software instruments, providing a comfortable recording, mixing, and mastering workflow. to my knowledge, enthusiasts like myself will still be needing enthusiast computer hardware for the foreseeable future.

Real enthusiasts have always been the ones that wanted to really work with the hardware, whether the object was a car engine, vacuum-tube TV, or a computer. Fewer kids/adults developed the interest after the rise of "disposable" consumer culture, but from what I've been reading, that trend has slowly reversed as the weak economy started pushing more and more people to fix or improve whatever they can rather than blowing a bunch of cash on a replacement.