So, it looks like the Mac has crested

ZnU, fantastic missrepresentation of what EH2 and with some more reservation I hold as position, namely that there is a lack of:

Current usecases for the average consumer. Your (and others!)examples really don't fit that criteria.

It is his MO. Mis-represent your position and prove it wrong. That is called a strawman. he is the king of strawmen.

And really, it isn't even current use cases. He is having a heck of a time coming up with anything for nearish future term use cases. He can only come up with things that are decades away, and often aren't a performance issue, but rather programming issue.

ZnU, fantastic missrepresentation of what EH2 and with some more reservation I hold as position, namely that there is a lack of:

Current usecases for the average consumer. Your (and others!)examples really don't fit that criteria.

It's kind of silly to argue over the positions of third parties, but EH2's argument goes far beyond what you're implying here. He claims that in the past it was easy to see consumer use cases that required lots of additional computing power, but today it's not. He has rejected examples of such use cases that have been offered up as being too vague. So I detailed one.

The criticism of your usercase stands. What EH2 described were actual usercases, even I remeber people complaining about those. What you are describing is an imaginative usercase, a case that is rare even in university circuits.

You can see the change even in adverts. In the 90's the PC specs were spelled out exactly, with big font, what processor, memory size and type, harddisk, display adapter, OS, etc, even power supply size. These days adds for general consumer? Often just general processor family, memory size, OS. Anything else in tiny print at the bottom of page.

In a year or 2 probably even memory size will be dropped, and all specs but processor family will be at the bottom of add in tiny print.

I know. I describe things people were actually DOING and wanting it better. He picks something nothing like it.

Heck, remember the big deal when JPGs got progressive encoding (is that the name?) where it didn't have to download the whole file to see the image, but you could see parts of it as it became available? He confuses an implementation with an idea (youtube for video).

and the things I mentioned were OBVIOUS and NEEDED and WANTED. He can only come up with fanciful maybes and nice-to-haves (maybe) or stupid shit that people wouldn't care about. The ONLY things he has come up with that are worthy are multiple streaming streams (which wouldn't require more than about 30ishMbs) and 4k video (which is questionable for its usefulness and success).

ZnU, if the need for computer power is so obvious, why the shift to mobile? Why is ARM so much more popular than the much more powerful i5 Tablets? Are you now with a straightface trying to fit your usecase with current market directions?

Nobody here will agree with you on this one. Computer power takes a backseat towards mobility.

I have brought this up multiple times and he ignores the clearly obvious meanings. Hell, he ignores the whole point. I ask this over and over. I mean people are talking how users can use a tablet to REPLACE a desktop or laptop. Surely if the need for computer power was needed, this option would be impossible!

The criticism of your usercase stands. What EH2 described were actual usercases, even I remeber people complaining about those. What you are describing is an imaginative usercase, a case that is rare even in university circuits.

Consumer digital video wasn't very much of an actual use case in the early '90s, and all you have to do is step back a couple of more years for it to be every bit as speculative as augmented reality now is.

We weren't capturing it, but we were watching digital video.

Quote:

But it's trivial to provide examples of technologies presently at other stages of development/adoption.

Trivial, but you have continually failed to do so.

Quote:

Speech recognition is a little further along, not nearly as good as we'd want it to be, but a credible solution to some problems.

And runs on decade old hardware.

Quote:

3D is still further along, again not yet in its fully mature form (entirely convincing photorealistic real-time rendering), but nonetheless in a fairly sophisticated state, and the basis for a pretty large industry.

and your examples are games and industry. Not non-hard core gaming consumers.

Quote:

For instance, machine vision has very early, primitive commercial implementations, in the form of things like the After Effects rotobrush, the facial recognition features in photo editing software, etc.

and the issue, as noted multiple times is software, not hardware.

Quote:

In the early '90s, PC users were often tech enthusiasts. Now personal computers are entirely mainstream. You're simply seeing marketing respond to that shift.

No. THis doesn't explain it. if the consumers needed that power, than tablet's wouldn't be able to do it.

Quote:

This simply doesn't have the implications you and EH2 want to claim it does.

Of course it does. Why didn't tablets take off a decade ago? Because the performance hit was too high. It isn't too high. We all know that phones/tablets are only a small fraction of the computational power of desktops. And yet people find them useful and are talking of them replacing PCs.

Quote:

As I have noted at least a dozen times during this discussion, trading off one thing (such as performance) for another thing (such as mobility) within a given generation of devices simply does not indicate a permanent lack of demand for more of the thing that has been traded off.

What it absolutely DOEs imply that there was an excess in the thing traded off. Users have a surplus of performance on desktops, and they are trading that surplus for other things, mobility in notebooks, and even more so in tablets/phones. You ignore this over and over and the implication is there and no-denyable. The ONLY reason that phones/tablets can work as they are is due to the huge surplus consumers have in their desktops.

Quote:

Augmented reality wouldn't be of much interest without mobile computing,

and you use it as an example of desktop needs.

Quote:

Again, this is like saying that if you buy an SUV rather than a compact car, thus trading off gas milage for cargo space, this demonstrates you will be entirely indifferent to the future development of SUVs with lower gas milage.

Not at all. It would be like everyone driving 18-wheelers and then moving down to SUVs. They had such a surplus in storage that they could give up a huge amount and still have a useful vehicle. You are trying to argue that everyone needs 18-wheelers, and that future use cases will justify them. When the whole market has noticed that they don't, in fact, need that 18-wheeler.

Because the resolution was low, bandwidth was slow even then for low-res images. We all wanted higher res images and video.

And this is distinct from the present use cases I've offered for additional computational capacity... how, exactly?

Echohead2 wrote:

I understood it perfectly. You are the one that failed to understand it. Do you think that each person would have to have that power to get the results? Or they run it that way to DEVELOP the algorithm and then simply code it for CPUs?

The line between development and use is not quite as clear as you're making it out to be with these kinds of systems. Ideally you want a system which can learn and adapt on end-user hardware. Even if you are just interested in fixed function, I'm also not sure how you imagine you take a trained neural network with (I'd assume) at least hundreds of thousands of neurons and reduce its behavior to a trivial algorithm.

And again, unless you are actually arguing that human-level image recognition is possible on a six year old mid-range consumer PC, this entire line of argument on your part is a red herring. Are you actually arguing that?

Echohead2 wrote:

This is a nice diaribe, that ignores the hardware in question, pretends that augmented reality is a must-have killer feature, and also acts like it will happen anytime in the near future like he is describing.

None of this is less plausible now than streaming HD video to a 1.5 pound tablet was a couple of decades ago.

Echohead2 wrote:

And runs on decade old hardware.

Again, you are simply playing games here. If there's any implementation of something at all, you say it doesn't require more computational capacity because it "runs on current hardware". If there's no implementation, you say it doesn't count because it's too speculative or too far away. You've created a framework that allows you to dismiss any possible use case in any state of development or adoption — and then you argue that the present is different from the past by not applying this same standard when looking at past examples of (then) future use cases.

This is all pure handwaving.

Echohead2 wrote:

Of course it does. Why didn't tablets take off a decade ago? Because the performance hit was too high.

Yeah, I'm sure it had nothing to do with improvements in screen and battery technology, wireless networking, display technology, UI innovations, etc. It was all just CPU performance. Sure.

The hilarious thing is that even if we were to grant that, it still doesn't have the implications you want it to. Tablets crossing some performance threshold and becoming useful for some particular set of end user computing tasks does not imply a lack of future demand for more capacity unless the set of tasks end users are interested in performing remains fixed for all time.

Echohead2 wrote:

What it absolutely DOEs imply that there was an excess in the thing traded off.

If by 'excess' you mean that the device does not become utterly useless with one unit less of the thing traded off, this is technically true. In, again, precisely the same sense that if someone trades off gas milage for cargo space, clearly more efficient cars offered 'excess' gas milage.

Echohead2 wrote:

and you use it as an example of desktop needs.

This is, and always has been, a discussion of consumer demand for computational resources, not "desktop needs".

Echohead2 wrote:

Not at all. It would be like everyone driving 18-wheelers and then moving down to SUVs. They had such a surplus in storage that they could give up a huge amount and still have a useful vehicle. You are trying to argue that everyone needs 18-wheelers, and that future use cases will justify them. When the whole market has noticed that they don't, in fact, need that 18-wheeler.

That would be a great analogy if there were some historical trend in the motor vehicle market that had resulted in vehicles gaining massive amounts of additional cargo capacity at the same price over the last couple of decades, and an entire industry that had come up with all sorts of useful things to do with that cargo capacity. Here in the real world, your analogy doesn't map.

ZnU, fantastic missrepresentation of what EH2 and with some more reservation I hold as position, namely that there is a lack of:

Current usecases for the average consumer. Your (and others!)examples really don't fit that criteria.

The problem with EH2's position (and where I believe it differs from yours) is that he made a prediction about the next five to ten years. It is convenient for him to hide behind your defense, because you have moved the context of the discussion to the firmer ground of the present.

EH2 by his original argumentative statements, was asking for predictions. He then proceeded to dismiss them all as fantasy while trying to bolster his position with a distorted representation of the "obviousness" of past advancements. He is the perfect Epimethius.

ZnU, fantastic missrepresentation of what EH2 and with some more reservation I hold as position, namely that there is a lack of:

Current usecases for the average consumer. Your (and others!)examples really don't fit that criteria.

The problem with EH2's position (and where I believe it differs from yours) is that he made a prediction about the next five to ten years. It is convenient for him to hide behind your defense, because you have moved the context of the discussion to the firmer ground of the present.

EH2 by his original argumentative statements, was asking for predictions. He then proceeded to dismiss them all as fantasy while trying to bolster his position with a distorted representation of the "obviousness" of past advancements. He is the perfect Epimethius.

In EH2's defense, this discussion is old and comes from another thread. Though I'm less outspoken in my position, in essence EH2's forecast of approx a year ago (I don't remember the original thread) is holding up. It depends on metrics, semantics and use cases, but the uptake of even faster broadband speeds is not going as fast as people expected. His premise, 10mb is plenty for the average user is debatable... but the use cases offered are indeed not that "average" (again; we enter here into the realm of semantics).

EH2's posting style can be a tad rigid Sometimes he's dead wrong (aren't we all), or at least often enough I fully disagree with him. But it's never as ridiculous as some people here make it out to be. He's on the cautious/cynical end of the progress spectrum. There's got to be one

But that's just my take on EH2, he's more than capable to rile everyone up without me trying to analyze his posts

In EH2's defense, this discussion is old and comes from another thread. Though I'm less outspoken in my position, in essence EH2's forecast of approx a year ago (I don't remember the original thread) is holding up. It depends on metrics, semantics and use cases, but the uptake of even faster broadband speeds is not going as fast as people expected. His premise, 10mb is plenty for the average user is debatable... but the use cases offered are indeed not that "average" (again; we enter here into the realm of semantics).

None of the predictions made in this discussion have really had a chance to play out sufficiently since it began. One major early topic, for instance, was my prediction that 100 Mbps Internet would be at least as widespread in 2021 (in the US) as 10 Mbps Internet was in 2011. Given that I, personally, have moved from 10 Mbps Internet to 150 Mbps Internet since that discussion took place, I'm feeling pretty good about that one. I could certainly make the case that developments over the last year and a half support my position better than EH2's. But nothing is yet definitive enough that I'd convince anyone who was on the other side of this argument.

But it's trivial to provide examples of technologies presently at other stages of development/adoption. For instance, machine vision has very early, primitive commercial implementations, in the form of things like the After Effects rotobrush, the facial recognition features in photo editing software, etc. Speech recognition is a little further along, not nearly as good as we'd want it to be, but a credible solution to some problems. 3D is still further along, again not yet in its fully mature form (entirely convincing photorealistic real-time rendering), but nonetheless in a fairly sophisticated state, and the basis for a pretty large industry.

There are many technologies which could benefit from additional computational resources, in various stages of development and adoption. Just like it's always been.

Both machive vision and speech recognition aren't computer iensive problems where added power would somehow make a breakthough. You can't get much better solutions out of either even if you let a computer to process for days. They lack in algorithms, not in computer power.

But the fact you can imagine a usercase where increased computing power would help doesn't mean it is a real user case. Using sound to manipulate dropped objects would greatly benefit from increased computing power, but using computers to arrange sand particles, when sanding iced sidewalks, to read collected works of shakespeare isn't a real usercase.

ZnU wrote:

Redo from start wrote:

You can see the change even in adverts. In the 90's the PC specs were spelled out exactly, with big font, what processor, memory size and type, harddisk, display adapter, OS, etc, even power supply size. These days adds for general consumer? Often just general processor family, memory size, OS. Anything else in tiny print at the bottom of page.

In the early '90s, PC users were often tech enthusiasts. Now personal computers are entirely mainstream. You're simply seeing marketing respond to that shift.

Nope. Computers were sold for mainstream in 90s. Same as today. Only adds have changed. Seen adds of computers for computer enthusiast? There really aren't any. The industry where shop designed and build a computer out of best parts available has practicly died since 90s.

These days even computer enthusiast are hard put to name how fast dram they have, completely unheard of in the 90s. Same goes for the processor. Know how much cache you computer processor has? Probably 9/10 of even computer enthuciast these days have no idea. In 90s they would have known.

Both machive vision and speech recognition aren't computer iensive problems where added power would somehow make a breakthough. You can't get much better solutions out of either even if you let a computer to process for days. They lack in algorithms, not in computer power.

It's a lack of both. But this, again, does not distinguish today's proposed future use cases from past (then) future use cases. Modern Internet video streaming wasn't just a matter of taking 1993's digital imaging technology and adding more bandwidth and CPU power — YouTube isn't serving up video clips as a series of TIFF images, after all. You know why H.264 is so patent-encumbered? Because a lot of the techniques it uses are new enough that the patents haven't expired yet, i.e. they didn't exist 20 years ago. See also my discussion here of why we sort of expect algorithms (or at least known-useful implementations of them) to generally develop hand in hand with advancing computational capabilities.

Referring to the rapid deep-learning advances made possible by greater computing power, and especially the rise of graphics processors, he added:

“The point about this approach is that it scales beautifully. Basically you just need to keep making it bigger and faster, and it will get better. There’s no looking back now.”

If you look at the advances over the last couple of years, it really does seem like at least with neural networks, we've had algorithms for decades that simply couldn't be used for non-trivial problems because the hardware wasn't nearly capable enough. It's all very well to say you can leave things processing for a few days, but I suspect if you'd started doing that Google image recognition research I linked to on a supercomputer circa 1980, your code might literally still be executing today.

Redo from start wrote:

But the fact you can imagine a usercase where increased computing power would help doesn't mean it is a real user case. Using sound to manipulate dropped objects would greatly benefit from increased computing power, but using computers to arrange sand particles, when sanding iced sidewalks, to read collected works of shakespeare isn't a real usercase.

Seamlessly integrating virtual objects into physical space, recognizing images, or taking dictation as well as a competent human all seem like legitimately useful things.

Redo from start wrote:

Nope. Computers were sold for mainstream in 90s. Same as today.

This just isn't true. In the early '90s the median US household still didn't own a PC. It's inevitable that the market was much more heavily skewed toward business purchases and enthusiasts.

You also have to take into account that there was a fair bit of lag between when the PC went mainstream, and when PC vendors meaningfully responded to this. I'd argue that only now, with the rise of iOS and Metro, are we really seeing operating systems designed firmly for the non-enthusiast user. Inertia kept the market focused on enthusiast priorities well after enthusiasts became a minority.

But the fact you can imagine a usercase where increased computing power would help doesn't mean it is a real user case. Using sound to manipulate dropped objects would greatly benefit from increased computing power, but using computers to arrange sand particles, when sanding iced sidewalks, to read collected works of shakespeare isn't a real usercase.

Seamlessly integrating virtual objects into physical space, recognizing images, or taking dictation as well as a competent human all seem like legitimately useful things.

Who exactly wants those?

Dictation problem, as I've said isn't limited by computer power. Throwing more processors on it doesn't better the end product. Current processors have more than enough power to do it real time. It is the lack of algorithms that is limiting it.

As for image recognition and augmented reality, both are pipe dreams for consumer use. Google has more than enough processing power idling to do image recognizing, but it is again the algorithms that are lacking. If the problem could be solved with processing power, google would already have a image search from image descriptions.

And again where are the customers who want those? Who wants to peer augmented reality through 4" phone screen? Where are the consumers who are willing to pay for automated image recognization? I wouldn't, and neither would anybody I know.

Redo from start wrote:

Nope. Computers were sold for mainstream in 90s. Same as today.

This just isn't true. In the early '90s the median US household still didn't own a PC. It's inevitable that the market was much more heavily skewed toward business purchases and enthusiasts.

You also have to take into account that there was a fair bit of lag between when the PC went mainstream, and when PC vendors meaningfully responded to this. I'd argue that only now, with the rise of iOS and Metro, are we really seeing operating systems designed firmly for the non-enthusiast user. Inertia kept the market focused on enthusiast priorities well after enthusiasts became a minority.[/quote]

So, in the 90s computer purchases were dominated by enthusiast and business purchases, consumers being a minority? Hardly. And for some reason the computer enthusiast have gone the way of dodo now? As nobody does adds for them. Would seem wise to do adds for them as they are people who spend most on computers. But alas no.

Also as anecdotal, I know a lot of computer enthusiast, many of them work as server managers, designers etc. What computers do they talk about and buy? Keyboards, mice mostly. When on computers, under $200 cheap computers.

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore. Performace isn't used in marketing. So if people wanted better performance, why aren't they interested in it now?

But the fact you can imagine a usercase where increased computing power would help doesn't mean it is a real user case. Using sound to manipulate dropped objects would greatly benefit from increased computing power, but using computers to arrange sand particles, when sanding iced sidewalks, to read collected works of shakespeare isn't a real usercase.

Seamlessly integrating virtual objects into physical space, recognizing images, or taking dictation as well as a competent human all seem like legitimately useful things.

Who exactly wants those?

Dictation problem, as I've said isn't limited by computer power. Throwing more processors on it doesn't better the end product. Current processors have more than enough power to do it real time. It is the lack of algorithms that is limiting it.

As for image recognition and augmented reality, both are pipe dreams for consumer use. Google has more than enough processing power idling to do image recognizing, but it is again the algorithms that are lacking. If the problem could be solved with processing power, google would already have a image search from image descriptions.

And again where are the customers who want those? Who wants to peer augmented reality through 4" phone screen? Where are the consumers who are willing to pay for automated image recognization? I wouldn't, and neither would anybody I know.

ZnU wrote:

Redo from start wrote:

Nope. Computers were sold for mainstream in 90s. Same as today.

This just isn't true. In the early '90s the median US household still didn't own a PC. It's inevitable that the market was much more heavily skewed toward business purchases and enthusiasts.

You also have to take into account that there was a fair bit of lag between when the PC went mainstream, and when PC vendors meaningfully responded to this. I'd argue that only now, with the rise of iOS and Metro, are we really seeing operating systems designed firmly for the non-enthusiast user. Inertia kept the market focused on enthusiast priorities well after enthusiasts became a minority.

So, in the 90s computer purchases were dominated by enthusiast and business purchases, consumers being a minority? Hardly. And for some reason the computer enthusiast have gone the way of dodo now? As nobody does adds for them. Would seem wise to do adds for them as they are people who spend most on computers. But alas no.

Also as anecdotal, I know a lot of computer enthusiast, many of them work as server managers, designers etc. What computers do they talk about and buy? Keyboards, mice mostly. When on computers, under $200 cheap computers.

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore. Performace isn't used in marketing. So if people wanted better performance, why aren't they interested in it now?

Also as anecdotal, I know a lot of computer enthusiast, many of them work as server managers, designers etc. What computers do they talk about and buy? Keyboards, mice mostly. When on computers, under $200 cheap computers.

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore. Performace isn't used in marketing. So if people wanted better performance, why aren't they interested in it now?

Where the heck can you find a "under $200 cheap computer" that can do any meaningful work for a net/sysadmin? You cant be serious.

Also as anecdotal, I know a lot of computer enthusiast, many of them work as server managers, designers etc. What computers do they talk about and buy? Keyboards, mice mostly. When on computers, under $200 cheap computers.

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore. Performace isn't used in marketing. So if people wanted better performance, why aren't they interested in it now?

Where the heck can you find a "under $200 cheap computer" that can do any meaningful work for a net/sysadmin? You cant be serious.

I'm serious, but what they use them for isn't sysadmin work. For that they have work computer, some midpriced workstations.

For gaming they buy/make a computer, far in between. As the performance doesn't bring much to the table, they aren't that much interested in it, and don't upgrade/buy that often, so it isn't much of a talking point.

So those about $200 computers, or even cheaper? That is what computer enthusiast talk about and buy these days. They buy one, configure it to do some tasks and that is it. Home automation, media stations, mini file servers. That is what computer enthusiast these days do. Not liquid nitrogen cooled computers, (one of them did that) but turning cheap computers into tasks where computers aren't usually used.

Also as anecdotal, I know a lot of computer enthusiast, many of them work as server managers, designers etc. What computers do they talk about and buy? Keyboards, mice mostly. When on computers, under $200 cheap computers.

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore. Performace isn't used in marketing. So if people wanted better performance, why aren't they interested in it now?

Where the heck can you find a "under $200 cheap computer" that can do any meaningful work for a net/sysadmin? You cant be serious.

I'm serious, but what they use them for isn't sysadmin work. For that they have work computer, some midpriced workstations.

For gaming they buy/make a computer, far in between. As the performance doesn't bring much to the table, they aren't that much interested in it, and don't upgrade/buy that often, so it isn't much of a talking point.

So those about $200 computers, or even cheaper? That is what computer enthusiast talk about and buy these days. They buy one, configure it to do some tasks and that is it. Home automation, media stations, mini file servers. That is what computer enthusiast these days do. Not liquid nitrogen cooled computers, (one of them did that) but turning cheap computers into tasks where computers aren't usually used.

...and you have this information from where exactly? They're still plenty of enthusiast OC/gaming/high end computing websites and communities

Also as anecdotal, I know a lot of computer enthusiast, many of them work as server managers, designers etc. What computers do they talk about and buy? Keyboards, mice mostly. When on computers, under $200 cheap computers.

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore. Performace isn't used in marketing. So if people wanted better performance, why aren't they interested in it now?

Where the heck can you find a "under $200 cheap computer" that can do any meaningful work for a net/sysadmin? You cant be serious.

I'm serious, but what they use them for isn't sysadmin work. For that they have work computer, some midpriced workstations.

For gaming they buy/make a computer, far in between. As the performance doesn't bring much to the table, they aren't that much interested in it, and don't upgrade/buy that often, so it isn't much of a talking point.

So those about $200 computers, or even cheaper? That is what computer enthusiast talk about and buy these days. They buy one, configure it to do some tasks and that is it. Home automation, media stations, mini file servers. That is what computer enthusiast these days do. Not liquid nitrogen cooled computers, (one of them did that) but turning cheap computers into tasks where computers aren't usually used.

...and you have this information from where exactly? They're still plenty of enthusiast OC/gaming/high end computing websites and communities

I think RfS is making the point that the enthusiasts are talking about Arduino/Rasberry computers, not x86. Annecdotically, the programmers I know (which is a handful) indeed are very much into that. For day to day needs (Nas, PC etc.) they prefer as one of the calls it: "Appliances" => Plug and play devices.

The other kind of enthusiast, namely the gamer, is still interested in hardware, but even there the lifecycle is lengthening, as is proven by my home network. An old i7, an old E8400 and an ancient e6600 are more than enough to handle games like WOW, SC2, the Witcher, Dragon Age, Mass Effect in 1080P. The only time I upgrade something its because it dies (videocards die, probably due to our cats). If I were playing FPS I would be harder pressed for sure, but it's amazing that 8 year old machines are handling mainstream gaming without a hitch.

Another anecdotical point... I am not sure which videocards I'm using atm.... Probably something like two/three year old higher midend ati cards on the E8400/i7 and 4-5 year old high midend Ati or Nvidia on the E6600. I would have to boot them up and check the system hardware to know for sure. Anecdote is not proof, but I can't remember talking about this stuff or having to help friends picking parts. For my brother (a WOW) player I just ordered a Dell. Haven't heard about his PC needs in years until he wanted a small laptop.

And this is echoed by many people here. Though we might be an outlier, one would think we of all people have a truckload of hardware enthusiasts among us. I'm willing to bet that RfS is dead on with his description of market direction.

Besides the 800 pound Gorilla here is Apple... they don't really care about performance, more about user experience. And they convinced the consumer they were right (I for example agree...).

I think RfS is making the point that the enthusiasts are talking about Arduino/Rasberry computers, not x86. Annecdotically, the programmers I know (which is a handful) indeed are very much into that. For day to day needs (Nas, PC etc.) they prefer as one of the calls it: "Appliances" => Plug and play devices.

The other kind of enthusiast, namely the gamer, is still interested in hardware, but even there the lifecycle is lengthening, as is proven by my home network. An old i7, an old E8400 and an ancient e6600 are more than enough to handle games like WOW, SC2, the Witcher, Dragon Age, Mass Effect in 1080P. The only time I upgrade something its because it dies (videocards die, probably due to our cats). If I were playing FPS I would be harder pressed for sure, but it's amazing that 8 year old machines are handling mainstream gaming without a hitch.

Another anecdotical point... I am not sure which videocards I'm using atm.... Probably something like two/three year old higher midend ati cards on the E8400/i7 and 4-5 year old high midend Ati or Nvidia on the E6600. I would have to boot them up and check the system hardware to know for sure. Anecdote is not proof, but I can't remember talking about this stuff or having to help friends picking parts. For my brother (a WOW) player I just ordered a Dell. Haven't heard about his PC needs in years until he wanted a small laptop.

And this is echoed by many people here. Though we might be an outlier, one would think we of all people have a truckload of hardware enthusiasts among us. I'm willing to bet that RfS is dead on with his description of market direction.

Besides the 800 pound Gorilla here is Apple... they don't really care about performance, more about user experience. And they convinced the consumer they were right (I for example agree...).

There is some real data for evidence. The increasing popularity of Apple despite it's lesser performance on the performance per price metric. The popularity of laptops despite their less specs. Intel graphics have grown in the Steam hardware survey, despite their generally accepted worse performance. The long game console generation where the low end box (Wii) dominated based on input method, not specs, as well as the surge of xbox for the same reason (Kinect).

And what exactly are the poster children for computing? What are the 'concept cars' aka dream machines that geeks salivate for? Like it was stated above, the enthusiasts are interested in strange tablets and portable gaming systems and oddball raspberry pi creations, not high end supercomputer monsters, for the most part. There have been entire systems released just for the purposes of playing pirated, 15 year old 2d games.

Also notice that the last two versions of windows run on lesser hardware. The need for better hardware to run interesting features/killer apps is largely gone from client PCs. It's exactly analogous to the push for compact cars during the oil crisis in the 70s, energy efficiency is prized over raw horsepower. The market has matured and most of the applications for pure power have been addressed thoroughly by a commoditized market.

I think RfS is making the point that the enthusiasts are talking about Arduino/Rasberry computers, not x86. Annecdotically, the programmers I know (which is a handful) indeed are very much into that. For day to day needs (Nas, PC etc.) they prefer as one of the calls it: "Appliances" => Plug and play devices.

This is definitely where I am going. I used to build systems, but I stopped because there wasn't any significant advantage anymore to rolling my own. And, I still buy desktops for goodness' sake. For lappies, there's nothing to talk about.

Meanwhile, over the next year or two, I can easily foresee the need to build an old fashioned tinkering machine. The sort of thing we did in the '70s. My need is so impossibly geeky, I won't even bother to describe it even for this group, but it basically amounts to a situation where I'm going to be controlling a series of switches. I mean mechanical ones; relay type switches.

It will be fairly sophisticated for all of that, perhaps the most sophisticated thing I've ever done. But, today's PCs, even desktops, aren't really suited for something with a large number of inputs and outputs, even trivial ones. All I really need to do is figure out how to translate USB to a bunch of discrete control lines (which hardly seems impossible and probably off the shelf if I look hard enough -- USB to RS232 might be all I need with a little serial-to-parallel).

And, I don't need overwhelming horsepower. I need little more than I would have had in the '70s for this. Raspberry Pi and its many cousins, therefore, is the PERFECT machine for this sort of thing. It's going to be a "read this input, change these switches" kind of application.

Moreover, as a kind of side benefit, running it on a lower power platform means that I can probably back it up with a battery or a UPS or something. Since there may be a remote control aspect to this, that would be yet another advantage over a PC. The CPU horsepower is nothing but battery consumption is something here. I want it to wake up after a power failure (this would be deployed where they do happen now and then) and either stay awake or easily and quickly reboot after power is restored to tend to everything else.

Who knows where this will go or whether it will really happen.

But, it is a heckofa lot more likely than me studying the latest Intel spec sheets and building my own PC. That is off the shelf these days.

And, with most CPUs self-administering clock slow downs and other things of that sort, the benefits of overclocking are less than ever; Intel is telling you you don't really need it.

Well, it's obvious that I haven't spent a lot of time with this one, but look here:

Because the resolution was low, bandwidth was slow even then for low-res images. We all wanted higher res images and video.

And this is distinct from the present use cases I've offered for additional computational capacity... how, exactly?

Because people aren't doing what you think that they are. The things you are listing aren't even on their computers in rudimentary form, or being used hardly at all. You are taking seriously edge cases today that in 5-10 years will finally get to that level. and so far a lot of the signs point to programming, not CPu perf.

Quote:

Again, you are simply playing games here. If there's any implementation of something at all, you say it doesn't require more computational capacity because it "runs on current hardware". If there's no implementation, you say it doesn't count because it's too speculative or too far away. You've created a framework that allows you to dismiss any possible use case in any state of development or adoption — and then you argue that the present is different from the past by not applying this same standard when looking at past examples of (then) future use cases.

No. The point is that you brought a bad example (speech recognition). It doesn't need more cycles. It runs very well on current or even few years old PCs. The advances needed are software, not CPU performance.

Quote:

Yeah, I'm sure it had nothing to do with improvements in screen and battery technology, wireless networking, display technology, UI innovations, etc. It was all just CPU performance. Sure.

not so much really. Screens were pretty good, wireless was not great but adaquate, battery tech isn't that much better (I mean it is better, but the bigger deal is less power hungry stuff). Yes the big thing was CPU/GPU performance. To get adaquate perfromance you had to have them be so power hungry that you need bigger batteries, and the lifes were horrible and they were heavy. As CPU/GPU perf went up, it allowed those things to drop.

Quote:

Tablets crossing some performance threshold and becoming useful for some particular set of end user computing tasks does not imply a lack of future demand for more capacity unless the set of tasks end users are interested in performing remains fixed for all time.

It certainly points STRONGLY to a lack of future demand for more capacity in desktops. If they already are vast overkill for the people...then it is unlikely they will need way more any time soon. Certainly they might want more from their tablets...but that isn't the discussion.

Quote:

This is, and always has been, a discussion of consumer demand for computational resources, not "desktop needs".

No it hasn't. I have specifically and continuously said I am talking about desktops and to a lesser degree laptops. YOU are the one who have constantly tried to change it to consumer demand in general. Which is never, ever, what I was talking about. you do this to great strawmen. I have corrected you EVERY TIME that I have been talking about consumers (non-hard core gamers) desktops and laptops to a lesser extent.

Quote:

That would be a great analogy if there were some historical trend in the motor vehicle market that had resulted in vehicles gaining massive amounts of additional cargo capacity at the same price over the last couple of decades, and an entire industry that had come up with all sorts of useful things to do with that cargo capacity. Here in the real world, your analogy doesn't map.

In EH2's defense, this discussion is old and comes from another thread. Though I'm less outspoken in my position, in essence EH2's forecast of approx a year ago (I don't remember the original thread) is holding up. It depends on metrics, semantics and use cases, but the uptake of even faster broadband speeds is not going as fast as people expected. His premise, 10mb is plenty for the average user is debatable... but the use cases offered are indeed not that "average" (again; we enter here into the realm of semantics).

EH2's posting style can be a tad rigid Sometimes he's dead wrong (aren't we all), or at least often enough I fully disagree with him. But it's never as ridiculous as some people here make it out to be. He's on the cautious/cynical end of the progress spectrum. There's got to be one

But that's just my take on EH2, he's more than capable to rile everyone up without me trying to analyze his posts

I love this post. Slams on me, but truthful.

To be fair, I did adjust it from 10Mbs to higher considering multiple streams at once. However, I still think it will be very slow to happen. Also remember my discussion was broadband speeds in the US, not any other country. A big part of it is cheapness of consumers, also bad business situations (lack of competition), etc. Consumers would choose to have 10Mbs for say $20/month than 100Mbs for $30 I think. And part of it would be them not seeing a very tangible benefit.

In EH2's defense, this discussion is old and comes from another thread. Though I'm less outspoken in my position, in essence EH2's forecast of approx a year ago (I don't remember the original thread) is holding up. It depends on metrics, semantics and use cases, but the uptake of even faster broadband speeds is not going as fast as people expected. His premise, 10mb is plenty for the average user is debatable... but the use cases offered are indeed not that "average" (again; we enter here into the realm of semantics).

None of the predictions made in this discussion have really had a chance to play out sufficiently since it began. One major early topic, for instance, was my prediction that 100 Mbps Internet would be at least as widespread in 2021 (in the US) as 10 Mbps Internet was in 2011. Given that I, personally, have moved from 10 Mbps Internet to 150 Mbps Internet since that discussion took place, I'm feeling pretty good about that one. I could certainly make the case that developments over the last year and a half support my position better than EH2's. But nothing is yet definitive enough that I'd convince anyone who was on the other side of this argument.

Actually I believe your original claim was "majority" of people. But I could be mis-remembering it. I never remember this claim as "same percentage as 10Mbs".

But it's trivial to provide examples of technologies presently at other stages of development/adoption. For instance, machine vision has very early, primitive commercial implementations, in the form of things like the After Effects rotobrush, the facial recognition features in photo editing software, etc. Speech recognition is a little further along, not nearly as good as we'd want it to be, but a credible solution to some problems. 3D is still further along, again not yet in its fully mature form (entirely convincing photorealistic real-time rendering), but nonetheless in a fairly sophisticated state, and the basis for a pretty large industry.

There are many technologies which could benefit from additional computational resources, in various stages of development and adoption. Just like it's always been.

Both machive vision and speech recognition aren't computer iensive problems where added power would somehow make a breakthough. You can't get much better solutions out of either even if you let a computer to process for days. They lack in algorithms, not in computer power.

But the fact you can imagine a usercase where increased computing power would help doesn't mean it is a real user case. Using sound to manipulate dropped objects would greatly benefit from increased computing power, but using computers to arrange sand particles, when sanding iced sidewalks, to read collected works of shakespeare isn't a real usercase.

ZnU wrote:

Redo from start wrote:

You can see the change even in adverts. In the 90's the PC specs were spelled out exactly, with big font, what processor, memory size and type, harddisk, display adapter, OS, etc, even power supply size. These days adds for general consumer? Often just general processor family, memory size, OS. Anything else in tiny print at the bottom of page.

In the early '90s, PC users were often tech enthusiasts. Now personal computers are entirely mainstream. You're simply seeing marketing respond to that shift.

Nope. Computers were sold for mainstream in 90s. Same as today. Only adds have changed. Seen adds of computers for computer enthusiast? There really aren't any. The industry where shop designed and build a computer out of best parts available has practicly died since 90s.

These days even computer enthusiast are hard put to name how fast dram they have, completely unheard of in the 90s. Same goes for the processor. Know how much cache you computer processor has? Probably 9/10 of even computer enthuciast these days have no idea. In 90s they would have known.

EXCELLENT points. You are absolutely right. I'm a computer enthusiast and I have no idea what speed ram I have at home (or work). I can tell you the amount of ram, but that is it. The only time I care is when I want to buy more...which I haven't done in years.

I think RfS is making the point that the enthusiasts are talking about Arduino/Rasberry computers, not x86. Annecdotically, the programmers I know (which is a handful) indeed are very much into that. For day to day needs (Nas, PC etc.) they prefer as one of the calls it: "Appliances" => Plug and play devices.

The other kind of enthusiast, namely the gamer, is still interested in hardware, but even there the lifecycle is lengthening, as is proven by my home network. An old i7, an old E8400 and an ancient e6600 are more than enough to handle games like WOW, SC2, the Witcher, Dragon Age, Mass Effect in 1080P. The only time I upgrade something its because it dies (videocards die, probably due to our cats). If I were playing FPS I would be harder pressed for sure, but it's amazing that 8 year old machines are handling mainstream gaming without a hitch.

Another anecdotical point... I am not sure which videocards I'm using atm.... Probably something like two/three year old higher midend ati cards on the E8400/i7 and 4-5 year old high midend Ati or Nvidia on the E6600. I would have to boot them up and check the system hardware to know for sure. Anecdote is not proof, but I can't remember talking about this stuff or having to help friends picking parts. For my brother (a WOW) player I just ordered a Dell. Haven't heard about his PC needs in years until he wanted a small laptop.

And this is echoed by many people here. Though we might be an outlier, one would think we of all people have a truckload of hardware enthusiasts among us. I'm willing to bet that RfS is dead on with his description of market direction.

Besides the 800 pound Gorilla here is Apple... they don't really care about performance, more about user experience. And they convinced the consumer they were right (I for example agree...).

You know what is funny--my computer is 4.5 years old and the video card died. I went out to replace it. I looked at the exact card and decided that I would downgraded to an 8400GS. I had a 9500GS.

And no one has noticed a difference. BTW, I had to look up which card I had and which I got.

Remember back when integrated video was the biggest joke around and how horrible it was? And now? Works and you don't hear average consumers complaining anymore.

I think RfS is making the point that the enthusiasts are talking about Arduino/Rasberry computers, not x86. Annecdotically, the programmers I know (which is a handful) indeed are very much into that. For day to day needs (Nas, PC etc.) they prefer as one of the calls it: "Appliances" => Plug and play devices.

The other kind of enthusiast, namely the gamer, is still interested in hardware, but even there the lifecycle is lengthening, as is proven by my home network. An old i7, an old E8400 and an ancient e6600 are more than enough to handle games like WOW, SC2, the Witcher, Dragon Age, Mass Effect in 1080P. The only time I upgrade something its because it dies (videocards die, probably due to our cats). If I were playing FPS I would be harder pressed for sure, but it's amazing that 8 year old machines are handling mainstream gaming without a hitch.

Another anecdotical point... I am not sure which videocards I'm using atm.... Probably something like two/three year old higher midend ati cards on the E8400/i7 and 4-5 year old high midend Ati or Nvidia on the E6600. I would have to boot them up and check the system hardware to know for sure. Anecdote is not proof, but I can't remember talking about this stuff or having to help friends picking parts. For my brother (a WOW) player I just ordered a Dell. Haven't heard about his PC needs in years until he wanted a small laptop.

And this is echoed by many people here. Though we might be an outlier, one would think we of all people have a truckload of hardware enthusiasts among us. I'm willing to bet that RfS is dead on with his description of market direction.

Besides the 800 pound Gorilla here is Apple... they don't really care about performance, more about user experience. And they convinced the consumer they were right (I for example agree...).

There is some real data for evidence. The increasing popularity of Apple despite it's lesser performance on the performance per price metric. The popularity of laptops despite their less specs. Intel graphics have grown in the Steam hardware survey, despite their generally accepted worse performance. The long game console generation where the low end box (Wii) dominated based on input method, not specs, as well as the surge of xbox for the same reason (Kinect).

And what exactly are the poster children for computing? What are the 'concept cars' aka dream machines that geeks salivate for? Like it was stated above, the enthusiasts are interested in strange tablets and portable gaming systems and oddball raspberry pi creations, not high end supercomputer monsters, for the most part. There have been entire systems released just for the purposes of playing pirated, 15 year old 2d games.

Also notice that the last two versions of windows run on lesser hardware. The need for better hardware to run interesting features/killer apps is largely gone from client PCs. It's exactly analogous to the push for compact cars during the oil crisis in the 70s, energy efficiency is prized over raw horsepower. The market has matured and most of the applications for pure power have been addressed thoroughly by a commoditized market.

True. especially the bolded part. These are the things I see and the trends and I comment on it and ZnU acts like I am crazy.

And again where are the customers who want those? Who wants to peer augmented reality through 4" phone screen? Where are the consumers who are willing to pay for automated image recognization? I wouldn't, and neither would anybody I know.

Who's talking about four inch screens?

How many consumers were sitting around talking about capacitative touch screens before the iPhone? How many were talking about link-based page ranking algorithms before Google? The fact that consumers aren't presently getting excited about machine learning techniques or augmented reality or whatever does not provide even a hint of support for the notion that there are no compelling products that will be built with these technologies.

And seriously, dude, if you can't think of any ways to build useful consumer products involving, say, image recognition, you haven't got much imagination. There are already features in current consumer software built around (still fairly primitive) image recognition.

Redo from start wrote:

So, in the 90s computer purchases were dominated by enthusiast and business purchases, consumers being a minority?

I said the early '90s. Household PC penetration was under 25% in, say, 1993.

Redo from start wrote:

Another proof that computer performane isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore.

Do you actually believe most people buying computers today used to care about benchmarks, and have subsequently stopped? That's absurd. Today's median personal computer user has never had any interest in benchmarks, and if asked a question like "Describe what a computer benchmark does and when you might be interested in benchmark data" would likely be unable to give a coherent answer.

Echohead2 wrote:

The things you are listing aren't even on their computers in rudimentary form, or being used hardly at all.

Right, except the ones that don't count because they are on their computers in rudimentary form.

Look, for all of this elaborate back and forth, this is very simple. You literally believe that every consumer computing task that can be invented either has been invented, or, for some reason, will necessarily run just fine on 2007's hardware when it is invented. You also believe that existing use cases, some of which now exist in implementations that require hundreds of times more computational resources than their initial implementations, will cease evolving in ways that require more resources.

This is a prediction on the level of the famous "I think there is a world market for maybe five computers" and "This 'telephone' has too many shortcomings to be seriously considered as a means of communication". There's really not much else to say.

And again where are the customers who want those? Who wants to peer augmented reality through 4" phone screen? Where are the consumers who are willing to pay for automated image recognization? I wouldn't, and neither would anybody I know.

Who's talking about four inch screens?

You when you keep bringing up augmented reality. There aren't much choice for consumer how to use augmented reality but the phone screens.

ZnU wrote:

How many consumers were sitting around talking about capacitative touch screens before the iPhone? How many were talking about link-based page ranking algorithms before Google? The fact that consumers aren't presently getting excited about machine learning techniques or augmented reality or whatever does not provide even a hint of support for the notion that there are no compelling products that will be built with these technologies.

Check your history. Google wasn't the only one using such algorithms, not even the first.

For people not talking about capacitive screens is proof that they would be flocking to buy side walk sanding equipment that writes collective works of shakespeare on them by a boombox? Your reasoning can be used to support any dumb project. People were buying phones before iphone, and touchscreen phones. So are people buying machine vision software? I got quite a punch of odd friends but none of them have bought one.

ZnU wrote:

And seriously, dude, if you can't think of any ways to build useful consumer products involving, say, image recognition, you haven't got much imagination. There are already features in current consumer software built around (still fairly primitive) image recognition.

I'm guessing you are referring to facial recognition software. Which people aren't buying.

But that is beside the point. Your example is exactly what The Simpsons depicted when Homer Simpson designed a car. Lets completely forget that there aren't any consumer market for image recognition software. First we would need the algorithms to recognize images. Lets say we start from A and are up to anvil. Are all anvils same shaped? No. One anvil manufacturer had 20 different anvils. So we 3D scan all those and make some algorithm that says this is anvil. And make a program that twists that 3D to 2D and compares it to image. Are we done? No. That same damn algorithm will say that a bunch of different items are anvils too, like shoemakers metallic lasts. So we need to tweak the algorithm to discount those. So we now have a algorithm that will tell reliable when there's a anvil in image? No. It will throw up a punch of false positives like a mickey mouse hat in front of steel plate, making the steel plate behind it look lika an anvil.

And now we come to why this user case is exactly like Homer Simpsons designed car. We have one person that says this is what consumers want, when they don't. And especially aren't willing to pay what it will cost to buy even if it is divided to hundreds of millions people. We are talking about the biggest programming and database creation in the human history. Even google search database will be second to this in size and complexity. Even everyday home objects and things come in such a huge variety of shapes and materials the algorithm complexity and database size will be staggering. So this isn't a user case where the end product is just on the horizon. It is over the horizon on the 7 continent and costs arm, a leg, first born, second born and rest of the family and several vital organs. And for this consumers are yearning faster computers?

Redo from start wrote:

So, in the 90s computer purchases were dominated by enthusiast and business purchases, consumers being a minority?

I said the early '90s. Household PC penetration was under 25% in, say, 1993.[/quote]

And I was talking 90's, which includes later part too.

ZnU wrote:

Redo from start wrote:

Another proof that computer performance isn't a selling point anymore, is the lack of benchmark usage. It used to be quite common to quote system speckmark's on product flyers, not anymore.

Do you actually believe most people buying computers today used to care about benchmarks, and have subsequently stopped? That's absurd. Today's median personal computer user has never had any interest in benchmarks, and if asked a question like "Describe what a computer benchmark does and when you might be interested in benchmark data" would likely be unable to give a coherent answer.

They did. But not any more. People don't have to understand what the benchmark is, just which is better, bigger or smaller number.

People use benchmarks for qualities they care for when buying an item. Cars? Gas milage and power. Cameras? Pixel count. And cameras really prove the point. Most people are willing to admit that pixel count is a bad benchmark for image quality, but they still use it, for a lack of better one. Even BBQ grills have a benchmark, BTU.

Computers? Windows comes with its own benchmark app, but even that isn't used in marketing. As most people no longer care about performance. They have noticed that any computer bought will do just fine.

You when you keep bringing up augmented reality. There aren't much choice for consumer how to use augmented reality but the phone screens.

Honestly, this kind of reply just drives home the extent to which folks on the other side of this debate really just can't wrap their heads around the fact that the future could be significantly different from the present. The smartphone form factor you're talking about barely existed six years ago, yet you seem to be of the opinion that it's the form mobile computing will take for the rest of history.

Redo from start wrote:

Check your history. Google wasn't the only one using such algorithms, not even the first.

That has no relevance to my point whatsoever.

Redo from start wrote:

For people not talking about capacitive screens is proof that they would be flocking to buy side walk sanding equipment that writes collective works of shakespeare on them by a boombox?

No, that's some incoherent nonsense you made up. My point was that your argument, effectively "Consumers don't currently care about these technologies, so they won't ever form the basis of products that drive consumer demand", is nonsense.

Redo from start wrote:

So are people buying machine vision software? I got quite a punch of odd friends but none of them have bought one.

None of your friends use Picasa, iPhoto or Aperture? None of them use OCR software (including OCR features built into apps like Evernote), or digital cameras that spot faces to figure out where to set focus? Not a single one of them owns a Kinect? That is odd.

You seem to have this silly idea that a technology like machine vision gets turned into an app called "Machine Vision 1.0" for all the consumers who are excited about machine vision to buy. That's not how that works.

Redo from start wrote:

But that is beside the point. Your example is exactly what The Simpsons depicted when Homer Simpson designed a car.

Err... no. That's more of a send up of how you are imagining the market should work — consumers themselves designing products.

Redo from start wrote:

Lets completely forget that there aren't any consumer market for image recognition software. First we would need the algorithms to recognize images. Lets say we start from A and are up to anvil. Are all anvils same shaped? No. One anvil manufacturer had 20 different anvils. So we 3D scan all those and make some algorithm that says this is anvil. And make a program that twists that 3D to 2D and compares it to image. Are we done? No. That same damn algorithm will say that a bunch of different items are anvils too, like shoemakers metallic lasts. So we need to tweak the algorithm to discount those. So we now have a algorithm that will tell reliable when there's a anvil in image? No. It will throw up a punch of false positives like a mickey mouse hat in front of steel plate, making the steel plate behind it look lika an anvil.

And this is where 'deep learning' techniques, large neural networks, and the need for more computational power come into play.

Redo from start wrote:

And now we come to why this user case is exactly like Homer Simpsons designed car. We have one person that says this is what consumers want, when they don't. And especially aren't willing to pay what it will cost to buy even if it is divided to hundreds of millions people. We are talking about the biggest programming and database creation in the human history. Even google search database will be second to this in size and complexity. Even everyday home objects and things come in such a huge variety of shapes and materials the algorithm complexity and database size will be staggering. So this isn't a user case where the end product is just on the horizon. It is over the horizon on the 7 continent and costs arm, a leg, first born, second born and rest of the family and several vital organs. And for this consumers are yearning faster computers?

This is precisely analogous to dismissing digital video in 1993 because streaming 1080p over the Internet was a couple of decades away. As with digital video, the technologies that have been mentioned here typically have possible useful implementations short of their eventual mature implementations.

Redo from start wrote:

They did. But not any more. People don't have to understand what the benchmark is, just which is better, bigger or smaller number.

Fantasy. This isn't exactly scientific research, but my go-to illustration of the level the average user is working at is this video. Arguing that this user population cared about benchmarks and tech specs is silly. The reason those things were used in marketing is that a) generic PC vendors had little besides price/performance to distinguish their products, and b) vendors took a fair bit of time to shift their thinking in response to the PC going mainstream.

="Redo from start"]You when you keep bringing up augmented reality. There aren't much choice for consumer how to use augmented reality but the phone screens.[/quote]

Honestly, this kind of reply just drives home the extent to which folks on the other side of this debate really just can't wrap their heads around the fact that the future could be significantly different from the present. The smartphone form factor you're talking about barely existed six years ago, yet you seem to be of the opinion that it's the form mobile computing will take for the rest of history.[/quote]

Sorry, again you show no knowledge of history and US centric view. Touch screen phones are well over 6 years old, it is just that 6 yeard ago they become a fad. Anything else you are suggesting would need a dedicated device. But this is beside the point as we are talking about PCs. Augmented reality with laptop or desktop wouldn't make much sense to general public.

ZnU wrote:

Redo from start wrote:

Check your history. Google wasn't the only one using such algorithms, not even the first.

That has no relevance to my point whatsoever.

It is relevant when you say: "How many consumers were sitting around talking about capacitative touch screens before the iPhone? How many were talking about link-based page ranking algorithms before Google?" People were actually speaking about web search engines and their algorithms. Probably even more than thse days.

ZnU wrote:

Redo from start wrote:

For people not talking about capacitive screens is proof that they would be flocking to buy side walk sanding equipment that writes collective works of shakespeare on them by a boombox?

No, that's some incoherent nonsense you made up. My point was that your argument, effectively "Consumers don't currently care about these technologies, so they won't ever form the basis of products that drive consumer demand", is nonsense.

In view of your argument, both machine vision and boombox shakespeare speller are in equal worth. You just say one of them is nonsense for not being yours. Come up with a argument that wouldn't apply to boombox sander.

ZnU wrote:

Redo from start wrote:

So are people buying machine vision software? I got quite a punch of odd friends but none of them have bought one.

None of your friends use Picasa, iPhoto or Aperture? None of them use OCR software (including OCR features built into apps like Evernote), or digital cameras that spot faces to figure out where to set focus? Not a single one of them owns a Kinect? That is odd.

You seem to have this silly idea that a technology like machine vision gets turned into an app called "Machine Vision 1.0" for all the consumers who are excited about machine vision to buy. That's not how that works.

YOu have some silly idea that people want machine vision.

ZnU wrote:

Redo from start wrote:

But that is beside the point. Your example is exactly what The Simpsons depicted when Homer Simpson designed a car.

Err... no. That's more of a send up of how you are imagining the market should work — consumers themselves designing products.

Are you a product designer or a consumer?

ZnU wrote:

Redo from start wrote:

Lets completely forget that there aren't any consumer market for image recognition software. First we would need the algorithms to recognize images. Lets say we start from A and are up to anvil. Are all anvils same shaped? No. One anvil manufacturer had 20 different anvils. So we 3D scan all those and make some algorithm that says this is anvil. And make a program that twists that 3D to 2D and compares it to image. Are we done? No. That same damn algorithm will say that a bunch of different items are anvils too, like shoemakers metallic lasts. So we need to tweak the algorithm to discount those. So we now have a algorithm that will tell reliable when there's a anvil in image? No. It will throw up a punch of false positives like a mickey mouse hat in front of steel plate, making the steel plate behind it look lika an anvil.

And this is where 'deep learning' techniques, large neural networks, and the need for more computational power come into play.

Buzzwords don't make a product. Heard of IBM watson? That is several orders of magnitude easier problem that what you are suggesting, and it needs, ill quote wikipedia: "Watson is made up of a cluster of ninety IBM Power 750 servers (plus additional I/O, network and cluster controller nodes in 10 racks) with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses a 3.5 GHz POWER7 eight core processor, with four threads per core. The POWER7 processor's massively parallel processing capability is an ideal match for Watson's IBM DeepQA software which is embarrassingly parallel (that is a workload that is easily split up into multiple parallel tasks)"

What watson had was a ready made datebase, and much smaller compared to what you are proposing is the next home killer app. And even watson needed a machine consuming about same amount of electricity as midsized village.

ZnU wrote:

Redo from start wrote:

And now we come to why this user case is exactly like Homer Simpsons designed car. We have one person that says this is what consumers want, when they don't. And especially aren't willing to pay what it will cost to buy even if it is divided to hundreds of millions people. We are talking about the biggest programming and database creation in the human history. Even google search database will be second to this in size and complexity. Even everyday home objects and things come in such a huge variety of shapes and materials the algorithm complexity and database size will be staggering. So this isn't a user case where the end product is just on the horizon. It is over the horizon on the 7 continent and costs arm, a leg, first born, second born and rest of the family and several vital organs. And for this consumers are yearning faster computers?

This is precisely analogous to dismissing digital video in 1993 because streaming 1080p over the Internet was a couple of decades away. As with digital video, the technologies that have been mentioned here typically have possible useful implementations short of their eventual mature implementations.

Again you pick a example out of past trying to sound like this is inevitable. You could as well claim that this is precisely analogous to dismissing flying cars in 60's. Where did you park yours? Your vision is Homer Simpsons designed car.

ZnU wrote:

Redo from start wrote:

They did. But not any more. People don't have to understand what the benchmark is, just which is better, bigger or smaller number.

Fantasy. This isn't exactly scientific research, but my go-to illustration of the level the average user is working at is this video. Arguing that this user population cared about benchmarks and tech specs is silly. The reason those things were used in marketing is that a) generic PC vendors had little besides price/performance to distinguish their products, and b) vendors took a fair bit of time to shift their thinking in response to the PC going mainstream.

You are actually proving my point that people don't care of getting more powerfull computers. If they did, they would be educating themselves on how to compare computers. These are same people who in car shop know what car milage means, or accelaration. And they are marketed the cars with these numbers.

When these folks walk into a computer shop do they ask for more powerful computer than their old one? Do they even know what their old computer was? You claim there is a high demand for more powerful computers. If there is shouldn't they know what a more powerful computer is? Or if they are buying a more or less powerfull computer than their old one? Or do they just walk into a shop and buy this years model, like in a ski shop, not knowing if what they bought is any better than they have at home? How do these folks compare computers in the shop, if they aren't using benchmarks?

"I'll buy the pink one as that clearly is the most powerful" by your account.

You when you keep bringing up augmented reality. There aren't much choice for consumer how to use augmented reality but the phone screens.

Honestly, this kind of reply just drives home the extent to which folks on the other side of this debate really just can't wrap their heads around the fact that the future could be significantly different from the present. The smartphone form factor you're talking about barely existed six years ago, yet you seem to be of the opinion that it's the form mobile computing will take for the rest of history.

Sorry, again you show no knowledge of history and US centric view. Touch screen phones are well over 6 years old, it is just that 6 yeard ago they become a fad. Anything else you are suggesting would need a dedicated device. But this is beside the point as we are talking about PCs. Augmented reality with laptop or desktop wouldn't make much sense to general public.

ZnU wrote:

Redo from start wrote:

Check your history. Google wasn't the only one using such algorithms, not even the first.

That has no relevance to my point whatsoever.

It is relevant when you say: "How many consumers were sitting around talking about capacitative touch screens before the iPhone? How many were talking about link-based page ranking algorithms before Google?" People were actually speaking about web search engines and their algorithms. Probably even more than thse days.

ZnU wrote:

Redo from start wrote:

For people not talking about capacitive screens is proof that they would be flocking to buy side walk sanding equipment that writes collective works of shakespeare on them by a boombox?

No, that's some incoherent nonsense you made up. My point was that your argument, effectively "Consumers don't currently care about these technologies, so they won't ever form the basis of products that drive consumer demand", is nonsense.

In view of your argument, both machine vision and boombox shakespeare speller are in equal worth. You just say one of them is nonsense for not being yours. Come up with a argument that wouldn't apply to boombox sander.

ZnU wrote:

Redo from start wrote:

So are people buying machine vision software? I got quite a punch of odd friends but none of them have bought one.

None of your friends use Picasa, iPhoto or Aperture? None of them use OCR software (including OCR features built into apps like Evernote), or digital cameras that spot faces to figure out where to set focus? Not a single one of them owns a Kinect? That is odd.

You seem to have this silly idea that a technology like machine vision gets turned into an app called "Machine Vision 1.0" for all the consumers who are excited about machine vision to buy. That's not how that works.

You have some silly idea that people want machine vision.

ZnU wrote:

Redo from start wrote:

But that is beside the point. Your example is exactly what The Simpsons depicted when Homer Simpson designed a car.

Err... no. That's more of a send up of how you are imagining the market should work — consumers themselves designing products.

Are you a product designer or a consumer?

ZnU wrote:

Redo from start wrote:

Lets completely forget that there aren't any consumer market for image recognition software. First we would need the algorithms to recognize images. Lets say we start from A and are up to anvil. Are all anvils same shaped? No. One anvil manufacturer had 20 different anvils. So we 3D scan all those and make some algorithm that says this is anvil. And make a program that twists that 3D to 2D and compares it to image. Are we done? No. That same damn algorithm will say that a bunch of different items are anvils too, like shoemakers metallic lasts. So we need to tweak the algorithm to discount those. So we now have a algorithm that will tell reliable when there's a anvil in image? No. It will throw up a punch of false positives like a mickey mouse hat in front of steel plate, making the steel plate behind it look lika an anvil.

And this is where 'deep learning' techniques, large neural networks, and the need for more computational power come into play.

Buzzwords don't make a product. Heard of IBM watson? That is several orders of magnitude easier problem that what you are suggesting, and it needs, ill quote wikipedia: "Watson is made up of a cluster of ninety IBM Power 750 servers (plus additional I/O, network and cluster controller nodes in 10 racks) with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses a 3.5 GHz POWER7 eight core processor, with four threads per core. The POWER7 processor's massively parallel processing capability is an ideal match for Watson's IBM DeepQA software which is embarrassingly parallel (that is a workload that is easily split up into multiple parallel tasks)"

What watson had was a ready made datebase, and much smaller compared to what you are proposing is the next home killer app. And even watson needed a machine consuming about same amount of electricity as midsized village.

ZnU wrote:

Redo from start wrote:

And now we come to why this user case is exactly like Homer Simpsons designed car. We have one person that says this is what consumers want, when they don't. And especially aren't willing to pay what it will cost to buy even if it is divided to hundreds of millions people. We are talking about the biggest programming and database creation in the human history. Even google search database will be second to this in size and complexity. Even everyday home objects and things come in such a huge variety of shapes and materials the algorithm complexity and database size will be staggering. So this isn't a user case where the end product is just on the horizon. It is over the horizon on the 7 continent and costs arm, a leg, first born, second born and rest of the family and several vital organs. And for this consumers are yearning faster computers?

This is precisely analogous to dismissing digital video in 1993 because streaming 1080p over the Internet was a couple of decades away. As with digital video, the technologies that have been mentioned here typically have possible useful implementations short of their eventual mature implementations.

Again you pick a example out of past trying to sound like this is inevitable. You could as well claim that this is precisely analogous to dismissing flying cars in 60's. Where did you park yours? Your vision is Homer Simpsons designed car.

ZnU wrote:

Redo from start wrote:

They did. But not any more. People don't have to understand what the benchmark is, just which is better, bigger or smaller number.

Fantasy. This isn't exactly scientific research, but my go-to illustration of the level the average user is working at is this video. Arguing that this user population cared about benchmarks and tech specs is silly. The reason those things were used in marketing is that a) generic PC vendors had little besides price/performance to distinguish their products, and b) vendors took a fair bit of time to shift their thinking in response to the PC going mainstream.

You are actually proving my point that people don't care of getting more powerfull computers. If they did, they would be educating themselves on how to compare computers. These are same people who in car shop know what car milage means, or accelaration. And they are marketed the cars with these numbers.

When these folks walk into a computer shop do they ask for more powerful computer than their old one? Do they even know what their old computer was? You claim there is a high demand for more powerful computers. If there is shouldn't they know what a more powerful computer is? Or if they are buying a more or less powerfull computer than their old one? Or do they just walk into a shop and buy this years model, like in a ski shop, not knowing if what they bought is any better than they have at home? How do these folks compare computers in the shop, if they aren't using benchmarks?

"I'll buy the pink one as that clearly is the most powerful" by your account.

Sorry, again you show no knowledge of history and US centric view. Touch screen phones are well over 6 years old, it is just that 6 yeard ago they become a fad. Anything else you are suggesting would need a dedicated device.

If you really don't understand that the iPhone made mobile computing vastly more relevant to consumers, I'm really not sure what to say. But in any event, the precise timing of the introduction of touchscreen phones is entirely irrelevant to this argument.

Redo from start wrote:

But this is beside the point as we are talking about PCs. Augmented reality with laptop or desktop wouldn't make much sense to general public.

EH2 has tried several times to redefine the scope of this discussion like this as well. It's not about "PCs". It's about consumer computing.

Redo from start wrote:

It is relevant when you say: "How many consumers were sitting around talking about capacitative touch screens before the iPhone? How many were talking about link-based page ranking algorithms before Google?" People were actually speaking about web search engines and their algorithms. Probably even more than thse days.

Consumers were speaking about these things? You don't live on this planet.

Redo from start wrote:

In view of your argument, both machine vision and boombox shakespeare speller are in equal worth.

You're pretty much literally posting gibberish at this point.

Redo from start wrote:

YOu have some silly idea that people want machine vision.

Look, if you really can't imagine how something like computers being able to recognize images could possibly be useful, I understand why you hold the position you do on the subject of future demand for increased computational capacity. But this is, again, on the level of "This 'telephone' has too many shortcomings to be seriously considered as a means of communication". There's no sensible response to such a position except mockery.

Redo from start wrote:

Buzzwords don't make a product. Heard of IBM watson? That is several orders of magnitude easier problem that what you are suggesting, and it needs, ill quote wikipedia: "Watson is made up of a cluster of ninety IBM Power 750 servers (plus additional I/O, network and cluster controller nodes in 10 racks) with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses a 3.5 GHz POWER7 eight core processor, with four threads per core. The POWER7 processor's massively parallel processing capability is an ideal match for Watson's IBM DeepQA software which is embarrassingly parallel (that is a workload that is easily split up into multiple parallel tasks)"

Do you not understand that you're undermining the very argument you (and EH2) are attempting to make? Here we have a use case that already has useful rudimentary implementations (I've pointed to several), but that will likely require substantially more computational resources as it progresses into more mature implementations. That's precisely what's required to drive demand for additional computational resources over the next 20 years, the way use cases like digital video have driven it over the last 20.

Redo from start wrote:

Again you pick a example out of past trying to sound like this is inevitable. You could as well claim that this is precisely analogous to dismissing flying cars in 60's. Where did you park yours? Your vision is Homer Simpsons designed car.

They did. But not any more. People don't have to understand what the benchmark is, just which is better, bigger or smaller number.

Redo from start wrote:

You are actually proving my point that people don't care of getting more powerfull computers. If they did, they would be educating themselves on how to compare computers.

According to your theory of how the market has evolved, the very same consumers who you seem to now admit don't understand how to compare computers used to understand this.

It's far more reasonable to assume that the consumers who don't understand this now never understood it, and that when computer ads targeted people who did understand it, it was some combination of these folks not being part of the market yet, or of vendors simply not quite understanding how the market had evolved.

Redo from start wrote:

You claim there is a high demand for more powerful computers

No, I claim that the same dynamic that has lead to much more powerful computers (and use cases which exploit them) over the last couple of decades has not, in fact, broken down. That's not quite the same thing.

EH2 has tried several times to redefine the scope of this discussion like this as well. It's not about "PCs". It's about consumer computing.

This is a flat out lie. I have always and consistently said I was talking about desktops and laptops to a lesser extent. YOU are the one who is trying to redefine it to "consumer computer". I have sadi OVER AND OVER that this doesn't describe me point.

Nope. Try again. Maybe pay a little more attention. I have consistently be talking about a specific set of users using a specific set of hardware. Now if you want, I could have a disclaimer that I copy and paste onto every sentence I write so that you might start to understand that comments on this topic have that basic understanding. For instance--if I say "for consumers" in that case--I specifically am speaking to the group of consumers I have always been talking about. Not prosumers, not hard-core gamers, etc. You could probably throw in "basic office worker" in with "consumer" when I use the term. Hey--care to guess which hardware I am referencing? I know if you try real hard you might be able to figure it out after 2 years of me saying the same thign and you trying your hardest to create strawmen, but I haven't changed.

You have tried to transform it to cameras or phones or tablets...and that isn't waht I am talking about. In fact, they are supportive of my point, not part of it.

Quote:

Consumers were speaking about these things? You don't live on this planet.

Says the man who believes consumers are demanding more and more desktop performance. who thinks that improves never seem to end. Who denyies the last decades of proof.

Think of a $1000 computer in 1993. What was its useful life? About 2-3 years, and you would either get a new computer or up the RAM, graphics, HDD, and maybe even CPU.

I bought one in 1993, with upgrades it lasted 5 years.

in 1998 how long would a $1000 computer last. about 2-3 years..same thing, upgrade it and make it last 5.

Whichi is what I did. 5 years with upgrades.

2003..same thing again.

2008...I bought a $1,000 computer which actually had about $200 worth of TV tuner cards so really about $800, but let's go with $1,000. Now it is 4.5 years later and....nothing. Not even considering upgrading it. Perhaps in another 1.5-2 years I will swap out the main HDD for a SSD and then get another couple of years out of it. I fully expect to get at least 8 years out of it, if not 10.

and now today. Imagine a $1,000 computer of today.

I am not going to do a big shopping comparison, just a quick trip to hp.com For $1,000 I can get:

You are actually proving my point that people don't care of getting more powerfull computers. If they did, they would be educating themselves on how to compare computers. These are same people who in car shop know what car milage means, or accelaration. And they are marketed the cars with these numbers.

Not tryin to be nasty, but have you ever seen retail sales for anything? It's rare you see someone ,in any enviroment that sells goods to a consumer, have any clue as to what they are talking about.

Alos, have you ever seen a computer in a store? There's always specs listed for teh computer. Oftren time it's a sticker right on the floor model (I have a floor model HP with a big white sticker on the black box that explains what's in it). It's how much ram, the processor, how big harddrive, screen sixe, etc. Pretty much just like the stickler that's put on a car

Quote:

I'll buy the pink one as that clearly is the most powerful" by your account.

Which is exatcly how I would etsimate 90% of the peopel out there buy soemthing. It's newer and a few numbers rae bigger. This is hwo cars are sold, how TVs are sold, how appliences are sold, etc. Mnay devices are amde mroe flashy just to entihce peopel into buying it. It's why a mac notebook has a brushed metal appernce, why this HP desktop I'm using has a glowing blue bevel, etc. make it flashty enougth and people will buy it becasue it looks pretty

That computer would easily, imo, be viable for at least 10 years for the consumers we are talking about. If not longer.

from 2-3 years to a decade. How could that be unless hardware is outstripping needs?

Lets go back 10 years from today. Any intel desktop sold cannot run windows 8 due to lacking the NX bit. So lets go back 8 years. You can theorticaly run windows 8, but only after multiple hardware upgrades and even then it would be pretty bad experince