The star of the show, however, will be Google's new 10" tablet that was developed in conjunction with Samsung. This tablet will come bearing Android 4.2 (still operating under the Jelly Bean codename) and a Retina-surpassing resolution of 2560x1600 (300 ppi). Apple's "New iPad" features a screen resolution of 2048x1536 (264 ppi).

The device will likely be called the Nexus 10. We don't have any specs to report on at this time other than the screen, but we can only assume that it'll be packing a quad-core processor and at least 64GB of storage space at the high-end.

I don't think it's a matter of milking customers. It's more a matter of "mass produceability".

It's like for processors... IBM produced transistors working at frequencies superior to 10 gHz like what... 15 years ago? That was for single, stand alone transistors. Yet, 15 years later, the fastest processors runs at what? 3.5 gHz, give or take.

Even though they can build a single transistor that operates at 10+ gHz, they can't cram 1 billions of them in a chip smaller than your thumb's fingernail, that are stable at 10 gHz and don't overheat.

I'm not saying planned obsolescence is not a reality. It does exists. But the main reason of the slow progress in many areas is usually more a question of manufacturing process lagging behind.

Yup... Its one thing to make a breakthrough prototype or extremely high end product. Its totally another to be able to mass produce it and have it be affordable by the average consumer. The company that can do the latter is the one that makes money.

Not in this case. Mass production high res lcd has been available for a some years now. There just hasn't been any market demand for it. But there hasn't been any market demand for it because consumers are uneducated about the product. I remember reading the ceo of viewsonic answering some dudes question regarding high res screens a few years ago and his answer was there just wasn't any demand. This is a market where most people buy the cheapest 22" LCD they can get. 22" is still the most commonly sold lcd size. It takes a luxury goods maker like Apple (who has a strong brand, strong marketing, and a large herd of deep pocketed sheep) introducing an advanced feature as a premium feature on a high price point product for said feature to become popularized. The computer market is becoming like the car market where high end manufacturers spearhead the adoption of innovations like they did with disc brakes and fuel injection long ago.

Where did this insane conspiracy come from anyway? Why are people on a PPI bandwagon these days?

Try actually running gpu demanding (or even a web browser) software on something like a T220 back when it was released. Hell if 4k became the standard on desktop PCs right at this moment I'm sure we'd see endless complaints from people trying to play video games at 4K, who will inevitably point the finger at developers for not "optimizing" enough on a 8 million pixel screen.

Higher PPI means more clarity (to a point, as Apple's "retina" marketing makes clear). Yes, current commodity hardware is lacking to do realtime 3D rendering or video decoding at resolutions that high, but the vast majority of what people stare at on their phones, tablets, laptops, and desktops is text, not movies or games. We have plenty computing power to drive extremely high resolution text.

Significantly higher density means more clarity better text rendering, but the difference is greatly exaggerated, with the exception of moving from very low density for phones to the current standard of around 200ppi or so. At that point pixel edges become significantly less clear and you get decent text rendering.

Having a relatively low PPI standard is not arbitrary. try running apple's macbook retina and watch as it struggles to even browse youtube, dropping to 15~30fps frequently. that macbook actually has superior hardware to the "average" computer spent staring at text and charts. It's going to be a long time before your average computer has something comparable to an i7. The response times would also likely be worse overall considering more pixels for the panel to handle. We started seeing HD adoption in the late 90s for broadcast and nowadays hardware is still mostly just good enough for 1080p.