The star of the show, however, will be Google's new 10" tablet that was developed in conjunction with Samsung. This tablet will come bearing Android 4.2 (still operating under the Jelly Bean codename) and a Retina-surpassing resolution of 2560x1600 (300 ppi). Apple's "New iPad" features a screen resolution of 2048x1536 (264 ppi).

The device will likely be called the Nexus 10. We don't have any specs to report on at this time other than the screen, but we can only assume that it'll be packing a quad-core processor and at least 64GB of storage space at the high-end.

Where did this insane conspiracy come from anyway? Why are people on a PPI bandwagon these days?

Try actually running gpu demanding (or even a web browser) software on something like a T220 back when it was released. Hell if 4k became the standard on desktop PCs right at this moment I'm sure we'd see endless complaints from people trying to play video games at 4K, who will inevitably point the finger at developers for not "optimizing" enough on a 8 million pixel screen.

Higher PPI means more clarity (to a point, as Apple's "retina" marketing makes clear). Yes, current commodity hardware is lacking to do realtime 3D rendering or video decoding at resolutions that high, but the vast majority of what people stare at on their phones, tablets, laptops, and desktops is text, not movies or games. We have plenty computing power to drive extremely high resolution text.

Significantly higher density means more clarity better text rendering, but the difference is greatly exaggerated, with the exception of moving from very low density for phones to the current standard of around 200ppi or so. At that point pixel edges become significantly less clear and you get decent text rendering.

Having a relatively low PPI standard is not arbitrary. try running apple's macbook retina and watch as it struggles to even browse youtube, dropping to 15~30fps frequently. that macbook actually has superior hardware to the "average" computer spent staring at text and charts. It's going to be a long time before your average computer has something comparable to an i7. The response times would also likely be worse overall considering more pixels for the panel to handle. We started seeing HD adoption in the late 90s for broadcast and nowadays hardware is still mostly just good enough for 1080p.