When I read the press release yesterday I thought the PaperTab was developed by PlasticLogic, when in fact it would be better to say they were funding a research lab's latest project. The PaperTab is not a commercial prototype so much as it is a 2 month old research project from the Human Media Lab at Queen's University in Ontario.

The screens themselves are rather cool. They use Plastic Logic's 10.7" screen as part of a design developed by the HML. This research group has been working on bendable and flexible input since 2004, and they were more than capable of adapting their past work (here) for use in the PaperTab.

Each prototype PaperTab screen has a PL screen, and behind the screen is a pair of bend sensors and a 3D location sensor. On top of the screen is a flexible touchscreen component from Zytronic. The screen prototypes are of course wired into the base unit for both power and data, a design decision which came more from the fact the project is just beginning rather than a deliberate choice.

I was lucky enough to see the PaperTabs be put through their paces, but rather than repeat the description I wrote yesterday I will share this video again. I believe it shows everything I saw this morning.

Both this video and the presentation I saw today were the work of the HML. The folks giving the presentation in PlasticLogic's suite don't work for PL; they're from the HML.

Now, I never expect to see the PaperTab on a store shelf but that does not mean I wasn't glad to see it in person. It's an example of what could be done, and it shows hints of how we're going to be using our gadgets a few years from now.

A PlasticLogic screen is too expensive to be practical but that doesn't mean some of the ideas can't be applied to tablets with LCD screens. LCDs are getting thinner, and it's not such a stretch to think your next tablet will be aware of where your smartphone is and will be set up to quickly let you transfer content from one to another (file size and connectivity permitting).

This is the type of concept which the Human Media Lab has been exploring since 2004, I was told, though the earlier projects didn't have any screen tech to work with. This video, for example dates to 2007 and required a lot of equipment to simulate how one might use a digital interface.

That project is more impressive, IMO, because at that time they didn't have real tech to base their ideas on. They had to invent them out of whole cloth.