Ive already started exercising my eye rolling in preparation of some posters crying foul on Apple for offering a stylus accessory despite the fact that the primary input for the tablet will be finger-based with the stylus being for select app features for particular users, like drawing diagrams in class.

Start workout: superior rectus muscle intorsion with left extorsion with right and 1 and 2 and 3 and, superior rectus muscle extorsion with right and intorsion with left and 1 and 2 and 3 and Also, do forget to drink plenty of fluids on Wednesday

There's a very good reason we stopped using fingers to paint with and invented brushes, pencils and pens...

Disagree. For some people - including myself - the muti-touch on the Magic Mouse is a one-way trip to the RSI clinic. While I like the ability to scroll without having to hit a wheel or ball, using two fingers in a sideways motion while keeping the mouse from moving with the thumb or other fingers induces instant wrist pain.

If you have any kind of wrist pain already when mousing, I'd strongly suggest avoiding the Magic Mouse - or at last avoiding using any of the multitouch features.

I don't think that is a good rationale though because we can't change the shape or physical attributes of our fingers for different tasks in the real world. In the digital world we can though.

If, in the real world, we could use out finger and create a precise line of any pattern or shape or level of precision we would have no need for different tools.

Only by zooming in to an insane amount of precision, which robs you of the "big picture" of whatever you're working on. While you could get away with that on a Microsoft Surface-style "big screen" interface, on something that's 10in or less it will be unworkably bad.

Imagine if you are using the tablet to draw on and your primary PC monitor to view the overall picture? I think this could work very well. I also think you would get used to using our finger to draw on if you chose to do that. Though it would probably be easier for most people to just use a pen stylus as we are used to doing. We are creatures of habit.

Imagine if you are using the tablet to draw on and your primary PC monitor to view the overall picture? I think this could work very well. I also think you would get used to using our finger to draw on if you chose to do that. Though it would probably be easier for most people to just use a pen stylus as we are used to doing. We are creatures of habit.

Well in a machine that's primarily portable, having to have a monitor attached would be pretty fatal

It's interesting, too, that people are talking about it as if it's a choice between pen OR touch. It doesn't have to be. The current generation of Wacom tablets, for example, let you use either by sensing when a pen is close and "switching off" touch sensitivity at that point, No pen, and your fingers will work.

This solved the problem that was a major issue with the stylus on the Newton: the "lean-on" problem. Newton's touch screen was passive: that is, it responded to anything which touched it, including your hand rather than the pen. Most people who write tend to lean on the surface they're writing on to a lesser or greater degree. So, with Newton, you had to train yourself to not lean on it, which disrupted your normal handwriting - and so made it more "scrawly" and less easy to recognise.

Microsoft's TabletPC solve this initially by simply dictating that all tablets had to have what's called "active" pen digitisers. That is, they didn't respond to touch, but to the proximity of electronics in the pen. This was great, because it meant you could lean on the screen - and as screens get bigger, you find people doing this more. And it massively improved the legibility of what you were writing. Coupled to a seriously good handwriting recognition engine by the time Of XP SP2, and you had a very good system.

Two big problems, though. First, active digitisers were expensive, and accounted for a good proportion of the $200-500 more you'd pay for a TabletPC compared to an equivalent conventional laptop. Second, no touch with the fingers at all locked you out of using gestural stuff in the interface, and meant that if you lost your pen you were screwed.

(Microsoft later relented, and started allowing passive, resistive touch in TabletPC. This meant you could touch the screen - but in classic PC-world actions, manufacturers all used cheap resistive touch screens to drive down the cost, which gave a really shitty user experience.)

Now, though, we have the technology to do touch which is capacitive, giving a good slick experience with fingers, AND active, which gives a good experience with active pens. In graphics tablet form it's not expensive - but I don't know how expensive it would be to integrate it into a screen.

Will Apple use it? I don't know. They could deliver something which allows you to use an active pen for great quality drawing, handwriting or diagrams but also your fingers for a virtual keyboard and that slick iPhone-style user interface.

My gut feeling is that it comes down to cost. If they can make something that combines "the world's greatest touch interface" with "the world's greatest pen interface" they'll do it - but if it's a choice between the two for cost reasons, touch will win out.

As the de facto input method, absolutely not. As an included accessory, I doubt that, too.

But there is a need for a stylus if this tablet is going to be marketed across the board, like I think it is. A stylus for signatures, drawing in many various situations, and even replicating the annotations that we do in textbooks with a simple stylus that can change from a highlighter, underliner, strikethougher(?), etc.

Well in a machine that's primarily portable, having to have a monitor attached would be pretty fatal

What I mean by that is to use he tablet as a peripheral input device when you aren't using it as a portable. So it has a dual use. When you are using it as a portable it works as a more rudimentary input device for different applications to jot down notes or cut and paste things from documents and doing small art projects. Then you use it for more detailed work in conjunction with your primary PC.

A couple weeks ago AppleInsider reported that some who had seen the tablet were claiming that we would be surprised by the input method. Others have suggested that the tablet will make extraordinary use of multi-touch. A week ago an AI post suggested that iphones may come with touch sensitive panels on the back cover. This article quotes Steve Jobs emphasizing that one of Apple's keys to success is introducing intuitive new interfaces.

Speculating and crossing fingers, I suggest that the key innovation in the iPad will be multi-touch back navigation, in particular "back-typing." The key problem with tablet computers up to this point is input method. You have to hold the tablet, and input data, at the same time. A stylus (while great for some purposes) is lousy for text input. Thumb typing is slow and awkward at best. Back-typing could solve the input problem, and completely change the character and usefulness of a tablet. If Apple can make it feel intuitive and easy, they will have a monumental success.

Pick up a smallish, thin hardback book (whatever size you think the iPad will be). Hold it between your palms. Amazingly all of your fingers are free to tap on the back of the book, while your thumbs are free to tap on the front. This does not work with an iPhone because it's too small. All we need now is a typing method that makes use of these freely tapping fingers. It will need to be a new method, probably one that does not require tapping in 26 different spots to choose letters, but there is no principled reason this can't be done. The thumbs could operate space and shift and choose between sets of letters, the fingers would merely tap. I'm convinced this could be at least as fast, and as natural, as standard typing.

I don't believe that Apple would introduce a device that didn't provide some truly unique hardware related property. This is the one I'm voting for.

err not really ireland rehashed tired old data
we
looked for a tablet since 1996 or earlier

the dream of a newton never died among millions of mac faithful
and many also felt the tablet was supposed to come before the iphone

so
posting a topic 9 yrs later means zippo
sorry

back in 1999 mac rumours did a series of articles about a tablet very much like the one that may come
except it was for children and the finger interface was for painting
yet a children's tablet product would be nice
apple sued them by the wayu

Evilmole, the big things I wonder about are parallax and and the ability for the capacitive screen to capture detail when drawing. From what I have gathered there are limitations with capacitive touch screen size and detail and accuracy.

Also, is parallax a problem on the iPhone?

I love how the iPhone has no ridges or edges on the surface so the drawing/touching surface is completely flat, unlike most tablets out there. But does this make the screen thicker, increasing parallax issue on the iPhone compared to other tablets?

I love how the iPhone has no ridges or edges on the surface so the drawing/touching surface is completely flat, unlike most tablets out there. But does this make the screen thicker, increasing parallax issue on the iPhone compared to other tablets?

I don't know the answer to that (and if I did, I'd probably be working for Apple )

The trade off that Apple makes may well be simple "this is not a product designed for detailed work". And I'd rather they did that than push the limits of what can be done at a reasonable price with current technology. I'm excited either way

I am hoping it can at least do work comparable to the Modbook, with a slightly smaller screen. Though the president of Axiotron seems to be very confident Apple isn't going that direction. I really hope he is wrong!

What difference does it make whether a stylus is included or not. There are capacitive stylii that exist, and the iPhone touch screen doesn't even know the difference. Haven't any of you bought something from Apple recently with their new iPod Touch checkout system? You sign on the iPod!

No, it's not about the actual stylus that matters. It's whether Apple has taken the Newton handwriting recognition tech from the 90s and developed it even further such that it's really useful today. Plus, you'll want third party developers to be able to tap into that technology.

It's always important to think about the big picture for these kinds of things. A capacitive stylus is no different than your finger.

Now, if only someone could come up with a nail-based touch system! It's still tactile enough and yet more precise than a fingertip....

I am hoping it can at least do work comparable to the Modbook, with a slightly smaller screen. Though the president of Axiotron seems to be very confident Apple isn't going that direction. I really hope he is wrong!

Don't count on it. This is going to be based off of the iPhone OS. We may not even get the simple ability to download a file through Safari. The Modbook, otoh, is an actual computer though it's a monstrosity.

Don't count on it. This is going to be based off of the iPhone OS. We may not even get the simple ability to download a file through Safari. The Modbook, otoh, is an actual computer though it's a monstrosity.

What if the Tablet is used as a peripheral that works in conjunction with your primary PC though? This way it wouldn't matter if you use the iPhone OS because you are just using it as an input device. So it may not be an all in one solution like the Modbook but you get the same results with your primary PC. IMO it would be a waste of a tablet if you can't use it for dual duty like this -- media player and drawing pad -- especially for a 1000 bucks.

I don't understand why Apple hasn't yet introduced a multi-touch keyboard/mouse surface as FingerWorks has. For a company that prides itself on innovation, they're being surprisingly conservative. I was surprised that they came out with the Magic Mouse at all, instead of an external trackpad, which would have been more useful considering the usefulness of the MacBook (Pro) trackpad.

As a geek, I find that the movement from keyboard to mouse/trackpad and back again wastes precious seconds from a day of heavy coding. That's why I've memorized as many keyboard shortcuts as possible, and why IDEs such as Eclipse have introduced so many of them, even for such mouse intensive tasks as switch from the editor to the project view. In my close to 10 year career as a developer, I've probably spent over a solid month switching from keyboard to mouse.

I'm hoping Apple is working on this, and are just doing their do diligence in resolving the design, ergonomic, and QA issues inherent in such a device.

Yet everyone and his dog seems able to copy Apple as per usual. So were the patents lame? Or are the copies very limited compared with Apple? I have never tried a phone from a copycat company yet so I simply don't know but their ads certainly imply they are as good as an iPhone.

From Apple ][ - to new Mac Pro I've owned them all.Long on AAPL so biased"Google doesn't sell you anything, Google just sells you!"

I was a touch expectant of Mr Jobs in demanding 3 years, but this post on MDN was long before any iPhone announcement, let alone iSlate...

Mar 20, 06 - 05:52 pm Comment from: Macaday

I just read Stuarts post and a thought "ping" crossed my mind. I reckon Apple do need the phone technology because the genuine, ultimate, ALL-IN-ONE product will one day be here.

So the parts of that gizmo they need will be the phone, HDTV output, GPS, WiFi and Bluetooth. Put all that together on an OSX operating system all working beautifully together and with a touchscreen and you have one very cool product that will shake the world. No other manufacturer could come close to making that work, but Apple definitely could.

I don't understand why Apple hasn't yet introduced a multi-touch keyboard/mouse surface as FingerWorks has. For a company that prides itself on innovation, they're being surprisingly conservative. ...

You make two contradictory statements. What is innovative about releasing a product that has already been released by another company? You also betray an astonishing misunderstanding of technology from Apple. Apple has never sold technology for technology's sake. It sells technology that "just works" for its customers. There are notable exceptions, but they are just that--exceptions. Apple is indeed an innovative company. However, its singular goal is to bring intuitive technology to the masses. If the technology exists, then Apple uses it. If it does not exist, then Apple invents it. Then there are those cases where Apple takes an available technology and makes it usable.

A couple weeks ago AppleInsider reported that some who had seen the tablet were claiming that we would be surprised by the input method. Others have suggested that the tablet will make extraordinary use of multi-touch. A week ago an AI post suggested that iphones may come with touch sensitive panels on the back cover. This article quotes Steve Jobs emphasizing that one of Apple's keys to success is introducing intuitive new interfaces.

Speculating and crossing fingers, I suggest that the key innovation in the iPad will be multi-touch back navigation, in particular "back-typing." The key problem with tablet computers up to this point is input method. You have to hold the tablet, and input data, at the same time. A stylus (while great for some purposes) is lousy for text input. Thumb typing is slow and awkward at best. Back-typing could solve the input problem, and completely change the character and usefulness of a tablet. If Apple can make it feel intuitive and easy, they will have a monumental success.

Pick up a smallish, thin hardback book (whatever size you think the iPad will be). Hold it between your palms. Amazingly all of your fingers are free to tap on the back of the book, while your thumbs are free to tap on the front. This does not work with an iPhone because it's too small. All we need now is a typing method that makes use of these freely tapping fingers. It will need to be a new method, probably one that does not require tapping in 26 different spots to choose letters, but there is no principled reason this can't be done. The thumbs could operate space and shift and choose between sets of letters, the fingers would merely tap. I'm convinced this could be at least as fast, and as natural, as standard typing.

I don't believe that Apple would introduce a device that didn't provide some truly unique hardware related property. This is the one I'm voting for.

This is what Ive been saying for a long time now. Im glad someone else sees it.

You can hold, while still resting on the ground, your notebook display panel. Surely this is larger than a tablet but you can get the idea of how your fingers are free to move on the back, the thumbs are free on the front, perhaps for typing on a separated arch keyboard, and yet you can still firmly grip the device.

They have patents for it, Notion Ink is doing it, and frankly, its the only thing that makes sense..

Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"

No, Ireland has simply relied on "resting" his hand on his mouse for too long and is no longer able to hold his hand up like the rest of us can due to deterioration of his muscles. So the Magic Mouse doesn't work for him.

Unfortunately, he doesn't qualify his blanket statements. What he says is, "The Magic Mouse sucks" when what he should be saying is, "The Magic Mouse sucks for me".

What if the Tablet is used as a peripheral that works in conjunction with your primary PC though? This way it wouldn't matter if you use the iPhone OS because you are just using it as an input device. So it may not be an all in one solution like the Modbook but you get the same results with your primary PC. IMO it would be a waste of a tablet if you can't use it for dual duty like this -- media player and drawing pad -- especially for a 1000 bucks.

It's not a bad idea but it's not very useful. The problem with using this device as some kind of mouse or alternative input device for a desktop is that it makes for a sort of screwy user experience. You will have two screens that are in two very different positions and you lose that sense of touch-based intuitivness that the iPhone OS gives you because the primary machine is the desktop. It's convoluted and that seems very un-Apple. This is not even considering that if it has the iphone's concave design, it can't even rest flat on a surface.

This tablet may just be an e-reader of sorts or may eventually become a full-blown computer 5-10 years from now. Apple may have another product in the pipeline that is meant as a replacement for a laptop. However, just going by rumors, this is not meant to replace your computer.

I don't understand why Apple hasn't yet introduced a multi-touch keyboard/mouse surface as FingerWorks has. For a company that prides itself on innovation, they're being surprisingly conservative. I was surprised that they came out with the Magic Mouse at all, instead of an external trackpad, which would have been more useful considering the usefulness of the MacBook (Pro) trackpad.

As a geek, I find that the movement from keyboard to mouse/trackpad and back again wastes precious seconds from a day of heavy coding. That's why I've memorized as many keyboard shortcuts as possible, and why IDEs such as Eclipse have introduced so many of them, even for such mouse intensive tasks as switch from the editor to the project view. In my close to 10 year career as a developer, I've probably spent over a solid month switching from keyboard to mouse.

I'm hoping Apple is working on this, and are just doing their do diligence in resolving the design, ergonomic, and QA issues inherent in such a device.

I hope so too, I have my own grievances with the new highly reflective screens, but Apple has proven it places the design of it's hardware above ergonomics. Luckily there was a massive outpouring of hostility concerning that and Apple has offered a few of the older options.

Isn't there a third party keyboard that makes your job a lot easier?

The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...

Apple has found a need for a stylus on the iPhone/iPod Touch. Note the point of sale touches used in their retail stores. They now use a stylus so customers can sign their name. There are two uses for a stylus, signatures and detailed drawing. For writing, not so much.

Luckily all you need for a stylus on these devices is an electrically conductive stick with no electronics. I predict an updated OS for the tablet and phone will support this type of stylus by including the Ink technology that is already in the Mac OS. It supports drawing and handwriting.

It's not a bad idea but it's not very useful. The problem with using this device as some kind of mouse or alternative input device for a desktop is that it makes for a sort of screwy user experience. You will have two screens that are in two very different positions and you lose that sense of touch-based intuitivness that the iPhone OS gives you because the primary machine is the desktop. It's convoluted and that seems very un-Apple. This is not even considering that if it has the iphone's concave design, it can't even rest flat on a surface.

This tablet may just be an e-reader of sorts or may eventually become a full-blown computer 5-10 years from now. Apple may have another product in the pipeline that is meant as a replacement for a laptop. However, just going by rumors, this is not meant to replace your computer.

I see things the opposite. I actually think the way we have been doing things is what is convoluted. We have just been conditioned to not see that. Think about it.. what I am really describing is really like a modern version of the paper and book. Your tablet is the paper and your primary PC is the book.. only they can both interact with each other. And the paper can even function as a keyboard.

I believe the hurdle is understanding how to use these things to be most efficient. The way I think of it is your tablet is used for manipulating things hands-on and the primary PC is used for reference images or a wider view etc etc It's really just like dual monitors but on top of each other rather than side by side.

Also, keep in mind.. Apple has patents for a laptop that has 2 screens so it would be like taking a Apple laptop and putting another screen where the keyboard is.. and you would use this screen as a input device/keyboard. This is the same idea really.

Although some cannot for physical reasons, we never "stopped using fingers to paint with. Brushes, pencils and pens...are just an extension of our fingers.

True, but the point was that we incorporat tools to facilitate more accuracy and options. We don't call eating with utensils finer food despite our fingers being a key part of the procedure.

Quote:

Originally Posted by JavaCowboy

I don't understand why Apple hasn't yet introduced a multi-touch keyboard/mouse surface as FingerWorks has. For a company that prides itself on innovation, they're being surprisingly conservative. I was surprised that they came out with the Magic Mouse at all, instead of an external trackpad, which would have been more useful considering the usefulness of the MacBook (Pro) trackpad.

As a geek, I find that the movement from keyboard to mouse/trackpad and back again wastes precious seconds from a day of heavy coding. That's why I've memorized as many keyboard shortcuts as possible, and why IDEs such as Eclipse have introduced so many of them, even for such mouse intensive tasks as switch from the editor to the project view. In my close to 10 year career as a developer, I've probably spent over a solid month switching from keyboard to mouse.

I'm hoping Apple is working on this, and are just doing their do diligence in resolving the design, ergonomic, and QA issues inherent in such a device.

As you are aware, real innovation involves making it more useful than before, not simply making it cool. SciFi movies are full of visually appealing concepts that are ultimatly dumb. Most recently Avatar with see-through displays. And people think MBP glare is bad.

I think a flat touchpad keyboard is a horrible idea with today's tech. I want something my digits can register against when pressed. I want to be able to feel where the keys are when pressed. One day this will likely be resolved but it won't be anytime soon.

I think Jobs hates having to ship different keyboards for different countries and probably hates that they can't for everyone's needs in one design. We all remember the failed Optimus keyboard. A great idea using OLED keys, but being "ahead of your time" often isn't profitable. You have to strike when the moment is right; in this case that means when the tech is viable. Historically, Apple hasn't been the first of the major HW vendors to market but they're usuly first to make the change across the line.

I think we're no more than 5 years off from having OLED keys that change as needed. You hold down option and you see all rest of the character pallette. But that isnt the whole palette and that doesn't show you all the various Unicode commands, not to mention user and programming mapping for a wide variety of uses, so I predict when this change comes it will be accompanied by a shift in te keyboard layout. Meaning, for instance, the Option key may be a stickly toggle key that can instantly change the entire keyboardayout each time you press it, instead of just changing to alternate charactes while holding it down.

(Typed from iPhone, please excuse errors)

Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"

It's not. Not my opinion. I'll explain way it's terrible. Tracking: Lame ass terrible tracking compared to decent mice. Though to be fair that's more of an OS X problem, but it further demonstrates Apple do not understand the mouse, even if they introduced it to the masses.

You should not have to explain right click to anyone, ever. And I don't mean explaining why it exists or what it does, but how to click it. Apple call this mouse: "the world's first multi touch mouse". If it has MT then they should have allowed the user to right click without the need to lift their index finger. Another fundamental flaw.

The Magic Mouse is a simple demonstration of how clueless Apple can actually be sometimes. They think the mouse is "clever", but it's actually stupid. They make the mouse this way because 1: they are over-thinking the mouse, over engineering, and 2: for "looks".

Apple understands software--for the most part--and they make the best keyboards in the industry. But Apple simply do not get what a mouse should be, and the fact that it should be designed for the human hand. Not some tiny-baby-flat-index-lifting-over-explaining-lame-ass-tracking-alien-hand.

I see things the opposite. I actually think the way we have been doing things is what is convoluted. We have just been conditioned to not see that. Think about it.. what I am really describing is really like a modern version of the paper and book. Your tablet is the paper and your primary PC is the book.. only they can both interact with each other. And the paper can even function as a keyboard.

I believe the hurdle is understanding how to use these things to be most efficient. The way I think of it is your tablet is used for manipulating things hands-on and the primary PC is used for reference images or a wider view etc etc It's really just like dual monitors but on top of each other rather than side by side.

Also, keep in mind.. Apple has patents for a laptop that has 2 screens so it would be like taking a Apple laptop and putting another screen where the keyboard is.. and you would use this screen as a input device/keyboard. This is the same idea really.

Good points you made. Now that you mention it, I can see it being beneficial to a headless machine like a Mac Mini or Mac Pro but I wonder how useful it would be on an iMac or the Macbook line.

The only problem I have is that I would rather have it be an actual computer instead of being reliant on a more powerful machine. My Macbook is my prime machine. I really don't need two devices and I'm sure most feel that way. I'm sure Apple realizes this is the future but they may not view the technology as being "there" yet or maybe they are taking baby steps with the platform.

Thanks str1f3. I know what you mean about wanting a more powerful machine. I would like that too. But I think this may be just stage 2 of the touch revolution. The next stage is to put touch interfaces on everything including high power laptops and montiors.

My hope is that even though the tablet won't be the most powerful thing. they can make it so versatile that it makes it worthwhile. I see it as a keyboard from the future.. drawing pad.. media player.. second monitor.. remote control.. custom keyboard for any application... cheap laptop for people that don't really need a high power laptop.

If you already have a laptop and a desktop and a iPhone I could see how this would seem redundant though even with all the capabilities. There are a lot of people that just have a desktop though. This would be perfect for them.