After years of complacency – and falling sales – Apple has transformed the iPad into something it should have been from the start: a proper computer.
iOS 11, which dropped Tuesday evening, only does a little bit for the iPhone, but a great deal for the iPad. It isn't too much of an exaggeration to say that a mature computer …

For the vast, vast majority of users (including me) it's a consumption device with a bigger screen than a phone for old (or young) eyes. And it works perfectly for that. Wouldn't be without it, use it multiple times a day and have since the second generation was launched.

And when I travel for work, work laptop and personal iPad keep things nicely compartmented.

'work laptop and personal iPad keep things nicely compartmented'

This, totally this.

It's the mental equivalent of taking the suit off and popping your jeans on, plus I can browse and use apps on my iPad I wouldn't wish to do on my laptop (no not that, but really embarrassing stuff like El Reg)

Genuinely curious. My wife's iPad2 basically stopped working (well most of the apps did) after one of Apples "force obsolescence" updates. We took it to the Apple store, who suggested we buy a new iPad.*

I assume this was a year or so back; Tesco announced in October 2015 that there would not be a successor to the Hudl2 - released in October 2014.

My iPad2 (16GB, iOS 9.3.5) is still going okay, although given the huge performance difference between it and the Air2, it's shortening battery life and limited memory, it has been relegated to 'leisure' usage.

I learnt that the best way to avoid failed apps after iOS update was to ensure all apps were updated before installing the new edition of iOS. Also iOS like other OS's doesn't perform well with constrained memory (less than a couple of GB unused). Perhaps you should do a factory reset and only reinstall selected applications rather than fill the memory.

It is always a good practice to wait for the .1 (not 0.1 but .1) update if you have hardware older than two years. Many suggest that Apple doesn't optimize for older hardware on the initial release, concentrating more on getting the features in and making sure it runs well on the new iPhone that comes out at the same time.

People with older stuff are unhappy that it is slower, then the .1 comes out next month and those issues are mitigated. Apple obviously targets performance regressions in older hardware in the .1 version.

The unanswered question is how long did it take with this same iPad running on iOS 10? If it went from 9 to 12 seconds, with all of the background activity still going on, I'm not too worried about performance (but I still will wait for a few updates to come out). 2 seconds to 12, yeah, that's an issue. 20 seconds to 12 (doubtful from the context) and the Bootnote is misdirecting.

Is this a step backwards?

The whole point of the iPad product (and its OS) was the bonkers sand-boxing that made it almost impossible for one app to muck about with another. It was a significant impediment to malware and in combination with not letting *users* fiddle at the file-system level it made the iPad pretty safe for Joe User or indeed Joe User's offspring.

If they are now relaxing all that with a proper files app and letting folks use the thing more like a real computer, perhaps that is a retrograde step. Worse, perhaps it is not a big enough retrograde step, since anyone who actually wanted a "proper computer in a tablet format" already has quite a few options that have gone the rest of the way.

Re: Is this a step backwards?

No, it's fine, because it's not a "proper files app" in the sense of the Finder. It primarily allows you to navigate your iCloud, Dropbox, etc storage and maintains local caches of that, providing applications with pipes into and out of that conceptually remote storage. It doesn't expose the native file system. You can't investigate /Applications or /Library or /usr/bin anything else. iOS-level sandboxing still applies.

Well it took the rest of the world long enough to (nearly) catch up with proper font scaffolding and sub pixel hinting, so something as incredibly arcane as app->app, app->file, file->app d&d was bound to take longer.

When your userbase was as UI-obsessed as Apple's were during the late 90s/ early 00s, yes. Given that anyone who wasn't entirely UI obsessed wouldn't have touched the crap they were churning out then with a barge pole.

Remember, when we're discussing early Mac OSX, we're not talking about the sleek, shiny, high-end post-2006 Intel-powered Apple, but instead the pre-2005 PowerPC Apple, who's hardware was a shoddy joke (though still with an eye-watering price tag). The interface was the only thing they had going for them, so interface screw-ups were a big deal in Appleland. People were literally shocked that the 2007 Macbook also turned out to be a good computer underneath the shiny interface, because we'd not seen a competitive bit of hardware from Apple since about 1983.

Re: Innovation? We've heard of it

Re: "network lag"

Not sure why you were downvoted, as that seems like one perfectly reasonable explanation.

I also suspect a lot (most?) of it has to do with the VM bloat of Java Dalvik/ART, and the fact that Android in general is a convoluted mess under the hood. That wouldn't explain why iOS has roughly the same performance issues, though, since AFAIK it's fully native. I presume Windows mobile is using some .Net/CLI garbage.

Or maybe it's that relic known as Secure Digital storage, or the pitiful speed of ARM main buses, or some other bottleneck that isn't obvious.

Whatever it is, it's truly shocking that the technology is moving so slowly that it's still outperformed by thirty year-old systems from the 16-bit era.

Re: Innovation? We've heard of it

I hope the iOS 11 multitasking will permit me to start to load a webpage, go back to my phone/inbox/messages whatever and then return to a fully loaded web page and doing all this fluidly ie. without app's deciding that they can't listen/act on user input whilst it is waiting for the comm's, so you have to wait until the device either loads a web page or times out...

Re: Innovation? We've heard of it

I'd bet if you wrote an OS for the phone in machine language customized to the exact CPU set & hardware in the unit & took out most of the GUI stuff and had your applications written in assembly with no extra linked libraries of stuff they don't need, you would see your phone work faster than you could imagine.

However, you'd have almost no apps as it's a bit hard to copy/paste your way to working programs in assembly or machine language.

There was an article a few years back about a company building some car sensors / displays based on a very stripped down & hardware customized version of Linux, getting sub second power on till fully working & accepting input / displaying results times.

Then for extra points you could always try for the compressed self relocating machine code type programming.

Re: Innovation? We've heard of it

Your not talking about Amigas and it's applications then as that was C and linked libraries and functional GUI elements. Yes games would dump the OS, multitasking and hit the Hardware with probably a fair bit of assembly.

Modern compliers will generally produce quicker code than hand assembly unless there are some major cpu instructions the complier can't use like SIMD. Most of the time you'll be slower.

The problem is more likley (in no order):

Bling GUI elements like transparency, fade in and zoom on to display or wobble.

Lots of code which isn't complied into native CPU instruction things like javascript and bytecode languages.

Text based protocols and formats like HTML, XML which has to be scanned byte by byte.

Can't even tell how many bytes a character is with checking each one, but it mostly doesn't matter.

Use of Frameworks where 5% of the features are used but have to take the hit of the complexity of the other 95%

Re: "you'd have almost no apps"

This modern idea that software has to be written in some bloated framework with a bloated VM or else there will be "no apps", is in blatant contradiction to the fact that the likes of Amiga OS was written in BCPL and it's games and apps were mostly written in low-level languages, and yet there was no shortage of software available, in fact there was a veritable explosion of it.

The inevitable conclusion is that today's programmers, or at least the millennials, are lazy, uneducated and incompetent, and the centrepiece of their workflow is something that would look more at home in Toys R Us than at a software engineer's desk.

Re: "you'd have almost no apps"

> The inevitable conclusion is that today's programmers, or at least the millennials, are lazy, uneducated and incompetent, and the centrepiece of their workflow is something that would look more at home in Toys R Us than at a software engineer's desk.

Okay... why single out the "millennials" when we've experienced stuttery computers for decades? Might the answer instead be that the grasp of people's home computer systems always exceeded their grasp?

It's the Red Queen effect. As soon as one feature is perfected someone (be it the user, the seller, the marketeer, the enthusiast) thinks of adding another, and at ever higher bitrates, pixel count and density, adverts per page, frames per second, milliseconds saved in latency, always ever faster faster faster. The coder might not have time to tie his shoelaces up if he's always running!

Re: Innovation? We've heard of it

There was an article a few years back about a company building some car sensors / displays based on a very stripped down & hardware customized version of Linux, getting sub second power on till fully working & accepting input / displaying results times.

Whilst it was an interesting exercise, the same results could have been achieved with much less effort and cost by using an off-the-shelf RTOS...

Re: Innovation? We've heard of it

Seriously, it never ceases to amaze me how multi-core, multi-gigahertz, multi-gigabyte mobile systems can be so excruciatingly unresponsive compared with my 16-bit, 7MHz, 512KB Amiga from the 1980s.

How can this even be possible?

Well, for starters, your Amiga is doing all the bounds checking and type safety that my old Speccy used to do...

Probably more importantly - and more seriously - it's a smaller system. Less inter-dependencies. People talk about wanting time back that they spent watching a crap film. Screw that. I want the time back that I've spent watching Java and .Net programs start up. I'm not sure how much time I'd get back exactly, but I suspect that your grandkids will know me as "that guy who's functionally immortal".

Your Amiga could print text to screen with mere kilobytes of dependent libraries. These days we have to wait for megabytes of dependencies to load. Usually because some idiot developer thinks that maybe, some day, they'll need to parse JSON or make a raw TCP socket connection or whatever - so they should definitely have that in their project.

Re: "Hazy watercolor memories... Of..."

If you profile the start-up of a "modern" application. What you find is 3 things generally dominate the profile:

1) Loading from disk. The application has got bigger by orders of magnitude, partly because of code bloat, but mainly because of graphics and other shinies. Your Amiga had a tiny resolution screen in comparison to people running stuff on 4k monitors that have to scale.

2) Memory copies. Oh my god. Don't get me started here. I write high performance applications for a living, and think carefully about every memory copy. But if you follow the memory copies that occur in a "modern" application they are depressing. Pull something off disk into a temporary buffer; copy it into another location whilst you figure out how to use it; clone it 3 times into new locations, most of which die. And on top of that you have garbage collectors that are designed to run during the stable state of the application where everything is neatly divided into long-lived objects and short-life transient objects. But at program start-up you don't know which is which, so the GC often ends up copying objects that will become long lived multiple times until it figures out that the object is going to last through the lifetime of the app.

3) Pointless UI hooks. This one isn't quite as big, but a modern GUI has a shed tonne of UI hooks. Not just your basic "OnClick" but also UI hooks for when something is loaded, when it is rendered, when something is rendered on top of it, when the mouse moves over it, etc. A modern OS might easily have 50-100 UI hooks for every UI element. All these get called multiple times during start-up. Even if no code is implemented in the hook, due to the virtual implementation of these hooks, they often have to be called to figure that out. Especially in dynamic or JIT compiled languages.

It could all be improved. I've seen number 3 fixed by disabling all UI hooks during start-up - but that isn't always possible in every GUI framework. Number 1 has improved more recently by throwing SSD at it. There are some system level things you can do for number 2 (improved GC for example). But in general, it is down to the app writers. Lazy loading of objects from disk that you don't need for initial start-up, or reduced memory copies in your app, are both things that the app writer has to figure out. Unfortunately, like installers, it isn't irritating enough to the app developer to actually bother to fix. (If you have just waited half an hour for a compile to run, or a regression test suite to run, how is 10s to start the app going to bother you).

Re: "Hazy watercolor memories... Of..."

> Your Amiga had a tiny resolution screen in comparison to people running stuff on 4k monitors that have to scale.

That should actually point to vector based visual elements that actually can scale rather that something that is merely oversized to the point where enlarging it won't pixelate it too much. The former should take up less space rather than more.

The iPad is still what it has always been, a limited functionality device perfect for sofa-surfing and web-browsing, not worth considering for anything serious.

The latest upgrade further attempts to blur the distinction between the iPad and a proper computer for actually doing work on, and as usual with Apple makes a complete between-two-stools dogs dinner of it.

For God's sake Apple leave the iPad alone. SInce well before Jobs died, Apple have been focused on being a social status signifier and a leisure computing brand.

If sections of the organisation really still want to pretend they're touting a serious OS for doing actual productive work on, Apple should concentrate on forcing people to spend well over the odds for the MacBook UltraThin SuperShiny (or whatever the laptops are called these days) rather than trying to make the iPad an all-singing all-dancing device.

If sections of the organisation really still want to pretend they're touting a serious OS for doing actual productive work on, Apple should concentrate on forcing people to spend well over the odds for the MacBook UltraThin SuperShiny (or whatever the laptops are called these days) rather than trying to make the iPad an all-singing all-dancing device.

Or produce a 'business' version.

Surely one of the key advantages of the iPad (and iOS 4) was it's simplicity of operation and portability and so made IT accessible to a much wider audience.

You could (and I know people who did) give an iPad to a child with learning difficulties and they could use it: play their favourite music, videos, games etc. Something you can't do with a Windows tablet - unless you want the kid to use it as a frisbee...

Hence perhaps what is needed is iOS for everyday users and iOS for business - given what Doro and Samsung have done to make Andriod more accessible, I suspect these could be the same version, just with different UI skins.

"You could (and I know people who did) give an iPad to a child with learning difficulties"

I bought myself an ipod touch around 2008 and within days my autistic, non-verbal 10 year old son had managed to get his hands on it. It was a revelation! Before he had a big keyboard with a two line LCD panel which spoke the words he typed in, which naturally ended up not being used because it was too unwieldy. My son took to the ipod like a fish to water. He could play his music and more importantly type into it to tell people what he wanted, all in a light weight easy to carry hand held device. I think he is on his third or fourth ipod touch now and when this one dies I don't know what we'll do now that Apple have discontinued them.

I use an ipad on ios11 and it works really well. It's an excellent tool for me to do ad-hoc remote support and access, email, web browsing, updating site documentation on site and watching films in bed.

iOS used to have a USB host mode that you can use to mount external storage - does it still? Also (again, this was some time ago) I seem to recall that you could move stuff to and fro using iTunes (and not just media - I used to move PDFs across).

And as someone previously mentioned - there are lots of OTA methods of moving files.

DFC/IP - new to iOS 11

Hidden deep in the iOS 11 developer notes is the intriguing DFC/IP. This has involved significant enhancements to the iOS networking stack which now has deep and full access to the GPU and Apple's new Neural Net Engine, and was quietly built with the support of a number of industry-leading personal entertainment providers. IP we all know about; but what of DFC? Well, it stands for Dynamic Fleshtone Compression, and is allows deep packet inspection during the video streaming process to fully optimise the compression of the full gamut of flesh tones (as well as a few additional colours to cover the various fluids found in such productions). Apple is being somewhat coy about the whole thing, and has not yet sent the spec out for ratification, but I'm sure this will happen.

There's an interesting discussion on all of this from a rather happy looking Distinguished Apple Engineer here (SFW): goo.gl/SsAhv

I've been testing DFC/IP on my own in the end cubicle between 3-4pm at work most week days since the original iOS beta, and, apart from a bit of rawness (me, not the streamed video images) it all works very well. I am deeply satisfied.

Re: Optional

School attended by my offspring has signed them all up for Google Drive (etc.) without even asking. Now they are expected to use it to complete assignments from home.

What a shame their main computer is a Pi which doesn't quite manage to make Drive work...

Then there's selling (presumably) my mobile phone number (again without asking - my number is on record purely for use as an emergency contact) to a commercial company which wants me to sign up to some kind of proprietary twitter-like thing so that the school can "keep in touch", something it could do just as effectively by other methods but has a pretty poor track record of doing so.

Re: Optional

So long as you don't blame the IT guys, your criticism of school admin processes is pretty accurate.

Currently I have parents asking why I have four different systems for booking different things. It would be rude to respond "Do you know how many I have internally?". Someone goes on a course, sees a thing, another school says they have it, suddenly we HAVE to have it, we get it, realise it's the same as a module in the thing we have that we already pay a fortune for, half-ass an implementation to get it to talk to the same databases (just don't even go there), roll it out, give the parents YAFP (yet another flapping password), and then have to deal with all the differences, implementation, servers, licences, ways of working, data differences ("Oh, you want to opt-out from everything... let me just remove you manually from 20+ databases and hope the teachers didn't save your details").

We've heard of database sync. Shame most of the vendors we're forced to use haven't. I currently have... 1, 2, ,3, 4... at least 5 copies of our primary kids+adults database information in various services (everything from Google Education to an alumni software), not to mention all the little bits, assessment programs, website logins for outside services, etc. etc. etc. Of course, they all sync seamlessly and never have a difference of opinion on what's an acceptable password, email, address field (just address, or housename as a separate field, or house number, or is postcode included, does it need a town or not?), etc. and with the exception of Google, no decent import/conversion/sync routines to match them all up whatsoever. Oh, and sometimes data-import/conversion charges every time you want to actually suck in automatically more than the handful of data you could do manually.

Don't even get me started on the people who "opt-out" of communications and then complain they aren't getting the newsletters any more...

When is a toggle switch not a toggle switch?

When it is on an iPhone. Bluetooth and Wifi toggles in the control centre no longer turn off the respective radios, they merely disconnect. You have to go into settings to properly turn them off. I don't like this.

It amazes me to think that an iOS UPGRADE can take 1.8GB of space. Unless it comes with lots of HD videos included I can only assume its sloppy coding that makes everything so huge. It not like they have to include loads of drivers for lots of different hardware like Windows or Linux since they only need to support a few different spec of hardware.

As I mentioned on the BB QNX topic, they were able to put an OS, desktop environment and browser onto one floppy disk, how is it that iOS 11 needs 1000 x more space?

I am not singling out Apple here either as Android, Windows and Linux are all just as bad for growing in size every year.

Aww

"So the iPad remained essentially a picture frame that ran apps. And perhaps unsurprisingly, people weren't in a rush to upgrade their Apple picture frame on a regular basis... Because it was still a picture frame with apps.

...I've had just two iPads in more than seven years. They spend their time gathering dust. Given the lack of use, I wouldn't have one at all if the school didn't insist that children do assignments on them."

Andrew, did you draw the short straw in the "who's gonna review iOS 11" office sweepstake? Hardly off to an objective start, eh?

"I wanted to discover how it ran on just about the oldest iPad hardware compatible: the first Air, which is now four years old."

For comparison, how does Android Marshmallow run on a 4-year-old tablet (if at all)?

DDE, OLE, COM, CORBA...

I'm afraid Orlowski made some confusion between DDE and COM - the former was often used to show updates between applications, the latter being often used to show how documents could be embedded in another application (i.e. an Excel sheet in a Word document).

Despite their complexity, SOM/COM and DCOM/CORBA were far better designed than the web stack used today - they took security into the design directly and didn't try to bolt it on top of layers and layers of ill-designed protocols. They also let to design a clear, clean API (despite IDL being a little complex to use) with strong data typing to catch errors early instead of spending most of the time parsing textual representation of data - because the web programmer is totally unable to understand binary data.

The only downside they weren't designed to work on networks where only a few TCP/IP ports are available.

I found really ugly applications were to manage a service on the same PC it has to run a web browser and then go through a whole stack starting from HTTP... using very little of the host OS security.

Computer?

A computer is exactly what a device like iPad should not be. Computers are for computer people in white coats to tinker with. Apple changed the paradigm with the Macintosh introducing an appliance. These devices and appliances happen to be implemented with electronic computer technology. But users should not be exposed to the fact of the implementation. Apple even dropped the word computer from its name.

I'm a bit concerned about exposing things like file systems, although as an old-time computer person, file systems are ingrained in the way that I think. But files expose the memory hierarchy which itself is just implementation. Users should never have to think in these terms - only in terms of what they need to do.

I hope this is not a backward step for iPad, but I have only had iOS 11 for 24 hours and mostly it seems pretty good.

The OS X dock came in for criticism by an old Apple designer and UI expert Bruce Tognazzini:

But why remap the frikkin keyboard?

I'm not sure I buy the "computer replacement" thing

For me, iPad (and tablets generally) is primarily a content consumption device which doubles as a communication device and data entry device. For consuming video, it is first-rate. For games, clearly it has a lot of processing/graphical grunt these days for developers who can figure out a touch-centric UI. For people like me, it is ideal for displaying lyrics and guitar chords - and I assume the Pro is great for displaying musical score. Obviously it's fine for browsing the web too. I don't find it good for reading on but my iPad is old and clunky and heavy.

As a "star trek" pad it offers a great way to enter data, an electronic form/clip-board.

For skype et al it's a great solution.

Trying to innovate the iPad is largely difficult due to the fact they pretty much nailed it ages ago and there's not much to change that regular users want. I don't upgrade because a faster CPU doesn't make it better for consuming content.

Er....

"It isn't too much of an exaggeration to say that a mature computer platform has just fallen out of a clear blue sky."

Speaking as an enthusiastic iPad Pro owner (with keyboard) who has just updated to iOS 11 and things it's pretty damn good..... no, the above statement still isn't true. Witness One for the prosecution: Files App — nice, an improvement, but still missing out the most essential feature (like, actually opening files...)

Power guzzling iOS11

The biggest problem with iOS 11 is the power guzzling. My iPad Mini 2 used to last 8-10 hours but since updating it only lasts 4-6 hours now. Annoyingly my iPad 2 is incompatible with any iOS since version 9.3.5 yet it still works perfectly. As usual with Apple they build obsolescence into their products to force you to upgrade to a newer phone and iPad. The good thing with Apple is they don't charge you for it, so that's a bonus over other systems.

As far as the comments saying they're too expensive goes, so are their rivals. Samsung phones are on a par with Apple but spend more advertising how marvellous they are and yet they still use awful Android; I say awful because it is unless you're used to using a Spectrum 48 and don't want to use anything more complicated.