Apple job listing tips new API, framework for next version of iOS

A job listing on Apple's own website indicates that the company will be rolling out brand new APIs and frameworks for the next version of its iOS mobile operating system.

The listing discovered by AppleInsider, which seeks an iOS Frameworks QA Engineer, entices potential applicants with the promise of developing "the very first iPhone/iPad app that uses a new API/framework in the next version of iOS." That engineer would join the iOS team, working with different engineering teams within iOS and Apple in order to ensure the quality of APIs and frameworks.

Major iOS releases are typically revealed at Apple's WWDC in June.

While the listing gives no indication of what specific new APIs could appear in the next version of iOS, it is an early confirmation that the next version of Apple's mobile OS will feature new and expanded APIs and frameworks. This, in itself, is no surprise, as Apple is continually adding features to its offerings; it is, though, a rare public acknowledgement from the notoriously secretive company.

The listing is one of a number on Apple's jobs page making reference to future versions of iOS. Another listing calls for an iOS Software Engineer working in App Compatibility, who would be charged with "analysis of issues found in existing and future releases of iOS software."

The listings give no indication as to when the next major version of iOS will debut. But if history is any indication, Apple will unveil what will presumably be known as "iOS 7" at its annual Worldwide Developers Conference in June.

Last year's WWDC was held in early June, where the company offered its first public glimpse of iOS 6 with an all-new Maps application, as well as the new Passbook feature. The software for iPhone, iPad and iPod touch launched three months later, in September.

A job listing on Apple's own website indicates that the company will be rolling out brand new APIs and frameworks for the next version of its iOS mobile operating system.

Ya think?

--

What do you all think? Will they completely revamp their current concept? Will they create 'Live Tiles'/ alter the Springboard? Or will they keep on enhancing every little aspect there is to iOS, building on solid foundation, expending on proven tech?

edit: Sol has given us a couple of ideas which have been 'warmly received'. One was this, I believe:

I like this, but that's already taken up by Notification Center. They can either move Notification Center back to the very top of the lock screen (so then it would be consistent through the whole OS) or get rid of it entirely.

I don't understand this. Is it "swipe down with one finger, starting on the app in question, which shows notifications for said app"? What's wrong with better organization in Notification Center?

I've never understood these; what's the point? It seems like a waste of RAM/temporary storage, and is anything these previews could show really important or large enough to be meaningful to see like this?

New framework and APIs? How about a biometrics framework? As in face thumbprint recognition through the FaceTime camera and (rumored) thumb print scanner in the iPhone 5S home button?

Typing passwords is the worst typing chore in iOS right now. Strong passwords should be long, should contain some cap letters, numbers, and punctuation, and shouldn't be auto-completed by UITextField. iOS could be made more secure and security itself could be made far more convenient through some kind of biometric recognition framework.

Come to think of it, maybe Apple's long-term plan for iOS (and maybe also OS X) is to eliminate as much typing as possible. Siri queries, dictation, and biometrics might be just the first steps toward that goal. And speaking your password would be terribly insecure. (Unless there's also a voice match framework to detect individual users' voices...)

I had imagined the Settings options would be revealed by swiping the clock to the right or left (as shown below in my abysmal mockup) but I think yours is much more elegant with the single pull down.

Quote:

Or perhaps this idea from the internet is something that would work:

I like the way that looks but I can't figure out how one would activate it. Long hold on the icon? But that's taken. Quick double tap? Does that break the guidelines? Whatever they do I hope do something as I find Notifications to be less then optimal. I only seem to check it for weather once every week or so which means cleaning out all the old notifications.

The BB10 has some good ideas. I don't think Apple is afraid of losing any marketshare to them or losing any money to Android-based devices but I do hope we get something that truly adds to the usefulness in iOS 7 without affecting any current usability.

This bot has been removed from circulation due to a malfunctioning morality chip.

New framework and APIs? How about a biometrics framework? As in face thumbprint recognition through the FaceTime camera and (rumored) thumb print scanner in the iPhone 5S home button?

Typing passwords is the worst typing chore in iOS right now. Strong passwords should be long, should contain some cap letters, numbers, and punctuation, and shouldn't be auto-completed by UITextField. iOS could be made more secure and security itself could be made far more convenient through some kind of biometric recognition framework.

Come to think of it, maybe Apple's long-term plan for iOS (and maybe also OS X) is to eliminate as much typing as possible. Siri queries, dictation, and biometrics might be just the first steps toward that goal. And speaking your password would be terribly insecure. (Unless there's also a voice match framework to detect individual users' voices...)

Great points! I agree, I'm constantly typing in PW's in my iPhone. Would love just to place my thumb on my iPhone.

A new version of an operating system (any operating system) with no new APIs would be truly baffling.

I was thinking the same thing. I am under the impression there are hundreds of new APIs for iOS each year but perhaps we should be focusing on the framework. How many new frameworks appear?

Frameworks are usually a collection of APIs and are useful to interface with new core features such as third party apps being able to communicate with Passbook or Maps, for example, where each framework might have dozens of individual APIs. That communication can be streamlined for the developer with the use of a framework.

I've never understood these; what's the point? It seems like a waste of RAM/temporary storage, and is anything these previews could show really important or large enough to be meaningful to see like this?

Hit-or-miss AI member PhilBoogie's publication on iOS7 concepts forgot to include a much requested feature found online that is already in the current version of iOS; a password protected login screen:

New framework and APIs? How about a biometrics framework? As in face thumbprint recognition through the FaceTime camera and (rumored) thumb print scanner in the iPhone 5S home button?

Typing passwords is the worst typing chore in iOS right now. Strong passwords should be long, should contain some cap letters, numbers, and punctuation, and shouldn't be auto-completed by UITextField. iOS could be made more secure and security itself could be made far more convenient through some kind of biometric recognition framework.

Come to think of it, maybe Apple's long-term plan for iOS (and maybe also OS X) is to eliminate as much typing as possible. Siri queries, dictation, and biometrics might be just the first steps toward that goal. And speaking your password would be terribly insecure. (Unless there's also a voice match framework to detect individual users' voices...)

The thing is nothing is more secure than data locked way in your brain. Retinal scanning, colonic mapping, fingerprinting, voice authentication, and facial recognition can be copied and used without you knowing. The only way to use your password is to have you give it to the intruder in some way or a brute force attack. Sure, you'd have to give your fingerprints to someone, too, and they could technically try various patterns until enough of the points match up to fool a scanner, but they don't need to do that because you are giving it over very time you touch something.

Personally, I'll never rely on just a fingerprint if added and I question it will be faster than me typing in my 4 digit PIN.

A way I can see fingerprint recognition work for Apple in the long run is for security identification, but for usability identification. Making no sense? Allow me to explain with a scenario. You come from your work. Your kid is watching TV. Some cartoon or whatever but it's now your time to watch the big screen of your 120" 4K IPS IGZO TV set. You pick up the remote control and sensors on the remote control relay the biometric data back to the television so now everything you do will be tailored to your settings, your preferences, your viewing history, etc. No more single DVR that saves all the viewers data in on giant unmanageable clump. Your content is queued up and ready to go. You can speak to your remote control with the built-in Siri microphone to tell it what to do next.

This bot has been removed from circulation due to a malfunctioning morality chip.

I like this, but that's already taken up by Notification Center. They can either move Notification Center back to the very top of the lock screen (so then it would be consistent through the whole OS) or get rid of it entirely.

Note that it's invisible when not in use that there is no Notification Center drop down on the lock screen. If you had any notifications that came up then the swipe down would simply move them down a bit. This concept works very well.

Quote:

I've never understood these; what's the point? It seems like a waste of RAM/temporary storage, and is anything these previews could show really important or large enough to be meaningful to see like this?

This JB solution looks nice but I'm not a fan. It's showing a lot of pointless info that doesn't make it more usable. Looking for the app via the icon instead of a condensed version of the page you were on is faster. If you are a web developer checking the same pages in multiple browsers the whole things becomes pointless and the screenshot gets in the way of the icon.

This bot has been removed from circulation due to a malfunctioning morality chip.

I love this idea! We need something that allows me to quickly turn on/off wifi and bluetooth. I don't leave my wifi on when outside of the house as random coffee shops have free wifi but you have to go to their webpage on safari before it works. This is a long process and I prefer to just use 3G to save time.

I love this idea! We need something that allows me to quickly turn on/off wifi and bluetooth. I don't leave my wifi on when outside of the house as random coffee shops have free wifi but you have to go to their webpage on safari before it works. This is a long process and I prefer to just use 3G to save time.
Having a quick way to access these controls is really nice.

I would add brightness controls to that list in some way. I will oft adjust my brightness from min to max but when you are coming from a dark enclosure to a bright one, like direct sunlight, it can be impossible to see the screen. Going the other way it can be so bright it's distracting to others. I propose a solution that is visually non-intutitive that is simply running your finger in a clockwise or counterclockwise position on the lock screen to adjust the brightness on or off, respectively.

This bot has been removed from circulation due to a malfunctioning morality chip.

I would add brightness controls to that list in some way. I will oft adjust my brightness from min to max but when you are coming from a dark enclosure to a bright one, like direct sunlight, it can be impossible to see the screen. Going the other way it can be so bright it's distracting to others. I propose a solution that is visually non-intutitive that is simply running your finger in a clockwise or counterclockwise position on the lock screen to adjust the brightness on or off, respectively.

I was going to say that also, but I find the auto brightness works fairly well (though it can be slow sometimes). I also don't want too much clutter on the screen, but a solution like what you are suggesting wouldn't require another slider or switch, just a finger gesture. I like it!

I was going to say that also, but I find the auto brightness works fairly well (though it can be slow sometimes). I also don't want too much clutter on the screen, but a solution like what you are suggesting wouldn't require another slider or switch, just a finger gesture. I like it!

My main complaint with the auto brightness was that it used to go torch-bright in the unlock screen, which is painful in the dark. It's very puzzling why they did that. It looks like they fixed it in iOS 6.1. I thought I'd try it again on a lark as I was writing this post.Edited by JeffDM - 1/23/13 at 2:33pm