Navigation

iPhone PL lockdown

3.3.1 â€” Applications may only use Documented APIs in the manner prescribed by Apple and must not use or call any private APIs. Applications must be originally written in Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs (e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited).

Now think about this. Basically, they have said "you can only use these PLs to originally write your code, use of any other PLs breaks this agreement." Now, assume iPhone continues to be widely successful and that...eventually PCs are replaced by iPad devices. And say everyone starts copying Apple's locked fortress model, where will PL innovation go?

Comment viewing options

I find it detestable as well. For reasons of PL innovation and plenty of other reasons beside.
Apple's stance continues to worsen at an accelerating pace. I wish I could believe that this will hurt their developer mindshare.

I haven't noticed this before. This is appalling. While it should be easy enough to circumvent, and Apple will probably not sue/ban you unless you are Adobe, if this is accepted as legitimate behavior (which is unlikely) this might be a watershed moment for the industry.

Does this preclude writing apps in say an ML -> C++ based language implementation? How in the hell would Apple ever know?

Or if they did "find out," how would this be different than some utility code generators such as lex/yacc or a utility to generate a bunch of boiler plate screen forms? Does Apple really wish to preclude using lex/yacc in iPhone applications???

In addition to being offensive, this seems like a pretty stupid and unenforceable restriction.

I experience the realization not too long ago, that every program we write implements a language. Some are quite simple some are more 'complete', some are graphic, some are 'mouse click' languages, some might even be audio languages. But every user interface amounts to a set of linguistic rules and expressions.

I think this awareness has informed and perhaps improved my own application programming. It is a useful way to look at user interfaces - what sort of language are we forcing a user to learn? How different is it than other languages the user is familiar with? Ideas such as the 'principal of least surprise' fit into this model.

For an obvious example, the C program that implements GHC; for a less obvious example, the program that implements a music player like Amarok. Both have in common a set of expression rules and syntax that they allow.

You might want to review the work of Clarisse de Souza. Her writings on Semiotic Engineering of human-computer interaction share your insight that user interfaces are a form of language.

The semiotic approach (study of signs and communication) explores the relations between the strict program language and the natural language used by the humans creating and using the application. Mismatches between the operations available in the formal code (which supports a limited set of allowed meanings) and the informal users (which can generate unlimited interpretations of the program) are a major source of user errors.

In particular, the author's intent when creating the tool must be somehow transmitted to the user through the application interface. Users will infer their own meanings, and if the intent is not well expressed, those meanings will be faulty.

If Apple does "find out", and decides you are worth suing, they will compel you to produce your original source code by discovery, and then have you testify under oath that it really is original. It's up to you if you want to commit perjury in such a situation.

This is consistent with Apple's approach to their locked down devices so not very surprising. Their complaint about Flash as being a performance hog (relative to say, Apple's canvas support in their browser for the same developer base) shows this isn't really a technical issue.

Understanding the rationale can help you anticipate the long-term implications. For example, one speculation is that Apple is primarily concerned with abstraction layers which let you create portable apps. If that's correct, then it might be possible to get them to tailor the developer agreement more narrowly, so that any PL you use has to expose all the UI APIs directly, and not offer any of its own.

On the other hand, if Apple's goal is monopoly for its own sake, then they're never going to open up even that much. If you want to discuss the implications, then it'll be useful to know which scenario we're looking at.

Or you might rule that it's too early to talk about it, since none of us know what Apple wants.

Adobe has asked Apple to expose certain APIs that would reduce power consumption. Apple has refused, probably on the basis that these APIs are unstable and will not exist in future releases - but also because Apple views their own APIs as all the user should need to use the platform. Things like QuickTime are differentiators.

There are no reasonable ways of achieving power consumption without a better instruction set to target (e.g., the Apple's hardware acceleration API). Jan Ozer, who consults for Adobe, has written about this in the past. This power consumption problem with Flash applies to MAC OS X laptops, too.

I guess testing goes far, but formally you run into halting problems and stuff.

It's not a bad decision on their part. They keep APIs and libraries light as not to clutter the OS with all kinds of bits, and -I assume- give reasonable performance with low battery consumption. Can't say I am that against it. What it doesn't take into account is that you can abstract all you want in an application, like embedding LUA. But that still makes sense from Apple's point of view. It just means that maintenance on their part remains low, and people can expect reasonable performance without too many software issues from the device they bought.

I find it pretty fair, albeit that it hints that there should be better ways of doing so.

There are actually two somethings potentially relevant that have been bugging me about their policy.

First, Apple is attempting a quality control move based on language difficulty: their high-level languages (those in the browser) have horrendous performance, making them suitable pretty much only for static content, so their policy limits interactive applications to their low-level languages. This, in turn, turns away much of the developer community. Those already invested in the platform benefit from this (and I generally perceive an undertone of this when they discuss non-inherent issues like consistency). So: are there software quality benefits from *enforcing* usability barriers? Forcing the use of Objective-C by those unaccustomed to it would make me think not. However, obviously taking a bit of a jab, maybe Haskell code is beautiful because the audience has been screened.

Nebulous concept, unenforceable, invalidates the contracts of all existing app creators (XCode generates starter code too, which nobody "wrote"), prevents porting and code reuse in its strict interpretation ... and the original language in which all programs are written is the language of neuronal firings, ignoring which the language of a specification document is the "originally written" language.

It's nebulous and overly broad precisely so that it can be selectively enforceable. Legal documents are not programs which Apple's lawyers are obligated to blindly execute. A better reading of something like this is "you're probably doing something wrong, but as long as you stay on our good side and don't flagrantly violate the agreement, we won't say anything".

Which isn't all that unusual. If even a tenth of what's written in contracts, licenses, and laws was actually enforced fully, literally, and universally, society would probably collapse. This often seems bizarre to many programmers, being the precise opposite of a computer's tendency to do only, exactly, and all of what you tell it to.

Strained justification of LtU relevance for this comment: Legalese is a terrible programming language, I wonder if it could be improved?

The general purpose computer will exist so long as there are general purposes. I might buy an iPad so my mother can go online and see pictures of her granddaughter, but it's hardly going to be a suitable replacement for gamers, musicians, artists, software developers, accountants, architects, retailers, telemarketers, computer engineers, police departments, hospitals, universities, power plants, airports, government agencies, and everyone else who wants something you can't find in the app store.

The general purpose computer will exist so long as there are general purposes.

Not necessarily. Computers just have to be general enough for almost everybody. If locked-down machines like the iPad become the norm (which has got to be Apple's long-term goal), truly general machines will be pushed into smaller and smaller niches. Eventually, if you want to do PL research, you'll need a machine 10 or 100 times as expensive as what you can get at Best Buy.

I am writing this on my second MacBook Pro. I believe Steve Jobs must have personally approved the new 3.3.1 and thus changed Apple's paradigm to something more locked down than I anticipated.

It is said that some of what is sent to sjobs@apple.com reaches Steve Jobs. I shall let him know that I dissent.

For me, it is time chart a course from OS X back to Linux because the attitude that wrote 3.3.1 cannot be good for the future of OS X. Apple makes good hardware, but if I am willing to pay for it then I can find good hardware elsewhere.

The iPhone/iPad OS and toolchain are so ingrained in the platform that it seems foolish to consider anything other than any of the fully supported C variants. And this is par for the course with embedded-style systems. If you write a game for the Wii, you use Nintendo's development system. I love using different programming languages, but clearly some sense of reality has to take precedence.

Regardless, you still can embed Lua (or any other interpreted language) in your C code, and write much of your app in it. But you need to keep the iPhone library interfacing in C, and not write a full bridge that's accessible via a custom language.

Regardless, you still can embed Lua (or any other interpreted language) in your C code, and write much of your app in it. But you need to keep the iPhone library interfacing in C, and not write a full bridge that's accessible via a custom language.

Actually, embedded interpreters, compilers, or virtual machines were already banned on the iPhone, even before the latest rev of the dev license. This new version limits freedom even more, by even prohibiting you from using source-to-source translation.

Actually, not quite. The ban was on interpreters running "downloaded code", not interpreters in general. I have it on pretty good authority that interpreters that didn't run downloaded code were acceptable.

Now, note the use of past tense; I'm not sure what this new agreement means for the "embedded interpreter" scenario, but I'm guessing it's not good. My only thought is that a number of App Store games must be using some sort of interpretation of scripts, and I doubt they'd be banned given Apple's attempt at courting the casual games sector.

If you are finishing up your Masters in Computer Science, have determined that a Ph.D. in Comp Sci. is not for you, but you also dread going into the real world to become a "software engineer" or project manager, then apply to Law School and see if you can get an internship at the Software Freedom Law Center with Dan and the gang! ;-)

The iPhone/iPad OS and toolchain are so ingrained in the platform that it seems foolish to consider anything other than any of the fully supported C variants

Why? MonoTouch provides a .NET platform for running on the iPhone, and Unity started that road by providing a successful .NET game development platform for the iPhone. To my knowledge, these are the most well-supported (maybe only?), memory-safe garbage collected platforms for the iPhone.

Apple's previous terms were much more sensible, as they required only compiled code, not specific languages. This requirement is absurd.

Regardless, you still can embed Lua (or any other interpreted language) in your C code

I think the interesting/unfortunate part of the change is that it will almost necessarily lead to selective enforcement. After all, one of the main things we learn with implementing PLs, compilers, etc, is that, in a concrete sense, all programs are interpreters.

I don't see it having a huge impact of anyone other than Adobe and Flash developers who will know have to learn Cocoa instead.

And, of course, anyone who wants to use a programming language other than C, C++, or ObjC. Imagine if you had been forbidden to use graph algorithms in your research, and you have a sense of the absurdity of this decision from my pov.

Now, of course it's true that's not the reason Apple did it, but I'm not actually obliged to adopt their point of view in preference to my own. (Maybe that will be in the next version of their developer license. :)

There will certainly be some Schadenfreude because Apple can disrespect developers and drive their many innovative languages, frameworks and systems irrelevant, at least for their much admired products. Apple acts like an authoritarian project manager who hears "freedom" and expects chaos, uncontrollable growth of hobby horses and added dependencies. Suddenly Apple has to support them and gets locked in by others instead of maintaining their hortus conclusus. They lose their freedom in favour for developers voluntarism. In a world of Open Everything Apple is the revolutionary conservative: an author, a designer, not a wise crowd of hobbyists and proletarians with a random taste.

On the other hand I don't quite understand why Apples behaviour shall suddenly become generic? It never was. Other companies have copied their style and innovations but never pretended to be like the computer couturier Apple. Apple doesn't represent a paradigm shift in software engineering but a perturbation in a promiscuous software landscape where everything is coupled to everything.

Lockdown is common in underpowered devices, but the iPhone represents the (by far) leading smart mobile device and the tipping point for them. When you include Kindles and the iPad, which seem to be the next wave of smart devices, we have the locked down embedded mindset being applied to systems that don't really have the same limitations yet companies are bending over backwards to put them in. When you view it as computing classes (in the sense of Bell's Law) -- mainframe, pc, laptop/netbook, cloud, and mobile, this is quite a shift. I don't think it's maintainable due to competitive pressure, but we'll see.

Actually, on SmartCards it is the JVM, demanded e.g. by Telecom providers who wanted to load their own applets, which opened the system a bit.

I wonder if we would lead the same discussion if Apple prescribed using their API but utilized LLVM more thoroughly for their own purposes instead of sticking to GCC. I remember Thomas Lord wrote something illuminating a while ago about Richard Stallmans own very different locking in intentions regarding the GCC. The bewildering and backward oriented language decision made by Apple might be a remote echo of this and isn't entirely without irony.

One suggestion not discussed here yet is that Apple imposed this restriction as their implementation for multi-tasking requires some internal knowledge of an application's state to facilitate their implementation of suspending the application's activity. Using a VM or language-to-language translation layer would change the application's pattern of resource usage beyond the usage pattern expected by the iPhone's OS.

It all sounds very woolly to me but is there anything to this? I'm not proposing this was the sole reason! Whatever the cause I'm hugely disappointed; I was hoping to harness the Mozart/Oz team's work to port their runtime to Android and embed it in a couple of iPhone/iPad applications. Going back to pure C/C++/ObjC is not an enticing prospect.

I am working with Peter Van Roy's researchers on porting Mozart to Android.
We already manage to run mozart applications on Android but there are performance issues because of the Java GUI.
Mozart is running fine and some improvements should help the GUI to be more responsive.
We would like to port Mozart on iPhone/iPod/iPad but there are some problems that we need to think about before.
Mozart should be released in version 1.4.1 soon and this release will be the starting point for the mobile support.

If you want to collaborate on this work, you can send me an e-mail to Jeremie.Melchior@uclouvain.be

The iphone (and ipad) are a return to the Cooperative-multitasking universe that characterized Apple computers up to the release of OS-X. If a program is badly behaved, it can block all other programs from running. One of the reasons Apple needs the App store is to prevent badly-behaved programs from running in the first place.

Rather than hardening the OS itself against untrusted code, which would take extra hardware, extra battery to power it, extra time to run checkers and verifiers, extra development effort, and extra software infrastructure, the App Store, at least in theory, allows them to prevent untrusted code from running, or even retroactively remove it if it gets out before its bad behavior is discovered. It's an interesting theory, and may turn out to be a workable solution for preventing iphones from becoming part of botnets, running malware very often, sending spam without the owner's knowledge, etc.

So it is reasonable to require everyone to use the documented interfaces and not make calls to the private interfaces that allow programs to be badly behaved. Those are the ground rules apple has to force everybody to follow in order to avoid spending massive effort on securing the devices against untrusted running code.

I think the language lockdown is part of Apple's efforts to prevent development tools and environments not controlled by Apple from proliferating. In the first place it is hard to check a development tool against the ability to create badly behaved programs. Apple is declaring, up front, that it will not attempt to do so.

In the second place, such proliferation would increase the degree, over time, to which it could become practical to own and operate a "hacked" iphone without access to the App Store. This is not such a big deal until you consider that the App Store is both Apple's profit center and primary security tool for this device.

I mean, seriously, if someone produces a kernel-mode compiler for the iphone and ports a competing OS to it, whole, with all APIs and multitasking and familiar applications out the wazoo, then the game is over for apple. They can refuse to license the kernel-mode compiler for distribution through their app store, but if I can install Linux with all its stuff, then I no longer give a crap about not being able to get apps from the app store.

Allowing development tools is a very dangerous game when your profit and security both hinge on "nobody can distribute badly-behaved programs." Not only do you have to verify that something isn't a kernel-mode compiler or otherwise "badly-behaved" program but you have to verify that there's no way to use it to make a kernel-mode compiler or otherwise badly-behaved program ..... and infinite regress.

If people ever have enough stuff to thrive and prosper without the app store and comfortably get by, then apple loses. Every development tool out there is a way for more stuff to get created and a risk that if there is enough stuff, someone will distribute it through other channels.

Trying to argue Apple's position on technical grounds seems absurd given that they use Objective-C but..

Even if that's the case, I would trust a language to generate cooperative code better than a random entrepreneur trying to write Objective-C.

In the first place it is hard to check a development tool against the ability to create badly behaved programs. Apple is declaring, up front, that it will not attempt to do so.

Red herring -- Apple can do what they've already been doing, checking programs (which I doubt is a good check to beginwith).

primary security tool for this device

That sounds like a really, really bad idea [[though might be ok in a setting like AppEngine where a high-level and instrumentable language is used]]. However, I also don't think it's true. Security comes in layers, and even how the hw is structured in phones starts to provide some.

Finally, the undocumented API stuff is more of a software engineering mistake. There are many ways to have hidden these APIs from non-Apple applications if they so desired.

I'm not sure what your point is. Linux can already be installed on 'jailbroken' iPhones and probably iPads, and of course an iPhone running Linux is not going to access the App Store. There's very little interest in it because the iPhone/iPad hardware is not cross-subsidized by App Store revenue, although I suppose that strategy could be made to work in the future. Accordingly, Apple ought to care little if you wipe out iPhoneOS from your device and replace it with Linux, whereas they would be quite justified in worrying about jailbroken iPhoneOS devices running bootleg apps.

Adobe, after successfully fending off Microsoft (Silverlight), is now attacking Apple's platform in order to place it's meta-platform on it.

This is what 3.3.1 is made of. Not programming languages - they are just a casualty.

Adobe was denied placing the Flash Runtime in the iPhone OS. They now have new technology that statically link the Flash Runtime to every app. And the interepreter is replaced by subroutine-calls to the Flash Runtime. This solution produces large apps and it drains resources (is very inefficient). Adobe openly admits this is sneaking the Flash Runtime in the backdoor.

And as Adobe is releasing CS5 the coming week Apple had to go out now so Adobe-developers knows the rules before they invest.

You should consider how much problems Adobes previous platforms have caused Apple. When moving to Intel CPU:s it took Adobe over 2 years to move. And Adobe is still running 32-bit apps under the compatibility layer from System-7.

And in the mobile world, especially with Apple now using a private version of the ARM CPU, it is very important for Apple to avoid the type of lock-ins Adobe caused them with OSX.

If Apple control the tools they can quickly move functions to hardware for acceleration or battery-conservation (probably within 3-6 months). With Adobe and others on top we are talking years.

I think we have to accept there is a difference between living in a consumer-device world and a computing-device world.

Personally I have a great deal of respect in Apple's decision to rid itself of the proprietary closed web towards a free open web by fighting Flash. Perhaps they have all the wrong reasons but at least they are doing something - the "free internet" crowd seem to prefer just to sit around talking about it.

As noted above: Only programming language related aspects of this decisions are relevant for LtU. Some speculation is hard to resist, too much of it is tiresome. As usual, long time members in good standing will receive more leeway than newcomers when it comes to judging whether posts are on topic or not. While I understand the issue is a hot-button one for some, I highly recommend people not make fools of themselves.

I don't know anything about smart phone development in general; but two questions I have is:

1. Is it possible to target Dalvik directly, or are you are required to go through Java Bytecode? A bit of searching suggests that yes, Dalvik can be targetted directly, although I haven't found a definitive answer.

2. Does Dalvik have reasonable support for proper tail call implementations, or is it just a register-based JVM knockoff?

At the very least, you can download the Dalvik source code here. Unfortunately, I haven't yet found much in the way of documentation.

Steve Jobs gave out an enigmatic clue as to Apple's thinking and there is some PL relevance here. I'll tell the brief back-story, point out the meat of his reply, and then state the PL significance as I see it.

* Backstory

There is another blog run by Paul Graham (et al.) called "Hacker News". The blog is for "hackers", sure, but particularly for hackers who are or wish to become or wish to be understand what it means to be an entrepreneur. Thus, it is full of readers (and commentators) who watch and/or participate in the "App Store" phenomenon with great interest. For example, even if an Apple App is unlikely to make you rich: (a) it might create users for your larger product; (b) it might win you a chunk of change as an alternative to having to seek out and win angel investor capital.

Consequently, there was a fair amount of outrage about this move by Apple. One reader thought to write to Steve Jobs directly, since it is well known that sane messages to his email address receive replies fairly frequently. I will crudely paraphrase the question that was put to Jobs as "What in the world are you thinking? Aren't you just shooting yourself in the foot?"

Jobs (or his staff) replied tersely, but pointed to a third party blog which he said gave an insightful and non-negative analysis. I'm making this LtU post self-contained but here are the cites:

b) the "Hacker News" story about that (this link will expire in a few weeks, unfortunately):

http://news.ycombinator.com/item?id=1255858

c) the blog entry that Jobs described as "insightful and non-negative":

http://daringfireball.net/2010/04/why_apple_changed_section_331

* The "Insightful and Non-Negative" Analysis

For our purposes on LtU, the meat of the analysis that Jobs endorsed is this (my loose paraphrase):

~ given a new platform, like iPhone / iPad, people will often try to use it as a target for a meta-platform. A "meta-platform" is a development environment that let's you "write once and run anywhere". For example, at one time, Tcl/Tk was a kind of meta-platform encompassing unix, Windows, and some other platforms. You could write a complete application, including a GUI, using Tcl/Tk - and run with few or no modifications on very different underlying platforms.

~ meta-platforms discourage development that is exclusively for just one of the underlying platforms. A third-party developer wants the largest potential audience, usually. So they'll write to a successful meta-platform when they can.

~ if enough applications are written to a meta-platform we would say, in business terms, that each target platform loses product differentiation. E.g., there will be no special incentive to buy an iPhone because the most important apps run just as well on every other kind of "smart" phone. The App Store will have no special charms, for all the interesting apps will be available elsewhere, at least for other phones.

~ this would undermine the business model Apple is using for the iPhone (and iPad, and iPod touch). Therefore, Apple chooses to banish anything that looks or smells like a meta-platform or potential meta-platform from the App Store and to contractually forbid the porting of a meta-platform to iPhone (and other products) as part of the developer kit agreements.

In a separate message, Jobs also added that (again, paraphrased):

~ typically, applications developed for Apple products via a meta-platform fail to live up to Apple's high standards of user experience quality. Continuing the analogy: Tcl/Tk may have allowed "native looking" GUI apps on Windows, but these could not quite ever have all the bells and whistles of an application that was written directly to, and embraced, the native Windows APIs. The abstractions that enable a meta-platform also undermine the "UI guidelines" of the underlying platform. (In the case of iPhone, it's not necessarily just UI. For example, it is also mentioned in the materials linked above that flash programs, "compiled" to run on iPhone, are bad citizens for managing battery life at least when using the translator that Adobe has developed.)

* Relevance to Programming Language Theorists, Inventors, etc.

Some short observations and speculative analysis:

~ Programming languages and their environments can compete against proprietary operating systems by abstracting away from them such that the underlying operating system becomes a commodity! Well, that's been conceptually true and known for a very long time but often has not worked out well in practice for at least three reasons: performance loss relative to a native platform; a comparative poverty of features and capabilities relative to a native platform; the incentive created by overwhelmingly popular native platforms to write low-level platform-specific programs. Perhaps those negatives are starting to expire around this time in history - hence Apple's concern.

This raises the questions: should programming language designers more explicitly aim to compete not only with other programming languages, but with traditional operating systems? Should programming language theorists begin to regard OS theory as an important sub-specialty of the field?

~ There is a political matter that should be of grave concern to programming language theorists and developers. Apple illustrates the concern but it is surely not limited to them:

Apple has deployed a system of contracts, technological means (DRM), patents, copyright (and perhaps trademarks, but I'm not sure) all with the aim of legally preventing the use of certain forms of abstraction on the computing systems that comprise a very popular platform.

If the Congress of the US or a state legislature were to pass a law forbidding the use of some forms of abstraction in programming, it would take only as long as it takes to process the paperwork and schedule a brief hearing before the courts issued a preliminary injunction against enforcement of the obviously unconstitutional law (on first amendment freedom of speech grounds, among others).

In the US system, at least, contracts are not by default constrained quite so sharply as the legislatures.

And so there is, really and truly, a slippery slope here: if many economically significant platforms are subjected to measures similar to what Apple is doing, the programming language theorists and many other sorts of software theorists and engineers ought to just pack up and go home: the society will be legally barred from developing more sophisticated approaches to programming.

~ We are entering an age of commodity computing, in which typical users have less and less say over what software can run on their systems. The theoretical significance of "legally banning new abstractions" is of growing, real significance because the day to day computing that people and businesses do is migrating towards (a) centralized computing utilities owned by third parties, leased under contract, and accessed under contract; (b) end-nodes (handheld devices up through "workstations" and "servers") which come with built-in DRM and strong contract terms that limit not only what music files you can play or what movies you can watch, but what software you can run.

As an example of one implication, suppose that other platform vendors and commodity computing providers continue to follow Apple's lead. Academic programming language research and development can and will certainly continue but on what terms, and with what relevance?

If it becomes a de facto Mortal Sin to invent a meta-platform, then what relevance will remain to serious programming language innovation?

This may seem like an exaggeratedly dark question. Surely Apple will just do what it does but they are hardly going to impose these restrictions on computing *in general*. Plenty of major vendors will decline to follow Apple's lead so, what's to worry about?

What *I* worry about is the very strong commercial trend towards fairly thin end-nodes (of which mobile phones are just a clear example) combined with the trend towards "commodity computing" - by which I mean not owning and controlling your own main computers but rather leasing cycles over the net from some firm that operates a number of large-factory-sized compute clusters.

This is no longer the idea of renting a server that we saw in the early days of "dot com". The main difference is the "vertical integration": people rent applications, not computers. Universities consider handing over their campus email to Google. Small businesses look for on-line apps as an alternative to internally managed IT. Firms the like of Youtube, Twitter, and Facebook help to undermine the assumption that our most important communication networks are subject to strong regulation in the public interest. It is becoming an unquestioned habit of upcoming generations of individuals and businesses to, as much as possible, give up control of their own computing needs and perform computing and communications subjecting themselves to relatively unconstrained contractual restrictions.

I fear that we face the very plausible risk of the de facto complete privatization and centralization of computing as a field of research and development, and as a tool for day to day life.

That contractually enforced restrictions against programming languages are in the news today, on a commercially quite significant platform, is an illustration of why I say "plausible risk".

Thomas, you summed this up beautifully. I had wanted to write something similar, but was struggling immensely with the big picture - you nailed it.

Some comments:

This raises the questions: should programming language designers more explicitly aim to compete not only with other programming languages, but with traditional operating systems? Should programming language theorists begin to regard OS theory as an important sub-specialty of the field?

Competing with operating systems requires better support for resources. Languages should start with a TCP/IP networking stack that is more modular than Linux, but equal to its performance that comes from its monolithic structure. Ian Piumarta has started in this direction, Walid Taha has done some interesting stuff with resource-aware programming, and the Austin Compiler Project has done interesting research in transforming a clear protocol specification into efficient executable code.

~ We are entering an age of commodity computing, in which typical users have less and less say over what software can run on their systems. The theoretical significance of "legally banning new abstractions" is of growing, real significance because the day to day computing that people and businesses do is migrating towards (a) centralized computing utilities owned by third parties, leased under contract, and accessed under contract; (b) end-nodes (handheld devices up through "workstations" and "servers") which come with built-in DRM and strong contract terms that limit not only what music files you can play or what movies you can watch, but what software you can run.

This is one of the things the EFF and FSF have been fighting, and losing against. Linus Torvalds and Richard M. Stallman simply delayed the inevitable. More broadly, net neutrality is already a myth - what you access through the iPhone is a reflection of what you access through the App Store. The recent appellate court ruling against the FCC is irrelevant. It's also irrelevant whether Congress decides to pass net neutrality legislation or not. You cannot defeat the market, and the market is controlled by mega-corporations.

The upshot of this is that I think services with contracts based on capabilities will become more widely endorsed, out of necessity.

As an example of one implication, suppose that other platform vendors and commodity computing providers continue to follow Apple's lead. Academic programming language research and development can and will certainly continue but on what terms, and with what relevance?

Researchers will go to China, and Sean McDirmid will teach them mandarin, cantonese, and altanese. Half serious. American cryptography researchers did go to Europe in reaction to DMCA. So did *BSD developers. Researchers go where the laws won't apply to them. Do not underestimate labors of love.

The "unwashed masses" (sorry) of computer users have more or less used their magical machines first to type, then create small spread sheets, charts, contact databases, and now to communicate with other people and read lots of Web published stuff.

So what? Today's users gobbling up pre-installed software is no different to me than Word Perfect users circa 1990 who had no concept of DOS directory structures and kept hundreds of docs in one directory that they *only* accessed from within Word Perfect, completely ignorant of the operating system at all.

A potential saving grace (has been in the past) is all the small to medium industry applications. Awkward and ugly stuff, like software to manage a textile printing shop - everything from contact management to Pantone colors to book keeping. Not exciting stuff, but stuff like that is ubiquitous in small business (that's most business) and it still drives lots of hardware, system, software and training sales. Outside the relatively mainstream hype, this (useful) crap kind of makes a big (boring) part of the small computing world go round and round.

Many/most of these folks won't every learn anything like C++ or Objective C or try to stuff their (Pascal and Basic, mostly) apps into a browser using Java Script. Just won't happen. Apple will come around or limit their market strictly to "consumers" - which will cost them serious change.

My current neurologist has the most automated office I've ever seen. Every Dr. and nurse has a pad-type device and all past records are at their finger tips, future appointments are scheduled on the spot, and prescriptions just spit out of a big fancy Xerox printer thingy. I image a whole bunch of companies are making a VERY PRETTY PENNY configuring systems like that for the health care industry.

Etc. blah blah blah. I wouldn't worry. A more open platform simply *WILL* exist sooner rather than later - because there's filthy lucre to be made :-)

As for Apple? I've never really assigned them any special status in my picture of the computer platform world. WHO CARES?

I've seen what Jecel Assumpcao, Jr. did with Merlin programming language/operating system and building his own laptops using FPGAs and bitblt for graphics, all by himself. I know creating a counter-culture will not be a serious issue (following up on naasking's point about a new Apple counter-culture revolting). However, pragmatically speaking, counter-culture movements have always resulted in consumers being denied access to fun leisurely enjoyments like modern 3d games.

If the market is indeed dominated by large corporations, then it is not a free market but a corrupt one, and it can and, historically, will be beaten. Corruption cannot be destroyed, but particular corruptive tools can be broken, as Thomas Covenant says.

"Dominance" and "empire", those militarist categories, hardly ever apply to markets, although they are very seductive metaphors. Market share must be conquered like territory. On the backside of success, being a "monopolist" has a negative appeal and is almost immoral and calls for resistance. The modern day liberal, worker-consumer still loves warfare, in particular when no one gets hurt.

if many economically significant platforms are subjected to measures similar to what Apple is doing, the programming language theorists and many other sorts of software theorists and engineers ought to just pack up and go home: the society will be legally barred from developing more sophisticated approaches to programming.

I'm not particularly concerned. Like Microsoft, only a counter-culture will compel more openness from Apple, or relegate them to a niche where they might be happy. Anti-trust convictions did very little to curb MS, but the open source counter-culture to MS's lock-in is very effectively dismantling its stranglehold.

Fortunately, while Apple got a significant first-mover advantage here, Google preempted Apple's draconian measures and already started the counter-culture with Android and the Dalvik VM. It seems to be going pretty strong so far.

As a first-gen iPhone user, I'm fairly certain I'll be getting an Android phone next, because iPhone development has just lost all the exciting PLT potential in my eyes.

It is becoming an unquestioned habit of upcoming generations of individuals and businesses to, as much as possible, give up control of their own computing needs and perform computing and communications subjecting themselves to relatively unconstrained contractual restrictions.

I'll just note that most "owners" are generally not themselves technically savvy, and thus they had already surrendered control of their computing infrastructure to IT staff. I'm not sure handling staffing is any less of a headache than dealing with a third-party service. I suspect it just shuffles the headaches around a little.

This raises the questions: should programming language designers more explicitly aim to compete not only with other programming languages, but with traditional operating systems? Should programming language theorists begin to regard OS theory as an important sub-specialty of the field?

The idea that Operating Systems are Languages is probably pretty uncontroversial and unsurprising. But the converse is also true: Languages are Operating Systems, it's just that most are pretty bad ones.

Not all meta-platforms are Tcl/Tk. Tcl/Tk, I will agree, tends to make applications that are quirky and unpolished, and that do not mesh well with the OS. However, what about Java, .NET, and web software?

As a sniff test, anyone who has used an Apple desktop or laptop is invited to consider how much of the software they use was developed specifically for Macs. Personally, I tried the experiment, and it went heavily in favor of the meta-platforms. I started out trying to live the Apple dream and using all of the software that is optimized for Apple. Over time, I grudgingly dropped mail, contacts, and calendar for Google equivalents. I never even tried XCode; as a primarily Java developer I use Eclipse, which is based on Java. I have used a variety of web browsers, but all of the big ones are multi-platform, including Safari. At this point, practically everything I use, except the core OS, is cross-platform.

The case against meta-platforms in general is interesting but does not look especially strong in practice. There is certainly a lot of crummy cross-platform software written. At the same time, the very best software is usually cross-platform!

I don't know why this is, but to speculate: cross-platform software has a larger user base, which through a mysterious step two leads to more resources to further develop the app. There are a variety of ways to fill in step two. Pay-per-copy software has more profit if it's cross-platform, which will partially be redirected back into further development. Open-source software will attract more developers, because it can scratch more people's itches. Popular software has more critics and commentators, which lead to a better pool of ideas existing for the developers to draw from. Popular software causes other software to interface with it, making the original software more valuable with no work from the developers. It's good to be popular, and cross-platform software has more room to be popular.

So are 4GL code generators. The essence of these code generators is to generate 3GL code that optimizes for nonfunctional requirements given your functional requirements. This equates to 100% reuse of nonfunctional requirements code across all applications. -- Some software factories and software product lines are basically code generators/compilers in this respect.

If you actually look at the quality of code by many modeling tools these days like Eclipse's modeling framework, especially for managed languages, they are often superior to the quality of code output by developers. Modeling tools can also mask away performance optimization differences across versions of a managed runtime environment. e.g. javac transforming pure string concatentation into StringBuilder calls.

The same logic applies to GWT, and also for that matter mapping OCaml code to C or JavaScript, or even the published uses of Haskell for real-time systems in IEEE journals, through the use of FRP.

One aspect of this that I find troubling, and that seems not to have come up in the discussion so far, is how much this move tends to bolster myths about programming languages. It seems that some will conclude that (a) some languages (as opposed to implementations) are slower and more resource heavy than others, (b) libraries are inherently inefficient, (c) code in C like languages is more amenable to analysis in order to prove good behavior than other languages, (d) even when executables are what is being distributed, clients should impose restrictions on the programming language used, and similar notions. I am being a bit vague here, but I hope the kinds of views and over-generalizations I have in mind are clear. These and other myths have been the target of LtU discussions, and great inroads have been made.

Needless to say, whatever Apple's intentions are the bolstering of such views would be an unfortunate outcome. One reason why discussing their motives critically may be important is that it can show that the technical rationale should not be accepted as representing the actual state of the art in the field.

If you bar the mythical sufficiently smart optimzer, and constrain the extent of white box composition, and impose a particular process and IO model from the outside, at least (a) and (b) have a reasonable basis. Both languages definitions and the environment inherently constrain language implementation.

For point (d), consider an intermediate possibility: executables are distributed, but only after static analysis, certification, and compilation by a trusted third party. Would it be reasonable to limit the language in this case? i.e. to Joe-E? Or perhaps a language that supports analysis of memory and CPU and power requirements? or process accounting? I think so.

That said, this isn't an argument capable of justifying the "originally written in" constraint. A high-level language can be an intermediate compilation target.

I am a bit curious as to whether Apple's judgement will lead us to develop "code translation" tools that attempt to preserve style and comments and such to beat the system. The floundering 'language oriented programming' movement might see a little more interest.

As long as we are addressing this, figure I might as well open up another thread by hitting these myths from how I believe Apple would view them. I think you might be pleasantly surprised.

(a) some languages (as opposed to implementations) are slower and more resource heavy than others,

I would say Apple would agree with that myth. But more importantly they don't care about the distinction between languages and implementations. They aren't deciding between languages in some platonic sense but rather specific implementation that exist and are widely used for their platform today. To them, an implementation for them is a language. And they go, in their development documentation even further and talk about conventions as essentially being a language.

I.E. What they are concerned about is Apple's objective-C as compiled by LLVM / XCode using the conventions in their development documentation as contrasted with competitor's language / compiler / convention trio.

Arguably given their focus on convention in their documentation they might be advancing this aim. It is hard to know how much the average developer is absorbing of these more abstract notions.

(b) libraries are inherently inefficient,

I don't follow how that would come about. Apple preaches the virtues of using their libraries all the time. It is because they want people to use their libraries that they encourage Objective-C.

(c) code in C like languages is more amenable to analysis in order to prove good behavior than other languages,

I don't think they are saying this. Take for example the latest set of developer documentation which focuses on the issue of garbage collection vs. reference counting. They are pretty up front that because C like languages allow for casting to and from void pointers that they have essentially failed at getting garbage collection to work in practice for Obj-C apps. It is only because C languages are less amenable to analysis that they have to use a fairly complex conventional reference counting system to provide some of the advantages of automated garbage collection. And BTW they agree this applies to their OS code, so they aren't saying this is only something that happens to bad programmers.

Again I'm not sure if the average developer is reading the "why" parts of the ARC (memory management) documentation and not just the "how" parts. But the why parts could have been written by any LtU regular.

(d) even when executables are what is being distributed, clients should impose restrictions on the programming language used

They are being pragmatic. They want people to follow their guidelines for application construction for example Human Interface Guidelines. In general people writing in languages not supported by XCode are people who don't care what Apple thinks about how applications should be constructed and thus violates guidelines arbitrarily, and that diminishes the user experience for the platform as a whole.

As I noted elsewhere, Apple has effectively banned the use of safe languages. The negative security implications for iPhone/iPad users are significant.

I'm one of the people who reserved an iPad for day one. It's not a perfect device, but it's quite good for my purposes. Because Apple is now demonstrating that they are committed to preserving the vulnerability of this machine, I'm probably going to return it. I'm just grateful that they stuck their foot in it while the return period was still valid.

Thankfully, the forthcoming Android tablets won't suffer from any of this.

Want to get Steve's attention? Go buy an iPad, wait 13 days, and return it citing the security consequences of 3.3.1 as your rationale.

They support JavaScript, so perhaps you mean "fine-grained security" or "security with reasonable performance." Oddly, at least for multiprincipal systems, I view JS as safer than most mainstream languages.

Given the big money stakes for the iPhone and now iPad, Apple need only throw a relatively tiny bit of money around to various PL projects in order to create "custom" versions of high level languages for their hardware and software platform. They need only pick to help fund one of: Scheme, ML (Ocaml, SML or Alice ML), Haskell, Scala, a few "pop" scripting languages, notably Python and Ruby, some BASIC dialect (really important for smaller industry applications, actually) and maybe a few more speculative language offerings out there (Clean, Mozart Oz, some Forth like offering - who knows?).

The money involved is way small change, it will yield quality "lock in" bindings to their OS API, mostly satiate the PL/PLT community's needs, and get rid of most of the bad press lickety split (which does Apple NO GOOD with potential application developers).

With a little more time, and without even spreading any additional "bribe money" to the PL tools community, they can then devise some sort of "certification program" for other new languages for the platform.

If they don't want to certify a Flash implementation that abstracts their proprietary API's, then frankly, they don't have to. Same would go for Java's and even Squeak's abstracted GUI/OS API's.

The point is, Apple can satiate (1) developers and (2) the PL language and tool builders who love them without sacrificing their "lock in" goal. A more-or-less happy compromise CAN be achieved if Apple has the will.

They have a been funding a Ruby implementation called MacRuby. It is still only at version 0.5, and it's not currently intended to work with the iPhone platform, but my guess is it'll wind up there soon after 1.0.

The idea is to automatically resolve non-functional requirements, such as platform mapping. The idea originated (as far as I'm aware) in code generation tools such as the ones James Martin wrote about in the early '80s (but these were not very sophisticated) [edit: Although I suppose the use of microcode by Xerox to build portable platforms like the Alto is an earlier example]. In Model-Driven Architecture circles, where you have a Platform Independent Model and a Platform Specific Model, there is intended to be a clean separation of these models. The goal is to bury non-functional requirements from the purview of developers and the domain expert.

A meta-platform is not a system's programming language. It is a translation tool. The goal is actually to ignore the system as much as possible. Once you understand the system and its performance model, there is little need to keep writing directly to the system. This facilitates reuse.

edit: recently multimedia outlets like TechCrunch have latched onto meta-platform as a buzzword. Surprisingly, they are using it correctly. For example, Google's meta-platform for social networking subsumes Facebook. It commoditizes Facebook.

I think section 3.3.1 is not that dramatic and could actually lead to more programming language theory out there in the field :-) I have written a couple of iPhone Apps sold through the Appstore, learning Objective-C in the process. Objective-C is not that bad, it definitely is better and more uniform than C++. The biggest pain on the iPhone is the missing garbage collection which leads to an attitude of "who cares, memory is reclaimed anyway when the user shuts down the app". Therefore I am developing my own programming language that generates, among other targets, Objective-C code. Now, is that prohibited by section 3.3.1? In theory, yes, in practice, no way! Because if I keep my Objective-C generator to myself, Apple has no means of enforcing section 3.3.1. I also think that my language will give me a strong competitive edge over other app programmers, because writing Objective-C code is just slow. So what does this mean?
It means that in order to write competitive apps in the future, you need to employ programming language theory in-house. Good prospects!

Back in the NeXT days, some of us involved with the NeXT in one way or another used to talk about Steve Jobs' "Sales Prevention Program" because it was all too common for him to implement policies or do something that would kill a particular sale (in some cases a sale of many machines), or destroy NeXT's prospects in a market.

It was the flip side of his amazing intuitive sense of elegance and form. This sense allowed him to instantly 'know' the right way to do things, most of the time. But because he lived by that intuitive sense, when it came up wrong he had no way to rationalize an alternative. Just as there was no way to express why he was right in the 'right' cases, there was no way to argue with that intuitive sense when it was wrong - and quite often there was no way to know whether he was right or wrong until later.

I'm asking because I'm not even sure they considered thinking at least twice before writing this down. Either way, now that it is (written down) anyway, I take it as a pretty clumsy move, marketing-wise and vis a vis the global PL community. I mean, your average iPhone addict end user won't really care (if at all) about such a lockdown, but that user won't be either the one capable to bring new ideas in, invest time and efforts to build apps on upon the thingy, will he/she? Even though most enthusiastic programmers aren't afraid to learn new PLs if only for fun, I see there a pretty harsh and likely unsuccessful attempt soon or late, from those iPhone folks "in charge". Well, let's wait and have the market see(?)

Apple has changed its license, yet again. I don't know what this means for Flash, but Miguel de Icaza has tweeted his excitement over MonoTouch: "Incredibly psyched about Apple changes to 3.3.1 officially allowing MonoTouch use. Thanks to everyone that stood with us all along!".

We have listened to our developers and taken much of their feedback to heart. Based on their input, today we are making some important changes to our iOS Developer Program license in sections 3.3.1, 3.3.2 and 3.3.9 to relax some restrictions we put in place earlier this year.

In particular, we are relaxing all restrictions on the development tools used to create iOS apps, as long as the resulting apps do not download any code. This should give developers the flexibility they want, while preserving the security we need.

The download is not restricted in the case of JavaScript if it's run through WebKit's JS engine.

3.3.2 An Application may not download or install executable code. Interpreted code may only be used in an Application if all scripts, code and interpreters are packaged in the Application and not downloaded. The only exception to the foregoing is scripts and code downloaded and run by Appleâ€™s built-in WebKit framework.

[Or maybe it'll be like pornography: I can't tell you the difference, but I know it when I see it.]

Funnily they use that line when talking about acceptable apps. From the App Store Guideline:

We will reject Apps for any content or behavior that we believe is over the line. What line, you ask? Well, as a Supreme Court Justice once said, â€œIâ€™ll know it when I see itâ€. And we think that you will also know it when you cross it.

How hard will it be to provide acceptable content during the review process and then later release the stuff that's "over the line?" How much effort does Apple put into or plan to put into policing application content after initial approval?

I was wondering the same thing as soon as that restriction came out. You can download images. At what point is a decompression algorithm complicated enough to consider the source data "code" that is executed by the decompressor? Is a declarative language like a data file format a programming language? What about shaders?

I guess it will be possible (and allowed) to download a code package file via the browser and have an app like Codify handle opening of the package? Any comments/observations on whether that is license compatible and whether Apple will reject such apps?

Codify has an unapproved add-on that allows for code sharing. That's going through its own approval process. If it were a small number of code packages the most obvious would be for Codify to have code repositories which are applications add-ons and those can be in the App Store and distributed as an application add-on. If Codify ran its own "website" and vetted the applications on their side, they could most like have thousands of apps and I would suspect it would get approved....

But the download via. a browser and install... Applications aren't allowed to pass arbitrary data globs to one another. Any kind of state sharing (and they do use this term BTW) between applications is held to a much higher level of scrutiny than just application approval. There are some that are approved state shares but the approval is of the form: Application X is authorized to pass data type Y to application Z. Basically Apple does not want you creating your own OS components, that is applications providing services to other applications. They don't even really like this on OSX and have grumbled about it many times.

To be more general, having a web browser download code from an arbitrary website and install code, direct on the iPhone not through the iTunes website for random users would get the app rejected n a heartbeat. The same way the FDA isn't going to agonize over allowing a non-doctor to mix up boxes of unapproved drugs and sell them to people out of the trunk of his car, regardless of who opens the box.

Apple provides a mechanism for applications to add additional content called in-app purchases. Lots of applications download additional content inside the app and through the network. However, they need to be uploaded by registered developers who are associated with the application and Apple might ask for additional information.

And of course Apple is fine with applications downloading "data" from websites. For something fun like codify they are unlikely to be terribly strict about any sorts of restrictions providing the developer of codify has made sure that codify isn't creating backdoors into the system.

The distinction is very murkey. Is markup code? Is a word document that consist of markup code? What does interpretation mean? When is rendering interpretation? Rendering a static document? Rendering an animation? When is simulation interpretation? And so on...

In the strictest reading of their rules, all applications are disallowed, of course, since their are no clear cut distinctions. All that Apple can do is be subjective.

That's the point, in this case people are downloading code from the App store, not from a website, as an add-on. But the mechanism of download is not is really important, what is important is that each of those add-ons programs has a registered developer associated with them and that developer has an official tie to the person who registered the interpreter. This creates a chain of accountability for the end user and for Apple.

Websites are another thing entirely, and obviously since that chain of accountability for the scripts don't exist the application is subjected to more scrutiny. But as I mentioned in the other thread there are tons of apps where people do download and install code from websites. Apple will allow it providing all sorts of other criteria are met. Gambit Scheme, which we discussed has a large web repositories and loads data from them to their iphone version via. the iTunes interface.

And of course the big thing that everyone keep skipping is... Developers do not need to use the App Store to load software. If you just use the developer interface you can load whatever software you want and that software can do whatever you want. Apple's protections exists solely for people who either are not developers or do not have access to direct IT support. What keeps being quoted are the App Store guidelines, the criteria you need to meet for Apple to be a distributor for your software. The term for the sorts of people who need to freely load code from external sources is "developers".

Once a registered developer, or a registered support group takes direct responsibility for software X running on device Y Apple doesn't need to act as a gatekeeper, preventing Y from running into trouble.

That said, this kind of project kind of screams for the social aspect of something like Scratch, so it is unfortunate that the App Store rules make that impossible.

I don't know if you are from the era of typing in BASIC programs from magazines. I'm not sure that source code listings with discussion aren't a better way to educate than allowing free sharing. That being said there are lots of tools that allow you take code from one applications directory on iOS and put it in another directory:

So sharing already exists just slightly more controlled. If Codify cleans itself up enough that it is ultra secure then I suspect that Apple would allow it to drop steps 2 and 4. Apple for example has there own sharing server for iWorks which their apps can pull from and load to. Some 3rd party programs do as well,

I've commented on other threads that I don't think the Scratch ban is permanent. My guess is codify is good for Apple, because they are part of the culture they'll understand better how to address Apple's concerns and carve a path for Scratch.

Codify is taking a big risk in hoping Apple will change, because they might not. And why should they? Programming is still not on their radar, and Codify will have to be really successful even with its limitations for this to occur.

Sharing is too hard to be practical right now. You are right, it is possible, but in practice there are too many barriers, which are all artificial.

You and I actually disagree a bit on how closed Apple is. I tend to view Apple's protections as more like a baby gate, designed to keep the really ignorant from hurting themselves but not doing anything to interfere with adults. It seems like you tend to think those doors out of the system aren't meaningfully there, that Apple is more like a prison, a prison with very nice window treatments but a prison none the less.

Tim Cook is highly dedicated to Apple offering "revolutionary technology", he is not an open systems guy anymore than Jobs was. His fundamental direction has been partnering with manufacturers. My guess is that Apple moves more in the direction of non standard components for their platform under Cook, i.e. more proprietary like Apple was around 2000. I would expect more initiatives like Thunderbolt. I don't know what is opinions are about software though. So... in general I think he is going to be a Steve Jobs successor and keep on the same path.

Codify is taking a big risk in hoping Apple will change, because they might not. And why should they? Programming is still not on their radar, and Codify will have to be really successful even with its limitations for this to occur.

Today angry birds hit the 1/2 billion download mark. I think they are at around $50m in gross sales at $.99 / each. Codify is $8 which means after Apple's cut it is $5.60 / per. Codify being one of the first child friendly programming platforms is going to do fine regardless of whether Apple opens up any further or not.

1) Right now Codify has a sharing app under review. They have started the discussion.
2) They are getting ready to release their back end Lua part as Open Source, which means codify developers will be able code for it on their Mac.
2a) Of course members of the Apple development community can run Codify on the iOS SDK under virtual hardware and then sharing is on.
3) They have hints offline (in their FAQ) about using iexplorer to load stuff into the application from a Mac. And they are running a forum with shared source already.
4) They are starting to support user created sprites so the code on iOS is starting to be effectively modifiable i.e. effectively user created functions not just scripts of system functions. They have gotten a positive response from Apple on that.

I do think Apple will open up further, Codify is the kind of innovative software Apple likes. Apple has consistently been willing to compromise their rules for developers that work with them. Two Lives Left gets Apple.

_____

I do understand your bigger point you raised elsewhere that embedded systems like iOS devices, Xbox/playstation/nintendo360, tivo... could replace general purpose computers. But so far it has been rather easy to take any of those embedded systems, load a new OS and turn them back into general purpose computers. Right now there isn't much demand since general purpose computers are ubiquitous. But if they weren't people would just use XBOXes which do make nice computers.

Apple needs to maintain control, and will be conservative about what they accept to maintain control. I don't think Apple is being overtly evil, they are doing what they think is necessary and it affects us negatively. Apple cannot let users share code freely since this could become a new uncontrolled channel for app distribution. As soon as you allow someone to share scripts, they can give you scripts that subvert the app approval process.

Actually, any app that supports unwashed content can subvert the app store. Do you allow users to share sprites? Porn! This is the problem with manual vetting: it can only really work if apps are closed boxes. In order to ensure that vetting works, they try to make apps as closed as possible, they really don't have much other choice if they want to maintain control. Google, Microsoft, Amazon, to varying extents, suffer from the same problem here: it is not Apple being evil, it is the rise of the closed platform that we are afraid of.

I don't think Apple CAN open up further, so I'm assuming they won't since they are generally very smart people and not very ideological. If some engineers at Apple like Codify, this is irrelevant, they know what they are doing!

On an aside, Codify provides for a crappy coding experience, its basically a text editor with some keyword code completion and no member completion. Nice first try for them, but I hope it doesn't negatively affect our movement.

Actually, any app that supports unwashed content can subvert the app store. Do you allow users to share sprites? Porn!

Porn is a good example of an area where and how they were able to loosen. It is also a good example of the kind of controlled loosening that Apple does. Apple doesn't want to sell porn, they don't want a cut from porn sales nor do they want to be distributing porn. On the other hand they don't want their avoidance of porn to be a backdoor where other media they do want a percentage of gets on without them getting a cut.

Moreover, when the iPhone came out they didn't have a mechanism for knowing the age of the person using the phone. So even applications which could create access to porn were forbidden. The rules were very tight and people used the "no porn on the iPhone" which again was an exaggeration.

So what Apple did was:

1) They created the the 12+/17+ mechanism which parents can disable on Apple's side. That allows them not to worry about whether the phone is being given to a child. The parent has essentially said: G only, PG-13 or R.

2) They encourage content to be separate so that parents don't feel strong armed into approving R to get access to something else. For example they have several book stores that sell erotica but use this mechanism.

3) Porn is never sold through the Apple store directly but rather loaded in, in some 3rd way. So for example they have never done anything to disable the sites loaded with iPhone porn (i.e. movies cut to the exact screen dimensions of the iphone, with video codecs designed to play fast on iPhone Quicktime)...

____

As soon as you allow someone to share scripts, they can give you scripts that subvert the app approval process.

Not if the sharing mechanism is enforcing policies similar to the ones you would want them to enforce. For example allowing app sharing inside of Codify (or Scratch) where either:

a) The person getting the script personally knows the person who originally wrote it.
b) The code has been checked by a knowledgeable 3rd party.

In particular no chain sharing where X wrote it, X gives it to Y who isn't knowledge and then Y gives it to Z. This is the policy that exists today for Obj-C applications.

c) And if there is any kind of sale, Apple gets a cut. No creating a backdoor store.

Remember Apple can unapprove apps if Codify were to start out and get out of hand.

We know Apple is willing to loosen because they have. Look at the exceptions they have made:

Gambit Scheme, ND1, iLuaBox -- Likely end users are able to self support.
Corona -- Likely end users are not able to self support so approved after they added an additional security mechanism.
Codify -- Broke their sharing mechanism and their low level mechanism from the rest of the application, so the scripting language as distributed is crippled (safe).

Further there is competitive pressure. Apple can not be so closed that Android, Blackberry or Windows mobile ends up offering a better software platform.

Nice first try for them, but I hope it doesn't negatively affect our movement.

I'm not sure if by "our movement" you mean the LtU push towards LISPish style languages or the GNU project.

GNU project -- Apple has been a serious threat, the one proprietary Unix that is doing well. The process they are creating of iPhone app -> iPad app -> OSX app will result in a huge number of applications and types of applications that will be almost impossible to port to free operating systems. No different than the problems with Visual Basic or ASP applications 10-15 years ago. The best response to this is really pour effort into the GNUStep project so that applications can be ported.

Apple has generally have embraced free software and have well documented standards, unlike Microsoft. They participate heavily in lots of open source projects. I'd give them a B for GNU type issues.

LtU -- I hate to sound stereotypical but LtU need to get out of the ivory tower. They absolutely should register with Apple as developers if they want to develop software for iOS. Apple is not going to facilitate people refusing to work within their system.

Lispers need to start having a presence at things like WWDC if they want their voice heard. Apple is right now deciding what they are going to do language wise over the next decade or two, for example what role is MacRuby going to play. There is every reason to believe that Apple is open to a dialogue on their platform languages.

Apple because of how profitable the iPhone store is, is right now educating tens of thousands of developers with their philosophy. LtU needs to knowledgeable enough about Apple culture to participate in this dialogues. Otherwise the dialogue is going to be LtU with the LLVM guys (it wouldn't shock me if the LLVM guys are LtU guys). LtU will have input but not nearly enough to effect the kinds of policies you are concerned about.

Apple hasn't so much loosened as they have given up control when it is impossible to maintain. Of course you can view porn on your phone, there are just usability barriers because Apple doesn't make it easy. Same thing with financial transactions that don't go through the app store. You can't embed the URL link to buy something in your app anymore, rather it has to be text and the user has to copy and paste.

Making something hard often means making it so undesirable that your market is severely neutered; i.e., the iPhone is not porn friendly.

Having scripts go through vetting to be shared is a no-go since this is not financially viable for the developer or for Apple. Rather, you have to create entire app experiences to share, which is fine if you want to build an app but often this isn't what you want to share.

LittleBigPlanet has an interesting take on sharing via community moderation to identify IP violations and inappropriate content, although this is a bit heavy handed (you can't really appeal if someone misflags your content). Sony can't vet everyone level that users want to share, and making sharing hard would defeat the purpose of the game. Suffice to say, something like LittleBigPlanet is not currently possible on the iPhone/iPad (you have to wait for a Vita), while no developer in their right would develop this for iOS without forward assurances from Apple that they would change their rules.

I'm not sure if by "our movement" you mean the LtU push towards LISPish style languages or the GNU project.

I meant our movement of touch-based programming, not LtU related but there are a few us who are doing this. Its much harder than throwing an editor on an iPad, which is what codify did, and no, we aren't in ivory towers (rather glass towers often working for major corps).

Having scripts go through vetting to be shared is a no-go since this is not financially viable for the developer or for Apple.

I'm not so sure. I can imagine this being doable. Say for example a collection of simulations connected to a textbook for iLuaBox (Lua seems to be very popular with the iPhone crowd, that's the reason I'm picking it), answers to sets like the Euler problems.... random scripts, no.

LtU -- I hate to sound stereotypical but LtU need to get out of the ivory tower. They absolutely should register with Apple as developers if they want to develop software for iOS. Apple is not going to facilitate people refusing to work within their system.

This is a strawman. Few people on LtU object to the idea of paying a developer license to distribute software through Apple's store. I certainly agree that that requirement is very reasonable.

The objection I and others raise, is about paying Apple more just to use and program my own device, the way I want. Apple is forcing people like me to pay an extortion fee to unlock my device's inherent capabilities. Obviously that's a little hyperbolic, but not by much.

It isn't meaningfully your hardware unless you are a developer. You are buying a condo with a tough owner's association, not a house.

And now we come to the crux of the matter: Apple has the magical power to wave away your property rights. There is no equivalent to the Condominium Act delineating the rights of unit owners vs. the association for your phone; at the moment, whatever the "condo association" says, goes.

I'm honestly surprised that you are seriously advocating the position that purchasing something doesn't grant you ownership.

You should consider the cost of the SDK just part of the purchase price.

I consider the purchase price to be the purchase price. Of course, Apple would never say that the purchase price doesn't actually grant you ownership over your device, because that would be fraud, and offering to give you ownership for $99 would be extortion.

Richard Stallman is correct to be very wary of the iPhone platform; by using it, you abdicate all rights to Apple's whims, and you have no recourse, no appeals process, no transparency. I'm troubled that you seriously don't find this troubling. That's all I have to say on the matter.

I'm honestly surprised that you are seriously advocating the position that purchasing something doesn't grant you ownership.

Because I don't find it all that unusual. I've owned lots of cars. The state of New Jersey / California / New York / Minnesota had no problems regulating my usage and limiting my property rights. I own a home, the township regulates my usage. I can buy non-prescription drugs which when filtered and combined would make methamphetamine, if I were to in fact to do that, I'd go to jail. I own a scalpel, gauze, medical sponges and a suture kit; if I were to use my property to perform surgery I'd go to jail. Property rights are restricted all the time.

And in particular, my cell phone is highly regulated by my carrier. For example when I first started using cell phones, I could share a number between multiple phones that privilege resulted in widespread fraud so for the common good carriers eliminated it, and trying to circumvent those protections would be a crime today.

Property is often regulated. Frankly I'm thrilled that a technical company is stepping up to the plate and acting as a regulator in the public interest. Microsoft quite often completely failed to do that. For example, when spam started to be a serious problem, there were dozens of easy technical solutions involving changing or replacing SMTP, all that would be required would have been an organization to get most everyone to shift at once which Microsoft was in a position to do; 17 years later we still have a serious spam problem because Microsoft didn't take ownership of problems with their platform. Conversely when Microsoft did step up and address communal problems, like solving "DLL-Hell", life in general got much better for end users.

I'm thrilled to see a company stepping forward and playing the role of the government and setting up a regulatory framework for end user computing. I'm tired of living in the anarchy of Somolia, even though I own body armor, and an assault rifle and am a good shot; I welcome a sheriff. I get:

The absolute freedom from virus. Total immunity, even with hundreds of thousands of applications written by small developers.

Safe runtime performance for all applications. An application cannot change my system to increase its performance at the cost of other application's performance.

A safe application purchase and upgrade system including publicly available ratings and contact information.

Frankly the only freedom your side has been able to cite that I've lost is the right not to register with Apple as a developer if I can't handle things the way a civilian would and instead want to use my rifle and armor. I have far more freedom on my iPhone than I had on LG-Touch to modify the system.

I've been a supporter of Richard Stallman for over 20 years now. And Richard himself argued quite forcefully in the mid 1990s that Debian, as a platform should not allow applications/code that compromise the copyright status of the entire OS. That was why he fought so hard against KDE/QT under the QPL. Debian most certainly does regulate developers putting a trusted Debian developer in-between each and every single application and the end user. There is no freedom to arbitrarily distribute code to end users via. apt. Getting into the Debian distribution means going through an approval process and applications are rejected.

The information on the FSF website regarding Apple's policies is simply factually untrue. Apple's role is little different from what Debian does with its distribution. The main difference has been that Debian has from its earliest days existed in an environment where there are multiple Linux distributions so Debian policy is not effectively Linux policy and get confused.

Driving is a privilege, not a right. All the other regulations you cite are because the changes to your property are not isolated to your property, which is not the case here.

And you still seem to blatantly ignore the fact that you have no rights at all in Apple's ecosystem, except what they say you do (subject to change at any time), and no appeals process when you get screwed.

And your platform safety properties are all myths, and I'm amazed you actually think you're free from viruses and trojans on iOS, and finally, that a sheriff is not even needed and cannot even provide this property.

All the other regulations you cite are because the changes to your property are not isolated to your property, which is not the case here

And platform standards that other users establish are not isolated to them. The fact that other users downgrade, complain and won't buy applications that violate Apple's HIG is the reason I have a consistent experience with Mac applications. The fact that Unix/Linux customers are willing to tolerate software without good defaults is the reason I lost weeks every year for the last 20 figuring out how to configure applications to do the basics.

Absolutely, users create a collective platform standard and then they are all effected by those standards.

And you still seem to blatantly ignore the fact that you have no rights at all in Apple's ecosystem, except what they say you do (subject to change at any time), and no appeals process when you get screwed.

Because as we have discussed multiple times that is simply not true. I have the right at any time to change to another group of master servers and be completely unaffected by Apple's rules, and I have right to a developer key which gives me the right to bypass any rule on a one off basis. The only reason people are influenced by Apple's rules is that they like Apple's rules.

And finally of course there is an appeals process. Apple funds the "App Review Board" The App Review Board to provide you the opportunity to appeal the rejection of an application if you believe that the functionality or technical implementation was misunderstood. You will be able to submit details that the App Review Board will use to determine if the rejection of your app should be reconsidered.

I'm amazed you actually think you're free from viruses trojans on iOS

OK when you get over your amazement let me know where I'm wrong. How exactly is a virus going to spread from system to system?

I don't know if you are from the era of typing in BASIC programs from magazines.

I'm just barely old enough to have had this sort of experience; I'm still young enough, though, to hope that we can reach a point where "sharing code" for hobbyists doesn't just mean copying the text of a particular static version (by whatever technical means we achieve it).

Ideally, I'd imagine that sharing code in a modern hobbyist system should be "live" by default. I point at an interesting sprite, widget, or behavior my friend has authored and I automatically run against the most up-to-date "released" version (with the ability to browse their version history if needed). Something like a halfway point between GitHub and a language-specific package manager like Cabal, but simplified and integrated into the language/IDE/runtime (since they are one and the same).

It seems like you tend to think those doors out of the system aren't meaningfully there, that Apple is more like a prison, a prison with very nice window treatments but a prison none the less.

You are right that one can usually circumvent the borders of the various "walled garden" products out there. It is a bit much, however, to expect the target user of something like Scratch or Codify to do so.

No barrier can stop the most determined users, but almost any barrier can stop new users from giving something a try.

On an aside, Codify provides for a crappy coding experience, its basically a text editor with some keyword code completion and no member completion. Nice first try for them, but I hope it doesn't negatively affect our movement.

Agreed (now that you've clarified the "movement" to which you were referring).

Another (related) challenge is that by selecting an off-the-shelf general-purpose language, Codify forces users to understand both the execution model and semantics of Lua as well as the execution model and abstractions of the Codify "engine." A system where the engine and "language" are more tightly coupled (LBP, or YinYang) ultimately has fewer concepts to learn.

Ideally, I'd imagine that sharing code in a modern hobbyist system should be "live" by default. ...Something like a halfway point between GitHub and a language-specific package manager like Cabal, but simplified and integrated into the language/IDE/runtime (since they are one and the same).

iPhone really isn't a hobbyist system. Honestly none of the cell phones are that open, the carriers would hit the roof. That being said someone who wants to use Git, Cabal, an integrated IDE... is a developer and should register with Apple as one and then... there are far fewer restrictions.

You are right that one can usually circumvent the borders of the various "walled garden" products out there. It is a bit much, however, to expect the target user of something like Scratch or Codify to do so.

I'm not expecting the target end user of Scratch or Codify to do so. I think that's why Apple has no problem with Codify telling end users how to scale the walls. Basically it is Apple protecting you from accidentally shooting yourself in the foot, not intentionally shooting yourself in the foot. I'd assume that Apple probably figures that if you are ready to scale the walls, to get Codify stuff you are either a PIA or close to stepping up to a developer's license. Either way they don't need to worry about it. The "adult version" of Codify is Corona and Apple has been supportive of Corona to the tune of selling tens of millions of dollars of Corona apps per year.

Corona does the same thing: Corona is a dialect of Lua with many more goodies that Codify running against an Objective-C engine so that the Lua has access to wrapped versions of the Cocoa API.

I think I must be missing part of your position here. You keep saying that people who want to share code more freely should sign up for Apple's developer program and use the "adult version" of these tools, or else play with side-loading sorts of workarounds. But what about when the user is a teenager or younger? They may not have the money to become a registered developer, and may not have the technical ability, equipment, or parental permission to "scale the walls" and side-load or jailbreak.

Sure, "grown up" developers should use grown-up developer tools. Grown up developers also probably have a dedicated development machine, in addition to one or more mobile devices, so the lockdown has even less of an effect on them.

I'll go further: what about the grown up with an iPad in their backpack but no laptop. I know a serious developer would wait until they got home to use their PC, but there are less serious developers who just want to create when they feel like it.

I was referring to things like Codify and Scratch, which are hobbyist systems.... [old -- I'd imagine that sharing code in a modern hobbyist system should be "live" by default.]

Oh I see what you meant. What I'd say is that this opens up potentially some very serious cross contamination. For example do you want to allow a script that can erase or change all the other scripts? While I don't consider "Word viruses" to actually be viruses, end users do.

iOS only has security at the Codify:Documents level below that it would have to depend on Codify's sharing system's security to "do the right thing". Codify is right now in negotiations with Apple to get approval for a sharing application likely trying to work out what "the right thing" is for young hobbyists. For older hobbyists Apple has had a lassez-faire attitude. With kids they are going to need additional protections, probably something like user scripts are protected and groups of scripts are protected from influencing one another. Also I think they are going to want some level of checking the Codify engine itself, maybe something like a security audit. In a year these issues will be worked out.

For Scratch the problem is that MIT Media Labs wasn't the author of the application, it was 3rd party. So the application author couldn't negotiate.

You keep saying that people who want to share code more freely should sign up for Apple's developer program and use the "adult version" of these tools, or else play with side-loading sorts of workarounds. But what about when the user is a teenager or younger?

Part of the problem is that person being discussed keeps changing. I think kids fall into a few different categories

1) Supported by a school or university. In which case their device should be registered under the Enterprise SDK registered to the school or university and exceptions to policies can be made by the system administrators supporting them. So for example a college that wanted to use iPad's wouldn't be effected by Apple's policies unless they wanted to be.

2) A knowledgeable adult is supervising them. My daughter for example, in which case the knowledge adult's developer's license allows them to do these things. And of course the knowledgeable adult knows how to scale the wall a 1/2 dozen different ways. The limitations facilitate the knowledgeable adult because the environment defaults to safe and the knowledgeable adult has to directly introduce dangers.

3) The kid is self supporting. The how to do these things is well documented, but not obvious. In which case the kid will search the web and learn how to do this stuff. Very similar to how kids 10-15 years ago used to install Linux, back then a good rule of thumb was that if you knew enough to get Linux to run then you knew enough to run Linux.

4) The kid is not self supporting, not supervised by a knowledgeable adult nor by an institution. In that case Apple is having to act in loco parentis, and I think caution is justified. Apple needs to take care of that kid's best interest. And they are doing that. Remember the parents made a deliberate decision to get an iPad (for example) instead of an Android tablet, a BBOS tables, a QNX tablet even though most of those tablets are available for considerably less money. They did that because they wanted Apple's managent, and the advantages that brings.

What about the child who has an iPad, but no PC?

Well up until very recently that wasn't even a possible configuration. It still isn't a recommended configuration. Apple is unequivocal that iOS devices are limited purpose adjuncts to main computers not replacements for them. My main response to kid type 4 and their parents is "doing stuff Apple doesn't recommend is likely to be bad". In which case the hypothetical kid either needs to:

a) Live within a very constrained universe, on a crippled piece of equipment designed to do far less than a general purpose computer. Programming would just be one among a hundred areas where the experience would be deficient.

b) Make the iPad into a semi-general purpose device. Get access to a computer for a short period of time and install an alternate OS like iPhoneLinux that isn't designed for a secondary device. That can be either the exclusive or the secondary OS.

c) Trade the expensive iPad for a less expensive laptop and have a general purpose computer.

Why would a parent buy an iPad for a kid that doesn't have a laptop and is old enough to operate a keyboard? IMHO the question is kinda like "what should a kid do to bike to school if his parents won't replace flat tires on his bike"

I agree that the category of user here matters a lot, and I appreciate that this has sometimes lead to people in this thread talking at cross purposes.

I think the line you draw between your cases (3) and (4) is a bit harsh, though. I can think of a lot of children who are "self-supporting" enough they can do great things in something like Scratch (or Logo, Alice, Squeak, etc.) without a lot of parental hand-holding, but who don't have the requisite knowledge (or tolerance for arcana) to do something like install Linux on their phone (or their one-and-only computer).

I may have a bit of a bias on this subject, though. I still remember the copy of HyperCard that came installed on my family's first Mac, and the impact in had on me.

Remember these sorts of apps already do exist for iOS. There is a good Logo. There are a ton of BASICs.... That exists and more are being created. The issue you had discussed was sharing. And Hypercard was a very serious vector for viruses on Macs, lots of people credit it as the first "macro virus" platform, beating Word by a few months. It enabled inexperienced programmers of all types to construct custom software, and many a young virus writer got their start.

And I agree with you about 4's. 8 years ago, my wife who doesn't know what a for-loop is and certainly doesn't know what a device driver is nor what they do, wrote essentially a custom device driver for herself in Applescript. I think she's a perfect example of the kind of (4) and how these easy tools allow people to do the cool stuff you are talking about. I don't think that ever would have happened on iOS with its policies against changing functionality, we aren't disagreeing on the negative consequences. I can tell you from personal experience, she benefits enormously from the safety of iOS, she loves the fact that nothing she does can hurt her system. The safety allows her to experiment much more freely than she can on OSX where she has to worry about the possibility of hard to reverse changes.

Since you picked Hypercard, one of Hypercard's direct descendants is Run Time Revolution's LiveCode. Runrev personal edition and student edition of LiveCode for iOS runs on OSX, that is it requires a cross compile. They expect you to also buy the Developer SDK from Apple. So even for students who want to deploy they have to be paying $100 for LiveCode (vs. generally $300 for the pro version) and $100 for the SDK (assuming the school doesn't use the Enterprise version).

That is even Hypercard (essentially) has to follow Apple's model of building software for virtual hardware running on the Mac and cross compile, not treat the iPad like a development platform. And I agree that this modern Hypercard gets rid of the whole casualness which allows for discovery. Someone has to be at least semi-serious to use LiveCode at all. Apple is aware of this and is trying to make things easier, Apple is seriously considering allowing them to bundle a mini version of the iOS-SDK that allows for the creation of private apps only (no AppStore), with the student version of LiveCode for free.

Also, LiveCode for iOS doesn't use the run time library but rather is a statically bound application. Runrev met with Steve Jobs in early 2010 and knows what he wanted in terms of making LiveCode safe enough for the runtime engine, and they are slowly moving their product in that direction. No one has published the specifics so I don't know what Steve Job's issues were, but according to the CIO of RunRev they were not small thing they could knock off in a few months.

Basically things are better than people say, the situation is much more nuanced than the absolutist language you hear and in general Apple is slowly finding an appropriate way to meet everyone's needs. No one has ever done what Apple is doing, in terms of created an environment this configurable for end users while at the same time this safe. It is very easy to say "iPad's should do X" not so easy to balance out all the various plusses and minuses. By 2015 or 2020 it is entirely possible that iOS (I suspect on Macbook air) is a premier kid's and school's development platform because they do anything without any fear at all of harming the system.

Does this mean that applications have to be submitted as source code? Is Apple then going to read that code to determine whether it was the result of compiling some other language to Objective-C or whether the code was written by hand?

Apple no longer has the Objective C-only requirement, they allow code from other languages; they also allow interpretation as long as you don't side load code from an outside source.

Terms of the license are not enforced technically, but rather you have to disclose what your app does. If you are honest, then you will get rejected. If you are dishonest, then your app will get banned after the unwanted behavior is identified, and your account will possibly be suspended if the violation is blatant. This is a very effective filter, since most people are honest up front. Only the virus writers will be sneaky about this, and they will probably eventually be found out via the community of users.

Worms aren't the only kind of virus, while even worms are viable if you can compromise the system.

Any system has an attack vector, its only a matter of how hard it is to exploit, and given the interest in Apple systems recently, I'm sure people will find holes. Technological solutions include static analysis, sand boxing (including via processes), managed languages, and so on. They are only so effective (can't filter porn yet), and come at the cost of flexibility, so they aren't pushed too hard. Instead, social/legal/economic solutions work well enough, at least for now.

They only allow a fixed set of languages right? They don't allow compilation to languages in that set? (whatever compilation means -- many companies use code generators). It's a bit unclear what's not allowed and what's allowed.

No that isn't right. Corona, which is a very heavily used commercial Lua system is a good example. There were terms until September 2010 banning other languages in theory. In practice they were mainly banning Flash game developers.

Mainly it the rules seem unclear because people are trying to handle this more abstractly than Apple is. Here IMHO is what the unofficial guidelines are:

1) Apple will do whatever it thinks is in the best interest of the platform. In practice the only rule is that Apple is the final authority and there are no rules only guidelines.

2) The more knowledgable your end users the more leeway you will have.

3) The more consistent your application is with the iPhone interface the more leeway you will have.

4) The more consistent you application is with Apple culture the more leeway you will have.

5) The more obscure and niche your app the more leeway you will have.

For languages which are heavily used, Apple will handle them on a case by case basis.
For example the original ban was mainly aimed at the Flash CS5 compiler. That ban came directly from Steve Jobs:

This becomes even worse if the third party is supplying a cross platform development tool. The third party may not adopt enhancements from one platform unless they are available on all of their supported platforms. Hence developers only have access to the lowest common denominator set of features. Again, we cannot accept an outcome where developers are blocked from using our innovations and enhancements because they are not available on our competitorâ€™s platforms.

For Java there are ongoing discussions with Sun/Oracle. Apple appears interested in Jazelle (Java bytecode interpreter directly built into the Arm processor). So for that one I'd say Apple's position is "soon but not yet".

For CLI on ARM there is no question that Microsoft and Apple have had numerous discussions but the rumor mill is unclear where this stands.

For most of the languages the LtU people care about, I suspect Apple would be genuinely supportive.

Why did you ask the question about code generators? I started the whole post with a discussion of a specific example of a code generator that does over a million a month in business with Apple. Why ask the question if you aren't interested in the answer?

On the contrary, I am very much interested in the answer, because I was thinking about creating a development tool targeting iOS applications. But with the current uncertainty it's not worth the risk. Hearing the policy from an Apple employee, or from somebody who heard it from an Apple employee would certainly reduce that risk.

A development tool (i'm assuming professional) and this is for creating apps for widespread distribution (i.e. not small scale or corporate)? First off that kind of app would be running under OSX not under iOS. So most of this conversation is irrelevant.

If the tool is iOS specific (i.e. Cocoa) then in general Apple is going to be supportive. If it isn't Apple is going to make quality judgements either on the output of the tool or on your customer's apps. There is no way to eliminate the risk regarding development tools that aren't in line with Apple's preferred methodology. Developers / companies with 2 decades of connection to Apple have had problems on this front. But generally companies have been able to work through them. On the other hand the iOS Apple App store revenue is about 7x Blackberry App world, Android Market and Nokia Ovi combined. So if you want to make a commercial tool the only developers, on average, making much money right now are the ones doing JavaME or the ones doing iOS.

They don't define code or data formally, so they have discretion to decide when data crosses the line and becomes code. Most of the time, its easy to tell when the line is crossed, therefore they don't need to worry about the gray distinction between the two terms. Again, they aren't playing scientists or even lawyers where terms have to be defined unambiguously. And to be honest, can you really blame them?

You are getting good answers below. The big thing is Apple doesn't play computer scientist here. Code and data are defined in a business sense.

1) Does the end user think of it as code or data. pdf documents are obviously code but thought of as data for example.

2) How safe is the interpreter? Does the interpreter create security holes? And relative to how big the security holes are, who is the app marketed too? Apple is much more liberal on applications designed for the sophisticated user and much less liberal on apps designed for children.

3) Does the interpreter and the code running on it fit with or detract from the overall platform direction? For example
a) Does the end user experience an iOS look and feel. For example iOS applications have graphics built for the screen size, while Android developers tend to use scalable graphics. Which does the end user experience?
b) What is the download size. Does it make images smaller or larger?
c) Does it create the possibility for external markets and commerce?
d) Are the code developers fundamentally iOS guys. As the Apple announced new directions for the platform at WWDC can they expect the interpreted code to follow?

____

If you are using an interpreter to make "insanely great" software available, that's good and Apple is going to be looking to qualify your application if they can do so safely. If you are using an interpreter to save programmers the trouble of having to learn anything about the iOS platform and churn out generic applications without any concern for end user experience, that's bad and the rules will be strictly enforced.

Quite literally the same interpreter app could pass or fail depending on soft factors like target marketing.

Apple doesn't want a permanent clear cut policy because they don't want people to think of ways to comply with the letter but not the spirit. For example low value apps had occurred to apple until they pulled several hundred of Khalid Shaikh's apps (topic specific news aggregators for a higher price). The issues with runrev and Corona are well documented (both interprets that Apple has been somewhat supportive of).

It is my description of what they do in practice. I look at the objections to applications I think it is pretty easy to derive. For example look at a classic like thoughts on flash. The objections Jobs lists:

Apple doesn't want a competitors closed platform. Either open or under apple's control.

Apple wants to implement technologies in ways optimized for their hardware

Apple wants cocoa developers not generic developers

Apple considers quality of implementation important in whether to allow larger systems.

Apple wants software that meets their interface standards, and use the interfaces provided.

Apple will be stricter the more important the interface. They don't want widely used software holding the platform back.

I'll quote the last paragraph:

Our motivation is simple â€“ we want to provide the most advanced and innovative platform to our developers, and we want them to stand directly on the shoulders of this platform and create the best apps the world has ever seen. We want to continually enhance the platform so developers can create even more amazing, powerful, fun and useful applications. Everyone wins â€“ we sell more devices because we have the best apps, developers reach a wider and wider audience and customer base, and users are continually delighted by the best and broadest selection of apps on any platform.

You are never going to get a binding official statement because people always think of ways to do stuff that Apple doesn't like.

I wasn't looking for an official binding statement, but just some reference on the spirit argument that I could reference and quote when further work is done. It makes total sense to me that Apple would avoid a strict legalistic or structural definition of what a "valid" app given that so many of the criteria are necessarily subjective. As engineers and researchers, we might not like this, but I don't see a decent alternative that allows them to maintain control.

Eventually, we should start looking at open ecosystems again: how can we make our open unwalled gardens safe without building real walls? Can we leverage crowdsource reputation and editorial controls such as those used on Wiki to create a high-quality garden where noise and malicious apps are somehow vanquished or at least obscured?

For the spirit, I don't think you can do much better than thoughts on flash. That is the bible, anything else is biblical interpretation. :)

As far as crowdsourcing, my argument is that a lot of the objections to Apple policies are based on misrepresentation of what the policies actually are, conflating a worst case of Apple could do in theory with what they are actually doing (FSF, whom I normally like, being one of the worst offenders here).

As I see it there are tons of great ways to that a crowdsourced solution could in theory offer a solution better than the one Apple does. In practice I doubt it. It has been over 15 years were there lots of good solutions to the spam problem of which 0 were implemented because it was impossible get consensus between stakeholders. Lambda the Ultimate is all about advances in computer programming languages and how much better we could be doing if we could get all the various stakeholders to examine better options than slight variants on C for application programming, and and after a decade we have LINQ. Emergency responders still can't talk to one another on their radios, a decade after 9/11.

Good choices and the right sorts of things happening are so rare, I'm thrilled to see it. I've helped out wikipedia and seen them make all kinds of bone headed decisions about what goes in and what goes out, choices far worse than Apple which have been far more damaging to Wikipedia as a standard. Debian is quite often vastly more restrictive and less pleasant to deal with than Apple. And that is the best of crowd sourcing. No, I don't see a problem. I see pretty much one of the most ideal possible solutions existing. Genuinely good government, being provided by knowledgable and interested officials with the power and resources to actually carry out their policies is rare enough. But then add to that, Apple having the judgement to provided reasonable bypasses when their policies are severely mistaken. Sorry, I see the government I get from Apple as being pretty close to perfect. I picked Apple over Android, mainly because I like the benefits of full integration.

My phone before I got an Apple ran Qualcomm's BREW OS. BREW has an even more restrictive agreement where every application had to be tested twice once by Qualcomm (True BREW) and once by the carrier and nothing went out with both of those two agreeing. Under BREW there were maybe 1000 applications available to me tops. I don't remember any agitation about it. Subaru and BMW have a couple dozen computers in my cars, and they try really really hard to make life miserable for the hacking community. There is nothing like the Subaru SDK that lets me reprogram my car.