He then discussed the architecture of rich internet applications, focusing on the challenges.

The first challenge he discussed was local storage, and he highlighted how Google Gears is solving that problem, and how Air wants to cooperate with Gears.

The next challenge was searchability and deep linking. He proposed that we as a community use # as a standard for maintaining state in Ajax application URLs. Adobe has sent a proposal to the OpenAjax Alliance to standardize # and begin the process of standardizing deep linking.

Cross-domain access came up next. He reviewed the problem and why the cross-domain security policy exists. He discussed a proposed “crossdomain.xml” permissions file that allows a site to declare exceptions to the cross-domain security policy, which looks like:

It turns out this file already exists on 36% of Alexa’s top 100 sites in order to support cross-domain Flash behaviors. Because Flash already supports this mechanism, you can use Flash today as a hidden communication mechanism to allow cross-domain behaviors (see adobe.com/go/crossdomain).

Kevin hopes this same mechanism can be implemented in browsers and offered to work with standard bodies to add this to browsers.

Next, Kevin chatted about the Tamarin project, reviewing the guy features and described it as “JavaScript from the future”, reviewing key features, such as:

Kevin showed off the adoption of Flash Player 9, showing that it was pushed out to 83% of the web in 9 months, calling it “the most ubiquitous platform in the world, even more than operating systems” and the “fastest deployment” ever for a new platform.

Kevin announced a new free, open-source Flash/Ajax Video kit that allows really simple syntax for playing movies.

This toolkit is available at adobe.com/go/favideo. He went to hbovoyeur.com to show off how you can use the video capabilities of the Flash player to do some really cool interactive stuff.

The next weakness Kevin highlighting is developer productivity. He said that a declarative way to do development is more productive than procedural mechanisms. He highlighted how Flex’s MXML gives a much richer declarative mechanism than HTML. To this end, he reviewed that Flex 3 is now open-source, with a public bug database and daily builds. The project will be fully up and running by the end of the year under the Mozilla Public License.

Data synchronization. This is clearly a hard problem and Kevin reviewed how Adobe is currently solving this problem with LifeCycle Data Services. He pointed out that Ajax applications can be used to talk with LifeCycle via HTTP or their own RTMP protocol (RTMP adds push capabilities). He had a demo showing Dojo use the sync services, but sadly the demo was broken. He did show the code, which was about 6 lines of code to bind a JavaScript data collection to the LiveCycle sync services.

And now the transition to AIR, which Kevin described as a way to bring Web apps to the desktop. He highlighted that AIR adds these services to web applications:

He also reviewed that AIR applications can be written in two styles: HTML or Flash. In both cases, you can seamlessly integrate PDF documents into the application. He also highlighted a capability I hadn’t seen before: support for deploying to “Device OS’s”. He then showed off some AIR applications:

- Simple Tasks, an Ajax application written by Jack Slocum of Ext JS running as a local Air app.
- Finetune, a Flash application to stream music
- Buzzword, a high-quality Word processor that shows off some very sophisticated layout and UI features. From his quick demo, it seemed more powerful than Apple’s Pages but adds collaborative features like co-editing with other users over the network (but is turn-based not concurrent)
- Adobe Media Player, a way to play Flash video on the desktop (similar to the Quicktime Player)
- Pownce, a client for Kevin Rose’s new Twitter-esque service

Always like Ben’s writeups, they are comprehensive. Many thanks for the great info! …Although crossdomain.xml for Flash has been around awhile, I like that Adobe is getting the word out about that. It would seem ‘cleaner’ for a straight Ajax app to be able to get info cross-domain in straight JS through the browser, but it’s not like it’s that much extra to include a Flash .swf and write a little extra code to accomplish a similar x-domain transport. (Likewise, I always like it when there is a ‘push’ from a group or company to increase motivation for other platforms towards a worthwhile technology)

IMHO, crossdomain.xml wont solve anything but will increase the attack surface. Keep in mind that there is a big difference between XMLHttpRequests and JavaScript remoting. I cannot see why simple JavaScript includes should be restricted by the crossdomain.xml file. Therefore, given that fact that almost every service provides JSON output, you will increase the attack surface by making XMLHttpRequest calls not only available to the current origin but also to other websites. Correct me if I am wrong.

There are different attacks being addressed. Same-domain restrictions on data loading (e.g. XmlHttpRequest) are intended to protect the server hosting the *data*. Imagine, for example, servers behind a firewall. They should not be automatically query-able by external content running in a browser.

However, this restriction in the browser provided no means for servers to *permit* cross-domain data access. crossdomain.xml addresses this limitation with an explicit permission mechanism. JSON output also addresses this restriction, but opens up an opportunity for attack to server hosting the *requesting application*.

Loading a .js file from another domain opens up your web app to attack. The loaded JS is essentially imported into your app’s domain, giving it full privilege to query your app’s server — complete with session credentials!

So, further use of crossdomain.xml for real data loading via XmlHttpRequest should reduce the use of cross-domain JSON for otherwise safe data loading operations.

I understand what you are saying. However, what I am trying to say is that with or without crossdomain.xml, XMLHttpRequest objects and JavaScript remoting hacks work. Let me make it clearer. :)

Due to the Same Origin Policies JavaScript can access only the current origin. Even if you implement the crossdomain.xml file, JavaScript again will be able to access the current origin. Why? Compatibly issues. We cannot move to the new technology over the night. With or without crossdomain.xml JSON or JavaScript remoting, if you like, will still work. The only thing that will change is increased attack surface due to the trust relationship between apps. Let me explain.

Let’s say that we have app on A.com and another one on B.com. B.com says that A.com can access its data. Effectively, this means that If I can get XSS on A.com, I will be able to read the data on that domain including the data on B.com due to the trust relationship. Today this is not possible. I need two XSS vulns rather the one.

In your example, B.com’s data is more vulnerable, since any client it permits (A.com) may have XSS vulnerabilities.

However, A.com is *less* vulnerable, since it no longer has to rely on JSON importing in order to access B.com’s data. It can load the data via XmlHttpRequest and not fear that man-in-the-middle attacks or ownage of B.com would render its own site (A.com) vulnerable.

That’s the point I was trying to make. As a whole, I think it’s an improvement, since the risk is assumed by the data provider, rather than the consumer. But depending on your role, it’d be reasonable to feel differently. If you’re the data provider, you might prefer that your clients assume all the risk.