137 Comments

"Contrary to other mobile platforms such as iOS, Windows or Tizen, which run software compiled natively to their specific hardware architecture, the majority of Android software is based around a generic code language which is transformed from “byte-code” into native instructions for the hardware on the device itself."

What I don't get is why people say the Windows Phone stores cloud back compiles to native every time someone downloads an app. Can't they just keep the native code for each device, and compile it to the full amount of optimization just once? Yes they have more devices than Apple, but they also tightly control which SoCs WP uses.Reply

.NET on Windows (desktop_ has supported AOT compilation since at least version 2.0, possibly before (I don't recall). It also caches the JIT images so It's not 100% comparable to the way Dalvik works. Heck, even the user can generate native images for .NET programs by running the ngen.exe tool on .NET code.

Most commercial .NET programs either use AOT compilation or compile the entire program on first run.Reply

The first time I booted the Nexus 7 2013 on the L preview I actually killed the boot process. It was taking so long, it had to be frozen. I must've screwed up the flashing, I thought. So I flashed again, and this time was more patient. The initial boot took quite a while, but turns out it was probably related to these underlying changes in Android.

The Nexus 7 2013 never felt slow, but I didn't know it could run this fast. The browser scrolls to an almost iOS like smoothness. I say almost because there are (very) rare hiccups in the FPS but I actually believe those maybe 2/2 to the "preview" nature of this OS.

I am very happy and excited for where Android will go in the next year. I think Android can finally bring it to iOS and Windows Phone when it comes to interface/GUI smoothness.Reply

I experienced the same thing with my droid bionic. After flashing it, I think I had to wait 30 minutes before I could even login due to ART compiling apps. I just left my phone on and plugged in and walked away.

But the experience after was much better after that long wait. I assume that production firmware would already have that step included in the restore.Reply

I have the 2012 Nexus 7, and while the performance is far from perfect (it is notably slow when app loading and horrendous when anything is accessing the SSD), the latest Chrome Beta has brought nearly flawless performance and (seeming) 60fps scrolling with little, if no jank for all but very heavy sites. In fact, there are a number of apps that I use regularly that scroll flawlessly on this ageing system.

There are still apps that run slower than they should, and typically apps load slower than I would like, and many exhibit jank when scrolling (in fact anything with a list of pictures), and I'm hoping that Android L improves these situations (if the 2012 Nexus 7 is updated at all). But of late, I'm pretty happy with the optimizations made. Ever since the recent 4.4.4 update, it seems that the OS is running apps much more smoothly than ever before.

If L improves the performance as you describe, then Android will attain that fluidity and swiftness that iOS and WP have been known for across most apps, which will be very welcome.Reply

I agree to an extent. Google is making great strides, and the L release appears to bring some very useful user experience "fixes." The only issue at this time is the color schemes we have seen recently in Google app updates and in the demoed Material Design Gmail app. The colors remind me of the colors from the basic 12-pack of Crayola crayons. They don't quite fit the slick new interface.

What has me most excited is four initiatives by Google to take back control over Android. First, is the lack of customization for 3rd parties using Android Wear, Android Auto and Android TV. Second, is the introduction of stock Android phones under the One program. If the initiative takes off, that would be a lot of phones running Google controlled and delivered stock Android. Third, is the as-of-yet unofficial Android Silver program - bringing Google Play Editions to carriers, with the software side apparently also to be controlled and delivered by Google. Putting "Silver" devices running stock Android in direct competition with the manufacturer's skinned phones should and hopefully will force the Samsung's of the world to up their game. Fourth, is an iOS system for introducing new versions of Android. This sneak peak will, hopefully, allow the manufacturers to do their appropriate skinning and get updates out in a much more timely manner.

All told, exciting times for those who appreciate technology and the advances we've seen over the last 15-20 years. I'm not sure what that next BIG product category is. I'm not sold that it's smart watches. What is the elevator speech for a smart watch? It's not an intuitive buy or justification for a lot of folks. Reply

For watches, once they are hard to distinguish from classic analog watches (thin design, top-quality screen tech, decent battery life) then the pitch is "high-tech fashion accessory" which you will be able to buy from Rolex and other expensive watch manufacturers.

But if you're talking about wearables as a class (i.e. not necessarily watches) then I think it has to be personal health monitoring. At first, it'll be just basic stuff like heart rate, blood pressure, exercise monitoring, but eventually (years from now) as the medtech improves, it may be able to do things like warn you of an impending heart attack or stroke, or perhaps a vitamin deficiency, etc.Reply

If Google is willing to ship an update that effectively freezes the system on first launch for 30 minutes while providing no UI to explain what is going on, I don't think Apple has much to be worried about...

(Apple is not perfect on this score; in particular there are time when OSX shuts down when one has an uncomfortably long period of watching a spinner while the system is doing god knows what. But they at least understand the principle of user feedback, ESPECIALLY during first boot.)Reply

On current Android devices, when you switch the runtime to ART, you get the "Android is upgrading; X of Y" progress bar on first boot. I'm sure once the L release is finalised, it will have a similar UI.

What's perplexing is why this isn't currently in place on the Dev Preview.Reply

Seems the rumor is the old Nexus 7 might not see L... It'll be over two years old by the time L arrives officially, and being based on an old Tegra perform that not much else is using anymore it's chances are probably on the low side. I think ART was never enabled as a dev option under KK for it either but don't hold me to that, you can check yourself tho (I've got a 2013, gave my sister a 2012 as a gift tho).Reply

Good update, although I find it interesting that all of a sudden Android after all WAS not as smooth as iOS (which it indeed never was, really).

What I'm still missing (and I hope L will address this at some point) are more privacy controls. If (stock) Android grows a way to manage permissions after an app is installed I would be very glad.Reply

The way permissions work on Android is enforcement is based on if the app ever uses a specific permission, and it's announced on install. You can't have after the fact permission management, the app either has all it's permissions or it doesn't run. I believe it was done like that for performance reasons. It's also a hell of a lot easier for developers because you don't need to constantly check permissions before doing things.

If you don't like an apps permissions, don't run it. A system like you describe would be as horrible as the system that classic Blackberry used and that sometimes required explaining to users to go in and manually give apps X, X, X and X permissions.Reply

Once rooted, you will have more choice to control app permissions. I am currently using XPrivacy(Xposed Framework)which has fine grained conttrol over what app can access. It almost becomes annoying with so many prompts for permission when an app runs. Reply

Oh for crying out loud, give it a fscking rest. You're like the people who will excuse ANYTHING Apple does.The current Android permissions handling is a complete abortion, and it's obviously going to be changed to something more iOS-like in the future. And what are you going to do then, Mr "We have always been at war with EastAsia"? Complain that Google is making things worse with the new permissions system they introduce in Android P?Reply

@darkich As difficult as it may be for you to believe, Dolphin 11.x with Jetpack enabled is not a consistently smooth experience across all devices. My HTC One M8 is such an example. A "heavy" website such as the newly redesigned androidcentral.com is buttery smooth on Chrome and the built-in HTC Internet app, but on Dolphin it loads slower and is a bit jittery when scrolling through the page. I realize that optimizing an app such as a web browser to be smooth across a large number of devices is difficult, but when you call people flat out liars because their experience differs from yours (on a different device nonetheless) just shows you don't know what you're talking about.Reply

I know precisely what I am talking about.He was referring to scrolling performance specifically, and in general, scrolling on Dolphin Jetpack is by far the fastest and most fluid out of any browser.Yes, I can also conform that Dolphin has issues on some pages, but that doesn't change the overall picture when we talk performance and fluidity.Show me a browser that handles every page flawlessly and then I will give you a point Reply

As for Androidcentral, well I just tested it on Chrome and Dolphin.A single swipe on Dolphin scrolls through the entire(in a typical Dolphin Jetpack fashion) front page in the desktop mode.Chrome? Gets only about halfway through!And Safari is even far worse.There is just no comparisonReply

"Number of swipes to reach end of page" is not the same metric as "scrolling is buttery smooth at all times". In fact, they aren't even remotely related. The two of you are talking about completely different things, almost orthogonal to each other.Reply

That was completely nonsensical.By the most basic and obvious logic, the speed of scrolling is the very first metric of its smoothness.If you have two wheels and spin them with the same amount of force, and one spins for twice longer than the other-which one would you regard as the "smoother" one? Reply

In the past I would have singled you out as being stupid, but I've seen a number of android users make the exact same utterly bizarre connection between scrolling speed and smoothness. Has it ever occurred to you that high velocity is used to hide jank and stuttering? Reply

Setting scrolling velocity is just a decision of the developers, this is just a parameter. You can make the crappiest hardware scroll like mad. Smoothness (and getting the behaviour close to a believable, consistent physical model of inertness and friction) is really hard work that requires lots of things in the system working right to even allow trying. Android was never good (or consistently good) at that. Google has improved it with every version though.Reply

"and getting the behaviour close to a believable, consistent physical model of inertness and friction"

.. And that is exactly where Dolphin trumps everything else, at least for me.Sure, it's not perfect at all times and on all sides (no browser is, again) but at its best, Dolphin Jetpack is the prime example of the description you gave.. It feels like a real, oily smooth mechanism Reply

Darkich: perhaps this helps:'smooth' is about the dropping of frames or (in)frequently stalling the drawing. It has nothing to do with how quickly you go to the bottom of a page as the browser can simply stop drawing for 1/10th of a second and show the bottom of the page and be fastest to the bottom - yet it would not be smooth at all.

So uhuznaa is right, scrolling speed has nothing to do with how smooth and fluid the UI is. It can be slow but never drop frames or fast but drop frames all the time.Reply

I don't know if this is the reason but on my old iPhone 4 I can install 5 apps and continue to use an app at the same time without even noticing the installs going on in the background while doing the same on my Nexus 7 only leads to frustration. Same with loading lots of emails or anything else going on in the background. My Nexus always gets seizures and seems to hang for seconds when this happens. iOS seems to prioritize user interaction over everything while in Android user input seems to be treated as just another task to be handled sooner or later.Reply

It is possibly part of the reason, but for what you talk about the main reason, I think, is that the Linux kernel is not good at handling I/O while maintaining interactivity. This is actually currently being taken care off but with the linux kernel in android so far behind mainline (linux is at 3.15, android at what, 3.5?) this might take a while to get fixed.Reply

Maybe Google and other app stores in the future can do the native compilation for x86 and ARM on the server and users would download the bytecode and their native platform code; or even just the native code.Reply

Shouldn't be any harder. You compile once/device (SoC?) and store the result for future downloads. The initial conversion would need a ton of CPU time; but they could do that in advance using spare data center capacity during slow times.Reply

The obvious thing to do would be for at least some Android vendors to take responsibility for this, liaise with Google, and deliver this as a value-add for their phones.But that would require an Android vendor to think more creatively than "for my next phone, I shall triple the screen resolution, double the number of cores, halve the battery life, and thereby take over the market".

Maybe Xiaomi could do this? They seem to be the only Android vendor that isn't 100% idiot. Amazon is the other possibility, and they actually have the tech skills to do it --- but of course there is the tricky problem of their negotiating with Google... (Do Amazon and other forkers even get ART or is that now a Google exclusive?)Reply

What I don't get is why people say the Windows Phone stores cloud back compiles to native every time someone downloads an app. Can't they just keep the native code for each device, and compile it to the full amount of optimization just once? Yes they have more devices than Apple, but they also tightly control which SoCs WP uses. Reply

Unlikely. Java applications should be the ones that worked anyway on x86. The applications least likely to work would be native applications, which a developer may not compile and distribute for x86. Those are most likely to be games, particularly since Google (bafflingly) discourages use of the NDK. Reply

It's not a great reason to discourage the NDK. Many applications are written to be cross platform and run successfully on multiple architectures. It doesn't even increase your test load, since even if you're writing your application in Java you still do need to fully test it on every platform. Test is usually the expensive part. The exact wording of the page is just odd. It says that preferring C++ isn't a good reason to write your application in C++. That's a pretty obviously false assertion. Preferring C++ is a great reason to write your application in C++.Reply

The frame drop counts seem very odd with respect to the total milliseconds delayed. (Or I'm bad at math.) At 60 FPS a frame is 16 ms. A 4ms GC sweep might drop a single frame at 60fps. The output indicates it dropped 30 frames. That's 750fps. Plausible if you're running without vsync or framerate limiting on a static screen like a splash screen, but that's not really a meaningful example, nor is it especially noticeable to the end user. More interesting would be the frequency of a frame drop in an application with extensive animation running at an average of 30fps. That's going to be a situation where you notice every frame drop.Reply

It's a mistake in the article. Those log lines have nothing to do with each other. The Choreographer is reporting a huge amount of dropped frames because <unknown> took a really long time on the UI thread, *NOT* because specifically the GC took that time. This is actually pretty normal, as when an application is launched the UI loading & layout all happens on the UI thread, which the Choreographer reports as "dropped" frames even though there wasn't actually any frames to draw in the first place as the app hadn't loaded yet. So the 30 dropped frames there means the application took about 500ms to load, which isn't fantastic but it's far from bad.Reply

Re: "Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too."

There shouldn't really be any difference. Dalvik was carefully designed so that its odex format could be mmapped in to RAM, allowing the kernel to do the same page handling as with ELF executables. (Actually a little better than regular ELF executables, since odex doesn't need any relocations that cause dirty pages.)

The "ProcessStateJankPerceptible" and "ProcessStateJankImperceptible" are the process coming in and out of the foreground. When it goes out of the foreground, the GC switches to a compacting mode that takes more time to run but saves more RAM. When it switches back to the foreground, it switches to the faster GC you have been looking at. The GC pauses here won't cause any jank, because these switches are never done when the app is on screen.Reply

KSM mostly kicks in on virt loads, but the ksmd overhead is so small that including it on memory starved, multiprocess devices isn't a bad idea.BTW, I believe that android use the dalvik cache partition to avoid unnecessary re-jitting, but the space is limited, and therefore dynamic, so apps can be vacated.Reply

I can't believe Google hasn't also adopted F2FS in Android L. I would've been perfect. How is it that they put it in Motorola devices a year ago, and they still can't make it default on stock Android?Reply

If it's a "nuke'n pave" restore (like the Dev Preview or System Images), then it's not an issue. Backup your data to the PC/cloud, reformat all partitions, install, carry on.

If it's an in-place upgrade, then it becomes tricky. Unless, of course, you are using F2FS for the /data filesystem, which (really) is the only one that benefits from it. You don't need to make /sdcard (internal storage) F2FS, and you don't want to make /ext-sd (SDCard) F2FS as then you lose all non-Linux reader support. Nothing stopping you from using those as F2FS, though.

I'd really like to get a custom recovery for the G2 that allowed you to select which FS to use for each partition, and a ROM with a kernel that supported it, though. Just to try it out, and see how it works. :) Any takers? ;)Reply

> Google claims that 85% of all current Play Store apps are immediately ready to switch over to 64 bit - which would mean that only 15% of applications have some kind of native code that needs targeted recompiling by the developer to make use of 64-bit architectures.

Does this means that OEMs could use soon "pure" Aarch64 architectures? I think you can use ARMv8 purely for the 64-bit mode, with no compatibility for 32-bit, too. I imagine that would make the chips less expensive and also more efficient for OEMs.

I'm not familiar with how Intel has its chips, but I think it would be a lot harder for Intel to get rid of the "32-bit" parts, and they are pretty much stuck with their chips being both 32-bit and 64-bit, at least for the next few years, until nobody in the world needs 32-bit anymore on any platform Intel chips runs, and then they could just redesign their architecture to be 64-bit only.Reply

I've long suggested that this is exactly what Apple will do. I don't think they'll ditch 32-bit support for the A8, but I honestly would not be surprised if the A9 comes without 32-bit support and iOS9 has a 32-bit SW emulator to handle old apps. Then by iOS 11 or so they just ditch the 32-bit emulator.

Other vendors have the problem that they don't have a tight control over the entire eco-system. Qualcomm, for example, are not making Android chips, they're making ARM chips --- for anyone who wants an ARM chip. It's something of a gamble to just ditch 32-bit compatibility and tell anyone who wants that "Sorry, you should go buy from one of these competitors". Most companies (foolishly, IMHO) weigh the cost of backward compatibility as very low, and the cost of losing a sale (even if it's to a small and dying industry segment) as very high; so I suspect they're not even going to think about such an aggressive move until years after Apple does it.Reply

"Google was not happy with this and introduced a new memory allocator in the Linux kernel, replacing the currently used “malloc” allocator" - Malloc allocator is not in the kernel. I dont think there was any change to the linux kernel in this. Malloc and Rosalloc are both done in user space in the ART lib. Both probably use the sbrk() system call to get memory from the kernel. Also a quick look at Rosalloc.cc code shows it is written in C++. So definitely cannot be in the linux Kernel.Reply

The article mentions that startup times for devices will be worse with ART, but I don't understand why; surely if the code has already been compiled it will simply be cached somewhere, so it's just a case of executing it directly. In fact, this should mean that startup should be faster than normal.

In fact, the space requirement is another question mark; once an application has been compiled, does the byte code even need to be retained? Surely it can be discarded in that case? Though I suppose it's required to ensure that signatures don't change, it seems like the OS could enforce that differently (i.e - as long the byte code validated pre-compilation, then the compiled code is considered signed as well)?

I dunno, it just seems to me like there are plenty of ways to not only avoid slow-downs or extra storage use, but in fact there are ways to use ahead of time compilation to accelerate startup and reduce storage use.Reply

It only makes sense the the application's first startup will take a long time. That first startup is where the Ahead of Time compilation is happening. Where else would it happen? Application startups after that will be much quicker, though, since the AOT compilation was already done beforehand.Reply

Use ANY other benchmark. Who the hell knows how antutu works?For micro benchmarks try geekbench.If you're willing to do some compiling, linaro has a bunch of benchmarks it uses to determine progress.Reply

I didn't realize you had to pay for it.Regardless, antutu is junk. Why? Because we don't know exactly what it does, or how it does it.The other option I mentioned is pick some of the linaro benchmark tools and compile them.I won't call you crazy for not buying apps because I don't know your situation. What I do, however, is try free versions and if they are good I buy them. They don't cost much and I don't waste battery with ads I'll ignore.Reply

It never ceases to amaze me how many problems that were solved decades ago in computing are problems on modern computing platforms.

Real compilation of code has been around forever -- the norm, in fact, for desktop and server computing with a few notable exceptions. Yet somehow taking what effectively amounts to interpreting code (just-in-time compilation is very similar to interpretation) and switching to compiling it ahead of execution is being touted as a new idea.

The fact that Android has pretty much been completely reliant upon JIT running in a VM has always made me scratch my head. As clearly spelled out in the article, it cause huge performance issues, along with significant hits to battery life. And we're talking about mobile devices where we've got relatively low-power CPUs and GPUs, little memory, and finite battery capacity. But it has been the way that Android has worked from the beginning. Crazy that it hasn't really been addressed until now.

And the idea that operating systems and development languages be in charge of garbage collection, and people being surprised that it causes performance hits, seems odd to me too. Managing your own memory isn't that hard to do. And it is a hell lot more efficient doing it yourself than making the language or OS figure out how to do it. It's a "clean up your own mess and put things back where you want them" vs. "make someone else do it and let them try to figure out where things go" situation. It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important.

Because the developers that I work with aren't accustomed to managing memory, we're constantly running into issues. We've got scripts that allocate dozens or hundreds of megabytes of RAM and don't free it when they're done. They'll go through 3, 4, or 5 more of these processes within a single script, not freeing memory they're done with along the way, so by the time the script is done running hundreds of megabytes that aren't needed are still tied up. Because the language can't be sure if data is going to be used again it hangs around until the script has finished running.

Create dozens or hundreds of instances of one of those scripts and you've got a performance nightmare. Relying on a language or OS to do garbage collection will have the same net result. Reply

Nothing you say is wrong, but I think you hit the nail on the head with this sentence when it comes to Android: "It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important."

I personally think Android didn't care that performance was so bad in the early days. The point of Android, from what I can tell, was to make things Open Source and make it easy for developers. As you said, having the OS manage memory itself, it's meant to make programming easy. I think that's what made it attractive to the likes of Motorola, Samsung, and HTC in the beginning. I think that's what made it popular with the OEMs, and eventually, that's what users were getting used to.

Yes, precompiled code in interpreters are nothing new. But ART is changing what Android can do. It's not a new concept, I agree with you. But again, Android has had different priorities from the beginning than, say, writing purely in C and/or assembly for mission critical or safety critical systems where real time better be real time or else that car/plane/space shuttle will crash, or even in other not as critical embedded systems like HDDs and SSDs where performance and power matters more than anything. I think Android has always been about the easiness in its development environment, just like Java, and that's just where they put their priorities first. Now that their development environment has been pretty well founded, I think they're making the right steps with improving performance, first with the JIT compiler in 2.2, "Project Butter" in Jelly Bean, and now making the default environment ART instead of Dalvik in Android "L". They just had different priorities, and well... look at where Android is now.Reply

I think you're completely right about ease of development being the priority for Android early on, after all they had to establish a market and needed apps quickly and easily. After Google bought the OS it suddenly got lots of developer attention and they just ran with the setup as it was. If Google had made lots of changes at that time they might as well have rolled their own.Reply

Google adopted Java for Android because it was a mature programming language, popular with developers, that they didn't have to create from scratch and had features (i.e. running in a VM) that made it easy to create secure apps that would run on a multitude of different hardware platforms. Java also had an affordable (i.e. free) development environment (Eclipse) that Google could build their development tools around.

Clearly, with the incredible growth Android has enjoyed over the last six years, the decision to go with Java was anything but a mistake.

As for compiler technology, the necessity to run the same apps on multiple hardware architectures precluded the use of traditional desktop and server based compilers, and the technology behind JIT compilers certainly hasn't been standing still over the last decade. The performance and battery deficits caused by the current VM environment are certainly not as bad as you think they are, given that modern Android tablets come pretty close to matching IOS which only has one hardware platform and architecture to worry about and where the software can be tightly integrated with that sole platform. It's not as good, no, but it's good enough for Samsung to sell millions of phones in direct competition with the iPhone.

Yes, the time has come for Google to move on, but there should be nothing amazing about their use of a Java-based platform that has served them very well over the past six years. It was the right decision at the time.Reply

I think they could have produced a much better product if they had used C++ instead - native performance and battery life when it was needed in the early days, and probably faster than ios performance today. Reply

I think you are absolutely right there. I doubt that merely doing AOT compiling is going to produce faster results and that's exactly what I experienced when I switched from Dalvik to ART in 4.4. Of course there are going to be more improvements in L since the code itself has improved. I mean who was launching an app on Android and wishing it would *launch* faster? There may have been apps that took their time launching. But not too many. On the other hand, better garbage collection and other improvements will certainly help in run-time performance. AOT is not doing anything much compared to JIT.

I always wondered why Google didn't buy Sun. Both companies have similar DNA (certainly better than Oracle and Sun) and Android could have used all the expertise Sun had in building JVMs and Real Time Java in Android and the rest of Google. They could have sold off the hardware division to IBM/Oracle and not have had to deal with the heart ache and drama of the lawsuit.Reply

You'd be amazed on how can evolve a compiler in development stage.Most of the performance advantage from ART comes from AOT compilation. It can take the whole code and optimize it agressively. For example, when compiling GCC with the fastest optimizations you can get the whole program executing in the main function, with loop unrollings and vectorizations while taking into account the difference of having the functions inlined, optimizing references to variables and parameter passing.

A JIT can only focus on the "hot spots", improving some parts of the program but it can't improve it as a whole because there's not enough performance history storage space to achieve that.

So many incorrect statements about jvms in this article it would take a half hour to list them all. Plus nothing at all was said about Googles major motivator which is it is obvious Dalvik was stolen from Sun and the lawsuits aren't over. Finally this is still a long way from true 64 bit and it's benefits. For example the only reason Apple can encrypt and decrypt fingerprints in real time is because encryption operations are dramatically faster in 64 bit.Way beneath Anandtech standards.Reply

You are correct about apple's decision to use 64bit was partly because of the fingerprint scanner, but you are wrong that L is not fully 64bit compatible. In fact, it is easier for android to move to 64 bit because of the VM it runs on. The Linux kernel has always supported 64bit, but Google's runtime and libraries have not, and consequntly neither have the apps. Android L replaces the libraries and runtime with 64bit compatible versions and "enables" 64bit support for 85% of apps automatically with no work from the developers. That's pretty impressive.Reply

Contrary to other mobile platforms such as iOS, Windows or Tizen, whjich run software compiled natively to their specific hardwware architecture, the majority of Android software is based around a generic code language whicjh is transformed from “byte-code” into native insstructions for the hardware on the device itself. The performance gains over Dalvik are significant num.to/427-837-276-945Reply

Contrary to other mobile platforms such as iOS, Windows or Tizen, whjich run software compiled natively to their specific hardwware architecture, the majority of Android software is based around a generic code language whicjh is transformed from “byte-code” into native insstructions for the hardware on the device itself. The performance gains over Dalvik are significant http://num.to/427-837-276-945Reply

<i>they are using reference compression to avoid the usual memory bloat that comes with the switch to 64-bit. The VM retains simple 32-bit references.</i>This feature was implemented in JDK6. Google just imported it into their new VM once ARMv8 (ARM 64 bit) has become available.Still, 64 bit android applications will use more memory, if they're compiled by ART for 64 bits.Reply

Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too. I’m curious what the effect of KSM (Kernel same-page merging) has on ART, it’s definitely something to keep an eye on.

Also, the work won't end with this release. Like Dalvik before, ART will be improved as time goes by.Reply

"and is at the whim of the system to correctly manage things in an optimal manner"

You're showing your bias. Aside from large heaps, show me where the JVM is not handling memory in an optimal manner, and to clarify this should from a cost benefit perspective outweigh the time it would take to implement in a lower level language.Reply

Millions pay the penalty thousands of times for successful programs - I think lots of development time could be justified if you looked at everyone's time. Imagine Android not needing so many tries at optimization and speed-up, and how that development time could have been spent instead. Reply

Shortly after updating to Kiit Kat 4.4.4. on my Nexus 5, I switched to ART. It took about 10 minutes to recompile. I really didn't notice any significant storage loss. However, I notice significant improvement in speed and overall responsiveness. For me, very noticeable at first but now that it's become norm ..... as it should be. The N5 is already fast but since ART .... it flies. Stock Kit Kat with ART on 4.4.on a Nexus 5 just smokes. Love it!Reply

Now would be a very good time those Android liers to come out and admit the old Android simply isn't up to iPhone's standard. I would know, I believed their lies and bought a Note 3 and it lags like I am using a single core computer back in the 2000s.Reply