Samsung, which was undoubtedly surprised by the usually-slow-to-market Apple announcing that the iPhone 5S would be the first 64-bit smartphone on the market, has announced that its “next smartphones” will also have “64-bit processing functionality.” This presumably means that the Galaxy S5, or perhaps the phone after that, will have a 64-bit SoC — and, perhaps more importantly, that Android will make the leap to 64-bit, too. Whether smartphones actually gain anything from 64-bit processing, or whether this is merely a 3D TV-like marketing ploy, is another question entirely.

At the iPhone 5S launch last week, Apple surprised us all by announcing that its new smartphone would be the first in the world to support 64-bit software. In true Apple style, we don’t know a whole lot about the A7, except that it’s Apple-designed and based on the new ARMv8 instruction set. ARMv8 is 64-bit, but it is backwards compatible with 32-bit ARMv7. Almost every mobile chip in the last few years has been based on the ARMv7 Cortex-A7 or -A9 cores, with the exception of Qualcomm and Apple, who designed their own cores (which are still fairly similar to ARM’s reference cores). While there is a Cortex-A15, its large power envelope makes it almost unusable in smartphones — and then there’s Cortex-A53 and -A57, which are ARM’s first 64-bit reference designs.

As far as we can tell, judging by the limited info released by Apple, the A7 SoC is probably based on the Cortex-A57 core. Samsung, when it releases a 64-bit phone, will almost certainly use an Exynos SoC that uses the Cortex-A57 CPU core. This is all well and good, except for one thing: Cortex-A57, ARMv8, and 64-bit computing are essentially useless in mobile devices. They are primarily low-power, high-density server-oriented technologies.

There is almost no reason for Samsung to follow in the A7’s footsteps, except to counter any marketing FUD that Apple throws up. There are minimal performance gains to be had from switching to 64-bit, and in some cases there might even be a small hit to performance and battery life. You might see this as long-term future proofing, but even on desktop PCs, where x86-64 chips have been the norm for 10 years, the impact of 64-bit computing has been negligible. Yes, we need to switch to 64 bits at some point, but there are probably more important things that Apple, Samsung, and other mobile device makers should be focusing on at the moment.

And then there’s Android. As it stands, Android is completely 32-bit, and if a lot of work is being done on 64-bit, it certainly isn’t being done publicly. I would be extremely surprised if Android 4.4 KitKat turns out to be a 64-bit OS. Porting Android to 64-bit would be a fairly large undertaking, and I doubt that Google would invest the time and effort unless there was a valid reason to do so. In this regard, it will probably come down to how loudly Apple beats the 64-bit drum; if Apple can somehow get the public to think that 64 bits are better than 32 (which wouldn’t be hard), then Google will have no recourse but to make the jump.

Tagged In

Yay 64bit phones, for even faster text messaging. I must rush out to buy one.

arabsrulechina

Actually 64 bit is for faster biometrics. For a 32-bit phone, it would take 5 seconds to compute the finger scan. The iPhone 5s takes less than 1 second.

VirtualMark

Read the article on this site about the 5s performance.

Singh1699

I saw a vine on Facebook of a girl unlocking her bf phone and asking random questions.
I think this is the secret apple objective. To try to close everything up, including your marriage/relationship :P.
More performance is useful though for running things though or heavier interfaces but the physical limitations in terms of screen size etc. Are obvious. Idk, we shouldn’t have a I need all the power I can get mentality for pc and reject it for mobile.

howardbamber

Right… Apple went to all that expense just to make a biometric scanner 4 seconds faster. Fool

Zageron

It wouldn’t take very much effort for Apple to convince the public that 64bit is better. All it took was listening to the radio for a few minutes and both of my parents were convinced that it was groundbreaking and faster. It took me around 5 minutes to explain why it didn’t make a difference.

VirtualMark

It’s always been the way – they throw a number at people that sounds impressive, followed by some technical word like megapixel, bits, mhz etc, and everyone thinks they need this marvellous new technology. Advertising works(on the weak minded at least). There’s even a 4k resolution phone out now, which will offer no benefit and several drawbacks on such a small device.

Singh1699

It only works because dumb people want to feel smart. (No offense to your parents.)

Phobos

I really doubt old people are interested in technology let alone an overprice product like apple.

Singh1699

Old meaning 35+ not 65.

Phobos

35 old? holly shit then 65yrs and up are mummies.

Singh1699

The ones that don’t learn might as well be. Compare a student taking notes online to an agenda. If both lose their ‘devices’ one is still unaffected if you factor insurance. Gaps are always about application of skill, through my experience I’ve seen that race doesn’t exist, and neither does tribalism. Ideology and attitude are the differentiates.

Ty Smith

Piss on you I’m 43 and have forgotten more tech than you will ever know!

Singh1699

Don’t remember what my original comment was about so we are in the same boat old sir. :P

howardbamber

Piss on you… I’m 50,senile and can’t remember a thing!
SERIOUSLY as has been stated 64 bit and 4k screens are crazy and have negative affects on mobile phones

beachguy757

Actually if you use a 64 but architecture there are significant benefits. Data processing is not the forefront of this really on this platform and more towards battery life and multitasking. Addressing memory over 4gb is not the only reason to move up to x64. Multicore processing at a lower frequency lowers power consumption and heat produced on top of being able to still use 32 bit instructions. You misled your parents because you have biased glasses on over a company instead of looking at the larger picture where this leads to.

RS

Totally agreed and Zageron and others have proved that they have no insight into 64 and 32 architectures. Go back to school and learn about it before blabbering here. Just because Apple has an OS that can take advantage of 64 bit and Android cannot take the same advantage doesn’t demerit the benefits and apps that will be designed around 64 bits.

Asmodai

You vastly underestimate the rate at which the mobile market is advancing. The upcoming Samsung Galaxy Note 10.1 has 3GB of RAM so it won’t be long before Tablets at least have 4+ GB of RAM. It’s not uncommon for top tier phones to have 2GB and typically RAM doubles so there will likely be phones with 4GB before the end of next year.

As for Android being 64bit ready it’s based off of the Linux kernel which has been 64bit for some time. Furthermore most apps are not native code but Java-like byte code so if a 64bit JIT engine is released then they’ll become 64bit automatically only those who used the NDK will have to make changes.

You are correct that 64bit isn’t needed to do the things smart phones do now but that’s not the point. The point is to make them do even more things and if you have a 64bit OpenGL ES 3.0 powerhouse in your pocket with gigs of RAM then many average users can start dropping things like laptops and eventually even desktops and instead just use wireless connections to monitors, wireless keyboards, etc. with their phone being their sole computing device.

VirtualMark

Yeah sure, I’ll get rid of my i7 for a phone CPU.

Phobos

LOL
I wonder how long its going to take the phones to reach the performance of a desktop? and by the time they do what kind of performance can we expect from desktops?

VirtualMark

Exactly – desktops will always be way ahead, unless the laws of thermodynamics change.

VirtualMark

Exactly – desktops will always be way ahead, unless the laws of thermodynamics change.

Scott Jackson

Yes desktops will always be more powerful, but there comes a point when you have to ask how much power do you need? Eventually phone CPUs will be powerful enough to do suffice all our computing needs and then we can just have one computer that you can hook up to different screens depending on the form factor you want to use. I for one think it would be nice to not have to use a separate phone, tablet, laptop and desktop and just use one device for everything. I get where Asmodai is coming from, as 64-bit CPUs in your phone will make sense at some point, just not right now.

VirtualMark

How much do I need? Are you serious? I think you want the alternative site MediocreTech lol.

Seriously, I want as much computing power as I can get. It’s always been that way.

Robert Bray

Why?
Also, why do you think this will always remain true?

VirtualMark

Go back to 1980 and ask the same question, then look at what computers can do today that they couldn’t do then.

Robert Bray

Then they couldn’t do a huge amount of things that even basic users required.

Today? Not so much. Times have changed and hardware now easily outstrips the average user’s requirements.

For instance, in the 90’s or early to mid 2000’s if someone had needed a computer for desktop processing and browsing the internet, I’d have put a large amount of thought into the design and requirements.

Now except for gamers (and this is becoming rarer and rarer) my response is “Buy anything, you can’t find one that’s not powerful enough to do what you want”

So. Now that I’ve demonstrated the difference between now and 30 years ago, perhaps you’d like to try giving an actual specific answer to my question.

VirtualMark

“if someone had needed a computer for desktop processing”

“desktop processing” – what is this?

I was simply trying to illustrate that as computing power goes up, so do the capabilities. Use your own imagination if you want to know what they’ll be able to do in the future.

Believe it or not, in the 1980s computers did everything their users needed them to. There wasn’t the software around that we have today, but the software people used worked just fine on the computers back then. And in the 90s, any old computer could use the internet just fine. I had a pentium 75mhz, with 16Mb RAM, and it managed to browse web pages back then.

Look at a modern program like Photoshop – the advanced image processing that now takes seconds would have taken days and been impractical back then. Even fitting a large image in memory would have been impossible.

Look at what supercomputers are doing today, for example the AI projects. We do not have nearly enough power to run intelligent AI in real time yet, in a few years we may well have. Imagine having a personal assistant to do your chores around the home?

Games are another prime example, just look at games today with advanced AI and physics. This really should need no explanation. Even today, the games are a pale comparison to real life, we have a long way to go to be able to convincingly simulate the real world.

I make electronic music, and always like to play with new toys. We still haven’t found all the ways to manipulate sound, and modern programs are sounding better and better. Things like real time morphing between sounds, circuit simulation to bring analog synths inside the box, higher internal sample rates, advanced algorithms etc all take a lot more computing power.

I like to use modern software, and usually it is more taxing on hardware and needs newer hardware to run optimally. It’s as simple as that – I can’t tell you what programmers will write when we have 10x or 100x the computing power of today, but I’m certain that I’ll be using some of the programs. I think that in 20 years time, you’ll look back to your comment today and realise how short sighted it was.

Robert Bray

Word processing. General things you’d do in an office.
The things that the vast majority of computers are used for.

I also remember computing in the 90’s and remember how frequently upgrades were required. How much tweaking was required to get certain games running and how rarely it was possible to run any game on max graphics. I also have clear memories of Office running like a complete dog.

What proportion of users do you believe need anything approaching top-of-the-line today? What proportion use a computer in the same way you do?

My current computer is a two year old i7 with 12GB of RAM (mainly used for virtual servers) and a mid-range (for the time) graphics card. Except for RAM, for the first time I’m nowhere near finding uses for all the power it can provide. I’ll have to check my CPU performance on R2 when I get home, but I don’t recall it using an impressive amount.

As technology advances, the proportion of people who will really require the grunt that an X86 can provide will get lower and lower. For everyone else, they’ll quite happily accept the reduced power bills and cost of hardware.

VirtualMark

Herein lies the problem – you’re judging me by your lowly standards. I regularly max out my CPU. I don’t give a fuck about what other users do, my statement was that I will always want more computing power, understand?

Upgrades weren’t “required” in the 90s, computing power just grew faster. As always if you stuck with the same software, your computing requirements didn’t grow. But if you wanted newer software, you needed a newer computer.

And as for games running at full spec? You’re deluded. Try one of the more demanding games today on a mid range PC, there is no way you’re running that on max settings at a decent resolution.

Like I said, please revisit this question in 20-30 years time. I forgive you for your stupidity and short sightedness. I understand that all you like to do is word processing and can see how any computer will fit your needs. I honestly don’t think you read any of my reply, and feel that I am wasting my time on another stubborn and thick headed idiot. Sure buddy, we don’t ever need more computing power – humans millions of years in
the future will look back on today as the peak of civilization.

Robert Bray

>>Herein lies the problem – you’re judging me by your lowly standards.
I regularly max out my CPU. I don’t give a fuck about what other users
do, my statement was that I will always want more computing power,
understand?<>

Upgrades weren’t “required” in the 90s, computing power just grew
faster. As always if you stuck with the same software, your computing
requirements didn’t grow. But if you wanted newer software, you needed a
newer computer.<>

And as for games running at full spec? You’re deluded. Try one of
the more demanding games today on a mid range PC, there is no way you’re
running that on max settings at a decent resolution.<>Like I said, please revisit this question in 20-30 years time.<>I forgive you for your stupidity and short sightedness<> understand that all you like to do is word processing and can see how
any computer will fit your needs. I honestly don’t think you read any
of my reply, and feel that I am wasting my time on another stubborn and
thick headed idiot. Sure buddy, we don’t ever need more computing power
– humans millions of years in
the future will look back on today as the peak of civilization.<<
Keep trolling, buddy. An aggressive tone makes neither argument nor accuracy.

VirtualMark

I really believe you to be one of the stupidest people I have ever come across. For example – your last comment, instead of addressing my point about future humanity, you instead call me a troll.

So, stupid person, do you really believe that in 1000 or 10000 or 1000000 years time people will still be using the computers we have today, and doing the same things on them that you do? Lol!

Please don’t answer any further, I don’t like to have to teach retards how to think.

Robert Bray

Coming from the person who thinks that only X86 based CPUs will continue to advance while ARM based CPUs will just…. Stop right now for some reason.

Which point did address your point about future humanity, if only you had the wit to understand it.

By your reasoning, this is as good as games will ever get, yes? Lol so dumb…

Please leave me alone now, you’re not worth speaking to.

Robert Bray

Your entire argument relies on you believing that ARM based or whatever low-powered processor becomes the flavour in the future will somehow not advance, while only x86 based processors will. As time goes on, those few years difference in performance will mean less and less.

I’m sorry if you can’t see this. Someone as easily angered as you probably wants someone to blame. You should start with your parents.

VirtualMark

Wow. Now I really don’t know what conversation you’ve been reading, nor do I care. You’re mistaking my condescension and impatience with your stupidity as anger – believe me I’m not angry. I just find it a bit sad that people like you ask such dumb questions and then ignore the answers.

Like I said, I forgive your stupidity, it’s not your fault. But I don’t really want to discuss technology any further with you. So thanks for your input, I’ll leave it at that.

Maybe reminding you of the context of this discussion can help you:
“Yeah sure, I’ll get rid of my i7 for a phone CPU.”

Anyway, Angry Man. Call yourself whatever you like, just try not to take it out on children, women or animals.

VirtualMark

Stupid person – that wasn’t the comment you replied to. My English is just fine thanks, clearly it is you who has the problem with not only understanding English, but learning how to use a discussion forum correctly.

Secondly – I do not know how you draw the conclusions you did from my statement. Like I said earlier, nor do I care. I can see that your brain doesn’t work too well and don’t have the time or patience to try to decode your nonsense or trade petty insults.

RS

People in the past never thought they will have the same processing in mobile but they made it possible. But who knows about future all you need is to plug your mobile to a display and you can get all the CPU power as desktop.

Singh1699

There will be different materials and techniques. Mobile is behind due to psychics not I’m falling asleep.

Singh1699

There will be different materials and techniques. Mobile is behind due to psychics not I’m falling asleep.

VirtualMark

“Someone who’ll say something like that has no business commenting on the intelligence of others.”

And yet the 20 upvotes all got the joke. I’m sorry you didn’t, like I said it’s not your fault.

Phobos

You do make a good point, some games are just poorly coded or just a poor console port.

Singh1699

I’m a soldier, so I want enhancements to my brain, that are faster than my brain. See that power level coming any time soon? It’s over 9000.

Singh1699

I like to type one finger at a time like I’m 65 or a chicken.

Singh1699

I like to type one finger at a time like I’m 65 or a chicken.

Phobos

Now I agree with you but in terms of AI I’ll say they are lagging behind to the point they just got stuck in the mid 90’s early 00, yes the graphics and the physics are always on top.

Singh1699

This. It’s like asking why didn’t we stop at the sword since all cultures seem so attached to it?

VirtualMark

Yeah, the guy is beyond stupid – he tried telling me that I won’t want more computing power, so he thinks he knows me better than I do. I cannot tolerate such people, they will be neutered when I rise to power.

VirtualMark

How much do I need? Are you serious? I think you want the alternative site MediocreTech lol.

Seriously, I want as much computing power as I can get. It’s always been that way.

Robert Bray

This. It’s great having the fastest computer available, but there is a time when most people ask themselves if it’s really worth the expense. Whether “Good enough” is what they’re after. If you’re not using the extra processing power, you may as well not have it.

The following is exciting for when mobile processors become good enough for pretty much anything bar the most exceptional user requires. We’re a few years away from it, but it’s coming.http://www.ubuntu.com/phone/ubuntu-for-android

Personally about the only thing that begins to justify my desktop’s i7 is video conversion. Beyond that, my major bottleneck is RAM.

Scott Jackson

Yes desktops will always be more powerful, but there comes a point when you have to ask how much power do you need? Eventually phone CPUs will be powerful enough to do suffice all our computing needs and then we can just have one computer that you can hook up to different screens depending on the form factor you want to use. I for one think it would be nice to not have to use a separate phone, tablet, laptop and desktop and just use one device for everything. I get where Asmodai is coming from, as 64-bit CPUs in your phone will make sense at some point, just not right now.

Chris Stuber

Dude.. I am a developer and have not sat at a desktop in years. It’s all Laptop and Tablet for development and the phones (and other devices) are the target

Asmodai

I know non computer savvy people who still have systems running XP. I bet their latest iPhone is already faster then whatever ancient CPU is in their desktop. Sure the newest Desktops smoke their phones but they aren’t running out to buy new desktops because what they have does what they need it to do. The average person on the street doesn’t need the power PC’s have now. Mobile chips don’t have to catch Desktop chips, they just have to be good enough. Laptop chips aren’t as fast a Desktop chips either, guess what? More laptops are sold then Desktops now and many people don’t even had desktops anymore because their laptop is “good enough”. The same thing is going to happen with tablets then phones.

Phobos

I do get the feeling that by looking at the specs of todays phones they have the same computational power as desktops from 2003-04, so maybe in 8yrs we could have an A8 or A10 in phones, not bad at all.

VirtualMark

“Laptop chips aren’t as fast a Desktop chips either, guess what? “

So, please explain the differences between the desktop i7 and mobile i7?

tgrech

Last time I checked desktop versions had around a 40% higher clock speed, and much larger uncore. That’s a fairly large difference, Even if real performance only increases 30%, that’s two generations of progress now. That’s before we get into i7-E(Since i7m parts can get to similar prices).

Even comparing a $600 mobile part(3820QM) to a $565 desktop part (3930K) gives a ~40% speed difference(12000/8500), as I stated.

I don’t see where you were trying to go with that.

VirtualMark

No, you didn’t bring price into it.

Where am I going with it?

“VirtualMark
8 hours ago
There’s not much in it”

tgrech

“(Since i7m parts can get to similar prices).” I presumed that similar prices were obvious, if you had enough money you could probably make an i73930K level processor work at 45W TDP, so working with unlimited budgets doesn’t make that much sense here. Even comparing max speed regardless of budget, 12000/9200 is still ~30%, which is again a figure I listed as a possible outcome, and still a very large difference.

VirtualMark

I was talking about the differences between desktop and mobile versions of each chip, not about price points. For example the GTX 780 is way more powerful than the mobile 780m.

Now obviously the high end is going to be better on the desktop, due to the higher thermal headroom. But the gap isn’t there in the lower chips, according to those benchmarks anyhow. I don’t know how reliable the clock speeds are, as they have turbo mode. But I don’t see a massive difference between mobile and desktop parts, certainly not 30%.

tgrech

So what similarities are you drawing in order to deem the chip similar(If not price or max. theoretical perf)? It sounds like you’re going off SKU name, which isn’t really a good way to compare anything.

VirtualMark

Just if there was any differences in architecture and performance at similar clock speeds. You said something about the uncore being different? I don’t know much about that. All I’m saying is that mobile parts seem to be almost on par with desktop parts clock for clock.

tgrech

Well obviously the same cores at the same clock are going to perform pretty similarly, uncore won’t really affect CPU benchmarks, more memory intensive benchmarks/software or iGPU benches. But then comparing clock for clock speed is completely irrelevant to the original statement being discussed – “Laptop chips aren’t as fast a Desktop chips either, guess what? “, because the main difference(Apart from often higher speed memory support, larger iGPU, and other uncore-related stuff) between a laptop chip and a desktop chip is their stock clock.

few people squeeze the power available in today chips , less with mobile chips ,most people care about uploading photos,most people brag about their phones claiming is more powerful than anybody else phone without being able to use that power..i know some douches claiming”” hey i can open autocad files with my iphone duhhh

Asmodai

I know non computer savvy people who still have systems running XP. I bet their latest iPhone is already faster then whatever ancient CPU is in their desktop. Sure the newest Desktops smoke their phones but they aren’t running out to buy new desktops because what they have does what they need it to do. The average person on the street doesn’t need the power PC’s have now. Mobile chips don’t have to catch Desktop chips, they just have to be good enough. Laptop chips aren’t as fast a Desktop chips either, guess what? More laptops are sold then Desktops now and many people don’t even had desktops anymore because their laptop is “good enough”. The same thing is going to happen with tablets then phones.

Roberto Tomás

they are about there now. the latest snapdragon 800 / tegra 4’s are performing at about 66% of their maximum tested capacity, and compare not all that far behind an AMD Trinity a10-4800m or a low end i5 notebook-class cpu from intel.

Singh1699

As things went from phones to consumption hubs, processing needs increased; as we move to information hubs they will again.

I’m looking forward to a smart-watch and a 5″ phone that I don’t have to pull out of my pocket constantly.

Even now, I know my dual-core phone is not fast enough to run its os at desktop speeds. I want a desktop experience on a smaller screen, not necessarily a desktop os but it should feel seamless.

We’ll be there in a year I think with the generation after snapdragon 800; I want a phone that feels like a small (weaker) pc not a wannab.

Basically less ui lag. Plus, I’m big enough that a 2-3″ smart watch screen doesn’t bother me. If we shrink bezels further having a close to 6″ screen for bigger tasks in my pocket will be king.

After that, these things will become commodities and increases in processing power a matter of course, and product differentiation.

McNo

And yet no phone can even handle Flash properly…

Phones may appear fast at some tasks because they have been heavily optimized for the hardware or use tricks to make light-weight computing tasks appear to do some heavy lifting.

Roberto Tomás

the problem with flash is that it uses too much battery, not that it can’t be properly rendered.

Singh1699

It’s true. I’m excited at what new possibilities will become possible with new technology. I’m already using ever note and the camera on my phone to make my life so much easier. And that excuse makes for harassment free BBM/text in class he he he.

Singh1699

It’s true. I’m excited at what new possibilities will become possible with new technology. I’m already using ever note and the camera on my phone to make my life so much easier. And that excuse makes for harassment free BBM/text in class he he he.

Phobos

LOL
I wonder how long its going to take the phones to reach the performance of a desktop? and by the time they do what kind of performance can we expect from desktops?

Asmodai

That’s why I specifically said “many average users”. People who read ExtremeTech probably aren’t “average users”, we’re more tech savvy people who demand more from their system then the average Joe. There is a great deal of people out there though that just use their computers to surf the web, check email, play around of facebook, and other similar tasks that a phone/tablet CPU is more than enough for. If they can get to the point where they have a nice keyboard, mouse, monitor setup on their desk that their phone wirelessly connects to or even a cheap pseudo-tablet that is really just a wireless touchscreen monitor for their phone then for many people that will likely be more than enough. Not your or I for sure but for a lot of the general public.

VirtualMark

Yeah you’re probably right, it seems to be going that way for the average user. I’ll never give up my large form factor computer though unless they stop making them!

VirtualMark

Yeah you’re probably right, it seems to be going that way for the average user. I’ll never give up my large form factor computer though unless they stop making them!

rahuldey85

Well you have the mac mini for that purpose and some windows variants of the same concept. Granted the mini(and its likes) is much bigger but then it will always be more powerful than any phone due to thermal and size restrictions of the phone.

Asmodai

Desktops, even tiny ones will always be more powerful then phone but the thing is most people don’t need that power. How often do you think Joe User really stresses their machine? Again us ExtremeTech readers are a different story because we’re “Power Users” or whatever you want to call it but we’re not the norm. If people are already carrying around smartphones and the smartphone is “good enough” then there is no reason to buy a Mac Mini or similar even if they are technically more powerful. You’ll just have a keyboard, mouse, and monitor setup on your desk that your phone wirelessly connects to and you don’t have to sync all of your software and settings to another device. A wireless connection to a cheap mobile touchscreen display would make a cheap tablet computer with all the processing done in your pocket while you interact with the “tablet” in your hands.

rahuldey85

Well you have the mac mini for that purpose and some windows variants of the same concept. Granted the mini(and its likes) is much bigger but then it will always be more powerful than any phone due to thermal and size restrictions of the phone.

Asmodai

That’s why I specifically said “many average users”. People who read ExtremeTech probably aren’t “average users”, we’re more tech savvy people who demand more from their system then the average Joe. There is a great deal of people out there though that just use their computers to surf the web, check email, play around of facebook, and other similar tasks that a phone/tablet CPU is more than enough for. If they can get to the point where they have a nice keyboard, mouse, monitor setup on their desk that their phone wirelessly connects to or even a cheap pseudo-tablet that is really just a wireless touchscreen monitor for their phone then for many people that will likely be more than enough. Not your or I for sure but for a lot of the general public.

Nkumba

The biggest single reason to go 64-bit is exactly
because of physical address space. Your virtual address
space needs to bea multiple of the physical one:
when you hit 1GB of RAM, 32-bit virtual memory is no longer
acceptable. You literally do need more virtual memory than
physical.

Why do you think that android actually needs to be full 64-bit? Just rewrite base to 64 add multilib and your done.

So where is your proof that performance gains are negligible benchmarks? Where are links to them?

symbolset

No. Just no. You don’t understand what’s going on here, hold yourself as an expert, and are clueless.

tgrech

I think you’re confusing the benefits of x86-64 and Arm64 a little. Arm64 has completely redone the old instruction set. It’s now much more efficient, much tidier, fully and truly RISC, and makes much better use of the wider registers to bring a performance boost.
Whereas x86-64 made the x86 instruction set more complex and slower, Arm64 makes it much simpler and faster, while of course both allow the use of 4GB+ of RAM(Which as you acknowledge IS still overkill on smartphones and probably will be for a while).

Asdacap Cap

I though x86-64 gave a performance boost through its additional register right?

tgrech

Normally the speed difference is ±5% in the real world. There are times when the speedup is as great as 20-30% but that is rare. Slowdowns are normally caused by the lower number of instructions/data you can keep in the cache.

Phobos

That’s great now can play angry birds without lagging.

Phobos

That’s great now can play angry birds without lagging.

q37

The CPU inside the Ascend D quad is Huawei’s K3V2 64-bit processor, running at 1.5GHz That was last yearhttp://news.softpedia.com/news/MWC-2012-Live-from-Huawei-s-Conference-255121.shtml

tech01xpert

64 bit memory bus in the K3V2 – its Cortex A9 which is not a 64 bit CPU design.

q37

The CPU inside the Ascend D quad is Huawei’s K3V2 64-bit processor, running at 1.5GHz That was last yearhttp://news.softpedia.com/news/MWC-2012-Live-from-Huawei-s-Conference-255121.shtml

Roberto Tomás

I fear to comment on this article because so much of it is written like the author just woke up from a coma from 15 years ago, and found out that now mobile devices are going 64 bit.
64-bit has use in processing large ammounts of data, like at a server farm. But also like in compression/decompression, in encoding/decoding video, etc. You actually *do* use the CPU for those tasks, and it makes sense that users want to keep up with their TV, etc, when they get a new phone. Mobile devices also have one really big reason they would benefit from 64 bit even more than PCs – there is no such thing as “double precision” in mobile GPUs, so some degree of heterogenous architecture will allow high quality bitmaps for 3d gaming, etc, in an environment that cannot afford, process-wise, to go fully 64bit internally in the gpu (for power consumption reasons).

Roberto Tomás

I fear to comment on this article because so much of it is written like the author just woke up from a coma from 15 years ago, and found out that now mobile devices are going 64 bit.
64-bit has use in processing large ammounts of data, like at a server farm. But also like in compression/decompression, in encoding/decoding video, etc. You actually *do* use the CPU for those tasks, and it makes sense that users want to keep up with their TV, etc, when they get a new phone. Mobile devices also have one really big reason they would benefit from 64 bit even more than PCs – there is no such thing as “double precision” in mobile GPUs, so some degree of heterogenous architecture will allow high quality bitmaps for 3d gaming, etc, in an environment that cannot afford, process-wise, to go fully 64bit internally in the gpu (for power consumption reasons).

http://theseonut.com/ Adam

I don’t get it, why do we need 64bit on a phone?

tachyonzero

advanced app or software.

symbolset

Same reason as >300 DPI.

Steve

Hey guys,
Does anybody remember webtop from Motorola? It would make a lot of sense if you start using 64 bit for a system like that.

Kasem Asawaprecha

I think that a 64-bit CPU is really needed for securing the fingerprint data. Apple is basically proposing to turn your fingerprint into you digital ID, which in turn is the key to seamless cloud computing. Currently, we are required to create and remember too many usernames and passwords. The iPhone 5S needs to be strong enough to guard these precious data. Furthermore, its patent means that the prime screen location has been booked for iOS devices.

symbolset

A fingerprint is maybe 120 bytes. That’s not the sort of thing that needs a new architecture.

niboned

Yes, but the encryption required to secure it (perhaps even beyond AES-256, if the allegations of government being able to crack this is true) would make a huge difference in processing time to encrypt a 16 byte value.

Steve

No. Fingerprints require very little data processing. The biggest thing I see them doing is a Motorola webtop rip off. The thunderbolt connection is fast as enough to enable extra memory or heavy duty graphics processing by using the port. Just check it out on YouTube, thunderbolt can technically allow for an extra GPU.

cooldoods

Samsung could probably make Tizen a 64-bit OS. As for Android, unless there is official 64-bit support in Android, Samsung should not transition to 64-bit just yet. Though among all smartphone manufacturers, Samsung is the closest to the 4GB RAM limit of 32-bit memory addressing.

symbolset

It’s just time for 64 bit ARM, and knowing that Apple was quick to the punch. Had they given it another iThing cycle they would have been following the trend, and that’s not their thing. 64bit ARM gives damned little: maybe 10%. But being first is sweet sweet cachet.

So because they show the Atari’s Jaguar console does that mean Samsung is going to fail?

http://nsaclan.com/ Tbird83ii

Wait . . . since Android is a Linux-based OS, it already has native
64-bit support . . . How is it “misguided” when all they would be doing
is including hardware which their OS could already utilize (with minor,
if any, tweaking). Then all that is left is 64 bit app environment, and since Dalvik is based on JVM, which has been offering 64-bit versions since at least Java 5, the transition should be very simple (Java being cross-platform, and having little ambiguity in their int sizes).

Also, for many people, the speed of our
mobile devices is important – I have quite a bit of the functions of my
house automated with RFID tags, faster phone means programs start and
close faster, less latency between automated tasks, etc. Now, “Speed” and “64-bit” aren’t synonymous. The only way 64-bit will be significantly faster is if Apple and Android Manufacturers increase memory – if they stick with lower memory then 64 bit instructions will just bog down their processes.

BeaveVillage

As usual, Android’s on the job to copycat once again. Watch for 64-bit Android phones coming in 2014 as a direct response to 5S.

Start thinking for yourselves will you, Google? Everything you have done, someone else already did before you, including your primary root: The Search Engine.

gecco

Well, Android is and have always been 64bit, it only needs the hardware to match it whenever the need is there. As of now there isn’t that much of a hurry for anyone to push 64bit, as Apple probably would claim. But regarding Android, the article is wrong. There would be no huge undertaking to make Android 64bit, as it already is.

Astroboy888

Of course this article has some misinformation. Apple does not use a standard ARM core from ARM. They do their custom implementation of the ARMv8 64bit architecture by being instruction set compatible.

ARMv8 has many advantages over ARMv7. Architecturally it allows for much higher throughput because of the wider word length allowing for higher parallelism. Theoretically you should get more than 2-4X performance gain at the same clock speed.

Recent benchmark data now shows that Apple A7 is the fastest mobile processor (on part with Intel Bay Trail Atom) on the market today. Out pacing all Android device CPU offering by a solid distance. A7 at 1.3G it beats Qualcomm’s Snapdragon 800 at 2.3G by a solid amount.

zoom44

samsung was so surprised that they built them for apple and didnt notice :rolleyes:

http://www.quietatheist.com/ Slugsie

Do you really think Samsung was surprised that the A7 processor was 64bit? Honestly? Maybe you should do a minutes research and Google who actually fabricates the A7. It isn’t Apple that’s for sure. Yes, it is Samsung. So if you think they were even remotely surprised that the processor they were making was 64bit then you’re an idiot.

ok so we know you don’t know what the heck you are talking about…
in case anyone wants to know what really is going on…

do a search for this article below that lists that actual performance gains, and makes the iPhone5s, the fastest mobile device on the planet, including being able to read your fingerprint in less than a second, and take photos with phenomenal speed, live editing of 120frame a second movies… and surf the web at amazing speeds…

Android on the other hand will have a heck of a time duct taping in 64 bit into their hodge podge, as if hardly anyone is on their latest OS in the first place…
…

Yowan Rdotexe

“Porting Android to 64-bit would be a fairly large undertaking” You are dead wrong, only Android runtimes are 32-bit, as you reach the layer of the dalvikvm, the number of bits becomes irrelevant as it is at this point, the apks are native bytecode (A “by-product” of generated Java code compiled into a portable bytecode) which targets the DalvikVM which in turn interprets and translates the bytecode targeting the raw ARM instruction set. Android apps are not Linux processes; they are Dalvik executables that run on a Java-like virtual machine. Typical Android “.dex” apps are not native code in the way all iOS Cocoa Touch apps are so there is no need to port anything. AArch64 and ARMv8 support is already included in the Linux Kernel.

Jake

What you are saying is right, provided the apps are all SDK based.
Unfortunately (or fortunately), they aren’t.
All those crapwares like “Hello World” are written with SDK while demanding apps with NDK – with few exceptions.
I feel already sorry for those NDK-app developers since the transition will be extremely demanding in resources and time to put with little to no return at all.
If I were one of them, I wouldn’t bother and stick with 32-bit binaries.

That’s where we are at. There will be hardly any third parties willing to jump to the 64-bit wagon on Android.
Google has no means driving the move, unlike Apple.

Java and Dalvik, both are fundamentally stuck with 32-bit from the ground. You will see little to no benefits in moving to 64-bit, and again, unlike Apple.

Last but not least, ARM32 and ARM64 don’t come along very well, unlike x86 and x64.

Toggling between both states costs an additional exception call every single time.
If you run 32-bit apps together with 64-bit ones, Android’s “true” multitasking will kill the performance in addition to the battery.

I almost think that Apple has been so restrictive with third parties’ multitasking due to this fact.

Google better stick to 32-bit or abandon Android completely and start anew with 64-bit-only ChromeOS.
Or do the extreme : Ditch the VM step by step, omit backward compatibility of NDK-apps in 64-bit Android. That would be the wisest choice for long term.

Be evil Google, if you want to survive.

AppToday1

I doubt Google can catch up with a cohesive OS across search, infrastructure, and mobile, at least not for several years.

ChrisGX

The writer is right that 64bit offers no immediate benefits for phone users. His other claims are mostly wrong however. Moving to 64bit would be mostly seamless. 64bit is a compile option for the Linux kernel and very likely the rest of the Android stack. At this level the effort to go 64bit would be quite small and it is very likely that Google has been active making the relevant optimisations – 64bit ARMv8 architecture documents have been published for years – for some time now. Also, it is a feature of ARMv8 that 32bit applications run normally unchanged. No effort required there. The place where effort is required is in updating development tools so that software developers can take full advantage of AArch64 and the A64 instruction set. That is not so daunting as the writer implies and the process of improving and polishing development tools is anyway without end. You don’t need the best development suite ever in order to release well designed and coded applications. You can count on the fact that Google will have solid development tools ready well before 64bit Android phones and tablets are released. The writer sees controversy where there is none.

slapppy

Now that we know. Apple A7 isn’t just marketing hype or BS. Some of you clueless haters just got owned!

slapppy

Now that we know. Apple A7 isn’t just marketing hype or BS. Some of you clueless haters just got owned!

Chris Stuber

ok I have some questions about the new big.LITTLE or the new ARM 64-bit processors. Says it has Virtual 44b PA (whatever that is.. ) I could not really gleen anything new
but http://www.realworldtech.com/arm64/ article seems this isn’t fully developed (same goes for Intel/AMD) needed to standardize on extended memory management and hardware assisted Virtualization Technology (VT-x, VT-d for direct I/O). Developers pretty much *have* to own these type of the devices (the latest CPUs) to even compile

android if your using VMware, and the industry is trending towards low power flashable SOC for just about everything. The key is having the processor that running decent virtual machines for *any* operating system (and preferably on SOC). All I can say is… we want it faster, smaller, and energy efficient. It’s not just Apple… We gotta have these processors on laptops/tables for running Linux64 because android cannot compile itself.. regardless of whether we are talking about whether an end-user needs 64-bit on their phone makes no difference to the user, but for the laptop/tablet market it makes and in 10 years 64-bit will be the norm. It will take Apple/Google/Samsung/Intel/AMD/ARM to step up the game.

Some of it is bloat…like the gs4 getting a 20% increase in speed w/ android 4.3, but losing 10% w/ samsung knox. Or not being able to uninstall bloat apps, just hide them.

I know you might think…what does that have to w/ the functionality of the phone, my facebook, online shopping or txt messaging?

But one little thing will tell you what a 64 bit chip will get ya, the next market for phones & why 4k is necessary…

Infinity blade 3.

Charles Humphries

Erm, Samsung announced plans for 64 bit processing in phones a while ago and estimated 2014 as the year it happens. Oh, and Samsung makes the A7. as for android – do you really think there is no way better hardware can be of an advantage to an open source community running a mobile OS, running in the linux kernel? 64 Bit architecture was on the android wishlist for the Galaxy S4, ya know?

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.

Use of this site is governed by our Terms of Use and Privacy Policy. Copyright 1996-2016 Ziff Davis, LLC.PCMag Digital Group All Rights Reserved. ExtremeTech is a registered trademark of Ziff Davis, LLC. Reproduction in whole or in part in any form or medium without express written permission of Ziff Davis, LLC. is prohibited.