Apple just released their new iPhone 5s, where the biggest news is really the 64-bit processor core inside the new A7 SoC. Sixty four bits in a phone is a first, and it immediately raises the old question of just what 64 bits gives you. We saw this when AMD launched the Opteron and 64-bit x86 PC computing back in the early 2000’s, and in a less public market the same question was asked as 64-bit MIPS took huge chunks out of the networking processor market in the mid-2000s. It was never questioned in servers, however.
The answer for 64-bit x86 was really that it did not do much for the average user in the short term. It was a must-have in the server space, and it sealed the fate of Intel’s Itanium. Thus, it was hugely influential in the evolution of technology and markets. Still, it is only in the past few years, a decade later, that most end-user computers have started to come with 64-bit Windows, Linux, and OS X as standard. In the desktop space, 64-bit has been very slow in coming as the software has been slow in moving up to 64 bits (due to the interconnectedness of legacy in all parts of the software stack). Today, I find it absolutely necessary, as a I see more than 4 GB of RAM being necessary to run modern applications (and in particular many applications at once). In professional end-user usage, 8 GB of RAM or more is simply required for anything non-trivial (be it graphics, huge compiles, or running Simics). Thus, “bigger memory” marks the final point where 64 bits goes from being merely a novelty into something necessary.
In the embedded networking space, memory was not the answer, since the devices back then paired a 64-bit core with a 64-bit operating system and 256 megabytes of memory. Instead, the value was from being able to manipulate more data in each instruction, which was hugely beneficial for TCP/IP routing and processing. I personally never dug deep enough into this to understand quite exactly how this worked, but the answer was consistent from those in the know. Moving to 64 bits and taking advantage of the new processors was easy, since the product developers could choose the processor, the OS, and write the software all by themselves. Integrated solutions are good places to take advantage of new fundamental technologies, while fragmented solutions like the PC market are a bad place.
In servers, 64-bit computing arrived very early (wikipedia has a timeline that looks about right to me). Large words has advantages in handling large data, and the problems of being limited to 4 GB of addressing were clear already in the late 1980s. All the early solutions in the server space were launched by systems houses who built their own processors and their own operating systems, making the transition fairly quick (even with support for old 32-bit applications running on the 64-bit machine). Software makers in the server space also realized the advantages of 64-bit computing and followed suit fairly quickly.
The iPhone 64-bit transition is not like the PC case, it is much more like the server case. It is a classic example of the kind of innovation made possible by a true systems house that builds the processor, hardware, OS, and core applications all in a single organization. In comparison, with Android the OS and hardware and phone putting them together will most likely arrive piecemeal over a longer time.
From a software perspective, however, the transition feels like the 64-bit x86 case. There is a huge body of existing applications that will probably never update to 64 bit compiles, since it works well enough as it is. However, there is also a real gain in performance to be had from compiling to 64 bits, just like is the case for 64-bit x86 (I wrote about this a while ago), and there are surely phone applications that will quickly update to take advantage of this. Memory sizes in phones are already at 2GB, and despite the power cost of large memories it is not inconceivable to see memories north of 4GB in the next few years. Especially in tablets, where power is less of an issue and memory comes to even better use with more and bigger and more sophisticated applications.
Extremetech were quite critical in their evaluation, but I think they might see more of a performance increase than they think. Assuming that most code running on an iPhone is in the OS or in Apple-supplied apps, they might well see an increase in performance that is quite noticeable. Games and similar applications for third parties will not, but I think many users primarily spend their processor cycles on the built-in software which will be updated. In any case, the OS tends to consume a very significant amount of the execution time of modern applications, so unless the switching cost between 32-bit and 64-bit is too high, any application use of a standard API could benefit from the OS code being 64-bit. On the other hand, Apple has been seen in the past to be really slow in updating OS code to a new architecture… when they moved from 68000 to PowerPC and then to Intel they used emulation for quite a few years to keep some OS parts running without having to redo or recompile them.
I think that in moving to 64 bits Apple is preparing for the future and laying the groundwork for a future advantage – just like Linux gained a lot of traction in the years that it was the only reasonable way to run 64 bits on a PC, until Windows finally got its act together with Windows Vista. Currently, it would seem that the first non-Apple 64-bit phone is quite some time away. At least a year, until either an ARM Cortex A50-series core gets into a chip, Qualcomm build their own ARM v8 processor, and Android goes 64-bit. Intel just released the Z3000-series Atoms which do have 64-bit processor cores, but that also need a 64-bit Android or a tablet-friendly Windows that is natively 64-bit (no idea why it is not, but apparently it isn’t).
Some notes on the phone
Yes, there were some other new things in the iPhone 5S, such as a fingerprint reader and a new sensor chip, but overall it feels like a surprisingly uninspired launch. I do not care much for Apple’s products nowadays, but it is strange to see them stick to a low-resolution screen when the Android world is chasing 1080p or higher resolutions. It is beginning to feel like it was back in the early Macintosh days when Apple’s overall systems had a solidness and system-level functionality that PCs just could not match. But detail for detail, they were more expensive and lower-specced. I used to use Macs in the early 1990s, and loved them, despite the fact that I could get 2x the CPU speed at a lower price in a PC clone. I am kind of surprised that Apple has not remembered that, that at some point these things start to matter enough to seriously erode the market share and feeling of ease of customers.
Maybe Apple is thinking for their customers and in their customers’ best interest, and have decided that 1080p on a 5 inch phone is silly (there is something to be said for that considering the power cost of a high-res screen). If you want a movie machine, buy a gigantic 10 inch iPad instead (which it could be argued is the most competitive piece of hardware in Apple’s lineup). In any case, it is leaving them quite exposed to competition until the next update. A good screen matters, just as much as CPU speed and memory size (where Apple is doing a decent job pushing forward).
But it does start to feel awfully much like the old Mac vs PC battle again. Apple could at least try to match the breadth of offerings of the Android crowd. In particular, why is there no water- and weather-proof iPhone when that is quickly becoming standard in Android land and is a perfect choice for parents equipping youngsters?
A7 is a Bad Name
Apple can obviously call their products anything they like, but A7 is just going to drown in the sea of talk about the ARM Cortex-A7 core that is being used in tons of SoCs right now. The previous A6 collides with a certain brand of car, and A5 also collided with an ARM Cortex-series core. But since Apple is not selling it stand-alone, it hardly matters.
I was Wrong
I was totally wrong on where 64-bit ARM would first appear when I wrote a piece on it back in 2010. Back then, I believed it would appear in a Microsoft chip powering a Windows tablet based on a now-dead family of Windows OS. Instead, we get an Apple chip powering an iOS phone.
Mycket bra poäng. När maskinvaran säljs utvecklare kommer att använda det för deras mjukvara. Bara vänta några kreativa användning för fingeravtryckssensorn också.
du är för dum för att förstå ändå!