iOS7

64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months

 

Apple’s A7 processor, the new iOS 7 and “house” apps are all industry firsts: genuine, shipping 64-bit mobile hardware and software. As we’ve seen before with the iPhone or the iPad, this new volley of Apple products is first met with the customary bursts of premature evaluations and counterfactual dismissals.

On September 10th, Apple revealed that the new iPhone 5s would be powered by its new 64-bit A7 processor. The initial reactions were less than enthused. We were treated to exhumations of lazy bromides…

“I don’t drink Kool-Aid. Never liked the stuff and I think we owe it to ourselves to collectively question whether or not Apple’s ‘reality distortion field’ is in effect when we consider how revolutionary the iPhone 5S is and if Apple’s 64-bit A7 processor under its shiny casing will be all its [sic] cracked up to be when the device hits the market in volume.” [Forbes]

…and equally lazy “markitecture” accusations…

“With current mobile devices and mobile apps, there really is no advantage [to 64 bits] other than marketing — the ability to say you’re the first to have it…” [InfoWorld]

…and breezy brush-offs, such as this tweet from an industry expert:

“We’ll see just how good Apple’s marketing team is trying to leverage 64-bit. 64-bit add more memory and maybe registers. Period.” [Twitter]

Rather than wonder what these commenters were drinking, let’s turn to AnandTech, widely regarded as one of the best online hardware magazines.

Founded by Anand Lal Shimpi when he was all of 14-years-old, AnandTech is known for its exhaustive (and sometimes exhausting) product reviews. The 14-section September 17th iPhone 5S review doesn’t disappoint. Among other things, it provides detailed iPhone 5S vs. iPhone 5 performance comparisons such as this:

5S GeekBench Anand Edited

There are many other charts, comparisons, and considerations of the new 64-bit ARMv8 instruction set, the move from 16 to 32 floating-point NEON 128-bit registers, the hardware acceleration of cryptography operations… It’s a very long read, but not a boring one (at least not for interested geeks).

The bottom line is plain: The A7 processor is a substantial improvement that’s well supported by the 64-bit iOS7. (And I’d like to meet the author and bow to his encyclopedic knowledge.)

Was it because of AnandTech’s cool analysis that the doubters have changed their tune?

As I predicted, Apple A7 benchmarks well due to CPU arch (for IPC), new GPU, ARM v8′

Now that the A7 had become a Benchmarking Beast, the author of the previous week’s brush-off tweet (“more memory and maybe registers. Period”) has revised his position [emphasis mine]:

“The improvements Apple made with the A7 are truly incredible, and they really went against the grain in their choices. With an industry obsessed with more cores, they went with fewer, larger and efficient cores. With people expecting v8 and 64-bit ARM in late 2014, Apple brings it out in 2013 with full Xcode support and many performance optimizations.” [...] “Apple has done it again, but this time in unexpected fashion.”

That all-purpose defense, unexpected, provides a key to the wrong-footing of many “experts”.

When Apple entered the microprocessor field a mere five years ago with its acquisition of Palo Alto Semiconductor, the move was panned: Apple had no future competing with established industry leaders such as Intel, Qualcomm, Nvidia, and Samsung.

But with the successive, increasing refinement of the A4, A5, and A6, the designs were ultimately viewed as good, very good, roughly on par with the rest of the industry. What these processors lacked in raw power was more than made up for by they way they were integrated into Apple’s notion of a purposeful, usable mobile device: Enhanced UI responsiveness, reduced power consumption, obeisance to the unique requirements of media and communications.

The expectation was that Apple would either fail, or produce a “competent” (meaning not particularly interesting) iteration of previous A4-5-6 designs. No one expected that the processor would actually work, with all in-house apps running in 64-bit mode from day one.

But let’s back up and rewrite a bit of history, ourselves:

On September 10th, Samsung announced its flagship 64-bit Exynos processor, supported by Android 5.0, the 64-bit version of Google’s market-leading mobile OS. The new Galaxy S64 smartphone, which will ship on September 20th, features both 64-bit hardware and software components. Samsung and Google receive high praise:

“Supercomputer-class processor… Industry-leading performance… Tightly integrated 64-bit software and hardware open a new era of super high-performance applications previously impossible on mobile devices…”

And Apple gets its just deserts:

“Once again, Apple gets out-innovated…This confirms the trend we’ve seen since Tim Cook took over… iPhones have become second-class devices… The beginning of a long decline…”

Apple can be thankful this is fantasy: The real world would never treat it like this (right?).

My fantasy isn’t without basis: Within 24 hours of Apple’s September announcement, Samsung’s mobile business chief Shin Jong-kyun said his company will have its own 64-bit Exynos processor:

“Not in the shortest time. But yes, our next smartphones will have 64-bit processing functionality…” [The Korea Times]

As for Android support, no problem: 64-bit versions of the underlying Linux kernel already exist. Of course, the system software layer that resides on top of the Linux kernel — the layer that is Android — will also need to be converted to take advantage of the 64-bit processor, as will the Software Development Kit (SDK) that third-party developers use to create apps. It’s a sizable challenge, but one that’s well within the Android’s team skills and resources; the process has certainly been under way for a while already.

The real trouble starts outside of Google. Which 64-bit processor? Intel’s (the company says it will add 64-bit “capabilities” to Android)? Samsung’s? Qualcomm’s?

Who writes and supports device drivers for custom SoC modules? This sounds a lot like Windows device driver complications, but the complexity is multiplied by Google’s significantly weaker control over hardware variants.

Apple’s inherent control over all of the components in its platform will pay dividends in the speed and quality of the transition. There will be glitches — there will always be new, factory-fresh bugs — but the new 64-bit hardware is designed to run existing 32-bit apps, and it seems to actually do so in practice.

Now let’s go beyond the iPhone 5S. In his September 10th presentation, Phil Schiller, Apple’s Marketing Supremo, called the A7′s performance “desktop class”. These words were carefully calibrated, rehearsed, and approved. This isn’t a “Can’t innovate anymore? My asssaeta, blurted while seized by religious fervor at last Spring’s Apple Developers Conference.

Does “desktop class” imply that Apple could use future versions of its 64-bit processor to replace Intel chips in its Mac devices?

In the AnandTech post quoted above, several benchmarks compare Apple’s A7 to a new x86 chip, Intel’s Baytrail, with interesting results:

AnandTech Baytrail A7

So, yes, in theory, a future Apple 64-bit processor could be fast enough to power a Mac.

But let’s consider a 3GHz iMac running a high-end media creation application such as Photoshop or Autodesk. The processor doesn’t want to be constrained by power consumption requirements, it’s optimized for performance (this even ignores the upcoming MacPro and its thermal management prowess).

Can we see a split in the Mac product line? The lower, more mobile end would use Apple’s processors, and the high-end, the no-holds-barred, always plugged to the wall desktop devices would still use x86 chips. With two code bases to maintain ß OS X applications to port? Probably not.

Apple could continue to cannibalize its (and others’) PC business by producing “desktop-class” tablets. Such speculation throws us back to a well-known problem: How do you compose a complex document without a windowing system and a mouse or trackpad pointer?

We’ve seen the trouble with Microsoft’s hybrid PC/tablet, its dual Windows 8 UI which is considered to be “confusing and difficult to learn (especially when used with a keyboard and mouse instead of a touchscreen).”

The best suggestion I’ve seen so far comes from “a veteran design and management surgeon” who calls himself Kontra and proposes An interim solution for iOS ’multitasking‘ based on a multi-slot clipboard.

If Apple provides a real way to compose complex documents on a future iPad, a solution that normal humans will embrace, then it will capture desktop-class uses and users.

Until such time, Macs and iPads are likely to keep using different processors and different interaction models.

JLG@mondaynote.com

 

Apple Market Share: Facts and Psychology

 

Remember netbooks? When Apple was too greedy and stupid to make a truly low-cost Macintosh? Here we go again, Apple refuses to make a genuinely affordable iPhone. There will be consequences — similar to what happened when the Mac refused to join netbooks circling the drain. 

My first moments with the iPad back in April 2010 were mistaken attempts to use it as a Mac. Last year, it took a long overdue upgrade to my eyeglasses before I warmed to the nimbler iPad mini, never to go back to its older sibling.

With that in mind, I will withhold judgment on the new iPhone until I have a chance to play customer, buy the product (my better half seems to like the 5C while I pine for a 5S), and use it for about two weeks — the time required to go beyond my first and often wrong impressions.

While I wait to put my mitts on the new device, I’ll address the conventional hand-wringing over the 5C’s $549 pricetag (“It’s Too Damned High!” cry the masses).

iphone5c copie

Henry Blodget, who pronounced the iPhone Dead In Water in April 2011, is back sounding the alarm: Apple Is Being Shortsighted — And This Could Clobber The Company. His argument, which is echoed by a number of pundits and analysts, boils down to a deceptively simple equation:

Network Effect + Commoditization = Failure

The Network Effect posits that the power of a platform is an exponential function of the number of users. Android, with 80% of the smartphone market will (clearly) crush iOS by sucking all resources into its gravitational well.

Commoditization means that given an army of active, resourceful, thriving competitors, all smartphones will ultimately look and feel the same. Apple will quickly lose any qualitative advantage it now enjoys, and by having to compete on price it could easily fall behind.

Hence the preordained failure.

As a proof-of-concept, the nay-sayers point to the personal computer battle back in the pre-mobile dark ages: Didn’t we see the same thing when the PC crushed the Mac? Microsoft owned the personal computer market; PC commoditization drove prices into the bargain basement…

Interpret history how you will, the facts show something different. Yes, the Redmond Death Star claimed 90% of the PC market, but it failed to capture all the resources in the ecosystem. There was more than enough room for the Mac to survive despite its small market share.

And, certainly, commoditization has been a great equalizer and price suppressant — within the PC clone market. Microsoft kept most of the money with the de facto monopoly enjoyed by its Windows + Office combo, while it let hardware manufacturers race to the bottom (netbooks come to mind). Last quarter, this left HP, the (still) largest PC maker, with a measly 3% operating profit for its Personal Systems Group. By contrast, Apple’s share of the PC market may only be 10% or less, but the Mac owns 90% of the $1000+ segment in the US and enjoys a 25% to 35% margin.

After surviving a difficult birth, a ruthlessly enforced Windows + Office platform, and competition from PC makers large and small, the Mac has ended up with a viable, profitable business. Why not look at iDevices in the same light and see a small but profitable market share in its future?

Or, better yet, why not look at more than one historical model for comparison? For example, how is it that BMW has remained so popular and profitable with its One Sausage, Three Lengths product line strategy? Aren’t all cars made of steel, aluminium (for Sir Jony), plastic, glass, and rubber? When the Bavarian company remade the Mini, were they simply in a race to the bottom with Tata’s Nano, or were they confidently addressing the logical and emotional needs of a more affluent — and lasting — clientèle?

Back to the colorful but “expensive” 5C, Philip Elmer-DeWitt puts its price into perspective: For most iPhone owners, trading up to the 5C is ‘free‘ due to Apple’s Reuse and Recycle program. We’ll have to see if The Mere Matter of Implementation supports the theory, and where these recycled iPhones end up. If the numbers work, these reborn iPhones could help Apple gain a modest foothold in currently underserved price segments.

Still thinking about prices, I just took a look at the T-Mobile site where, surprise, the 5C is “free“, that is no money down and 24 months at $22 — plus a $10 “SIM Kit” (read the small print.) You can guess what AT&T offers: 24 months at $22/month (again, whip out your reading glasses.) Verizon is more opaque, with a terrible website. Sprint also offers a no-money-down iPhone 5C, although with more expensive voice/data plans.

This is an interesting development: Less than a week ago, Apple introduced the iPhone 5C with a “posted price” of $99 — “free” a few days later.

After much complaining to the media about “excessive” iPhone subsidies, carriers now appear to agree with Horace Dediu who sees the iPhone as a great “salesman” for carriers, because it generates higher revenue per user (ARPU). As a result, the cell philanthropists offer lower prices to attract and keep users — and pay Apple more for the iPhone sales engine.

Of course, none of this will dispel the anticipation of the Cupertino company’s death. We could simply dismiss the Apple doomsayers as our industry’s nattering nabobs of negativism, but let’s take a closer look at what insists under the surface. Put another way, what are the emotions that cause people to reason against established facts, to feel that the small market share that allowed the Mac to prosper at the higher end will inevitably spell failure for iDevices?

I had a distinct recollection that Asymco’s Horace Dediu had offered a sharp insight into the Apple-is-doomed mantra. Three searches later, first into my Evernote catchall, then to Google, then to The Guardian, I found a Juliette Garside article where Horace crisply states the problem [the passage quoted here is from a longer version that's no longer publicly available; emphasis and elision mine]:

“[There's a] perception that Apple is not going to survive as a going concern. At this point of time, as at all other points of time in the past, no activity by Apple has been seen as sufficient for its survival. Apple has always been priced as a company that is in a perpetual state of free-fall. It’s a consequence of being dependent on breakthrough products for its survival. No matter how many breakthroughs it makes, the assumption is (and has always been) that there will never be another. When Apple was the Apple II company, its end was imminent because the Apple II had an easily foreseen demise. When Apple was a Mac company its end was imminent because the Mac was predictably going to decline. Repeat for iPod, iPhone and iPad. It’s a wonder that the company is worth anything at all.”

This feels right, a legitimate analysis of the analysts’ fearmongering: Some folks can’t get past the “fact” that Apple needs hit products to survive because — unlike Amazon, as an example — it doesn’t own a lasting franchise.

In the meantime, we can expect to see more hoses attached to Apple’s money pump.

Next week, I plan to look at iOS and 64-bit processing.

JLG@mondaynote.com