hardware

TV Done Right: Still A Dream

 

As the strong reactions to even the slightest Apple TV rumor demonstrate, there’s a vigorous appetite for a simple, modern Internet TV experience. The technology is ready but carriers aren’t.

Last week started with Big Apple TV News in an authoritative-sounding Wall Street Journal article:

“Apple Inc. is in talks with Comcast Corp. about teaming up for a streaming-television service that would use an Apple set-top box and get special treatment on Comcast’s cables to ensure it bypasses congestion on the Web, people familiar with the matter say.”

Search for “Comcast” in a news aggregator such as Feedly (there are many other good choices), and you’ll see a wide range of reactions to the Apple-Comcast rumor. Given the heat the article generated, it’s odd that there has been zero follow-up from the main players — nothing from Apple and Comcast, no additional information in the WSJ or any other journal. When a deal of such importance is in the works, “people familiar with the matter” have a strong incentive to keep talking, to add color, to spin their side of the story. Of course, no one expects Apple to do much leaking, but the radio silence from Comcast spinmeisters is another matter entirely.

Philip Elmer-DeWitt offers the most likely explanation: The Wall Street Journal got played by someone intent on throwing a wrench into Comcast’s plan to acquireTime Warner’s cable operations. (This wouldn’t be the first time: Cellphone carriers have repeatedly used the WSJ to air their perennial Poor Me complaints about excessive smartphone subsidies.)

Echoes of the WSJ non-story ricocheted around the blogosphere. Some, such as this BBC article, make painful points about the abuse that US consumers undergo at the hands of broadband carriers:

Broadband Cost

As a sharp-witted Be engineer liked to remark: “It costs more… But it does less.”

Carriers take too much money for a user-hostile experience simply because they can. In most locations, cable companies have little or no competition, so there’s no reason for them to do anything more than milk the most profit from a cheap infrastructure. As Apple Insider’s Neil Hughes reminds us, the user experience isn’t a priority for cable providers. Indeed, as I write this from Paris, I have to juggle set-top box restarts and malfunctioning secondary content subscriptions only reluctantly allowed by the main provider.

It doesn’t have to be that way. No miracle is required to make our Cable TV experience easy and gratifying.

Consider today’s cable arrangement, simplified for our discussion. A coax cable is strung from the street into your basement or crawl space. You plug the coax into a signal splitter, connect one output to your cable modem for Internet access, while the others feed the TVs in your household.

Next, you run an Ethernet cable from your modem to your WiFi access point and maybe you also run a wire from the access point to your “most trusted” computer. Upstairs, we see a set-top box, an Internet TV streaming device (Roku, Apple TV, Boxee, or other), and, if your TV is of a certain age, a digital adaptor.

That’s four or five devices that you have to connect and, when things go wrong, disconnect, power down, and restart in the “proper” order.

It’s only too easy to imagine how a next-generation Apple TV could collapse this maze of impenetrable interfaces into one box: Coax in, Wifi and HDMI out and, miracle, one and only one remote! This is something that Apple seems to have the taste and resources to do well.

There are no technical obstacles, no new technology is required, no new software platform, just a careful integration job. I realize I’m veering dangerously close to the “mere matter of implementation” deception, but regardless of the amount of work it would take to integrate the various technologies, the benefit to the user would make the engineering effort worth it.

And there are many benefits:  We can throw away our DVRs as content becomes an app that we can stream whenever we want — the 60 Minutes iPad app is an elegant, flexible exemplar of the type. Rather than paying for a “package” of channels that are selected by the cable provider, we’ll be able to buy a la carte shows, series, and channels through iTunes or similar content vendor. We’ll be able to watch the free-with-ads version of a show, or we can pay for the ad-free edition.

Some day, the status quo will break, perhaps as the result of a patient encirclement and infrastructure buildup — a better, vertically integrated Content Delivery Network, both very much compatible with Apple’s playbook. As the reactions to the (possibly planted) Apple-Comcast rumor amply demonstrate, users are becoming increasingly aware of the disconnect between the experience that the cable companies offer and TV Done Right.

JLG@mondaynote.com

Wearables Fever

 

While Google, Motorola, and Samsung seem eager to jump into the wearables market, Apple characteristically keeps its counsel – and wisely so: Smartwatches and other wearables produce more pageviews than profits.

Wearables are a danger to your health – your mental health, that is. Smartwatches and sensor-laden bracelets aren’t so new anymore — see Microsoft’s 2004 SPOT Watch — but the vernal equinox seems to have triggered a bout of Wearables Fever the likes of which we haven’t seen since the Tablet Fever of January, 2011, when 76 tablets were announced at the Consumer Electronic Show in Las Vegas. As so often happens with pandemics, there was a smaller outbreak, called the Dawn of the Tablet PC, days before the January 2010 iPad launch.

In this year’s derangement, we are witnessing the birth of another epoch-making class of product — the Wearable. As Wired sees it, for example, Jawbone Is Now the Startup Apple Should Fear Most.

In one respect, Jawbone’s devices are a lot like Apple’s. The company admires minimalism…[b]ut Apple’s minimalism is cold — all brushed metal and glass — while Jawbone’s is warm, squishy, and textured… There’s a chance Apple has designed itself into a corner. But for Jawbone, the future is full of possibility.

Then there’s this analysis, quoted and mocked by John Gruber [emphasis mine]:

Cadie Thompson, writing for CNBC, “Time Is Ticking for Apple to Announce an iWatch, Say Analysts”. Apple needs an iWatch sooner rather than later, or the company will risk losing its innovative edge to rivals, analysts say.

They only have 60 days left to either come up with something or they will disappear,” said Trip Chowdhry, managing director at Global Equities Research. “It will take years for Apple’s $130 billion in cash to vanish, but it will become an irrelevant company… it will become a zombie, if they don’t come up with an iWatch.

I’m guessing the ellipsis denotes when he paused for another line of coke.

Parenthetically, it would be wrong to imply that Mr. Chowdhry might be “incentivized” to shout from the rooftops by rewards more satisfying than pageviews — no allegations of stock manipulation complicity here — but I wonder about the games that he and other anal-ists play. As Philip Elmer-DeWitt pointedly noted in a CNN Money column last year, Mr. Chowdhry urged his clients to unload Apple stock for eight months and then blamed the CEO and CFO “for destroying Apple’s shareholder value”.

If you’re curious enough to look at Mr. Chowdhry’s spartan Global Equities Research site, you’ll see he claims to have Commission Sharing Agreements with Goldman Sachs, Merrill Lynch, Barclays, Jefferies, Morgan Stanley and JP Morgan. As the Wikipedia article points out, such agreements “ask that broker to allocate a portion of the commission directly to an independent research provider.” Here, one wonders what the word independent really means…

Back to Wearables: The announcements pile on.

Samsung tells us they’re moving their smartwatches away from Android to a version of Tizen, itself based on a version of the ubiquitous Linux.

Google announces Android Wear, a version of Android for smartwatches.

Motorola, soon to be a Lenovo brand, announces that its moto 360 smartwatch is “Coming Summer 2014 in a selection of styles” and provides these artful renderings:

Moto Wrist Edited

and…

Moto Modern

(I write renderings because, as the Android Wear intro video indicates, these are simulated pictures. This doesn’t mean that the final product won’t be better looking– but we’re clearly not there yet.)

Why the haste? Did Tim Cook succeed in misdirecting Apple’s competition when he pronounced wearables a “very key branch of the tree? Or is there a giant business to be had?

We have many unanswered questions.

First, paraphrasing Horace Dediu, there are the twin questions of For What and By Whom: For what job is a smartwatch “hired”, and by whom? If we look at phones as a model, we see two “employers”: Carriers hire smartphones to increase their ARPU; normal consumers use them as small, ubiquitous, always-connected personal computers.

Will this model work for smartwatches? We can almost certainly eliminate carriers from the equation: Subsidies are out of question because a watch is unlikely to generate carrier revenue.

For us users, a smartwatch collects sensor data, connects to our smartphone, displays alerts, responds to touch and voice commands… and even tells us the time. These are all worthwhile functions that make for neat promo videos, but to keep users interested after the novelty wears out, smartwatches will have to do more than log the miles we’ve run, give us weather updates, and show us the name of the person who’s ringing the smartphone in our pocket. Put another way: We’re willing to pay a premium for our smartphones (whether directly or by contract) because of the huge range of features they provide, the enormous number of apps in the app stores. Will we be as durably aroused – and willing to part with substantial amounts of money – by (yet another) pulse rate app?

Another batch of questions: Since we no longer need a dedicated timepiece to tell us the time — our smartphone does that — Who wears a (dumb) watch these days, How, When, and Why?

Simplifying a bit, younger people don’t wear watches at all and older generations use them as jewelry — and gender-specific jewelry, at that. Furthermore, how many veteran watch-wearers wear the same watch all the time? Many of us own more than one watch, and select the appropriate timepiece (or two — or none at all) for the occasion. These aren’t trivial issues, they’re uncharted territory for mobile device makers and marketers.

Next question: How will smartwatch makers handle the delicate equilibrium between computing power and battery power? As smartwatches evolve and offer more features, a better display, and a more responsive user interface, they’ll need more computing power — and more computing power means a quicker battery drain. Will we put up with watches that run out of power at the end of the day? Will designers retard functionality in order to extend battery life to 24 hours and beyond… or make a smartwatch so big it’ll look like a miniature phone?

The power equilibrium question is why Samsung moved to a dedicated (and pared down) version of Tizen, and why Google did the same for Android Wear. All without giving much information of battery life.

Finally: Is there a business, there? Here in the Valley, Pebble CEO Eric Migicovsky claims to have sold 400,000 watches since January, 2013. At around $150 each, that’s $60M in revenue — a real tribute to Eric’s long-standing belief in wearables (he’s been working at it for six years).

But even if you multiplied this number by 10, it would barely nudge the needle for a large companies such as Samsung, Motorola/Lenovo, or Apple, which means these devices will be confined to the role of smartphone companion. They’ll help make money by enhancing the main product; they’re not going to be a $10B business in themselves.

As Charles Arthur writes in The Guardian, there are fewer than half a million smartwatches in use in the UK: “Wearable computing faces an uphill battle breaking through to the mainstream…”. Similarly, the Register doesn’t see any good, large-scale answers to the question. It calls Google wearables “A solution looking for a rich nerd”.

These challenges might explain why Apple doesn’t seem to have caught this Spring’s Wearables Fever. Smartwatches are destined to be ecosystem extensions, not The Next Big Thing.

JLG@mondaynote.com

One last thought before we close: Not all Ecosystem Extensions are equal. The no-longer-a-hobby Apple TV now brings substantial revenue and growth:

“Sales of the Apple TV are estimated to have grown by 80 percent in 2013, reaching around 10 million units for the calendar year, or some $1 billion worth of set-top boxes sold to end users.”

Horace Dediu puts a “Fortune 130” label on iTunes. By itself, with yearly gross revenue of $23.5B and growing 34%, iTunes is large enough to rank #130 in the Fortune list of the 500 largest US companies:

On a yearly basis iTunes/Software/Services is nearly half of Google’s core business and growing slightly faster.”

While music sales are on the wane, apps and video (mostly Apple TV) show healthy growth. Compared to an Apple TV, how much would an iWatch add to the iTunes business? Apps? Content?

Apple seems wise to stay out of the game until it can make something more lasting than a novelty.

CarPlay Thoughts

 

Who wouldn’t want an iPhone- or Android-like experience in their car instead of today’s misbegotten navigation and entertainment systems? CarPlay’s answer looks nice – until one looks at the details.

Apple’s CarPlay has an air of inevitability. Previously dubbed “iOS in the Car”, CarPlay brings the iPhone’s aesthetics, ease of use, consistency, and universe of apps to the ugly and dumbfounding world of car navigation and entertainment systems.

Seven years after the iPhone launched the Smartphone 2.0 wave, Apple kickstarts another mobile revolution…

It’s an enticing, simple vision. Instead of today’s disjointed systems — which often cost in the $1,000 range, plus $249 for a DVD of updated maps — you get a screen the size of a small tablet running iOS apps with voice and touch control (on-screen and armrest), off-air map updates, open-ended flexibility… We have arrived.

I’ve struggled with dashboard electronics from German, Japanese, and French car makers (no electronics on the old family Chevrolets), and I’ve seen what happened to Ford when it tried to use Microsoft software for its Sync system. Replacing these hairballs with an iOS system only makes sense.

But sense and reality are still living apart.

carplay2

To start, the “iOS in the Car” phrase is misleading. The iOS device “in your car” is the iPhone or iPad that you’ve brought with you — Apple isn’t about to license iOS to automakers (which may be part of the reason why Apple changed the name to “CarPlay”).

And Apple isn’t going to try to take the place of suppliers such as Delphi, VDO, and Aisin by making subsystems for carmakers — it’s not in Apple’s DNA. Not that it would matter if they tried: Automakers have made an art of pinching fractions of cents from their suppliers’ prices; they’d never tolerate Apple’s margins.

CarPlay replicates your iDevice’s screen as H.264 video spewed through an intelligent Lightning cable connected to your car’s USB port. The video format is widely accepted, so the in-car device either understands it already, or can be updated to do so.

So far, so good. As many observers have pointed out, the idea is a wired echo of Apple’s AirPlay, the technology that connects your iDevices (and other compliant products) to your television via the Apple TV black puck. Complications may arise when you consider the various in-dash screen sizes, resolution, actual uses of USB connections (my car’s USB connector is useless for anything other than charging my smartphone), and other mysterious incompatibilities that are beyond Apple’s control. Still, in general, screen replication demands little from the car maker. As with Airplay and a dumb TV set, the intelligence stays inside the smartphone.

The CarPlay proposal is much more limited than the Open Automotive Alliance, a Google initiative that implants a customized version of Android into a car’s electronics. (“Audi connect” is available today; we can expect similar collaborations with Honda, GM and Hyundai.) But if the in-car system runs Android (or QNX, as is often the case today), so much the better, from the carmaker’s point of view: Let Google or one of its partner do all the work to create an Android-based all-in-one car system and let Apple hitch a ride after the work is done. Serving both Android and iOS users is a no-brainer.

It sounds good… but I can’t help but harbor uneasy feelings about this whole “scene”.

To begin with, we have a clash of cultures. To be sure, Eddy Cue, Apple’s Senior VP of Internet Software and Services, is a dealmaking expert and, as a member of the Board of Ferrari, he has serious automotive industry connections. But the spirit that drives Apple is far from that which motivates automakers.

The automotive industry expects to be in control of everything that gets into their cars. The coup that Apple pulled off with the iPhone and AT&T — taking full control of the content, no crapware, iTunes only for media — isn’t going to happen with Mercedes-Benz, or BMW, or even Hyundai. Cars aren’t phones. We’re not going to see aftermarket Toyota CarPlay kits (let alone entire cars) in Apple Stores. Apple won’t get what it always strives for: Controlled Distribution.

Then there’s the F-word: Fragmentation. In-car electronics are a mess, a new culture grafted onto an old one, Silicon Valley and Detroit in a loveless marriage. Actually, that’s  unfair: Under the hood, embedded electronics do wonders to improve the reliability, safety, and economy of our cars. But where the union breaks down is in the User Experience domain. Competent combustion management engineers and the accountants watching over their shoulders have no empathy for smartphone-loving drivers.

The meanderings get more twisted when we consider a key difference between Google and Apple. Google could tell Audi that they’ll pay, in some form, for the user data collected by Audi connect— but Audi already makes a lot of money, they don’t want to open that can of worms. As they say in their privacy agreement:

“We will not share information about you or your Audi vehicle that is connected with your use of Audi connect’s in-car features with third parties for their own purposes without your consent.”

But what would a legally-troubled, profit-starved automaker such as GM say in response to Google’s offer to subsidize the in-car system?

Apple hasn’t played that game.

An all-in-one navigation/communications/entertainment system is a pleasant dream, it feels “right”. But the technical, business model, and cultural obstacles could make for a long, arduous march.

CarPlay could be a very smart way to hitch a ride on many in-car systems without having to struggle with their design and cost challenges, yet another ecosystem extension play.

JLG@mondaynote.com

 

The Apple Game: New Categories vs. Ecosystem Development

 

Putting hopes for Apple’s future on a magical New Category misunderstands the company’s real direction. Instead, Apple stays the course and continues to play its Ecosystem Game. Many interpret this as a company in death throes.

The iPhone is 7 years old. The iPad was introduced 4 years ago. Since then… nothing.
Apple’s growth is gone: a mere 9% revenue uptick in 2013; only 6% for the last quarter.

It’s time to acknowledge the painfully obvious truth: Innovation has deserted Apple. Something must have gone seriously wrong if the company can no longer break into new categories.

This is the refrain that echoes through the Web… and this is the toxic waste of success.

It started back in the early 80’s when all we wanted was a better Apple ][ (or a working Apple ///)… and we were stunned by the Macintosh. We were unable to imagine a mouse + a bit-mapped display + moveable windows + pull-down menus, all the things that came to define the next iteration of personal computers. We couldn’t imagine it because we didn’t have the right words to think with.

The Mac created a template for what we’ve come to expect from Apple: A breakthrough. Now we’ve had two epochal hits in rapid succession — the iPhone and iPad — and, once again, we’re hooked. We need another fix, another burst of excitement, something New and Different. We’re no longer satisfied with the boring New and Improved.

The desire for Something New is understandable, but breakthroughs don’t happen on demand… and a “breakthrough” isn’t always necessary.

Consider the PC. In rough numbers, the PC is 40 years old. In that time, it has matured into what we now enjoy through gradual improvements to hardware and software: Better graphics, sound, displays, user interface, network connectivity.

Why don’t we take as much pleasure in the continuous flow of improvements to our smartphones and tablets?

When we regard the new categories that Apple has “failed” to create (or join) we face a number of problems.

We’ll make short work of the fantasized Apple TV set. We all know what we want: à la carte content served to us as apps through an Apple-grade device. We’ve had enough of the Soviet-era set-top boxes and tiered pricing foisted on us by cable companies.

Even if Apple wanted to head in that direction — and step into the quagmire of exploitative relationships between content creators, distributors, and de facto carrier monopolies — the impact on the company’s top and bottom lines, revenue and profit, would be moderate. Ten million set-top boxes at (say) $299 makes “only” $3B — that’s less than 2% of last year’s revenue.

(A short digression on the curse of large numbers: At Apple’s size, which is approaching $200B in yearly revenue, a “breakthrough” product would need to generate at least 5%  — $10B — in order to move the needle. That’s approximately one Facebook.)

What about wearables? It’s a lively category and will get livelier as sensors get more sophisticated, consume less energy, and benefit from software refinements. Microsoft had one years ago; Pebble, Samsung, and many others offer “smartwatches”; Fitbit and Jawbone Up manufacture “fitness bracelets”.

My take on smartwatches and bracelets (The Next Apple TV: iWatch, More iWatch Fun and Apple’s Wearable Future) is that Apple is already in that category – with the “Always With Us” iPhone and motion sensing, and with apps such as Azumio’s Instant Heart Rate app.

We now turn to cars. The CarPlay announcement at the Geneva Motor Show follows the iOS in the Car presentation at last year’s WWDC (Apple’s Developers Conference).

We have to ask: How does the CarPlay money pump work, and in which direction? Does Mercedes Pay Apple, or the other way around? On the technical and User Interface levels, how does the tightly controlled and consistent Apple experience survive the wide range of UI cultures and cost constraints so painfully obvious in today’s cars? On its CarPlay webpage, Apple makes broad promises:

“To activate Siri voice control, just press and hold the voice control button on the steering wheel.”
“If your CarPlay-equipped vehicle has a touchscreen, you can use it to control CarPlay.”

and..

“CarPlay also works with the knobs, dials, or buttons in the car. If it controls your screen, it controls CarPlay.”

We’ll have to wait and see how “the apps you want to use in the car have been reimagined” and how automakers play one smartphone ecosystem against another.

In any case, CarPlay isn’t a product that’s meant to stand on its own. In Apple’s own words, it’s part of the iPhone ecosystem:

“CarPlay is a smarter, safer way to use your iPhone in the car. CarPlay takes the things you want to do with your iPhone while driving and puts them right on your car’s built-in display.”

A not-so-surprising  pattern emerges: Rather than play the New Category game, Apple stays the Ecosystem Development course it’s been on since the advent of the truly epoch-making iTunes. Tim Cook may claim that the $99 Apple TV black puck Tim Cook is no longer a hobby”, but its yearly revenue is in the $1B range, less than 1% of the company’s overall number. Apple may make a wearable someday, but will everyone want an iWatch the way they want an iPhone? And let’s not expect direct revenue upticks from CarPlay, just a more pleasant iPhone experience in “selected” vehicles.

This gets us to Apple’s deeply rooted fixation: From its April 1st, 1976 founding to this day, Apple has been in one and only one business: personal computers. Today, they come in three sizes: Macs, iPads, and iPhones. Everything else Apple creates — iTunes, the App Store, the physical Apple Stores, Apple TV pucks, CarPlay, the mythical iWatch — these are all part of the supporting cast, and they have a single mission: Prop up the volume and margins of the star products.

This isn’t to diminish the importance of the supporting cast. iTunes begat the iPod, a product once so successful it generated more revenue than the Mac in 2006. iTunes also introduced the distribution mechanism and micro-payment system that unleashed the iPhone’s full power, that made it an “app-phone”. As our friend Horace Dediu points out, iTunes by itself would rank #130 in the Fortune 500 list (yearly gross revenue of $23.5B, + 34% growth in 2103).

Nonetheless, iTunes doesn’t have a separate P&L (Profit & Loss) statement in Apple’s financials because there’s only one P&L for the entire company. There are no “line of business” numbers because there’s only one business.

As the Macalope explains, ecosystems are put to very different uses by Amazon and Google, on the one hand, and Apple:

“Amazon and Google sell tablets at cut rates in order to get people to use their ecosystems. It’s less crucial for them that people buy tablets; they just want people to use tablets to buy stuff and look at ads. Apple makes money off its ecosystem, too, but unlike Amazon and Google that’s not where it makes most of its money.”

Still, there is another possibility. Apple could get into an entirely new business, as opposed to one or more of the ecosystem-supporting products/services mentioned above. One such example is Amazon Web Services (AWS). Instead of opening another line of product sales, fresh produce or wine, Amazon got into the business of renting servers to companies large and small. It’s not Amazon’s biggest line, it weighs less than 3% of total revenue, it’s thought to be very profitable (unlike the main lines) and grows very fast. Gartner thinks AWS is now larger than its next five competitors combined. An amazing success.

Amazon did it because it developed a superior computer services infrastructure  for its core business and decided to offer similar services to any and all.

Back to Apple, could one imagine the company using some of its hard-earned expertise to branch in a different, non personal computers business? For a not-too-serious example, Porsche makes cars and there is a Porsche Design business. But I don’t see Tim Cook telling Sir Jony to open a design studio business. And Porsche Design weighs nothing compared to the Porsche car maker. Such branching out sounds very remote.

So is it true that Apple has given up on developing its business? In 2013, Apple increased its R&D spending by 32% (you can download Apple’s entire 10-K report here); 2012 was + 39%. In the previous two fiscal years ending last September — and without entering any new category — Apple increased its R&D spending by 83% to $4.5B, 3% of revenue. And then it increased it by another 33% in the December, 2013 quarter.

Accelerated R&D spending can only mean one thing: Apple is widening the range of products under development. Let’s just hope company execs continue to be as good as they’ve been in the past at saying No, at not shipping everything they engineer.

JLG@mondaynote.com

 

The Apple Tesla Connection: Fun and Reason With Numbers

 

Apple acquiring Tesla would make for juicy headlines but would also be very dangerous. There are more sensible ways for the two companies to make money together.

Apple has never suffered from a lack of advice. As long as I can remember — 33 years in my case — words of wisdom have rained down upon the company, and yet the company stubbornly insists on following its own compass instead of treading a suggested path.

(Actually, that’s not entirely true. A rudderless, mid-nineties Apple yielded to the pressure of pundit opinion and licensed its Mac OS to Power Computing and Motorola… and promptly lost its profits to Mac clones. When Steve Jobs returned in 1997, he immediately canceled the licenses. This “harsh” decision was met with a howl of protest, but it stanched the bleeding and made room for the other life-saving maneuvers that saved the company.)

Now that Jobs is no longer with us and Apple’s growth has slowed, the advice rain is more intense than ever… and the pageviews netwalkers are begging for traffic. Suggestions range from the deranged (Tim Cook needs to buy a blazer), to – having forgotten what happened to netbooks – joining the race to the bottom (Apple needs a low-cost iPhone!), to the catchall “New Categories” such as Wearables, TV, Payment Systems, and on to free-floating hysteria: “For Apple’s sake, DO SOMETHING, ANYTHING!”

The visionary sheep point to the titanic deals that other tech giants can’t seem to resist: Google buys Nest for $3.3B; Facebook acquires WhatsApp for $16B (or $19B, depending on what and how you count). Why doesn’t Apple use its $160B of cash and make a big acquisition that will solidify its position and rekindle growth?

Lately, we’ve been hearing suggestions that Apple ought to buy Tesla. The company is eminently affordable: Even after TSLA’s recent run-up to $30B, the company is well within Apple’s means. (That Wall Street seems to be telling us that Tesla is worth about half of GM and Ford is another story entirely.)

Indeed, Adrian Perica, Apple’s head of acquisitions, met Tesla’s CEO Elon Musk last Spring. Musk, who later confirmed the meeting, called a deal “very unlikely”, but fans of both companies think it’s an ideal match: Tesla is the first Silicon Valley car company, great at design, robotics, and software. Like Apple, Tesla isn’t afraid to break with traditional distribution models. And, to top it off, Musk is a Steve Jobs-grade leader, innovator, and industry contrarian. I can see the headline: “Tesla, the Apple of the auto industry…”

But we can also picture the clash of cultures…to say nothing of egos.

I have vivid recollections of the clash of cultures after Exxon’s acquisition of high-tech companies to form Exxon Office Systems in the 1970s.

Still reeling from the OPEC oil crisis, Exxon’s management was hypnotized by the BCG (Boston Consulting Group) who insisted that “Information Is the Oil of the 21st Century.” The BCG was right, Tech has its Robber Barons: Apple, Oracle, Google, Facebook, Intel, and Microsoft all weigh more than Shell, Exxon, or BP.

But the BCG was also wrong: Exxon’s culture had no ability to understand what made the computer industry tick, and the tech folks thoroughly despised the Exxon people. The deal went nowhere and cost Exxon about $4B – a lot of money more than 30 years ago.

This history lesson isn’t lost on Apple. So: If Apple isn’t “interested” in Tesla, why are they interested in Tesla?

Could it be batteries?

A look at battery numbers for the two companies brings up an interesting parallel. Tesla plans to make 35,000 Tesla S cars this year. According to Car and Driver magazine, the battery weighs 1323 pounds (600 kilograms — we’ll stick to metric weights moving forward):

Tesla S Battery 1323lbs edited

That’s 21,000 (metric) tons of batteries.

For Apple devices the computation is more complicated — and more speculative — because the company publicizes battery capacity (in watt-hours) rather than weight. But after some digging around, I found the weight information for an iPhone 4S on the iFixit site: 26 grams. From there, I estimated that the weight of the larger iPhone 5S battery is 30 grams.

I reasoned that the weight/capacity ratio is probably the same for all Apple batteries, so if a 26g iPhone battery provides 5.25 watt-hrs, the iPad Air battery that yields 32.4 watt-hrs must weigh approximately 160g. Sparing you the details of the mix of iPad minis and the approximations for the various Macs, we end up with these numbers for 2014 (I’m deliberately omitting the iPod):

100M iPads @ 130g = 13,000 tons
200M iPhones @ 30g = 6,000 tons
20M Macs @ 250g = 5,000 tons
Total Apple batteries = 24,000 metric tons

It’s a rough estimate, but close enough for today’s purpose: Apple and Tesla need about the same tonnage of batteries this year.

Now consider that Tesla just announced it will build a battery mill it calls the Gigafactory:

gigafactory process

According to Tesla, the plant’s capacity in 2020 will be higher than what the entire world produced in 2013.

A more likely explanation for Apple’s conversation with Tesla might be something Apple does all the time: Sit with a potential supplier and discuss payment in advance as a way to secure supply of a critical component.

Of course, neither Tesla nor Apple will comment. Why should they? But a partnership born of their comparable needs for battery volumes makes a lot more sense than for the two companies to become one.

–JLG@mondaynote.com

follow gassee

Postscript: Certainly, Apple is no stranger to acquisitions — the company recently disclosed that it purchased 23 small companies in the last 16 months — but the operative word, here, is “small”. Apple has made two large purchases in twenty years: The late-1996 acquisition of NeXT for $429M (and 1.5M Apple shares), and the $356M purchase of Authentec in July 2012. Other other than that, Apple’s acquisitions have focused on talent and technologies rather than sheer size.

This seems wise. In small acquisitions, everyone knows who’s in charge, it’s the acquirer’s way or the highway, and failure rarely makes headlines. Bigger deals always involve explicit or implicit power sharing and, more important, a melding of cultures, of habits of the heart and mind, of “the ways we do things here”.

——–

Comcast and Us

 

Comcast tells us how much better our lives will be after they acquire Time Warner. Great, thanks! Perhaps this is an opportunity to look at other ways that we can “acquire” Cable TV and Internet access.

Comcast CEO Brian Roberts thinks we’re powerless idiots. This is what his company’s website says about the planned Time Warner acquisition :

“Transaction Creates Multiple Pro-Consumer and Pro-Competitive Benefits…”

Don’t read the full legal verbiage that purports to explain the maneuver. A more productive use of your time will be had by reading Counternotion’s pointed summary in Obfuscation by disclosure: a lawyerly design pattern:

(tl;dr: According to Comcast, the merger is “pro-sumer” if you “get past some of the hysteria,” it’s “approvable” by the regulators and won’t “reduce consumer choice at all”. Will it raise prices? “not promising that they will go down or even that they will increase less rapidly.” Given the historical record of the industry, it’s Comedy Central material.)

Let’s not loiter around Comcast’s lobbying operations, either — the $18.8M spent in 2013, the pictures of Mr. Roberts golfing with our President, the well-oiled revolving door between the FCC and the businesses they regulate. Feelings of powerlessness and anger may ensue, as trenchantly expressed in this lament from a former FCC Commissioner.

Instead, let’s use our agitation as an opportunity to rethink what we really want from Cable carriers. The wish list is long: TV à la carte instead of today’s stupid bundles, real cable competition vs. de facto local monopolies, metered Internet access in exchange for neutrality and lower prices for lighter usage, decent set-top boxes, 21st century cable modems, and, of course, lower prices.

These are all valid desires, but if there were just one thing that we could change about the carrier business, what would it be? What would really make a big, meaningful difference to our daily use of TV and the Internet?

Do you remember the Carterfone Decision? For a century (telephone service started in the US in 1877), AT&T reigned supreme in telecommunications networking. (I should say the former AT&T, not today’s company rebuilt from old body parts.) The company owned everything along its path, all the way down to your telephone handset — only MaBell’s could be used.

Then, in the late fifties, a company called Carterfone began to sell two-way radios that could be hooked up to a telephone. The device was invented by a Texan named Thomas Carter as a clumsy but clever way to allow oil field owners and managers sitting in their offices in Dallas to reach their workers out at the pumps.

AT&T was not amused.

“[AT&T] advised their subscribers that the Carterfone, when used in conjunction with the subscriber’s telephone, is a prohibited interconnecting device, the use of which would subject the user to the penalties provided in the tariff…”

Carterfone brought an antitrust suit against AT&T… and won. With its decision in favor of Thomas Carter’s company, the Federal Communications Commission got us to a new era where any device meeting the appropriate technical standards could connect to the phone network.

“…we hold, as did the examiner, that application of the tariff to bar the Carterfone in the future would be unreasonable and unduly discriminatory.”

The regulator — an impartial representative, in an ideal world — decides what can connect to the network. It’s not a decision that’s left to the phone company.

Back in the 21st century, we need a Carterfone Decision for cable boxes and modems. We need a set of rules that would allow Microsoft, Google, Roku, Samsung, Amazon, Apple — and companies that are yet to be founded — to provide true alternatives to Comcast’s set-top boxes.

Today, you have a cable modem that’s so dumb it forces you to restart everything in a particular sequence after a power outage. You have a WiFi base station stashed in among the wires. Your set-top box looks like it was made in the former Soviet Union (a fortuitous product introduction days before the merger announcement doesn’t improve things, much). You have to find your TV’s remote in order to switch between broadcast TV, your game console, and your Roku/AppleTV/Chromecast…and you have to reach into your basket of remotes just to change channels.

Imagine what would happen if a real tech company were allowed to compete on equal terms with the cable providers.

Microsoft, for example, could offer an integrated Xbox that would provide Internet access, TV channels with a guide designed by Microsoft, WiFi, an optional telephone, games of course, and other apps as desired. One box, three connectors: power, coax from the street, and HDMI to the TV set. There would be dancing in the streets.

But, you’ll object, what about the technical challenges? Cable systems are antiquated and poorly standardized. The cables themselves carry all sorts of noisy signals. What tech giant would want to deal with this mess?

To which one can reply: Look at the smartphone. It’s the most complicated consumer device we’ve ever known. It contains radios (Wifi, Bluetooth, multi-band cellular), accelerometers/gyroscopes, displays, loudspeakers, cameras, batteries… And yet, smartphones are made in huge quantities and function across a wide range of network standards. There’s no dearth of engineering talent (and money) to overcome the challenges, especially when they’re tackled outside of the cable companies and their cost-before-everything cultures.

Skeptics are more likely to be correct about the regulatory environment or, to be more precise, regulatory capture, a phrase that…captures the way regulators now work for the industries they were supposed to control. Can we imagine the FCC telling Comcast: “Go ahead and buy Time Warner…just one little condition, make sure any and all of your connection protocols and services APIs are open to any and all that pass the technical tests listed in Appendix FU at the end of this ruling.”

That’s not going to happen. We must prepare ourselves for a sorry display of bad faith and financial muscle. Who knows, in the end, Comcast might give up, as AT&T did after telling us how pro-consumer the merger with T-Mobile would be.

JLG@mondaynote.com

@gassee

Apple Numbers For Normals: It’s The 5C, Stupid!

 

Today’s unscientific and friendly castigation of Apple’s iPhone 5C costly stumble: misdirected differentiation without enough regard for actual customer aspirations.

Here’s a quick snapshot of Apple’s numbers for the quarter ending December 2013, with percentage changes over the same quarter a year ago:

307TABLE JLG

We can disregard the iPod’s “alarming” decrease. The iPod, which has become more of an iPhone ingredient, is no longer expected to be the star revenue maker that it was back in 2006 when it eclipsed the Mac ($7.6B vs. $7.4B for the full year).

For iPhones, iPads, and overall revenue, on the other hand, these are record numbers…. and yet Apple shares promptly lost 8% of their value.

Why?

It couldn’t have been that the market was surprised. The numbers exactly match the guidance (a prophylactic legalese substitute for forecast) that was given to us by CFO Peter Oppenheimer last October:

“We expect revenue to be between $55 billion and $58 billion compared to $54.5 billion in the year ago quarter. We expect gross margins to be between 36.5% and 37.5%.”

(Non-normals can feast their eyes on Apple’s 10-Q filing and its lovingly detailed MD&A section. I’m sincere about the “lovingly” part — it’s great reading if you’re into it.)

Apple guidance be damned, Wall Street traders expected higher iPhone numbers. As Philip Elmer-DeWitt summarizes in an Apple 2.0 post, professional analysts expected about 55M iPhones, 4M more than the company actually sold. At $640 per iPhone, that’s about $2.5B in lost revenue and, assuming 60% margin, $1.5B in profit. The traders promptly dumped the shares they had bought on the hopes of higher revenues.

In Apple’s choreographed, one-hour Earnings Call last Monday (transcript here), company execs offered a number of explanations for the shortfall (one might say they offered a few too many explanations). Discussing proportion of sales of the iPhone 5S vs. iPhone 5C. Here what Tim Cook had to say [emphasis mine]:

“Our North American business contracted somewhat year over year. And if you look at the reason for this, one was that as we entered the quarter, and forecasted our iPhone sales, where we achieved what we thought, we actually sold more iPhone 5Ss than we projected.

And so the mix was stronger to the 5S, and it took us some amount of time in order to build the mix that customers were demanding. And as a result, we lost some sort of units for part of the quarter in North America and relative to the world, it took us the bulk of the quarter, almost all the quarter, to get the iPhone 5S into proper supply.

[…]

It was the first time we’d ever run that particular play before, and demand percentage turned out to be different than we thought.

In plainer English:

“Customers preferred the 5S to the 5C. We were caught short, we didn’t have enough 5Ss to meet the demand and so we missed out on at least 4 million iPhone sales.”

Or, reading between the lines:

“Customers failed to see the crystalline purity of the innovative 5C design and flocked instead to the more derivative — but flattering — 5S.”

Later, Cook concludes the 5S/5C discussion and offers rote congratulations all around:

“I think last quarter we did a tremendous job, particularly given the mix was something very different than we thought.”

… which means:

“Floggings will begin behind closed doors.”

How can a company that’s so precisely managed — and so tuned-in to its customers’ desires — make such a puzzling forecast error? This isn’t like the shortcoming in the December 2012 quarter when Apple couldn’t deliver the iMacs it had announced in October. This is a different kind of mistake, a bad marketing call, a deviation from the Apple game plan.

With previous iPhone releases, Apple stuck to a simple price ladder with $100 intervals. For example, when Apple launched the iPhone 5 in October 2012, US carriers offered the new device for $200 (with a two-year contract), the 2011 iPhone 4S was discounted to $100, and the 2010 iPhone 4 was “free”.

But when the iPhone 5S was unveiled last September, Apple didn’t deploy the 2012 iPhone 5 for $100 less than the new flagship device. Instead, Apple “market engineered” the plastic-clad 5C to take its place. Mostly built of iPhone 5 innards, the colorful 5C was meant to provide differentiation… and it did, but not in ways that helped Apple’s revenue — or their customers’ self-image.

Picture two iPhone users. One has a spanking new iPhone 5S, the other has an iPhone 5 that he bought last year. What do you see? Two smartphone users of equally discerning taste who, at different times, bought the top-of-the-line product. The iPhone 5 user isn’t déclassé, he’s just waiting for the upgrade window to open.

Now, replace the iPhone 5 with an iPhone 5C. We see two iPhones bought at the same time… but the 5C owner went for the cheaper, plastic model.

We might not like to hear psychologists say we build parts of our identity with objects we surround ourselves with, but they’re largely right. From cars, to Burberry garments and accessories, to smartphones, the objects we choose mean something about who we are — or who we want to appear to be.

I often hear people claim they’re not interested in cars, that they just buy “transportation”, but when I look at an office or shopping center parking lot, I don’t see cars that people bought simply because the wheels were round and black. When you’re parking your two-year old Audi 5S coupe (a vehicle once favored by a very senior Apple exec) next to the new and improved 2014 model, do you feel you’re of a lesser social station? Of course not. You both bought into what experts call the Affordable Luxury category. But you’re self-assessment would be different if you drove up in a Volkswagen Jetta. It’s made by the same German conglomerate, but now you’re in a different class. (This isn’t to say brand image trumps function. To the contrary, function can kill image, ask Nokia or Detroit.)

The misbegotten iPhone 5C is the Jetta next to the Audi 5S coupé. Both are fine cars and the 5C is a good smartphone – but customers, in numbers large enough to disrupt Apple’s forecast, didn’t like what the 5C would do to their image..

As always, it’ll be interesting to observe how the company steers out of this marketing mistake.

There is much more to watch in coming months: How Apple and its competitors adapt to a new era of slower growth; how carriers change their behavior (pricing and the all important subsidies) in the new growth mode; and, of course, if and how “new categories” change Apple’s business. On this, one must be cautious and refrain from expecting another iPhone or iPad explosion, with new products yielding tens of billions of dollars in revenue. Fodder for future Monday Notes.

JLG@mondaynote.com

@gassee

 

Mac Pro: Seymour Cray Would Have Approved

 

As we celebrate 30 years of Macintosh struggles and triumphs, let’s start with a semiserious, unscientific comparison between the original 128K Mac and its dark, polished, brooding descendant, the Mac Pro.

Mac 128KMac Pro

The original 128K Mac was 13.6” high, 9.6” wide, 10.9” deep (35.4 x 24.4 x 26.4 cm) and 16.5 lb (7.5 kg). Today’s Mac Pro is 9.9″ by 6.6″ (25 by 17 cm) and weighs 11 lb (5 kg) — smaller, shorter, and lighter than its ancient progenitor. Open your hand and stretch your fingers wide: The distance from the tip of your pinky to the tip of your thumb is in the 9 to 10 inches range (for most males). This gives you an idea of how astonishingly small the Mac Pro is.

At 7 teraflops, the new Pro’s performance specs are impressive…but what’s even more impressive is how all that computing power is stuffed into such a small package without everything melting down. Look inside the new Mac Pro and you’ll find a Xeon processor, twin AMD FirePro graphics engines, main memory, a solid-state “drive”, driven by 450W of maximum electric power… and all cooled by a single fan. The previous Mac Pro version, at only 2 teraflops, needed eight blowers to keep its GPU happy.

The Mac Pro achieves a level of “computing energy density” that Seymour Cray — the master of finding ways to cool high-performance, tightly packaged systems, and a Mac user himself — would have approved of.

(I’ve long been an admirer of Seymour Cray, ever since the introduction of his company’s first commercial supercomputer, the CDC 6600. In the early nineties, I was a Board member and investor at Cray Inc.  My memories of Seymour would fill an entire Monday Note. If you’re familiar with the name but not the supercomputer genius himself, I can recommend the Wikipedia article; it’s quite well-written.)

During Cray’s era of supercomputing — the 1960’s to early 90’s — processors were discrete, built from separate components. All of these building blocks had to be kept as close to each other as possible in order to stay in sync, to stay within the same “time horizon”. (Grace Hopper’s famous “one nanosecond equals a foot of wire” illustration comes to mind.) However, the faster the electronic module is, the more heat it generates, and when components are packed tightly together, it becomes increasingly difficult to pump out enough heat to avoid a meltdown.

That’s where Cray’s genius expressed itself. Not only could he plot impossibly tight circuit paths to guarantee the same propagation time for all logic signals, he designed these paths in ways that allowed adequate cooling. He sometimes referred to himself, half-seriously, as a good plumber.

(Seymour once told me he could fold a suit, change of shirt, and underwear in his small Delsey briefcase, and thus speed through airports on the way to a fund raising meeting while his investment bankers struggled with their unwieldy Hartmann garment bags…)

I finally met Seymour in December 1985 while I was head of Apple’s Product Development. The Mac Plus project was essentially done and the Mac II and Mac SE projects were also on their way (they would launch in 1987). Having catered to the most urgent tasks, we were looking at a more distant horizon, at ways to leap ahead of everyone else in the personal computer field. We concluded we had to design our own CPU chip, a quad-processor (today we’d call it a “four-core chip”). To do this, we needed a computer that could run the design and simulation software for such an ambitious project, a computer of commensurate capabilities, hence our choice of a Cray X/MP, and the visit to Seymour Cray.

For the design of the chip, the plan was to work with AT&T Microelectronics — not the AT&T we know now, but the home of Bell Labs, the birthplace of the transistor, Unix, the C language, cellular telephony and many other inventions. Our decision to create our own CPU wasn’t universally well-received. The harshest critics cast Apple as a “toy company” that had no business designing its own CPU chip. Others understood the idea but felt we vastly underestimated the technical challenges. Unfortunately, they turned out to be right. AT&T Microelectronics ultimately bailed out of the microprocessor business altogether.

(Given this history, I couldn’t help be amused when critics scoffed at Apple’s decision to acquire P.A. Semiconductor in 2008 and, once again, attempt to design its own microprocessors. Even if the chip could be built, Apple could never compete against the well-established experts in the field… and it would cost Apple a billion dollars, either way. The number was widely off the mark – and knowing Apple’s financials wouldn’t matter anyway. We know what happened: The 64-bit A7 device took the industry by surprise.)

Thirty years after the introduction of the original Mac, the Mac Pro is both different and consistent. It’s not a machine for everyone: If you mostly just use ordinary office productivity apps, an iMac will provide more bang for less buck (which means that, sadly, I don’t qualify as a Mac Pro user). But like the 128K Mac, the Mac Pro is dedicated to our creative side; it serves the folks who produce audio and video content, who run graphics-intensive simulations. As Steve put it so well, the Mac Pro is at the crossroad of technology and liberal arts:

Crossroads

Still, thirty years later, I find the Mac, Pro or “normal” every bit as seductive, promising – and occasionally frustrating – as its now enshrined progenitor.

As a finishing touch, the Mac Pro, like its ancestor, is designed and assembled in the US.

JLG@mondaynote.com

————————–

Postscript. At the risk of spoiling the fun in the impressive Making the all-new Mac Pro video, I wonder about the contrast between the powerful manufacturing operation depicted in the video and the delivery constipation. When I ordered my iMac early October 2013, I was promised delivery in 5-7 business days, a strange echo of of the December 2012 quarter iMac shipments shortfall. The machine arrived five weeks later without explanation or updated forecast. Let’s hope this was due to higher than expected demand, and that Apple’s claim that Mac Pro orders will ship “in March” won’t leave media pros wanting.

The Hybrid Tablet Temptation

 

In no small part, the iPad’s success comes from its uncompromising Do Less To Do More philosophy. Now a reasonably mature product, can the iPad expand its uses without falling into the hybrid PC/tablet trap?

When the iPad came out, almost four years ago, it was immediately misunderstood by industry insiders – and joyously embraced by normal humans. Just Google iPad naysayer for a few nuggets of iPad negativism. Even Google’s CEO, Eric Schmidt, couldn’t avoid the derivative trap: He saw the new object as a mere evolution of an existing one and shrugged off the iPad as a bigger phone. Schmidt should have known better, he had been an Apple director in the days when Jobs believed the two companies were “natural allies”.

I was no wiser. I got my first iPad on launch day and was immediately disappointed. My new tablet wouldn’t let me do the what I did on my MacBook Air – or my tiny EeePC running Windows Xp (not Vista!). For example, writing a Monday Note on an iPad was a practical impossibility – and still is.

I fully accept the personal nature of this view and, further, I don’t buy the media consumption vs. productivity dichotomy Microsoft and its shills (Gartner et al.) tried to foist on us. If by productivity we mean work, work product, earning one’s living, tablets in general and the iPad in particular have more than made the case for their being productivity tools as well as education and entertainment devices.

Still, preparing a mixed media document, even a moderately complex one, irresistibly throws most users back to a conventional PC or laptop. With multiple windows and folders, the PC lets us accumulate text, web pages, spreadsheets and graphics to be distilled, cut and pasted into the intended document.

Microsoft now comes to the rescue. Their hybrid Surface PC/Tablet lets you “consume” media, play games in purely tablet mode – and switch to the comfortable laptop facilities offered by Windows 8. The iPad constricts you to ersatz folders, preventing you to put your document’s building blocks in one place? No problem, the Surface device features a conventional desktop User Interface, familiar folders, comfy Office apps as well as a “modern” tile-based Touch UI. The best of both worlds, skillfully promoted in TV ads promising work and fun rolled into one device.

What’s not to like?

John Kirk, a self-described “recovering attorney”, whose tightly argued and fun columns are always worth reading, has answers. In a post on Tablets Metaphysics – unfortunately behind a paywall – he focuses on the Aristotelian differences between tablets and laptops. Having paid my due$$ to the Techpinions site, I will quote Kirk’s summation [emphasis mine]:

Touch is ACCIDENTAL to a Notebook computer. It’s plastic surgery. It may enhance the usefulness of a Notebook but it doesn’t change the essence of what a Notebook computer is. A keyboard is ACCIDENTAL to a Tablet. It’s plastic surgery. It may enhance the usefulness of a Tablet, but it doesn’t change the essence of what a Tablet is. Further — and this is key — a touch input metaphor and a pixel input metaphor must be wholly different and wholly incompatible with one another. It’s not just that they do not comfortably co-exist within one form factor. It’s also that they do not comfortably co-exist within our minds eye.

In plain words, it’s no accident that tablets and notebooks are distinctly different from one another. On the contrary, their differences — their incompatibilities — are the essence of what makes them what they are.

Microsoft, deeply set in the culture of backwards compatibility that served it so well for so long did the usual thing, it added a tablet layer on top of Windows 7. The result didn’t take the market by storm and appears to have caused the exit of Steve Sinofsky, the Windows czar now happily ensconced at Harvard Business School and a Board Partner with the Andreessen Horowitz venture firm. Many think the $900M Surface RT write-off also contributed to Ballmer’s August 2013 resignation.

Now equipped with hindsight, Apple’s decision to stick to a “pure” tablet looks more inspired than lucky. If we remember that a tablet project preceded the iPhone, only to be set aside for a while, Apple’s “stubborn minimalism”, its refusal to hybridize the iPad might be seen as the result of long experimentation – with more than a dash of Steve Jobs (and Scott Forstall) inflexibility.

Apple’s bet can be summed up thus: MacBooks and iPads have their respective best use cases, they both reap high customer satisfaction scores. Why ruin a good game?

Critics might add: Why sell one device when we can sell two? Apple would rather “force” us to buy two devices in order to maximize revenue. On this, Tim Cook often reminds Wall Street of Apple’s preference for self-cannibalization, for letting its new and less expensive products displace existing ones. Indeed, the iPad keeps cannibalizing laptops, PCs and Macs alike.

All this leaves one question unanswered: Is that it? Will the iPad fundamentals stay the way they have been from day one? Are we going to be thrown back to our notebooks when composing the moderately complex mixed-media documents I earlier referred to? Or will the iPad hardware/software combination become more adept at such uses?

To start, we can eliminate a mixed-mode iOS/Mac device. Flip a switch, it’s an iPad, flip it again, add a keyboard/touchpad and you have a Mac. No contraption allowed. We know where to turn to for that.

Next, a new iOS version allows multiple windows to appear on the iPad screen; folders are no longer separately attached to each app as they are today but lets us store documents from multiple apps in one place. Add a blinking cursor for text and you have… a Mac, or something too close to a Mac but still different. Precisely the reason why that won’t work.

(This might pose the question of an A7 or A8 processor replacing the Intel chip inside a MacBook Air. It can be done – a “mere matter of software” – but how much would it cut from the manufacturing cost? $30 to $50 perhaps. Nice but not game-changing, a question for another Monday Note.)

More modest, evolutionary changes might still be welcome. Earlier this year, Counternotions proposed a slotted clipboard as An interim solution for iOS ’multitasking‘:

[...] until Apple has a more general solution to multitasking and inter-app navigation, the four-slot clipboard with a visible UI should be announced at WWDC. I believe it would buy Ive another year for a more comprehensive architectural solution, as he’ll likely need it.

This year’s WWDC came and went with the strongest iOS update so far, but no general nor interim solution to the multitasking and inter-app navigation discussed in the post. (Besides  the Counternotions blog, this erudite and enigmatic author also edits counternotions.tumblr.com and can be followed on Twitter as @Kontra.)

A version of the above suggestion could be conceptualized as a floating dropbox to be invoked when needed, hovering above the document worked on. This would not require the recreation of a PC-like windows and desktop UI. Needed components could be extracted from the floating store, dragged and dropped on the work in process.

We’ll have to wait and see if and how Apple evolves the iPad without falling into the hybrid trap.

On even more speculative ground, a recent iPad Air intro video offered a quick glimpse of the Pencil stylus by Fifty-Three, the creators of the well-regarded Paper iPad app. So far, styli haven’t done well on the iPad. Apple only stocks children-oriented devices from Disney and Marvel. Nothing else, in spite of the abundance of such devices offered on Amazon. Perhaps we’ll someday see Apple grant Bill Gates his wish, as recounted by Jobs’ biographer Walter Isaacson:

“I’ve been predicting a tablet with a stylus for many years,” he told me. “I will eventually turn out to be right or be dead.”

Someday, we might see an iPad, larger or not, Pro or not, featuring a screen with more degrees of pressure sensitivity. After seeing David Hockney’s work on iPads at San Francisco’s de Young museum, my hopes are high.

JLG@mondaynote.com

@gassee

Sound Holiday Thoughts

 

Nothing too serious this week. No Microsoft CEO succession, no Samsung $14B marketing budget exceeding Iceland’s GDP, no Apple Doom. Just Holiday – or Cyber Monday – audio talk.

I used to listen to sound. Now I enjoy music. It started with explosives. I was lucky to be born at a time and place (an arch-communist suburb of post-war Paris) where a 9-year old kid could hopscotch to the drugstore around the corner and buy nitric, sulfuric, or hydrochloric acid, sulfur, potassium chlorate, hydrogen peroxide… and other fascinating wares – among which a flogger with short leather lashes I was also acquainted with. Imagine this in today’s California…

After a minor eye-burn incident, I was firmly redirected towards electronics and started building crystal radios, rudimentary AM sets using a galena (lead sulfide) crystal.

My good fortune continued. In 1955, my parents decided to send their increasingly restive child to a Roman Catholic boarding school in Brittany. What awaited me there, besides a solid classical education, was a geeky Prefect of Discipline who had a passion for hobby electronics. After hours, I would go to his study to read Radio Plans and Le Haut-Parleur — the French equivalents of Nuts and Volts — and salivate over the first OC71 transistor that had just landed on his desk (amazingly, the transistor is still available). This was exciting: Fragile, noisy, power hungry vacuum tubes that required both high and low voltages were going to be replaced by transistors. Numerous, randomly successful projects followed: radios, mono and stereo amplifiers, hacked surplus walkie-talkies.

Years later, in June 1968, I landed a dream job launching HP’s first desktop computer, the 9100A, on the French market. I distinctly recall the exultant feeling: After years of the psycho-social moratorium referred to in an earlier Monday Note, I had entered the industry I love to this day.

With more money, I was able to afford better turntables, tape decks, receivers, amplifiers and, above all, speakers. For a while I started to listen more to the sound they produced than to the music itself. The Lacanians have a phrase for the disease: Regressive Fixation On Partial Objects…

HP had installed an über-geek, Barney Oliver, as head of its Research organization, HP Labs. Adored for his giant intellect and free spirit, Oliver decided stereo amplifiers of the day (early 70′s) were either expensive frauds or noisy trash. Or both. So he raided the HP parts bin and built us a real stereo amplifier. (The manual and schematics are lovingly preserved here.) Four hundred were built. I bought two, because you never know. This was a vastly “overbuilt” device that used high-precision gas chromatograph attenuators with .1dB steps as volume controls. (Most of us have trouble perceiving a 1dB difference.) The power supply had such enormous capacitors that the amplifier would keep “playing” for 25 seconds after it was turned off.

HP, the old, real HP, truly was technophile heaven.

As years passed, I became uncomfortable with the audio arms race, the amps that pushed out hundreds or even thousands of watts, the claims of ever-vanishing .1%, nay. .01% distortion levels, the speakers that cost tens of thousands of dollars. (The Rolls-Royce of audio equipment of the time was…McIntosh.)

A chance encounter with The Audio Critic helped me on the road to recovery. Peter Aczel, the magazine’s publisher and main author is a determined Objectivist Audiophile, a camp that believes that “audio components and systems must pass rigorously conducted double-blind tests and meet specified performance requirements in order to validate the claims made by their proponents”. Committed to debunking Subjectivists‘ claims of “philosophic absolutes” and ethereal nuance, Aczel has attracted the ire of high-end equipment makers who hate it when he proves that their oxygen-free copper cables with carefully aligned grains are no better than 12-gauge zip wire at 30 cents per foot.

(A helpful insight from Aczel: In an A/B audio comparison, the louder gear inevitably wins, so loudness needs to be carefully equalized. This “sounds” like the reason why, over the last two or three decades, wines have increased their alcohol concentration to 14% or more: In tastings, the stronger wine is almost always preferred.)

The real turning point from sound fetishism to music appreciation came in early 2002 when I bought an iMac G4 that came with two small but surprisingly good external loudspeakers:

iMac G4 w Speakers

They won’t fill a concert hall, they can’t compete with my old JBL speakers but coupled with iTunes, the iMac had become a pleasant stereo. (Due, of course, to the improvements in magnetic alloys such as neodymium compounds, more efficient Class D amplifiers, and… but I’ll stop before I relapse.)

A decade later — and skipping the politically incorrect jokes about married men experiencing premature hearing impairment in the high-frequency region of the spectrum — I’m now able to focus on music and expect the reproduction equipment to stay out of the way, in both practical and auditory terms.

Today’s “disk drives” are solid state and store hundreds of gigabytes; CDs and DVDs have all but disappeared; iPods, after a few years in the sun, have been absorbed into phones and tablets. (And we watch iTunes on the road to becoming Apple’s Windows Vista.)

After years of experiment, I’ve come to a happy set of arrangements for enjoying music at home, at work, and on the go. Perhaps these will help your own entertainment. (Needless to say, I bought all the following – and many others – with my own money, and the Monday Note doesn’t receive compensation of any kind.)

At home, I use a Bose Companion V desktop set-up. It consists of two pods, one on each side of the screen, plus a bass module anywhere under the desk. Bose’s idea is to take your PC’s output from a USB port and process it to add an illusion of depth/breadth when sitting at your desk. For me, it works. And the output is strong enough for a family/kitchen/dining room.

That said, I’m not fond of all Bose products. I find the smaller Companion units too bass-heavy, and I didn’t like (and returned) their AirPlay speaker. As for the company’s design sensibility, Amar Bose gave me the evil eye more than 15 years ago when I dared suggest that the industrial design of his Wave System could use updating (I was visiting his Framingham Mountain office with a “noted Silicon Valley electrics retailer”). The design hasn’t changed and is selling well.

At the office, I followed advice from my old friends at Logitech and bought two Ultimate Ears Bluetooth speakers. With a (recently improved) smartphone app, they provide very good stereo sound. At $360/pair, the UE system costs about the same as the Companion V; what UE lacks in the Bose’s power, it makes up for in portability. The only knock is that the mini-USB charging port is under the speaker’s bottom — you have to turn it on its head to charge it..

Speaking of portability, Bose’s Soundlink Mini, another testament to modern speaker and amplifier technology, fits in a bag or roll-aboard and shocks unprepared listeners with its clean, powerful sound and clean design. No discounts on Amazon, which we can attribute to Bose’s unwavering price control and to the system’s desirability.

I kept the best for last: Noise-reducing earphones. The premise is simple: A microphone captures ambient sound, embedded circuitry flips the waveform and adds it into the signal, thus canceling the background noise and allowing us to enjoy our music undisturbed. This is a consumer application of Bose’s first noise-canceling headphones for aviation applications, still considered the domain’s standard. A “pro” set cost about $1,000. Consumer versions are $300 or less.

To my ears, early models were disappointing, they introduced small levels of parasitic noise and featured indifferent music reproduction. Nonetheless, sales were strong.

Later models, from Bose and others, improved both music playback and noise cancelation, but still felt big, unwieldy. Again, a matter of personal preference.

Yielding to the friendly bedside manner of an Apple Store gent, I recently bought a pair of Bose QC 20i “noiseless” earphones (about $300). The earbuds are comfortable and so “skin-friendly” that you forget you’re wearing them (I mention this because comfort will always trump quality). They’re also more secure, less prone to falling out of your ears than are Apple’s own devices.

Now, as I take my evening walk in the streets of Palo Alto enjoying the Bach Partitas, the street noise is barely a whisper, cars seem to glide by as they were all Teslas. For civility and safety, there’s a button to defeat noise reduction, and the mandatory Pause for phone or street conversations. There are other nice details such as a spring-loaded clip for your shirt or lapel, or a dead-battery mode that still lets music — and noise —  come through.

Next week, we’ll return to more cosmic concerns.

JLG@mondaynote.com