software

Turning Points

Once upon a time, Microsoft reigned supreme, they were IBM 2.0, having wrestled control of the PC from Big Blue. According to some critics, Microsoft took over the office application market through a combination of embrace, extend, and extinguish and tied sales. The MS M.O. followed this trajectory: First, Bill Gates and his troops would paint a bullseye on a product category—spreadsheets, words processors. Then they’d add features, creating a superset of those offered by Lotus 1-2-3 or WordPerfect. The next move involved the relationship with PC manufacturers: Microsoft (it’s said) tied the sale of the Microsoft Office suite to a Windows license. You want “favorable” terms on Windows? Install Office on the PCs that you sell.

MS Office applications became the Colossus of the business sector. And they established a de facto lingua franca. If you wanted to write a productivity app, you had to speak the MS language—you had to translate to and from Microsoft’s file formats. But these formats were under Microsoft’s sole control, they could “extend” the language at will, and—surprise!—these enhancements favored their own applications. For third-party developers, complete, full-featured compatibility was next to impossible to achieve. Ask the OpenOffice folks.

It worked. For more than 20 years, the Windows + Office combo has been Microsoft’s cash cow. In its 2010 Annual Report, MS reported $37B in revenue—60% of the company—for what is officially called Windows & Windows Live and the Business Division. With an operating profit of $24.7B and a beefy 67% operating margin, Windows + Office accounts for…102% of the company’s operating income?

(The percentage anomaly is due, in part, to the $2.3B lost by the Online Services business. With $2.2B in revenues, that’s more than $2 spent for every $1 in sales. For another part, we have an opaque category, Corporate Level activities, which adds another $4.5B of red ink.)

The mid-90’s, the browser wars. Microsoft seemed content to let Netscape’s Navigator own the scene… until the nascent Web wafted the aroma of monetization. Microsoft’s Internet Explorer began life as an add-on, almost an afterthought, a sop to “unproductive” Web surfers. But with the release of Windows ’95, IE was wired into the operating system, for free (which would eventually land Microsoft in various kinds of legal troubles, now largely forgotten). And, true to the MS M.O., subsequent versions offered “enhancements” to HTML that only IE knew how to interpret. Embrace, extend, extinguish…that was the end of Netscape’s hopes for a revenue stream.

Those were the days. Since then, all of Microsoft’s efforts to control standards have gone nowhere. More

Time to rethink Word Processors — Seriously

Last Friday, at the Apple Store near the Paris Opera House, I paid my annual Microsoft tax: €140 ($194) for the 2011 edition of Microsoft Office. My hopes: more speed, less bugs, and smarter features. All in the service of producing all manners of text and presentations required by my multiple jobs. So far, no mind-blowing features, nothing more than a superficial makeover.
To look at this new iteration of Word, I use the framework built on my experience of Microsoft’s R&D effort.  A few months ago, I spent three days at the Microsoft Tech Fest in Redmond. At first, I felt like a kid in a candy store, chatting with some of Microsoft Research 900 plus PhDs who work on exotic fields such as Machine Learning or Epidemiology. But the amazement subsided and was replaced by doubt: How did this tremendous intellectual firepower actually make a difference in the Microsoft products I’ve been using for 15 years. In fact, Microsoft R&D has very little impact of everyday products. This is but one of Microsoft’s many problems: see the long piece I wrote in Le Monde Magazine.

Let’s go back to the subject of this column. Knowing what I know about Microsoft’s vision of computer science, I had envisioned of a quantum leap for applications I use the most, such as the very word processor on which I’m using “as we speak”.  No joy. Let’s ignore the letdown and, instead, speculate a little bit about the next generation of text creation tools branded Microsoft Word, or Apple Pages (which comes with fewer bells and whistles, but is tidier).

First, text creation. One of the biggest challenges, and a growing one, is spelling, syntax, and grammar. In a country such as France, whose language is loaded with utmost (and sometimes absurd) complexity, the quality of writing is in steep decline. For the youngest part of the population, it is accelerated by the demise of a school system where teachers in effect gave up on written language. As for the 30-40 age bracket, the bombardment of daily interactions (email at work, SMS, chat on social networks) has made proper spelling and syntax secondary. Quite often, coming from a manager or even an attorney, you’ll receive a business document riddled with spelling errors well beyond the typos acceptable in a hastily written piece.

Unfortunately, today’s word processors do a very poor job when dealing with mangled spelling and grammar. All of us have in mind examples where the Word application becomes absurdly creative when dealing with the unknown: regardless of context, and with no learning capabilities whatsoever, Word will stubbornly keep suggesting an alternate spelling instead of simply skipping an unrecognized term.

Let’s dream for a moment; let’s picture what a text processing software could look like in the light of existing technologies.

When I install my 2013 version of MS Word or Apple Pages, it asks me to load a “reference corpus” of texts it will learn from. Since I write both in French and in English, I will feed the app with the final versions (edited, and proof-read) of articles I published and I’m comfortable with. Grammar and syntax will be helpful for English and thesauruses will be used for both. Since I currently write about media and technology, the application dictionary will soon be filled with the names of people, places, companies I mention, as well as with the technical jargon I allow myself to use. Alternately, if I don’t want to feed the word processor with my own writings, I can direct it to URLs of texts I find trustworthy: great newspapers, magazines, or academic papers…

Similarly, a lawyer or a doctor will feed the word processor with texts (from his own production, or found online) to be used as reference for professional vocabulary and turns of phrase. In my dream, third-party software vendors have seen a business opportunity: they sell industry- or occupation-specific plugins loaded with high-quality reference corpuses. This results in reliable auto-correct for Word and Pages. Some vendors even provide their corpuses as on-line subscriptions, constantly updated with state-of-the art content.
Then, as I write, the application watches my typing and matches it against the relevant corpus. Instead of relying on rigid hit-or-miss grammatical rules, it uses a statistical algorithm to analyze a word, or a group of words within their context of intended or inferred meaning. Take this gross mistake: “GM increased its sails by 10 percent”. The word is spelled correctly but, in this context, wrong. Because it lacks a context in which to detect the misspelling, the 1998-vintage word processor won’t change “sails” into “sales”. Conversely, the 2013 statistical-based language model flags the mistake by using the proper body of reference to see that “sails” is unlikely in an auto industry context.

Just a year ago, Google introduced Wave, an ambitious reinvention of email, seemingly ahead of its time. Among other advances, Wave featured a spectacular implementation of Google’s huge statistical model of language. In this video (go to the 45th minute) you’ll see Google Wave’s product manager Lars Rasmussen type the following sentences: “Can I have some been soup? It has bean a long time. Icland is an icland”, etc. Each time, the software automagically corrects the mistakes as they are typed, confident in the power of its algorithm and of its immense body of reference.  This statistical approach works with gross, obvious mistakes, but also with more subtle ones.
Of course, I am aware of the difficulties in applying statistical language models to personal software: such algorithms are bandwidth and CPU intensive. This could explain why Google did not deploy the Wave spelling demonstrator on Gmail, or on Google docs. But the underlying algorithms do exist. A less sophisticated version, limited to professional dictionaries and thesauruses at first, could be fantastically helpful in properly spelling Zhengzhou, if you happen to write about Asia, or Neuroborreliosis, if you are a medical student.

Second, the use of texts. A significant proportion of writings goes to blogs and other social environments.  As a serious user of the WordPress platform [today’s Word can’t even change WordPress into the correct WordPress, I had to check on Google…], I would gladly pay for a Word or Pages plug-in allowing me to compose a clean post with text, images, tables, links, typographical enrichments and, when done, letting me click “publish on my blog” or “send it to the mailing list”. No more cut & paste surprises or image resizing headaches.  The word processor plug-in could be provided by the same developer who designed the style sheet (CSS) for my WordPress (or Blogspot, or TypePad) site. Or I could go for the auto-settings by inserting the CSS code in the plug-in that will, in turn adjust the word processor’s dials, from fonts and sizes, to background colors, etc.

You get my point: self-correcting spelling systems that guarantee (or at least vastly improve) decent grammar, syntax and the proper spelling of nouns and names can be a huge improvement for all professional writers – especially in a globalized economy where a greater number of us produce documents in a foreign language. Such auto-correct systems can even offer educational value in helping bloggers improve their basic writing skills.

I’m writing this on Word version 14 (yes, fourteen).  How long will I have to wait for this quantum leap, Mr. Ballmer? Or Mr. Jobs?

—frederic.filloux@mondaynote.com

Apple’s Next Macintosh OS

by Jean-Louis Gassée

Operating systems don’t age well. Some have better genes than others or they have more competent caretakers, but sooner or later they are stricken by a cancer of bug fixes upon bug fixes, upgrades upon upgrades. I know, I lived inside two OS sausage factories, Apple and Be, and was closely associated with a third, PalmSource. I can recall the smell.
The main cause of OS cancer is backwards compatibility, the need to stay compatible with existing application software. OS designers are caught between yesterday and tomorrow. Customers want the benefit of the future, new features, hardware and software, but without having to jettison their investment in the past, in their applications.

OS architects dream of a pure rebirth, a pristine architecture born of their hard won knowledge without having to accommodate the sins of their fathers. But, in the morning—and in the market—the dream vanishes and backwards compatibility wins.

Enter the iPhone.

The iPhone OS, iOS, is a Macintosh OS X derivative…but without having to support Macintosh applications. Pared down to run on a smaller hardware platform, cleaned up to be more secure and tuned for a Touch UI, iOS is the dream without the ugly past. Tens of millions of iPhones, hundreds of thousands of applications, and billions of downloads later, this is a new morning without the hangover.

And now we have the iPad, another iOS device. (I’ll omit the newer Apple TV for the time being.) 8.5 million iPads were shipped by September, a mere six months after its introduction. The installed base will reach 14 to 15 million units by the end of this year.
To paraphrase the always modest Apple PR boilerplate phrase (“Apple ignited the personal computer revolution in the 1970s …”) the iPad re-ignited the marginal tablet category.

After more than 30 years of stalled attempts, the tablet genre has finally gelled. We see a flurry of tablet announcements from Asus, HP, Samsung, Dell, Archos, and many others, using Windows 7, WebOS, and Android. Surprisingly, we have yet to hear a pundit declare 2011 ‘The Year of The Tablet’. It’ll come.

On the other hand… Apple held a Back to the Mac event at its Cupertino HQ last week. As the name implies, Apple wants to make it clear that it’s still committed to personal computers. (You can see the full keynote here…but that’s 90 minutes. A tongue-in-cheek, adjective-laden 104 second montage gets to the essence here.) The iPhone may generate half of Apple’s revenue, but the event reminded us that Macintosh desktops and laptops are a $20B/yr business—a business that’s growing faster than the rest of the PC industry. Apple made a point of showing how the iPad, after taking its genes from the Mac, was feeding DNA back to its progenitor by way of the Touch UI that will appear in the release dubbed “Lion”, OS X 10.7.

During the Back to the Mac presentation, two prayers of mine were answered: A Macintosh App Store and a smaller laptop. The App Store has received the expected “walled garden” critique, but having seen how difficult it is for small Mac software developers to get retail shelf space or to make money selling their wares on line, I like the idea. A few days ago, I downloaded a neat little utility to silence the startup sound on my new 11” MacBook Air. How much did the developer make? Zero, it’s freeware; the programmer didn’t want to spend the time and money to set up a commercial site. How much would I have paid for it from a Mac App Store? Less than $5, more than 99 cents.

As for the 11” MacBook Air, Walt Mossberg, WSJ’s tech guru, penned an insightful review that’s neatly summed up in its title: “MacBook Air Has the Feel Of an iPad In a Laptop”.

So: A clean, fresh iOS; we’re not abandoning the Mac…What are we to make of these competing messages? My theory:

  • Today’s PC operating systems have advanced cancer
  • Personal computers as we know them are here to stay
  • Apple will move to something like an iOS Macintosh

Easier said than done. Steve Jobs remembers well the trouble Apple had getting apps for the first Macintosh, the painful failures of Lotus Jazz, the lame Mac software from Software Publishing Corp., creator of the best-selling PFS: series for the Apple ][. Ironically, some of the best software came from Microsoft—the word frenemy hadn’t been coined yet but retroactively fits. So, just like the iPhone App Store made the iPhone, the Macintosh needs a marketplace, an agora in preparation for the transition.

But a transition to what?

An evolution of the iPad? Certainly not something I saw at Il Fornaio, one of the local Valley watering holes. There, a very serious woman had her iPad standing on the official Apple keyboard dock, writing and, from time to time, raising her hand and touching something on the screen. As Jobs pointed out in the keynote above, it’s an ergonomic no-no.
Now, turn to the laptop. As one of my colleagues says: “It’s dark inside the box.” It’s what the machine does that matters, not what’s inside. Indeed. Imagine a port of OS X on an ARM, or A4, or AX processor, or even a Loongson CPU for that matter. If the right applications have been ported or adapted or, even better, created de novo for the platform —and made available through the App Store—would we object?

But, you’ll argue, “Aren’t these processors much less powerful than Intel’s?” Ask an iPad user: The machine feels swift and fluid, much more than a conventional PC.

Yes, there are no heavy-duty apps such as Photoshop or AutoCAD for the iPad. (AutoDesk publishes an AutoCAD companion app for the iPad and the iPhone.), but who knows? Adobe might be tempted to do for Photoshop what Apple has done for its OS: Scrap the past and build a modern Photoshop that’s written from the ground up.
Intel processors suffer the same type of cancer that afflicts operating systems. Their instruction sets and, therefore, their hardware, power consumption, and cost are beset by the tortuous need to stay compatible with existing code while offering an endless procession of new features. Intel has tried a fresh approach at least three times: the iPAX 32 in the early 80s, the Itanium (promptly renamed Itanic, a political compromise hammered out to keep HP’s PA architecture out of contention), and a brief fling with ARM called the XScale. Each time, the company (or the market) decided backwards compatibility was the way to go. Intel’s position is transparent: They believe that the might of their technology and manufacturing will bulldoze the cost and power consumption obstacles of the x86 architecture.

(We’ll note in passing that there is no Wintel in smartphones. For its Really Personal Computers, for its Windows Phone 7 devices, Microsoft is all ARM.)

Compare the bulldozer approach to what Apple did when it designed the A4, the “dark inside” of the iPad. Apple’s next Mac processor could be a multicore (or multi-chip) ARM derivative. And the company has proven time and again that it knows how to port software, and its support of the Open Source LLVM and Clang projects give it additional hardware independence. We all know the Apple Way: Integration. From bare metal to the flesh, from the processor to the Apple Store. Hardware, OS, applications, distribution… Apple knows how to control its own destiny.

Tomorrow’s MacBook Air might have even more of the “Feel of an iPad in a Laptop” that Walt Mossberg detected. The tablet and the laptop could run on the same “dark insides”, with the same software, and the same Touch UI interface. And, for a desktop machine, an iMac successor, we already have the Magic Trackpad for touch input.

(IMCO, the current Trackpad doesn’t feel magical enough: on the two devices I own, the touch input isn’t as reliable, pleasant and “second nature” as it is with existing mice or a laptop trackpads. I gave up after two weeks. I’m not the only one with that view, I’ve asked. And the local Apple Store doesn’t push appear eager to push the device either.)

All this doesn’t mean the x86-based Macs would disappear overnight: high-end Mac Pros, for example, might continue for a while as they do today for applications such as Logic Studio or Final Cut.

If this sounds farfetched, one question and an observation.

The question: Would you bet the longer term future of your $20B Mac business on an endless series of painfully debugged x86-based OS X incremental releases? Or would you rather find a way to move that franchise to a fresh hardware/software platform fully under your control?

The observation: Last week, the other Steve, Ballmer, was on stage at the Gartner Symposium. There, he was asked about Microsoft’s “biggest gamble”. Without missing a beat, as this forceful public speaker never does, he answered: “The next revision of Windows.” Not Windows Phone 7, not the Kinect game device, all near and dear to his heart, but Windows 8. (See here and here.)

He, too, is thinking about the future of the PC business.

JLG@mondaynote.com

PS: As I edited this note, I found this TechCrunch post dealing with the same iPad-Mac convergence.

HP’s Board Gets No Respect

.

And rightly so.

You recall: Last August, HP’s Board of Directors dismissed its wunder-CEO, Mark Hurd. Well-loved by Wall Street, although not so much by employees, Hurd turned HP around after the lackluster Fiorina years. He made acquisitions, cut costs, and put the company at the very top of the IT industry. But HP’s fearless leader was accused of having entangled himself, carnally and emotionally, with a female “marketing contractor”, and of having engaged in a few financial peccadilloes in the process of covering up the relationship.

I’ll hasten to add that Hurd reached an amiable—and solid—settlement with the former soft-porn actress. By “solid settlement” I mean we’ve heard exactly nothing from the aggrieved woman, or from Gloria Allred, her highly expressive Hollywood attorney. (As a self-described “Fearless Advocate for Justice and Equality”, Ms. Allred appears to dig gold on behalf of the rejected/dejected paramours of media and sports celebrities.)

While Hurd tried to do the right thing after his alleged mistakes, HP’s Board and management repeatedly and needlessly pilloried him, barely stopping short of accusing their former CEO of fraud. (See more sorry details in this Monday Note.)

All this led Larry Ellison to publicly lambaste the HP Board for kicking Hurd to the curb—and to promptly hire him as co-president of Oracle.

Ignoring the “when you’re in a hole, stop digging” maxim, HP doubles down and sues Hurd. Their complaint? As Oracle co-president, Hurd will inevitably misuse HP’s confidential information and cause his ex-employer grievous harm.

Larry chuckles and lashes out again. He calls HP’s suit vindictive, which is true, and adds that it will make it impossible to continue as business partners, only somewhat true as each had already recently moved into the other’s business. Oracle bought Sun and HP got into software and services by acquiring EDS.

A few days later, on the eve of Oracle’s OpenWorld, the suit is settled. HP’s pain is salved by a few million dollars, and the threat of the misuse of confidential information is suddenly, mysteriously no longer an issue. One wonders about the damage HP’s Board did to the company’s reputation by treating this alleged sinner in such a bullying and ultimately lame way.

While Hurd stays out of the limelight plotting Oracle’s next moves, HP directors keep stoking the coals for their critics. In their quest for a new CEO, the Board rejects internal candidates for the third time and pick an outsider: Léo Apotheker, ex-CEO of SAP Germany. This leads to another salvo of Ellison jibes. (When Larry calls himself “speechless”, you know he’s having a good time.)

But wait, there’s more.

What does the Board do besides recruiting Apotheker? They hire Ray Lane as Chairman. As the link to his Kleiner Perkins bio proves, Lane is, without a doubt, an “industry figure”, the type Kleiner Perkins, one of the largest VC firms in the world, likes to co-opt. But the slick KPCB bio (there is, significantly, nothing on him on Wikipedia) omits an important episode: Ray’s acrimonious departure from Oracle. The more charitable souls among us hope that everything is forgiven and forgotten. But knowing the protagonists, Larry and Ray, a more realistic view is that HP’s Board brought Ray in with a specific intent: They want to strengthen the team for a fight against Oracle.

There are three problems with such a move.

First, we now have two muscular venture capitalists on HP’s BoD: Lane and Marc Andreesen, from Andreesen Horowitz (as an aside, admire the firm’s spartan site). While some argue that it’s great that HP has such connections in the VC world (as if any executive or Board member couldn’t get us VCs to return their calls), there’s a governance problem. There will be many situations in which Mark’s or Ray’s existing investments and connections will raise conflict of interest questions; they won’t be deemed independent directors. More

The OS Doesn’t Matter…

by Jean-Louis Gassée

Once upon a time, operating systems used to matter a lot; they defined what a computer could and couldn’t do. The “old” OS orchestrated the use of resources: memory, processors, I/O (input/output) to external devices (screen, keyboard, disks, network, printers…). It’s a complicated set of tasks that requires delicate juggling of conflicting constraints, and every OS handled them differently—or not at all. In those days, the OS was married to the hardware and only a handful of “wizards” with Electrical Engineering degrees—and a deep understanding of circuitry—understood (and invented, and protected) the arcana of OS construction.

Over time, the secrets of these illuminati leaked out. About 20 years ago, the OS lost its mystery. We had learned enough about writing an OS kernel that it became a college-level topic and a Christmas break hack.

Today, there’s only one operating system: Unix. (Okay, there are two, but we’ll get to that.) This is why I contend that the OS doesn’t matter—or that we need to take another look at the word’s content, at what we mean when we say ‘Operating System’.

When RIM decides to go with QNX for its upcoming tablet, the PlayBook, tongues wag. After calling its Blackberry OS the “best of breed” (a tired markitecture phrase), RIM is surrendering to reality: The “proven OS” foundation proved to be unfixable. Because of the layers of software silt that had accumulated over the years, the edifice couldn’t be modernized. Better to quit and make a fresh start. QNX is based on a Unix OS for embedded applications that dates back to 1982(!) when it was first released for the Intel 8088 microprocessor.

The same thing happened at Apple…twice. The Apple ][ OS (or lack thereof, purists will say) couldn’t be brought up to modern standards, so the Macintosh had to be built on a fresh foundation. The original Mac OS foundered on its own beachhead and was replaced by OS X. Based on the Mach kernel, OS X is another Unix derivative, co-authored at CMU by Avie Tevanian. Mr Tevanian improved the system during his tenure as head of software at NeXT and was instrumental in convincing Apple that their purchase of NeXT would breathe new life into the company.

Open the Terminal application on a Mac and what do you see? A noble and worthy Unix “shell”, a program that geeks use to interact with the OS. Terminal uses the bash shell (for Bourne Again Shell. Created by Brian Fox, bash is based on the sh shell, which was invented by Stephen Bourne. Unix mavens love their word-play acronyms).

And now we have the Apple iOS, an OS X derivative that uses bits from the same kernel.

Regard Palm. The sine qua non of handset makers saw that their PalmOS couldn’t be fixed, so they pressed the restart button and created WebOS, a Linux derivative.

Android? It’s based on a Linux kernel. Nokia’s MeeGo? Ditto.

The list goes on. We have the spiritual children of Unix living inside the Cloud, powering the millions of Linux servers running at Google, Facebook, Amazon…

The only exception is Windows. Initially built on top of DOS, Microsoft painstakingly added version after version, always striving for backward compatibility while, at the same time, adding new features. It didn’t always work well (who wants to remember Windows Me and Vista?) but it worked well enough because Microsoft never gave up. They fixed mistakes that they claimed didn’t exist, and now we have the well-respected Windows 7. (Inevitably, critics will say that Microsoft wouldn’t have gotten away with such a tortuous path if it weren’t for its vigorously enforced monopoly.)

Windows will live on — in a PC industry now at a plateau. But otherwise, in the high-growth Cloud and smartphone segments, it’s a Unix/Linux world. We need to look elsewhere to find the differences that matter.

The technical challenges have migrated to two areas: UI (User Interface, or the more poetic—and more accurate—UX, for User Experience) and programming tools.

Now that all “system functions” are similar, the game for hardware and software makers is to convince the user that his/her experience will be smooth and intuitive. Your device will walk on water (with the programmer right under the surface), catch you as you fall, make sure you don’t get your feet wet.

For the developer, what we now call the OS must supply ever-growing expressive power—think a fife versus a twelve-keyboard organ. To wield that expressive power, the programmer needs software tools. The industry uses acronyms such as API (Application Programming Interface), IDE (Integrated Development Environment) or phrases such as Application Frameworks. They define the rules and conventions—which ideas are allowed and how to express them—and the software tools that programmers need to develop an application.

This is today’s OS. User experience. Development tools.

One last element that is and isn’t the OS: This new creature called an App Store (or Marketplace, depending upon the…OS). In my non-technical view, the App Store must be considered part of the OS totality, part of its gestalt. Applications have always been in a feedback loop with the OS. A program can only do as much as the OS allows it, so it played tricks to create multi-tasking, to allow smooth audio/video playback. These “tricks” were incorporated into the OS (and the hardware—think GPU), which then bred another generation of apps that wanted more, and so on.

The App Store genre, invented or not in Cupertino, is now part of that loop, a killer OS component, one that deserves a Monday Note of its own.

JLG@mondaynote.com

Science Fiction: An Apple-Curated App Store

In an alternate universe, Apple has announced the App Store Guide and Blog. Choice morsels from the PR material follow.

“We came to realize that a quarter million apps meant worse than nothing to Apple users”, said Apple’s CEO. “I get confused too! Reviews are often fake, lame, or downright incompetent. PR firms have been caught astroturfing reviews, publishers have resorted to flooding the App Store with shameful clones of successful applications. I won’t let one of Apple’s most important, most imitated innovations sink into anomie.”

[Remember, this is sci-fi.]

“So…Today we’re proud to introduce the Real App Store Guide, written and maintained by Apple experts. We’ll review new and existing iOS apps. We’ll tell you which ones we grok (and that grok us) and give you the straight dope on the offerings you shouldn’t touch, even if they’re free. In our Guide, you’ll find a series of paths: For the Traveler, the Gamer, the Music Lover, the Graphic Artist, the Oppressed Enterprise Windows User, Teachers, Parents, Doctors… The Guide will also feature a blog, a running commentary on the iOS App landscape with intelligent answers to cogent questions. And in keeping with our usual standards for decorum and IQ, the blog will be moderated…”

And so it is, the App Store is fully curated, at long last.

As always, this doesn’t please everyone…at least on the surface. In reality, the usual naysayers are thrilled: More pageviews! Ryan Tate jumps on the opportunity and frenetically fires at Steve Jobs’ inbox, trying to start another late night email séance. But this time the Emailer In Chief doesn’t bite.

Customers, on the other hand, like the Real App Store Guide. Users can finally find their way through the twisted and confusing maze of programs. They learn to adjust for a particular writer’s opinions, much as we’ve all learned to compensate for the biases of, say, movie reviewers. The blog gives civilians a forum where they can argue (politely) with the named authors of the reviews—there’s no anonymous corpospeak here.

App authors…some of them aren’t so keen on the idea. The ones that get tepid reviews are understandably furious and threaten lawsuits (in vain…their attorneys are told to re-read the App Store T&Cs). With a modicum of care with words, that’s what the Guide’s editors are for: Safe negative opinions. More

Curious Summer

by Jean-Louis Gassée

Nothing much happens in August, we thought. Wrong. Our three-week break has been filled with a number of “interesting” events.

Curious Yellow

Let’s start with Mark Hurd’s exit from HP after five years of great financial performance as CEO. If you missed the fireworks, you can get a refresher in this Business Insider post by Henry Blodget, or this excellent NYT piece by ace columnist Joe Nocera.

In twitter terms, it looks like this: A “marketing contractor” claims Hurd sexually harassed her; an inquiry fails to substantiate sexual harassment but finds “an inappropriate close relationship”; the investigation also reveals that expense reports were fudged in order to conceal a tête-à-tête with the female. Mistakes were made, Hurd is fired. End of story.

Not quite.

When a CEO gets the boot, a modicum of decorum is usually observed . Not this time. From HP’s General Counsel we hear that “Mark demonstrated a profound lack of judgment that seriously undermined his credibility and damaged his effectiveness in leading HP”. And that’s on the record.

In her memo to the troops, Cathy Lesjak, HP’s CFO and now interim CEO, accuses Hurd of “misusing corporate assets,” referring to the illegitimate expense reports and alleged payments to the erstwhile soft-porn actress for work not performed.

But forget the salacious details; there’s always Google for that. What puzzles most of us is the exit package story. HP maligns Hurd, accuses him of what lay people call fraud… and then grants him an exit package worth tens of millions of dollars, $35M according to unverified estimates. Attorneys, less puzzled than supercilious, sue HP’s Board on behalf of despoiled shareholders.

In the next few weeks we’re certain to get a clearer picture of the inside animosity directed at the cost-cutting, Wall Street-pleasing CEO. His alleged misconduct may turn out to have been nothing more than a convenient pretext, a word that resonates in HP’s history.

Curiouser and Curiouser

This one’s harder to explain: Intel’s acquisition of McAfee. If you own a Windows PC with Intel Inside, there’s a good chance your computer came with bundled anti-virus/anti-spam/anti-spyware software from companies such as Symantec or McAfee. Microsoft entered the fray a few years ago and provides what they call Security Essentials—for free (Microsoft also offers a free safety scan here). PC Tools, AVG, Kaspersky Labs and many others provide the now customary combination of free and paid-for software security products.

In short, this is an active, thriving scene: Symantec’s revenues are at the top of the $5B range and McAfee’s are close to $2B, despite the competition with “free” products from Microsoft and others.

So what possessed Intel’s CEO Paul Otellini to risk his reputation—and more than $7B of his shareholders’ cash—by wading into such a complex, competitive sector? Seasoned Valley observers such as the WSJ’s Don Clark are politely puzzled (see here and here). Otellini intones a new mantra: Security Is Job One. This marks “Intel’s move from a PC company to a computing company”. Sonorous words, certainly, but without a story of higher revenue and profit for the combined companies, there’s not much to back them up.

More

Under the hood: Google Apps and Apple

With its Cloud Apps, Google tells a nice, simple story: All you need is a browser. Life is simple, we take care of everything, no more fighting with fat, expensive desktop bloatware.
You can access your data and our apps Anywhere, Anytime…if you have an Internet connection. If you don’t, as we’ll see in a moment, things become more complicated. More like yesterday.

Let’s start with a simple Web app. How does it work?

Somewhere, a computer runs a Web server. In turn, the Web server runs an application whose job is to pull the strings of the browser marionette hiding inside my computer at the other end of a Net connection. The app tells my browser to display ‘Monday Note’ at these coordinates inside such-and-such a window, using this font, that size, and this color. Or the Web app sends a file and tells the browser where and how to play it, and so on.
But what happens if I lose the Net connection? The server no longer pulls the string, the marionette collapses, my Web application is dead.

To achieve its strategic goal of displacing Microsoft Office, Google knew it had to provide an off-line version of Google Apps. Off-line capability is implemented by dropping a replica of the Cloud—a Web server, the application code running on that server, and a local cache of my data—into my computer. My work will be uploaded to the Cloud when the Net connection is restored. With today’s software technology, with abundant storage and computing power on desktops and laptops, Google’s goal isn’t unreachable.

But…the Cloud can be replicated inside my laptop?

It’s not as fantastic as it sounds. While the Cloud evokes images of Google server farms and Big Iron, even the flimsiest of netbooks now provide ample RAM space (at least 1Gbyte, often 2), plenty of disk space (160 Gb or more), and an Intel processor running at 1 GHz or faster. Recreating the server, storage, and applications is well within their power.

Furthermore, your PC/laptop/netbook already contains a Web server. Every Mac carries a copy of the Apache Web server (“the most popular HTTP server software in use” says the Wikipedia article), as so do most Linux “distros” on netbooks and DVDs. Windows provides a Web server called IIS, Internet Information Services, the “second most popular web server in terms of overall websites…” (Wikipedia). If you want Apache on Windows, it’s free and easy, go here. The Windows Installer package (née MSI) weighs in at 6Mbytes, that’s all. More

The Adobe – Apple Flame War

The short version:

Who, in his right mind, expects Steve Jobs to let Adobe (and other) cross-platform application development tools control his (I mean the iPhone OS) future? Cross-platform tools dangle the old “write once, run everywhere” promise. But, by being cross-platform, they don’t use, they erase “uncommon” features. To Apple, this is anathema as it wants apps developers to use, to promote its differentiation. It’s that simple. Losing differentiation is death by low margins. It’s that simple. It’s business. Apple is right to keep control of its platform’s future.

The longer version:

The upcoming 4.0 release of the iPhone OS will come with licensing language that prohibits the use of Adobe’s Flash-to-iPhone compiler. The compiler is a clever way around the absence of a Flash interpreter on Apple’s smartphone OS. It takes Flash code in and outputs iPhone OS code, allowing Flash content and apps to run on the iPhone (and iPad). Problem solved.

Not so fast, says Apple, we’ll only allow applications that are written “natively” with our tools. No cross-platform tools, no Flash-to-iPhone compiler, no Flash.

Less than 24 hours later, an Adobe employee, Lee Brimelow, posts a virulent critique of Apple’s latest prohibition, titled “Apple Slaps Developers In The Face”. He concludes with a vigorous ‘Go screw yourself Apple’ and then adds a postscript: ‘Comments disabled as I’m not interested in hearing from the Cupertino Comment SPAM bots.’ Ah, yes. The one-way mirror…
[What the irate gentleman fails to say is this: The only developers slapped in the face are those who don’t use Apple development tools because they want to write a cross-platform app that may or may not use the particular features of the iPhone OS.]

He’s not alone in condemning Apple. In his blog, sunnily called “Why does everything suck?”, Hank Williams asks if “Steve Jobs Has Just Gone Mad” and wonders about “Insane Restraint of Trade”.

Adobe appears to be worried. In its latest SEC (Securities and Exchange Commission, the stock market regulator) filing, the company admits that its ‘business could be harmed’. If Apple succeeds in turning developers away from Adobe’s tools, a new version of which, CS5, is about to be announced, well, the money pump will stutter.

There are calmer minds, however. In his highly-recommended blog, Daring Fireball, John Gruber explains why Apple changed the iPhone OS licensing agreement. It’s strategic, really: Apple doesn’t want anyone else to have control over which OS features the applications have or don’t have access to. I’ll explain in a moment why it’s rational for Apple to fend off cross-compilers, and why it’s not too rational for Adobe employees and others to criticize Apple for keeping control of its future.

But, first, a bit of history.

More

Software and Brakes — Part II

by Jean-Louis Gassée

This week, no iPad disquisition, no large companies engaged in contorted Kama Sutra embraces, no Google-Apple-Microsoft love triangle. We’ll revisit these topics in due course but, for the time being, let’s go back to a geeky topic unadulterated by geopolitics or markitecture: software and brakes. Last month, we looked at the software invasion of automotive braking systems. More specifically, we looked at the interplay between braking and kinetic energy recovery in the Toyota Prius.

Today, we’re going back to “soft brakes” for another set of applications: differentials and stability control.

Differentials. Fifty years ago, they wore out, they made noises, they had to get careful periodic checks and special lubrication. Now, with progress in metal allows, high-precision machining and modern lubricants, they’re rarely seen or heard of. Yet, they perform and important role and their basic design suffers from one no less critical flaw.

(Here, we’ll assume a rear-wheel drive car. The concept applies to all drive configurations.)

The important role goes like this: when the car turns, the two wheels on the same axle draw circles of different radii, smaller radius for the inner wheel, larger for the outer. As a result, the outer wheel must turn faster that the inner one. This is no problem for the front axle whose wheels are “free”, not driven. But, for the rear axle, we’re in trouble: the drive shaft attached to the gear box connects to the axle through a 90 degrees angle gear. This causes each wheel to be driven at the same speed. This is fine in a straight line but causes wheel slippage when we turn as the wheels must rotate at different speeds.

This was the arrangement when the very early automobiles mimicked horse carts. On carts, wheels on the same solid axle did slip in a turn, but said wheels didn’t have to provide any traction, the horse did. In a car, the axle provides traction and wheel slippage works against stability and comfort, to say nothing of tyre wear.

So, the differential was invented. It’s a little counterintuitive at first but it works beautifully.

See this touchingly kitsch Chevrolet video. Or this learned Wikipedia article. Relatively simple, once you get the hang of the planet gear’s role. And universal.

But trouble starts right away.

More