Apple and Oracle Must Let Developers Have Their Say

iStock_000012573026XSmall

It’s getting harder to be a monopoly these days. Microsoft owned the desktop for decades, milking its Windows platforms every step of the way. Apple, on the other hand, hadn’t even managed four years of iOS dominance before Google’s Android staked a serious claim to the mobile market.

This isn’t because Microsoft is somehow smarter than Apple, but rather because the underlying dynamics of the technology industry have fundamentally changed. In brief, the technology world is increasingly embracing “write” communities, as Jono Bacon calls them, not simply “read” communities. Open source may have kickstarted this trend, but open APIs and open data are taking it to new heights.

Read communities aren’t characterized by a dearth of developers, but rather by what those developers can do on a given platform. After all, few can claim to sing to developers as eloquently as Microsoft CEO Steve Ballmer does, but there’s a (big) difference between talking to developers and letting them talk back. In your code. On your platform.

As noted, it’s telling that the shelf life of Apple’s dominance is much shorter than Microsoft’s decades-long dominance. Microsoft, after all, never had to deal with competing write communities, as Apple does with Google Android. Major developers like Facebook find Android more flexible: It allows them to write into and draw from the platform the capabilities they need.

Hence Apple, once the no-brainer first choice for developers despite its heavy hand on the development process, is increasingly losing out to the more free-spirited Android, which analysts see claiming over 50 percent of the smartphone market in just a few short years. Apple has responded by loosening its grip on iOS application developers, but it may be too little, too late.

Android isn’t perfect, of course, and still suffers from a worsening fragmentation problem. But its comparatively open nature makes it an inviting alternative to the closed iOS development. As but one example, try to get meaningful analytics data out of the iPhone. If you’re Apple, you can do that. If you’re anyone else, particularly Flurry, you’re out of luck.

Apple giveth, and Apple taketh away.

Contrast that with Google Android, which has an open-source logging/analytics tool developers can use called Logcat. Android is open source, which prevents Google from exercising control over how developers collect analytics data on Android devices. While one can make an argument that it’s good to have potentially sensitive analytics information guarded well by a responsible party like Apple, given Apple’s record of somewhat arbitrary and heavy-handed control over its platform, I’d vote for freedom on this one.

This isn’t just an Apple vs. Google story, either. It’s just one example of how innovation happens generally, no matter the industry. As Steven Johnson points out in The Wall Street Journal:

[I]deas are works of bricolage. They are, almost inevitably, networks of other ideas. We take the ideas we’ve inherited or stumbled across, and we jigger them together into some new shape. We like to think of our ideas as a $40,000 incubator, shipped direct from the factory, but in reality they’ve been cobbled together with spare parts that happened to be sitting in the garage.

The problem, as Johnson goes on to highlight, is that governments have largely pursued innovation in the past 100 years by doing the exact opposite of what is actually required to foster such innovation. The same is equally true of individual corporations like Apple or Microsoft:

[I]ntellectual property, trade secrets, proprietary technology, [and] top-secret R&D labs…share a founding assumption: that in the long run, innovation will increase if you put restrictions on the spread of new ideas, because those restrictions will allow the creators to collect large financial rewards from their inventions. And those rewards will then attract other innovators to follow in their path.

The problem with these closed environments is that they make it more difficult to explore the adjacent possible, because they reduce the overall network of minds that can potentially engage with a problem, and they reduce the unplanned collisions between ideas originating in different fields. This is why a growing number of large organizations—businesses, nonprofits, schools, government agencies—have begun experimenting with more open models of idea exchange.

It’s this sort of open exchange of ideas and code that leads to economic historian Eckhard Höffner to conclude that Germany closed the gap on England’s industrial revolution in a short span of time due to the wide-open nature of the country’s publishing market in the mid-1800s. Weak copyright law enforcement sent innovation into overdrive in Germany, while a comparative monopoly on publishing in England stymied that country’s early industrial lead.

Eventually, Germany followed England’s lead, and innovation slowed there, too, but ramped up in the United States, where “borrowing” the works of Dickens and other great European authors, not to mention technological inventions, was standard operating procedure. European creators didn’t like the Yankee “thieves,” but loose IP protection led to greater adoption of their works, industrial and cultural progress, and the authors still managed to get paid.

Since then, the industrialized West, including the United States, has increasingly clamped down on intellectual property in the interest of fostering it, but with the opposite effect. As numerous studies attest, patents and other intellectual property tools have slowed innovation, not accelerated it. Industrial innovation has accordingly moved to areas like Brazil and China where IP protection is light.

This isn’t just a matter for economists, but also for business strategists. It’s possible, for example, that Oracle’s integrated approach to product development will prove successful, but likely not over the long term. Such an all-consuming, go-it-alone approach breeds powerful enemies, including within one’s own customer base. It certainly creates distrust within the developer ecosystem.

Oracle may profess not to care, but competitors like Microsoft increasingly recognize that they must care. Software developer Dave Newman declares that “The .Net community operates in a non-collaborative vacuum,” and then announces he’s abandoning .Net. Microsoft can’t afford to lose too many Dave Newmans.

Neither can Oracle.

In today’s market, companies need community. They need adoption of their APIs. No company is smart enough to come up with all innovation on its own, so the best companies will create read/write platforms through which third-party developers have the flexibility and distribution to reach customers.

Open source is an essential part of this, but isn’t sufficient of itself to crown any particular vendor or technology king. Linux is rapidly taking over in the mobile market, but has yet to make a dent on the general consumer desktop. But the fact that open source isn’t sufficient of itself to decide a winner is no reason that platform vendors, specifically, and technology vendors, generally, shouldn’t be making the most of open source to enhance their attractiveness to third-party developers.

Related content from GigaOM Pro (subscription req’d):

loading

Comments have been disabled for this post