9 Comments

Summary:

Anybody who has spent any time around the technology industry knows that broad-based standardization is important, for many reasons. Likewise, openness in standardization processes is also important. Self-interested tech companies have pursued their own proprietary standards proposals and patent moats for years, and can often obstruct […]

4006382352_038c4e9e77_mAnybody who has spent any time around the technology industry knows that broad-based standardization is important, for many reasons. Likewise, openness in standardization processes is also important. Self-interested tech companies have pursued their own proprietary standards proposals and patent moats for years, and can often obstruct open standards, interoperability and more. At the same time, though, can standardization efforts be taken too far? Last week, a group of 19 technology executives and representatives from the ITU’s Standardization Bureau, including people from Cisco and Microsoft, met in Geneva to consider that very question. Their conclusion was that standards are necessary, but that the ecosystem that promotes them has become “too complicated and fragmented.”

According to a communiqué from the Geneva meeting:

“There are hundreds of industry forums and consortia in addition to national, regional and international SDOs [Standards Development Organizations] competing for business. It is becoming increasingly more challenging for the ICT industry to identify and prioritize the places to concentrate their standardization resources.”

The group made a series of recommendations, including “implementation of improvements to the present standards scenario so that SDOs complement rather than compete with one another.” The ITU has been selected to drive the development of a number of initiatives called for at the meeting, including disseminating standards-related recommendations through a web portal or handbook, and more. The group will reconvene in 2010 for a progress check, and Cisco will host the meeting in Silicon Valley.

As examples of why these issues surrounding standards are important, and how delicate they are, I can think of two very important connectivity and networking technologies that have gone through such laborious, multiyear standardization and certification processes, yet their users ended up as the losers: 802.11n (the next generation of Wi-Fi) and USB 3.0 (the much improved new implementation of Universal Serial Bus technology).

While many people have been using Draft-N 802.11n Wi-Fi technology for years, there are also a lot of businesses that won’t switch to a new standard absent official ratification of the technology. In the case of 802.11n, that ratification only occurred a few weeks ago. What’s important about that is that standards proposals for 802.11n first arrived for consideration by the IEEE in 2002 — seven years ago. If you’ve used 802.11n, you know that it has vastly better range and speed than 802.11g. Why should we wait nearly a decade for squabblers to agree on a new Wi-Fi standard that can improve work and play for everyone?

Likewise, the USB Implementers Forum has pushed for acceptance of proposed new USB 3.0 technology for years, but only recently, at the Intel Development Forum, did the technology have what is being dubbed its “coming out party.” It won’t arrive on a widespread basis in products until next year, even though it’s a vast improvement over USB 2.0 in terms of both performance and convenience. Again, too many people argued over the proposed standard for too long.

Standards are important, and multiple parties should be permitted to weigh in on them. I can understand, for example, the anger that Europe and other parts of the world have toward the U.S.’s borderline monopolistic control over Internet standards. At the same time, though, too much complexity and fragmentation in standardization works against users. In technology development, time is always of the essence.

  1. This is a highly uninformed article in terms of the standards it cites as examples.

    The process of reaching an industry consensus through standards committees while standards are being deployed in pre-release form in the market are a critical way to get a robust, final spec.

    “Why should we wait nearly a decade for squabblers to agree on a new Wi-Fi standard that can improve work and play for everyone?”

    Feh. That’s not what happened. 802.11n was proposed before the technology to make it feasible was available and tested. A couple years of ferment of early MIMO gear plus early deployments of pre-draft and Draft 1 hardware revealed many weaknesses, which were folded back into the IEEE group’s work.

    The delay to reach Draft 2.0 in 2007 (2 1/2 years ago) was entirely due to an industry pissing match over fine details, until an agreement that all manufacturers could live with came to the front. Without this work, there would have been 2 or 3 incompatible WLAN flavors, and the Wi-Fi Alliance would have been in an untenable position to pick one for trademark and testing given that its board members and general members didn’t agree on a single approach.

    Even in the last year, some critical final elements to 802.11n that are not yet deployed, but are crucial for enterprise and high-speed operation, were finally confirmed. These don’t affect consumer devices much.

    Yes, 802.11n could have been finished sooner, and it would have been far worse.

    I don’t know the history of USB 3.0 quite as well, but, again, the companies that developed the standard before it was handed to the USB-IF were trying to build something that simply couldn’t exist a few years ago: the right hardware, chips, and cost structures weren’t in place for what they achieved.

    Share
  2. One more thing: 802.11n has been accepted in the enterprise and academic institutions on a large scale already, once the biggest firms reached the right cost/feature approach and assured customers of free upgrade paths if incompatibility arose.

    It’s a very weird thing to say in 2008 or 2009 that a lack of ratification is what delayed IT spending on infrastructure upgrades that aren’t technically necessary.

    Share
  3. The great thing about standards is there are so many to choose from

    Share
  4. @ Glenn–in the case of 802.11n, yes many early adopters went to Draft-N well before ratification, but it’s still true that it’s policy at many companies and institutions to wait for ratification. In the case of 802.11n, that took seven years, which strikes me as just too long, especially when Draft-N was seen working well years ago. In the case of USB 3.0, again, products that worked well were around a year ago, but there have been long arguments over implementing the standard. I point to these because they are important technologies, and I think agreements could have been reached earlier.

    Best,
    Sebastian

    Share
  5. I’m sorry, but you don’t understand the history of 802.11n if you think seven years was too long. In a perfect world, yes. But four years ago, 802.11n would have been a disaster and miasma of incompatibility and stuff that worked unreliably.

    Two and a half years ago, the standard settled down, and then manufacturers worked diligently to produce ever better and cheaper products in the consumer space, while ramping up on the enterprise side. Enterprises want gear in the field for a year or two before they adopt, anyway, which is why you’re seeing large-scale adoption only now (and delayed again because of the economy).

    In some cases, sure, agreements could have come earlier, but that’s not a problem of standards. It’s a problem of the marketplace. Standards bodies typically reflect competitive marketplace disagreements. When the various firms involved decided they couldn’t win with separate standards in the market, they agreed and things went forward quickly.

    Look at 802.15.3b, which failed in the standards world. Intel and a host of others couldn’t break what became a division of Freescale’s lock on preventing agreements for UWB as the basis of a fast PAN standard. Intel et al broke off to form what became the WiMedia group. UWB in that area has more or less failed (so far), too. The technology wasn’t ready, the market wasn’t ready, and the standards group fell apart. WiMedia has also disbanded (rolling its standards into the USB-IF and Bluetooth SIG).

    Share
  6. Sorry, one more thing.

    I believe you are making a non-engineer’s mistake of confusing intention with capability. Could a USB 3.0 standard designed to last 15 to 20 years, like USB 2.0 (which is 8 years into a long lifetime) be created three or four years ago? I don’t believe the actual fundamental technology was in place at that point. There was no great breakthrough, but a lot of separate developments. (eSATA is a good case: it has a multi-Gbps standard, has been out for a while, but is being seen as too specialized, and I read that it may disappear.)

    Ditto, the various elements needed for 802.11n were simply not available in a mature form necessary to produce today’s equipment 3 to 4 years ago. It took years of mass production, refinement, and engineering research to get to the point a couple years ago when equipment was ready to ship that would stand the test of time.

    I think you also overlook the nature of collaboration and development. There are books that take 20 years to write because of the intellectual process that must be gone through. If you don’t believe there’s an intellectual aspect to standards, then you’re looking in the wrong place. Engineers have to take vague ideas about what a next-generation spec or an entirely new spec has to be and translate from imagination (and marketing) into something that’s technically expressible. This can be rushed, and you get bad specs.

    Share
  7. First, as someone who has spent too many years at standards meetings, I had to share this dilbert cartoon:
    http://www.dilbert.com/strips/comic/2009-09-02/

    Second, I would agree broadly with Glenn that standards take a long time to ferment. Often times there is a situation where the perceived application demand is ahead of the technology curve — I know this was the case for 10GBASE-T, and I suspect based on Glenn’s comments that this was also the case for 802.11n. 10GBASE-T was kicked off in 2002, and I expect that we will begin to see the first large port count switches with 10GBASE-T late 2009 or early 2010. The obstacle was the complexity of the problem of extending the life performance of Cat5 cable and the limitations of the CMOS from a power/performance viewpoint.

    Third, some standards groups have too many interested parties rather than the article’s case where there are too many standards. A good example of this is the debate within 802.3ba. There the more “traditional” Ethernet applications oriented players wanted to see the data rate progress from 10GbE to 40GbE. Whereas a number of router & telecom companies wished to see the data rate progress to 100GbE. The resulting compromise has two data rates being standardized. This might end up being very useful or it might just fragment the market. Sometimes having more ITC companies with different agendas results in a broader standard

    primate

    Share
  8. Before dwelling into a rather serious discussion, allow me a digression. Does the Geneva meeting intend to come up with a new standard on how to organize efficient standardization? And just how many companies want to pitch their IPs in it and draw up grand plans for licensing consortia, trademark regimen, certification processes, roadmaps into the future etc etc…

    Let’s start with a spoonful of zen – standards are not meant to be done in a hurry. Heck, oftentimes one wonders if standards are meant to be done at all! That’s exactly the point of frustration that people reach with standards that take forever. Yes, 7 years is forever in the world that moves at the speed of thought. Yes, tonnes of intellectual debate happens in standardization, a multitude of view points need to be synthesized and everyone needs to be convinced that there’s something in it for him/her. And yet, after all this, weaknesses of the standard stand exposed. Early adopters create practical variants that lead their own lives. Or, by the time a standardization is finally finally done and dusted, it’s time to create an evolved version. So, is there a cure? Maybe. Having fewer people in the game could help. Assigning priorities to the participants in the game helps too. But what we probably need most, or most urgently is a definition of the deadlines for any standardization activity. Either the SDO delivers by the deadline, or people move on. The SDOs need to be made accountable for timeliness of delivery – and not just for quality of the standard. Make it clear to the SDOs that failure to deliver on time would mean a rival standard gets popular adoption. Also, participating companies need to define strict criteria for appraising their standards guys..and promote people who deliver – not merely people who can debate till the cows come home. Ultimately, the standard, like every other human endeavor, is only as good as the guys who created it. Doctor, heal thyself.

    Share
  9. [...] too bad that the necessary parts of the computing ecosystem aren’t coming together in unison for USB 3.0 to truly arrive in the short term. The technology is far faster than version [...]

    Share

Comments have been disabled for this post