Why I quit writing internet standards

34 Comments

Credit: GigaOM

My seven years on the Internet Engineering Task Force (IETF), from 2003 to 2010, definitely taught me interesting things, including how to get a group of people to deliver when you had no control over their jobs. As co-chair of the Network-based Mobility Management (NETLMM) working group, I led one of the rather contentious working groups at the IETF. We managed to deliver relevant standards and actually brought closure to the working group so we could move on. Overall, my experience with IETF has positively contributed to my skills in leadership, consensus building, design thoroughness and seeing the big picture. It also gave me the opportunity to interact with incredibly talented people from diverse organizations and to really understand how the Internet came to be what it is today.

And yet, several years ago, when I was nominated for the Internet Architecture Board, I decided it was not for me. Not long after, I took an indefinite leave of absence from the IETF and have not returned since. There are times I feel guilty about not giving as much to the Internet anymore, and I take great pride and consider it my good fortune to have served on committees like the Security Directorate, reviewing contributions to ensure that they don’t break the security of the Internet. However, I find myself less distraught as I try to serve the Internet through other practical contributions from outside the fences of the standards organizations. (I’ve also had my share of experiences at other standards organizations like the IEEE, 3GPP and 3GPP2.)

So, why did I actually stop contributing to standards definitions? The primary one is the fact that while the pace at which standards are written hasn’t changed in many years, the pace at which the real world adopts software has become orders of magnitude faster. Standards, unfortunately, have become the playground for hashing out conflicts and carrying out silo-ed agendas and as a result, have suffered a drastic degradation.

Consider the Internet of Everything (IoE), one of the hottest topics of today. The Internet of Everything, you say? Surely, this must be built on interoperable standards! How can you possibly be talking to everything, from lightbulbs to toothbrushes to phones without interoperability? That sounds absurd!

And you would be right; there is a need for interoperability. But what is the minimum we need? Is it IP? Is it some link layer defined by IEEE, such as 802.15.4? Or Bluetooth 4.0? HTTP perhaps? It is useful to remember that none of these are fully sufficient to have IoE working in a meaningful way that is of some use to the user or the end consumer. And yet, while we wait on some inevitable PHY (physical) and MAC (link layer) protocols that must be defined by IEEE, once that is in place, we are ready to roll.

Running code and rough consensus, the motto of the IETF, used to be realizable at some point. Nowadays, it is as though Margaret Thatcher’s words, “consensus is the lack of leadership” have come to life. In the name of consensus, we debate frivolous details forever. In the name of patents, we never finish. One recent case in point is the long and painful codec battles in the WebRTC working group.

I have tremendous respect for a good number of people that participate at the IETF and other standards organizations that continue to make the Internet functional and sound. I value interoperability and hope that we will get it together for sake of IoE, because it actually is going to be hard to realize that vision without good interoperability.

But I look across the board at IEEE, IETF, SDN organizations and the like and feel that these organizations need a radical restructuring effort. They need to be shaken up, turned on their heads and revamped in order to have monumental impact on the Internet once again. For one, we all need to agree that everyone gains from the Internet being unencumbered and that interoperability only helps the Internet serve all our needs better. More critically, I believe it is time to revisit the tradeoffs between consensus and leadership; they absolutely should not be considered to be one and the same. This will be tricky and will require careful balancing of undesirable control and faster decisions. Most likely, a change like this will require a complete overhaul of the leadership selection process and structure. But, without this rather drastic shake up, I’m afraid we are widening the gap between standards and reality.

The world is marching towards fragmented islands of communication connected via fragile pathways. It is inevitable, as this is the fastest path to market. Unless these standards organizations make radical shifts towards practicality, their relevance will soon be questionable.

For now, some of the passionate people will go off and try to make real things happen elsewhere. I feel like a loser for saying “I quit writing standards”; kudos to the people that are sticking with it to make the Internet a better place. Some day, hopefully, we will all be better off because of it!

Vidya Narayanan is an engineer at Google. With a long history in mobile, she is obsessed about enabling amazing mobile experiences. She blogs at techbits.me and on Quora. Follow her on Twitter @hellovidya.

Featured image courtesy of Shutterstock user almagami

34 Comments

Anonymous Coward

Some of the specific problems can be dealt with by process leadership. For example, endless debate over meaningless technical details. When debates arise over “syntactic sugar” that have no effect on functionality, it’s appropriate for the group to pick one using an alternative consensus model (even the horror of voting). When debates arise over functionality, one approach is to drop the functionality from the current version and allow it to be added later. Especially when vested interests are at stake (such as with the WebRTC codec battles), one approach would have been to either pick the codec that no one said was impossible for them to implement, or, go with no mandatory codec for the first version (a mandatory codec could always be added to a second version). The key is to move forward, using simplification as a way around obstructions.

not fit for purpose

Noelle that’s a nice form of “market speak” speaking a lot and saying nothing substantial ….

wheres the different IEEE GitHub hosted open source alpha/beta code lists than anyone can see and contribute to without paying a fee ,and dont forget its related IRC channel (ieee have a pay for PDF on pinching that OSS way of collaborative working) so the real Open source people like the x264/ffmpeg/avconv/ the Open Broadcast Encoder etc developers http://wiki.obe.tv/index.php?title=OBE can offer optimised cross compatible SIMD optimizations and their unique speeds…. talk is cheap, and the hardware is outstripping the IEEE and other org’s sloooow progress…

end consumers (not your industrial customers) WANT a mass of affordable 10Gbit E kit ready for rec 2020 10bit/12 pp creation and displays end to end for instance, wheres that standard to enforce the OEM’s to comply and build at least some kit to the highest specs you publish…

change is inevitable, just look at what Linaro have done for the whole fledgling ARM cortex system to date, code first, time it, test it, refactor for SIMD where possible, upstream all the given code

do you really thing your fit for purpose today!

Noelle Humenick, IEEE-SA

Ms. Narayanan
Your candor is appreciated when discussing the pros and cons of being involved with standards development activity, and I compliment you for dedicating as much time as you did to the huge endeavor of leading a group in a standardization effort. It is good for those of us working at the SDOs to hear such comments.

The standardization process and, in particular, consensus, will always take time, no matter where you are trying to forge it; whether it is at or at home negotiating with your spouse over what to cook for dinner. And during every negotiation, a person naturally brings to the negotiation table all of their concerns and their wishes.

Standardization, for some, is about “religion,” for some it is about building a market. Early in my career at IEEE, my mentor told me that standards are about psychology. Years later, I have to agree with that assessment, although it is not, by far, a complete assessment.

When we become involved in group dynamics, there naturally comes a point when the group dynamic no longer serves our needs. That is the point when, as you did, we move on to a different method of expressing creativity and technical skill. Sometimes you can just move your contribution to a complementary venue, whether it is a conference or a pre/post-consensus activity–there is always a home for new ideas at most of the SDOs. I know that IEEE-SA has been attempting to be responsive to our community needs over the past decade by creating various frameworks for people to build consensus.

With respect to your comment about potential radical restructuring of the SDOs in existence today, we agree (well maybe not with the word “radical” :) ). The IEEE-SA recently has made stakeholder engagement an important focus. Our current activity includes analyzing the IEEE-SA participant ecosystem, engaging with stakeholders on why and how to improve upon it, and moving the organization into a better harmony with the emerging macro-societal trend. Whether you call the trend the emergence of the sharing economy, the relationship economy, the post-capitalist economy or any other name, it all moves in the same direction. Bi-lateral communications, real (as opposed to pro-forma) openness, dealing with the abundance of the internet, etc.!

A firm commitment to the Open Stand principles (open-stand.org) serves the IEEE as a guiding star. We don’t know where our activity relative to an update to the participant ecosystem will lead us, but sometimes, it is more about the process, what you learn along the way, and just having the conversation. I know I am enjoying it.

Carl Reed, PhD

Vidya –

Enjoyed your editorial. I work for a standards organization (Open Geospatial Consortium). I also work standards activities in a dozen or so other SDOs. Many of your observations concerning standards development are spot on. In response to these market forces, some SDOs, such as the OGC, are changing how we do standards development. For example, we are now using GitHub to provide public collaboration “sand boxes” to foster running code first and then developing formal (quality) standards document as the result of the implementation work. We also have an agile, rapid engineering interoperability program to foster innovation and development prior to any standards development discussion. Within that context, your comment on consensus and leadership is very germane and something we are still discussing and struggling with. I will say that our evolving approach to standards development is allowing us to address developing standards in “small quick steps” while maintaining consistency and quality. Of course, one can argue what an SDO means by “quick” :-) Anyway, your editorial has generated considerable discussion in the OGC. Thanks!

Steven Adler

These are great points, and it is very important for standards groups to make sure their work is grounded in customer requirements and to be open and share evolving standards with the marketplace to ensure that what we create solves real problems and has a real market demand. I currently co-chair two standards working groups: XMILE TC at OASIS, and the Data on the Web Best Practices WG at W3C. I’ve built a dialog with customers through ongoing use case webinars that have attracted over 3000 participants in the last year. I host workshops with customers to share current thinking and get market feedback. And we focus our deliverables on implementation use case examples to explain to customers how these standards can be used. The standards process might be driven by consensus internally, but working directly with customers helps inject market relevance into the process.

Net: Don’t give up. Make it work better.

not fit for purpose

“The need now is to standardize “adaptability.””

its already been done, get an idea, code it, time it down to the pico second (as per x264 devs standard practice) test it ,did it pass, yes add to list

no code no list

Ken Krechmer

Dear Ms Narayanan,

As a technical person you recognize that if you don’t understand the theory, the practice often does not work. There is a lot of theory on standards and standardization, that is not widely known or applied. Check out (it may require some study) isology.com for academic papers. Basically standardizing “compatibility” is not very productive anymore (as you recognize). The need now is to standardize “adaptability.”

Ken Krechmer

Wesley George

I find it interesting that this post makes no attempt to suggest solutions to the problems its author identified. As a current IETF participant, I’ll be the first to admit that IETF has ossified badly and is losing the ability to keep up and keep relevant, and forward progress is slow and quite frustrating. I’m quite sure that none of IETF’s current or past participants *wants* that to be the way of things, but fixing it is easier said (via a blog post about how broken it is and why you’re quitting, perhaps…) than done.

Ok, so we’ve admitted we have a problem. Now let’s get to the “running code” part.
Vidya, If you have suggestions on how to fix it, I’m all ears, but the right place to fix the organization is from within it. You’re more than welcome to end your leave of absence from IETF and be part of the solution by suggesting improvements.

Ana Vasiliu

Do you want anything in particular to have a “monumental impact on the Internet”as a whole …

JoAMoS

Will you help build a micro os (from zero) with the only purpose to facilitate inter device communication and data transfer. It has to have self learning artificial intelligence algo and has to be real time and an operating system upon which other apps can be build. It should be boot-able on a small footprint using minimum resources. Thus it may be an independent part of a larger device like a fridge or a base device upon which other apps are build to operate (not GUI) with the option to cluster.

openid exists bro

Why does the IETF feel like they’ve never used a consumer computer?

I ask because if you look in the SIP standard you can see all these assumptions that a consumer will have a free and open connection to the internet, unhindered by firewalls.

This totally ingorant view of reality pervades many IETF standards, and frankly we’re lucky that the web standards were written by smart people like Fielding et al. than the normal IETF bunch who often come off like totalitarian sysadmins with full root.

Jack McDerp

IETF is purposely complex, with consensus being purposely run off the rails so bad protocols and standards/libraries can be dropped in, like HTTP2.0, DNSSEC/DANE, the nightmare that is TLS and OpenSSL..

Various governments and special interest lobbyists should be kept away from it

Resident Crank

Having been an Area Director in the IETF in the 90s, I see one huge error in the model the author describes. She appears to believe that consensus requires unanimity. First and foremost, the IETF strives for *rough* consensus which is explicitly no required to be unanimity! The reason for that choice and distinction is clearly illustrated by her examples of the failure modes of the unanimity version of consensus.
One of the biggest reasons I stopped attending IETF is that I could no longer stomach the endless pontificates by people who were clearly either clueless or trolls for some employer’s agenda. Instead of ruling people out of order, we had to listen to a hijacked discussion or a trip down the rabbit hole yet again.
The awful truth is that the engineering work that makes the Global Internet work can no longer be done in the IETF precisely because of leaders seeking unanimity instead of an engineering rough consensus that will get something working so we can figure out what really needs to be done.
The IETF RFCs are now composed primarily of documents which “standardize” something before anyone really knows whether it will work in the Real World, at Global Internet Scale. And that makes them and the process that produced them counterproductive.

markchahn

Corporations all model themselves as hawks, but consensus and de-jure standards are dove things. It’s a poisoned culture, so that there is no attempt to build shared vision (because the lawyer/MBA weasels running the corporation insist on an extractive, rentier model, rather than mutual benefit.)

Creadev.org

Pardon the off subjectnesss: Internet islands are built by us, the contractors, not “them” the said example companies themselves. Why? Because we as humans are pushovers both in reality and in pseudo digi-reality. There is nothing that says we have to “obey” so why does everyone act like its some kinda cement wave forcing “US” to play the node VS node games that THEY make from back-room “treaties”?

The answer is quite simple: you have a fear that your cell phone might go away for a month or 2 while we hold their profits ransom and they hash it out to give us proper service. Prove me wrong :O

Just like in the real world, no one is coming after you when their islands begin to sink. Why? Because they will be desperate to get onboard the next biggest things that WE THE PEOPLE have created. This is where their profits originate. We build + control everything they do, yet we act like such helpless victims when it comes to locked down stagnant internet/mobile/devices. Hopefully people see the irony in this behavior.

ALL major providers, utilities, and watchdogs/standards coops are built by us, using the open tools built by us, running under guidelines built by us, powered by networks manned by us. They obey us so dont act like we are being exploited here. If ya’ll are feeling lost and exploited its because youre giving in and making excuses.

Chris Saunders

I think the design by consensus issue should be explored in more detail. I’m not sure if it’s an issue of scale, or quality of resources but I see this approach choking the life out of innovation in many organizations.

Raeven Wood

We can count on proprietary infrastructures that don’t communicate with one another to govern the IoE. The quality of goods and services you obtain is always based on your income. Like the ‘smart panel’ introduction over a decade ago that required specialized installation and manual integration of interfaces, the connectivity of devices will be limited through tiered pricing and commercial contracts. Automakers link with technology companies- your car purchase may now depend upon whether you have an iPhone or Android.

There will be expanding technological ‘islands’ established, where you move into a building or buy a house in a neighborhood based upon its infrastructure, which you will have to adhere to. Lose your job and can’t pay the premium in Appleton? Once you exhaust your credit and exceed any grace period, you’ll be lucky to get a two bedroom apartment in the 1600 Tracfone high-rise. Obviously, I’m using those names superfluously, but you get the point. Having entire areas limited to the partnership of [GE] and [Verizon] forms a new kind of empire, and the standards will be their specifications- not the other way around.

This is not accidental. You were wise to see which way the wind was blowing and go with Google. Anyone who doesn’t understand exactly how rapidly changing the world really is has no hope of thriving in it. I just read about a software company testing their ability to self-network in case of an ‘Internet apocalypse’. That sort of smart low-tech Frankensteining will be the poor man’s relay system, and made illegal like stealing cable. They were already unable to install their software on Apple devices, of course. Call it ‘enhanced protectionism’.

It sounds so Sci-Fi, but it’s already a fact of life. You probably are at the mercy of one cable provider and one power company. These are just the new utility giants, and the underwriters of laws and standards are increasingly beholden to them.

Danny Kommer

IETF also needs to remove any and all NSA employees from their groups. Until they do that, they can’t be trusted anymore:

http://arstechnica.com/security/2014/01/nsa-employee-will-continue-to-co-chair-influential-crypto-standards-group/

The fact that NSA has a co-chair, no less, of an important crypto group in a post-Snowden and post-RSA revelations, and who knows how many others in other IETF groups, is absolutely ridiculous and completely unacceptable. You can’t accept the people who have been working all these years to subvert the crypto protocols and undermine (not protect like they claim) the “national security”, with open arms.

Richard Bennett

If you look carefully at the history of crypto standards, I think you’ll find that the major disasters have been created by people who have no connection with NSA. Ars is simply trolling for clicks, like they always do.

Steve

I noticed that you did not actually respond to his complaint.

Richard Bennett

Actually I did. NSA didn’t write the sloppy code that enabled the Heartbleed exploit, volunteer, virtuous, open source, non-commercial clueless programmers did.

Frank

No, you did not, For some reason you did not (want) to get his point.
Greetings from mystic Austria, du deppats Nudelaug!

Randall Gellens

Given the completely open process of the IETF, I’m at a loss to figure out what evil things the NSA can accomplish by being a co-chair? They can’t force a technical change in any standard. I suppose they could slow down the adoption of some significantly better algorithms by appointing poor document editors and being slow to run the group process.

Richard Bennett

In my experience working on standards at IEEE 802, ISO, and IETF, IETF has always been the most dysfunctional of the standard organizations by a huge margin. While they’re all dominated by interests that often conflict, the IETF is the only one driven by a religious agenda. The “founders of the Internet’s vision” is enforced by a small group of second-generation Internet fans who studiously ignore current and future technology and keep the flame for things they mistakenly believe to be important – like end to end – that really aren’t.

IETF would benefit from term limits.

M Ire

Heh, I agree. Without the end-to-end religion, we would have had high quality real-time on-demand TV on the global Internet a long time ago, using virtual circuit pathways in the ‘cloudy’ internet. It would have cost a premium but so what? There are things that are free in life and then there are things you have to pay for. The planet’s resources are limited.

The IETF has single-handedly retarded such mankind-level progresses in technology thanks to its end-to-end cult. The datagram cloud idea of IP is a ‘tragedy of the commons’. It simply forbids the possibility of high quality real-time QoS applications even when there is enough hardware and software power and business and social case in the world to create such applications.

dave täht

I have been working for 3 years now on trying to standardize several QoS/AQM/Packet scheduling algorithms to make (in part), high quality realtime applications possible and the interet as a whole more speedy and less jittery.

https://tools.ietf.org/html/draft-hoeiland-joergensen-aqm-fq-codel-00

I find the ietf processes painful, archaic, and expensive. My own biggest problems with the org are the lack of running code – and that the interface is poor with the IEEE 802. group with much duplication of work and even less cross-understanding.

I have no idea how Vidya lasted 7 years at this. I won’t. But if someone doesn’t keep at it, what other org can?

— dave taht

not fit for purpose

“I have been working for 3 years now on trying to standardize several QoS/AQM/Packet scheduling algorithms”

great so take not that NHK/bbc R&D are planning on UHD1 10bitpp and UHD2 12bitPP
Defining the Future of Television
Posted by Andrew Cotton on 19 June 2013
http://www.bbc.co.uk/rd/blog/2013/06/defining-the-future-of-television for 2017/2020 commercial broadcasting and make your algorithms adaptive to that rec 2020 UHD TV spec ASAP for IP transmission on LAN/WLANWAN/MAN etc

follow the x264 plan of action and code writing , break your routines down to the most basic of routines, write the C99 code test it for speed and accuracy then write a fully SIMD optimised C codebase AND a corresponding SIMD assembly for x86/64 AVX2 and ARM NEON SIMD ,only falling back to the optimised C code when a SOC does not carry the SIMD engines required etc, go talk to the x264/ffmpeg/Avconv devs On their IRC channels and ask and offer something they want or can use etc…

Madlyb

Some great points Vidya, but I think you missed the biggest point of IoE, which is not over-engineering and hyper-optimizing something that is still in the very early days of definition and instead delivering capabilities within the bounds of what we already have and accepting that some things are going to be less than optimal in the first iterations.

As far as I am concerned, there are plenty of tools already in the set and we should push ourselves to make the most of the toolset before going off and creating new ones.

If we can become focused on small, iterative efforts, we can improve the pace and insure we do not over-invest in engineering standards that are out of date before they even see the light of day.

Vidya

Absolutely agree. Small steps are great! The standards orgs are just not set up for for quick, small steps.

Vidya

I should also add that there is a downside to rapid standardization, so it is a delicate balance to achieve.

not fit for purpose

its perfectly obvious to all the tech blogs that all the orgs especially the PCI-SIG and Ethernet sig are in fact not fit for purpose today…

just look at all the innovative ARM SOC IP as relates to generic super fast NOC (network On Chip) for pennies on the £/$ today at 1Tbit/s 2Tbit/s (256Gbytes/s) and more compared to the antiquated mass consumer 1Gbit/s ethernet cards and slow PCI-e interconnects (in comparison to Cortex IP) the mass consumers have had to put up with for decades.

its a travesty that low power ARM cores have a massively faster generic interconnect compared to the desktop today, no wonder there’s a “no sale no profit” feeling among the masses today as regards desktop products worth buying on mass…

M Ire

Call me a luddite or maybe just playing devil’s advocate, but why does my light bulb need to twitter your toaster over the Internets? Can’t we go green without all the needless inter-networking and the computerized control of our energy usage?
And do I want the added constant life-stress of worrying about who is hacking my connected home tonight?

Maybe it has some real critical uses, but for the most part, the Internet of Things seems to be pushed by growth-hungry corporations to an already gadget-infested info-overloaded world that is yearning for some serenity and peace of mind.

Comments are closed.