4 Comments

Summary:

As the former CEO of OpenTable, Jeff Jordan can attest that most CEOs believe they intuitively know which product developments will make the biggest impact. In this article, Jordan makes a compelling case for letting data — and not the CEO — drive product development.

Jordan_image 1

Facts are simple and facts are straight

Facts are lazy and facts are late

Facts all come with points of view

Facts don’t do what I want them to

From Crosseyed and Painless” by Talking Heads (written by Brian Eno and David Byrne)

Business growth at established companies tends to decline relentlessly over time in the absence of inspired innovation, an impact I affectionately refer to as “gravity.” If CEOs want to fight this gravity and improve the long-term growth trajectory of their businesses, they need to take proactive, concrete steps to make it happen.

As CEO of OpenTable, I was always trying to develop and test a portfolio of potential new initiatives, targeted at both optimizing the core business and adding new layers of growth (see my last post on this topic, “Adding Layers to the Cake”). I wanted to identify completely new initiatives that would boost business growth — things we weren’t already doing. Things you are already doing are largely yesterday’s news, and their impact on future growth tends to wane over time. Implementing completely new innovations can help your business fight gravity.

At OpenTable, one of the most highly leveraged examples of an innovation to optimize our core business was developing a rigorous methodology to pursue and assess potential site improvements. A while after I became CEO, my predecessor Thomas Layton (who did a spectacular job positioning OpenTable for long-term success) sent me a fascinating video of Ron Kohavi (previously the director of data mining and personalization at Amazon) talking about Amazon’s approach to something Kohavi called data-driven product development. Here’s a link to one of his presentations: http://videolectures.net/kdd07_kohavi_pctce/. (FYI, it only works sporadically.)

The video details how Amazon rigorously deployed A/B testing to optimize website efficiency. Kohavi starts the presentation by showing a number of different executions of the same feature that they had tested over time. He then asks the audience to vote on which they thought had performed better. The folks in the audience — all website geeks — were unable to consistently pick the winning execution. That was mildly surprising. But what was astonishing was the delta between the results driven by different executions of the same feature: what often appeared to be a subtle change could drive huge improvements in performance.

In the video, Kohavi also talks about the impact of the “HiPPO” (Highest Paid Person’s Opinion) in the product development process. Not surprisingly, most HiPPOs believe they know intuitively what will work best (spoken by the former HiPPO at OpenTable). But pretty much none of us have the product instincts of a Steve Jobs, and Kohavi makes a very compelling case for letting the data and not the HiPPO make the decision.

So we resolved to test data-driven product development as one of OpenTable’s potential core business optimization initiatives. We tasked a talented product manager, Julie Hall, to lead the effort, and we procured the necessary tools. For the art side, we found the low-cost and highly efficacious UserTesting.com service, which enabled us to get qualitative user feedback literally overnight to inform what we planned to test. And for the science side, we bought the overpriced but also highly effective Test&Target system from Omniture, which enabled robust quantitative measurement of a number of simultaneous A/B tests.

These two tools worked together marvelously. One time, we stumbled upon a big “improvement opportunity” (aka a nasty usability problem on the site). We used usertesting.com to task a handful of unregistered users with making a reservation. Then we watched in horror as a significant minority of the test users got trapped in our “Sign-In” functionality and couldn’t complete their online reservation. In the real world, they probably would have gotten pissed off and simply picked up the phone (OpenTable’s biggest competitor for consumers) and called the restaurant, a disaster for our user acquisition, brand affinity and OpenTable economics.

Here is the offending page:

The problem we encountered was that non-registered users would set the radio button that said “I am a new OpenTable customer” and then fill in their email address and select a password, which sat under the “I am an OpenTable member” section. This combination caused the page to return an error message.

In response, we developed alternative treatments of our sign-in functionality and tested them via Test & Target. The winning executions presented non-cookied users with a form tailored directly to new users, with clear visibility of a link for existing members to sign in:

After the user hit the “Complete Free Registration” button, we then presented them with a popup that prompted them to register to become an OpenTable member:

These simple changes boosted our reservation success rate by 10 percent over the prior implementation. And a 10 percent improvement in the revenue stream that comprised well over half of the company’s total business due to one simple change was a monster win for the business.

Over time, the data-driven product development methodology at OpenTable matured into a highly disciplined testing regimen. Hundreds of tests have been run in the past few years. Not all were home runs like the change above, but lots of singles and doubles supplemented the occasional home run to have a highly material impact on the business. I can’t recommend a rigorous data-driven product development process enough to managers of website businesses — it’s extremely low-hanging fruit in the pursuit of growth.

Other examples of core business optimization initiatives that helped OpenTable boost growth include:

  • Developing a new version of our enterprise software used by restaurants that improved search-to-reservation conversion by having the software better mimic the decision-making process of the person at the host stand
  • Redesigning key pages of the site to improve their efficiency, such as a complete overhaul of our search-results pages (tested thoroughly through A/B testing)
  • Focusing dedicated resources on optimizing the percentage of OpenTable restaurant owners who had “make an online reservation” links on their own websites, as well as improving the visibility of those links
  • Applying concerted product and engineering efforts to boost our ranking in search engine results
  • Hiring a lot more sales people to accelerate the acquisition of restaurants using OpenTable (not a product change, but highly effective)

The key takeaway here is that all of the above were new, concrete initiatives that were not yet part of the company’s arsenal. Their successful deployment helped fight off the impact of gravity and led to accelerated growth for the company.

Jeff Jordan is a general partner at Andreessen Horowitz. Previously, he was president and CEO of OpenTable, president of PayPal and senior vice president/general manager of eBay North America. He blogs at http://jeff.a16z.com/.

  1. Awesome article. I have been following this exact process to drive revenue growth for B to B SaaS companies, but I started doing it as a consultant since people kept asking me to help them. I was one of UserTesting.com’s earliest customers – love that tool.

    A few things I’ve also discovered that work well:
    – KISSinsights allows me to quickly test assumptions by asking targeted users a quick question. This allows me to much more quickly design tests that are more likely to work.
    – I like Optimizely’s A/B testing platform. T&T is more powerful, but it’s expensive and really hard to use since everything you want to test needs to be wrapped in their mboxes. I’ve found that because the testing process involves engineering, sales, marketing, etc, it’s hard to get tests done. Optimizely’s approach allows me to cut most of the engineering tasks out of the test implementation process and therefore move more quickly.
    – In addition to testing usability and features, you can use these platforms to very specifically test high value customer segments, such as those participating in a free trial, or coming from a paid click campaign or a partner co-marketing campaign.
    – By focusing on customers who are in a free trial, you can change the site to better meet their needs. This is incredibly important in SaaS businesses who don’t have an enterprise sales process (usually those which cost under $1000/mo) since the site itself has to do the selling. By changing the site to demonstrate the parameters by which in-trial customers are judging you, you can dramatically increase trial to paid conversion.

  2. Travis Collins Sunday, February 19, 2012

    I’m sorry, but this article is basically an Ad that provides only anecdotal evidence that OpenTable didn’t do their own basic user experience testing, and therefore benefited when any amount of testing was performed. Any company can benefit greatly from simply using their own product. They don’t have to pay for these expensive services.

  3. I’ve always thought Amazon’s site was very underwhelming – way too busy.

  4. Sorry, but I have to disagree with you Travis. I worked in web analytics with two of the biggest commcial sites, and despite rigorous user experience testing, we still gained very valuable insights from A/B and multivariate testing with in-house tools and tools such as T&T. When put to the test with large volumes of actual user traffic, frequently the least expected variant would be the most successful.
    Similarly, complex flows like checkout can often stymie a significant portion of your users due to various issues like geo validation issues, card proccessing anomalies, and so on that would take forever to find in testing. Why, right now I’m having trouble getting this text box to retain correctly the input focus on my iPad when I tap back to correct typos. I imagine the page was tested, though. Given the poor experiences I see all over the web, I don’t think we’re in jeopardy of over-testing yet!

Comments have been disabled for this post