Matt Cutts, a software engineer and an eloquent corporate spokesman for Google, spoke at PubCon earlier this month and later gave a video interview to Web Pro News, in which he said that the speed at which web pages are available might become a factor in […]

Matt Cutts, a software engineer and an eloquent corporate spokesman for Google, spoke at PubCon earlier this month and later gave a video interview to Web Pro News, in which he said that the speed at which web pages are available might become a factor in SEO moving into 2010. He said that because many within Google consider fastness to be vital to the web, the company is considering making web site speed a factor in calculating page rankings. Those comments have confused and scared many folks as to how speed might impact their businesses.

To be sure, Cutts’ comments don’t offer any details, and it is not even clear if  Google will go down that route. (Matt, can you offer clarifications please?) Still, some are worried that Google is going to turn PageRank into a country club for the rich, and penalize smaller sites because they don’t have high-end hosting facilities. Of course, there is the uneven distribution of backbone connectivity. Many parts of the world are not as well-connected as, say, Asia or the United States, so does that mean this new approach to PageRank could penalize sites hosted in places that don’t have abundant connectivity?

Ken Godskind, chief strategy officer of AlertSite, a web site traffic monitoring service, thinks such a move could have a serious impact on many web sites. “The potential changes at Google mean there will be a REAL business impact for poor web site performance, and conversely, in Google’s words, a bonus for good performance,” he writes on the company blog. “Online organizations will need to look closely at themselves and the other parties that participate in the Web application delivery supply chain to understand and manage this new development in 2010.”

On a personal level, I believe that a faster web is good for everyone.  At some point in our web journeys we have all cursed slow-loading sites, a problem that is only going to increase as the web becomes more intricately intertwined, in the process becoming a patchwork quilt of diverse  services. Performance hiccups at one service can send out ripples of disruption.

Increasingly popular widgets can slow down the performance of even the best web offerings, thus lowering the overall experience. The increased interdependency of various services can often cause disruption. Hours after the news of Michael Jackson’s death spread on the web, many news sites became inaccessible and suffered slowdowns. Those sites didn’t crash but instead were hampered because they were pulling data from ad servers that weren’t prepared for the onslaught of traffic. On our own sites, we have seen things break down when one of our partners suffers an outage or has performance problems.

As a consumer of information, I would say Google giving preference to faster sites doesn’t seem like a bad idea. As a publisher, the high cost of speeding up my web offerings might hurt in the short term, but ultimately it means a better experience for my readers — and there’s nothing wrong with that.

Photo of Matt Cutts at Pubcon 2009 courtesy of PlanetC1 via Flickr

You’re subscribed! If you like, you can update your settings

  1. I agree with your last paragraph.

    But what really bothers me, Om, is that
    You, Matt, and other top shelf experts are so used to
    technology, that you forget what it’s like for millions or billions of others.

    I have noticed this trend around high tech giants.
    They fail to appreciate, or perhaps are dismissive of,
    the decent folks who will publish good content but have NO clue
    That SEO, site speed, quality linking, etc, even exist.

    Thank you for your post.

    1. Ed,

      I do point out the issues which will arise because of this and I completely agree with you on us not knowing about the tech reality of the mainstream. But the caveat here is that this need for speed was and is a very personal opinion. I think in the end Google will find a middle ground.

      1. i would like to add to what ED is saying when i state that the tech media seems to be way out the ordinary when it comes to the issue of broadband speed. while the average user likes higher speeds it is very secondary to other factors such as coverage and price.

        as an example i will give tons of unlocked iphone owners who never use the wifi link and use t-mobile. the better coverage and pricing far outweigh the speed benefits of 3G for these users. most people want good quality basic services.

        perhaps it would make sense for google to lower page ranks on the very slowest sites. but it would be a disservice if they end up elevating sites for extreme speed when the ones down the list are ‘fast enough’. there should be an acceptable ‘lag’ that means equal ratings.

  2. Page load speed has been a factor in quality score for a long time. Google focuses on consumer user experience first and they have a big interest in people being happy with their search results. If a page doesn’t load, a less-savvy user attributes the failure to Google, not the particular site.

    1. You are NOT correct sir. Very slow performance is a documented factor in adword quality ranking, not in SEO ranking.

      1. slow performance needs to be defined – 1) is it the host serving the page 2) how the page was written HTML/CSS/JAVASCRIPT/ETC. or 3) both.

  3. It is simplistic to think that Google would be so binary in their decision. Speed will already impact results since if your page takes too long to load people will bounce back and click another result!

  4. Just passing images through Smushit on this page would reduce total data by 87K though the biggest saving of that would be resizing the picture of Matt to the dimensions you used.

    Google provide tools as mentioned at Pubcon, but which you have most likely reported on in the past such as Page Speed.

    A local business in the Philipines is not going to be harmed by this, in fact they will probably gain from using local hosting, and I am sure Google also has something up their sleave (Google CDN?)

  5. Funny i just read another blog about realtime data not beeing included in the Google index. This now becomes a common criticism of Google search.

  6. Google focusing on page speed, if only in comments for now, will result in better awareness of design and coding practices, and how situating hosting resources, including partners’ resources, near a site’s geographic majority audience can affect page delivery. It can also enhance SEO awareness of the importance of Google’s country-specific deployments. These kinds of awarenesses would create a generalized pressure to bring down page weight, simplify and better focus delivered content, and better balance load throughout the web. If Google is banking on the cloud, as seems to be the case, these kinds of awarenesses have a lot of future value for Google.

    (Where infrastructure is dicey due to a late start, stressful natural conditions, or historically unstable economies and politics, server companies may run server locations outside their home countries in order to offer high site availability to customers. Offshore server location ensures higher usage of international pipelines and slower page serving when a site’s main audience is within the home country. But site availability is often more important than page delivery speed, and certainly more important than pipeline usage, from the user’s perspective, in places where infrastructure is unreliable. Google raising the bar on speed could create an incentive for stressed countries to shore up infrastructure.)

  7. The vast majority of website optimisation can be done entirely on the server end, with no input from the web designer, contributer or end users whatsoever.

    For webservers like IIS, there are a number of tweaks that can be done in just a few minutes and will boost page download speed 5 fold.

    There’s nothing wrong with optimising the Internet per se at ISP level. That’s a job for professional network engineers to deal with. I do however have a problem with Google ranking people because of their servers ‘bandwidth’ as poorer countries/individuals and self hosting will suffer considerably. Their opinion will not be heard and their sites will go unseen.

  8. No question that speed is important.

    This is a case again of those that have getting more, and those with less getting even less than that.

    In other words, the small business guy working out of his garage may be a genius in terms of what he does, but he is also going to have to be a net genius as well.

    As an example, perhaps a lame one, it used to be the case that an artisan of great quality would be able to make glass Christmas ornaments and sell them to his community. Perhaps two such artisans could be supported by a mid sized community. Now, with the two pages of real estate only for such ornaments, the people with the most money, usually big box stores purchasing ornaments from China get all the business.

    With a big box store, naturally it is no big deal to hire really great programmers, folks who have the knowledge necessary to make their web sites load like lighting. On the other hand, the ornament genius doesn’t have the resources to hire the IT professional, and has no chance to be ranked high by Google.

    I’m not saying that it is possible to have both quality and speed in the ornament site business. I am saying that the very best artisans are generally going to be out of luck as they already are.

    No matter if we are talking about backlinks or loading speed or any other criterion of that sort, there isn’t any way for Google to get the best product in front of its audience using the kind of IT tools available to it.

    I know I am not saying anything new. Unfortunately, Google puts Walmart number one for paintings and frames. Filipe Zecoro, probably the best painter in the world currently can’t even be located on the internet. He depends on a very small clientele, starves in his garret, as Google programmers can’t find him.

  9. i would think that this could be a feature that could be enabled via Google labs. For example the user could select in their search preferences to sense the connectivity bandwidth for that particular session and optimize/weight pagerank of small faster loading sites more highly in situations where connectivity is low.

    i would find something like this useful for when i travel internationally as some sites are not as speedy depending upon where I am in the world. (of course while international travelers is a niche demographic, I imagine that this could be useful for some connectivity constrained localities)

  10. Page load times have much more to do with site design: poorly written javascript, excessive use of third party stat collectors, ad servers and other widgets. It doesnt matter how much you pay for bandwidth if you have a poorly designed site and while the very best designers dont come cheap, you dont need to be rich to pay for OK design.

    More power to google. unless this is just an indirect plug for their new protocol to fix http, in which case I call shenanigans!

Comments have been disabled for this post