Load Balancing – SEO

Question: Can using a load balancing system be detrimental to your SEO efforts?

I have recently noticed a lot of daily fluctuations in the SERPs for a reasonably competitive keyword that my client is targeting. One day it’ll go down 18 spots, then jump by 9, then again by 10, then down by 12, it’s all over the shop. Weirder still, we’ve not been changing much recently either, they just keep bouncing up and down. Then I spotted flagfox was showing a Dutch IP, which for a UK focussed site, isn’t great news.

I brought this up with the technical lead during our weekly call, and he mentioned that it could have been due to the load balancing system provided by Akamai. Dubbed as a “site accelerator” service, Akamai clients maintain one copy of their site on an “origin server” which then pushes the data out onto the rest of the Akamai server network. When a user wants to access a site hosted by Akamai, their global load balancing system defines and decides on the optimal server within the network to send the user too. All well and good if all the servers are hosted within the same country… not so all well and good when a UK facing site is hosted in the UK when the first Google DC hits in, in .NL the next time, .DE the time after that and .CO.CK the time after that (that’s the Cook Islands’ TLD, I swear).

  • http://knowledgehub.zeus.com FrintonBoy

    Ahh, so the problem is, Akamai is doing what it’s designed for!

    I guess the first question is, why are the client using Akamai in the first place? The service is designed with two (major) things in mind;

    1. move content closer to the client to improve performance.
    2. maintain the availabilty of the content.

    Performance issues can be handled in different ways. How much traffic is the site handling to warrent the use of a CDN? It may be possible accelerate the performance within the origin location, negating the requirements for the CDN. You could also add intelligence to handle the peak traffic (possibly bursting out to the CDN or even a cloud based overflow area), minimising the effects cause by Akamai.

    Resiliance (and performance) can be addressed in a number of ways, a second (active) origin location either in a physical datacenter, hosted environment or cloud service.

    Obviously without knowing all the reasons Akamai were chosen makes this pure guess work, but hopefully my comments give you a starting point to work from.

    Nick

  • http://knowledgehub.zeus.com FrintonBoy

    Ahh, so the problem is, Akamai is doing what it’s designed for!

    I guess the first question is, why are the client using Akamai in the first place? The service is designed with two (major) things in mind;

    1. move content closer to the client to improve performance.
    2. maintain the availabilty of the content.

    Performance issues can be handled in different ways. How much traffic is the site handling to warrent the use of a CDN? It may be possible accelerate the performance within the origin location, negating the requirements for the CDN. You could also add intelligence to handle the peak traffic (possibly bursting out to the CDN or even a cloud based overflow area), minimising the effects cause by Akamai.

    Resiliance (and performance) can be addressed in a number of ways, a second (active) origin location either in a physical datacenter, hosted environment or cloud service.

    Obviously without knowing all the reasons Akamai were chosen makes this pure guess work, but hopefully my comments give you a starting point to work from.

    Nick

  • http://knowledgehub.zeus.com FrintonBoy

    What you could also do (depending on what is doing the directing to the Akamai network), is to make exceptions for those connections identified as being search engine robots.

    This could be simply based on source IP, or using a more sophisticated behaviour based identification.

  • http://knowledgehub.zeus.com FrintonBoy

    What you could also do (depending on what is doing the directing to the Akamai network), is to make exceptions for those connections identified as being search engine robots.

    This could be simply based on source IP, or using a more sophisticated behaviour based identification.

  • Pingback: Building Your Site For International Traffic | Shark SEO

  • http://www.apogeeresults.com Cory

    I’m going to be dealing with this question in about…2 hours (ha!) with a client. They have two international sites hosted in their respective countries and are asking about taking the plunge into Akamai. Any updates on what happened with this situation?

  • http://www.apogeeresults.com Cory

    I’m going to be dealing with this question in about…2 hours (ha!) with a client. They have two international sites hosted in their respective countries and are asking about taking the plunge into Akamai. Any updates on what happened with this situation?

  • http://www.andrewblackburn.co.uk Andy

    Hi Cory,

    The issue we had was actually getting to speak to someone in Akamai, as the IT department/developers held that relationship.

    If the plunge is about to be taken, have a word with the Akamai techs BEFORE it happens… see if you can be guaranteed certain clusters of IPs… or certain blocks within the same geolocation. Unfortunately, Akamai’s load balancing is designed to do exactly that, balance load across servers depending on latency, traffic levels, outages, etc… so to limit which of their servers will host your sites, limits the load balancing effectiveness.

    Keep me posted on how you get on!!

  • http://www.andrewblackburn.co.uk Andy

    Hi Cory,

    The issue we had was actually getting to speak to someone in Akamai, as the IT department/developers held that relationship.

    If the plunge is about to be taken, have a word with the Akamai techs BEFORE it happens… see if you can be guaranteed certain clusters of IPs… or certain blocks within the same geolocation. Unfortunately, Akamai’s load balancing is designed to do exactly that, balance load across servers depending on latency, traffic levels, outages, etc… so to limit which of their servers will host your sites, limits the load balancing effectiveness.

    Keep me posted on how you get on!!

  • Pingback: Building Your Site For International Traffic | Smash Robot

  • http://www.directorysubmissionservices.net Nick

    That seems like a poor man’s load balancing solution to me. Load balance the backend, not the frontend, it shouldn’t be visible to end users or search engines.