2webperformancePublishers must ask themselves the question “what’s the value of our website” on a regular basis, simply to find out whether it fulfills all the functions and tasks, works well and executes all user requests immediately, as these factors determine its success. Slow response times, low availability, due to a complex page layout and faulty links, on the other hand, annoy online visitors, so that with time conversion decreases and the brand image suffers. It may even have adverse psychological effects that affect user behavior and search engine ranking can suffer, since these values are taken into account since April 2010 by Google, too.

Modern websites are mostly complex structures from different resources that are often not even controlled anymore by a company itself. Thus, the availability and total response time of a site comprises a variety of factors which affect each other. The original concepts of server uptime and server response time take this into account very inadequately. Today’s website operators are challenged to compress their offers in a meaningful way for the user and to optimize them technically, as slow websites simply frustrate the online user.

From 2003 to 2008, the size of an average site rose from 93.7 KB to 312 KB with some even having a size of 4 to 10 MB that for sure overwhelm even the best broadband connection. The number of objects displayed on a website doubled in the same period to an average of 49.9 items. Each object, from image to video, or from advertisement to the integration of social platforms, needs to be loaded - and that takes time.

Website Performance - important ranking factor in search engine marketing

Since April 2010, Google's algorithm takes into account the loading time of websites and counts them to the top 200 ranking factors.

Google measures the loading speed in two ways: On one hand it discovers which delivery time lies between the first request of the Googlebots (web crawlers) and the delivery of the last byte, on the other hand, it measures the lapse of time to the delivery of the entire page, including graphics, advertising banners and JavaScript and the value is then incorporated into the Google algorithm.

If the load-time of the website isn’t fast enough, search providers risk that a site can’t be crawl intensively enough. Google or Bing, for instance, set even 500 milliseconds as a target for their own website, raising the bar for website owners even higher.

Psychological effects of slow load-time

The speed of a website has a lot of influence on the satisfaction of the user. The Internet presents an almost limitless range of information and entertainment. Those who are just not fast enough lose current prospects to a competitor. Besides, the brand image suffers considerably, which has serious consequences in the long run. On a continuing basis, a slow and unsatisfying online offer gives the impression of poor quality. The aesthetic perception suffers as well as the credibility of the offer. Especially new users, who visit a site for the first time, don’t try their luck again after a first disappointment and prefer to take up the offer of the competition. The conversion rate decreases, viral effects are missing anyway and the user keeps only one memory alive: frustration.

Indeed, the tolerance of the user to wait for the loading time of a website is more than limited. A Google study e.g. found out that often milliseconds of load time decide on a website’s future success and that even minor adjustments in response times led to significant changes. When, for instance, the home page of Google Maps of 100 KB was reduced to 78 KB to 80 KB, the traffic increased in the first week by 10 percent and in the following three weeks again by 25 percent. A test of the online retailer Amazon revealed similar results. Any increase in the charge time to 100 milliseconds achieved a course increase of one percent. Experiments with Microsoft and Live Search showed that the slowdown of search results pages by one second meant that one percent less searches took place, and 1.5 percent fewer ad clicks were recorded. When the display speed was even worsened by 2 seconds, the search traffic dropped by 2.5 percent and the ad clicks by 4.4 percent.

Unavailability of a website for just a second can have a strong negative impact on the user behavior. In particular, a long-term commitment of the visitor to the site will erode.

Thus, whoever takes responsibility to measure, evaluate and improve the performance of an online presence, is usually facing in the beginning a quite complex optimization process. It takes time, care and a watchful eye to really get to know all the different factors and to derive the respective user feedback.

The use of flexible and meaningful monitoring tools or the calling in of performance experts is recommended if you want to make sure detailed site-and performance reporting with all its technical features will be of uncompromised quality.

By Daniela La Marca