Benchmarking Page Load Times

It is an unusual scenario to end up in where you contemplate benchmarking the load times of your pages with javascript. Yet it has happened to me more than once, let me be frank about this method it is not an exact science nor does it give a “true” result.

If you have not been put off and are still reading then take heart the method can offer a great insight into how your page is behaving for your users it just shouldn’t be trusted to be 100% accurate for loading times. I have used this method several times where I needed to understand where a slow down was occuring for my users, each time the results have been enlightening and puzzling in equal measure.

The situation normally begins in the same way, I have done my crossbrowser checks and the project looks good works great for me and my server. Having confirmed that it is complete I pass the project to the end user for final inspection in situ at its final environment. This is about the time the end user comes back with ‘It looks/works great but.. it is kind of slow can you not speed it up a little’ instead of the much expected ‘Brilliant work as always here is a bonus’ (ok the last part is fantasy).

The next few steps are crucial in checking if you about to enter a world of benchmarking and pain, or convincing and pain.

  1. Fire up firebug in Firefox (don’t have it? check out here or here ).
  2. Open up the Net tab make sure it is enabled.
  3. Clear your browsers cache.
  4. Navigate to the project and watch the Net tab fill up with file requests.
  5. Once your page has completely loaded check the final line, this will give you the time taken to create your page and the time taken for the onload event to fire. If the page loaded in an acceptable time for you then it taking note of these times for later conversation with the end user.
  6. Work down the file requests from top to bottom looking for items that took a long time to download or process. A critical speed killer is a 404 error on a file request. A 404 error is a File Not Found server error that means the file your requested is not on the server or could not be located by the webservice. Finding why this error has occurred can sometimes be the end of your hunting and the solution to the slow down. Common reasons for 404 server errors are
    1. The file was missed when the project was uploaded so really isn’t on the server.
    2. The server uses a case sensitive filesystem and your request does not perfectly match the filename case.
    3. Misconfigured server rules, simply put your filename matched a previously defined rule that is now hijacking your request but can’t handle it.
  7. If you have solved all 404 errors (or didn’t have any) and the user still has a speed issue then it is time to go back over the Net panel file requests this time concentrating on those requests that took a heck of a lot longer than you expected. If you find any files that take longer than expect try opening a dialog with the end user about these slow files. Your best bet is to make it ” a side note to your investigation” so it does not look like you are attacking their server to cast blame on anything but yourself, it is the last thing anybody wants to hear but being asked to check the logs to help the investigation is generally ok.
  8. If you are still with me at this point then you have passed out of the convincing and pain into the benchmarking and pain. Admittedly we did some benchmarking with the Net tab but that tells us how your experience of the page progresses not the user’s. Time now to look at getting the end user’s perspective…

Firsts things first credit where it is due the following method was inspired by this post. I have taken the idea and expanded on it to store the information in a database with a unique session ID to be analysed at leisure.

The code is fairly simple but robust enough for all situations.

Add the following line to the top of your page <head> tag below your <meta> tags

<script type="text/javascript">
var ptim=new Date(),muid=null;

Add the following code to your page <body onload> so our code is called in the onload event.

//call to log function
log_time(ptim,'page onload','');

Then at the bottom of your page add the following code just before the </body> tag.

//define our log function
log_time=function(){var d=null,c=null,a="";function b(){var f=null;try{f=new XMLHttpRequest()}catch(g){}if(!f){try{f=new ActiveXObject("Msxml2.XMLHTTP")}catch(g){}}if(!f){try{f=new ActiveXObject("Microsoft.XMLHTTP")}catch(g){}}return f}return function(h,g,e){var f=new Date().getTime();h=h.getTime();if(muid===null){muid=h+"_"+course.length}a=location.href.replace(/(file\.php[\w\d\W]*)/,"log_time.php")+'?url="'+location.href+'"&uid='+muid+"&pg="+g+"&ts="+h+"&te="+f+"&sect="+e;if(d===null){d=b()}if(d!==null&&d.status){"GET",a,true);d.send(null);return true}if(c===null){c=b()}if(c!==null&&c.status){"GET",a,true);c.send(null);return true}}}; </script>

This has been compressed using the YUI compressor, to minimise its impact on the page size and load, it creates a XMLHttpRequest object then each time the function is called the data is passed as a GET request to our recording script on the server log_time.php. On the first call a unique session id is generated which is passed with each call so you can differentiate between individual page loads.

I chose to use PHP as my recording script but you can use any script or language that can accept the GET request data and process it into your database. The contents of my script is

//include my library for database coms REPLACE with your own library or application connection
include_once('mylibrary.php'); //@TODO - REPLACE with your own connection object
$dta=new stdClass(); //create an object to store the data for the insert call
//process all GET variables into the data object
foreach($_GET as $k=>$v){
$dta->project_name='MyProject';//add a label to the data if we are running on more than one page

As you can see my script is not particularly robust or secure for long term use but it is only intended for short periods. If you intend to use this method for a longer time, please secure your server script from SQL injection and other nasties. I have included a reference to an external script mylibrary.php this needs to be replaced with your own choice of database connector and the insert_record() call will need to be updated accordingly.

With everything setup a visit to the page/site will populate some records into your database, take note of your own visits and compare them to your end users’ records. Each visit to a page will generate a new unique identifier it is not stored in a cookie in order to maintain low profile of this code. A small modification could be made to rectify this but for most purposes, you are interested in a birdseye view of how the page performs so it is not required.

If you find that there is large lag between times start dotting further calls to the log_time() function altering the 2nd parameter to identify that checkpoint. For example

//extra calls to log function
log_time(ptim,'before massive js function','');
log_time(ptim,'after massive js function','');

this will tell you how long a function is taking to process, it is best to ‘fence‘ code one portion at a time or you will end up with hundreds of db records that maybe of little use not to mention the extra load added to your already “slow” page.

Leave a Reply