How to improve the speed of your web page, the basics
Is saving a few seconds really important?
Image by FracFx |
In an ideal world all webpages would appear the instant we demanded them, however achieving good performance has costs, and those costs need to be offset against the benefits. Features and complexity sells and performance is just one of those features which can be easily ignored and tough to sell, especially if it is say the difference between 10 seconds and 5 seconds.
The benefits of speed are often underrated and can be difficult to appreciate. For example a fast application is easier to use, not just because of the reduced frustration of waiting, but because it is easier to explore. If you click in the wrong place as the next piece of information appears instantly you can easily explore the options on the page. If every option takes ages not only are you more careful in your decisions taking long to complete any interaction, you also will not explore and perhaps find new better pathways to complete your task.
Another aspect of speed is not only does it allow you to explore, because it does not take long to get back to a known pathway, but it gives you further confidence that the application is well made. Confidence in the application again has unseen benefits such as the user is more likely to assume that the problem they are encountering is of their own making and not the application, reducing support calls as they try to resolve their issues themselves.
Also people can start using more optimised work practices to save themselves time, often what the developer sees as the "correct" pathway can be ignored because the "incorrect" pathway is faster.
Saving time, even seconds can add up to a lot more than the some of its parts.
What do I do next?
Well done you have managed to secure resource to improve performance, but where do you start? There is only one place to start measuring. Always start by monitoring and measuring how long the page takes to display. There is an issue that by logging time taken does actually slow down the page which might be considered an issue in a live environment. However, it is important to remember that the performance bottlenecks spotted in a test environment may not full reflect your live platform.
For the remainder of this article I am going to focus purely on the HTML, CSS and JavaScript and the page rendering performance. To truly improve performance you need to consider all factors and in fact some of the best gains can be achieved through a more holistic approach. For example a great performance improvement pattern is to anticipate the next request and cache this pathway. This works extremely well with a string find UI, before the user clicks "Find Next" the system has already calculated where the next string match is and jumps to it immediately. Simple approaches like this can get around the requirements for massive amounts of power or super efficient algorithms. Of course most of my best gains have been working on the "back end" processes, but there is plenty that can be done at the "front end" too.
TCP/IP connections initially send a packet of 14kb followed by the rest of the data. If you are accessing the page via cellular data or perhaps through a satellite connection then latency of the connection becomes an issue. Latency can be in the hundreds of milliseconds so if you are aiming for great responsiveness then the initial 14kb becomes important. If at all possible it would be optimal to provide all of the data necessary to display the visible portion of the page in the first 14kb. To achieve this it is vital to minimise everything that is being sent
Lets take a look at a corporate website example. City Link is a large UK logistics company. On a decent broadband connection their website pops up pretty quickly, but, it could be faster.
Combining the CSS will have a similar effect. There are 4 css files, although one is served from Google the other 3 can be combined quite easily to reduce the number of requests.
Sadly I have used technologies which prevent you from getting a perfect score for example they force you to use lang or type='javascript' when you are working in HTML5 because the source will not compile otherwise. fixing broken HTML is not always possible, but when it is try to use HTML and rid your page of those warnings.
For the remainder of this article I am going to focus purely on the HTML, CSS and JavaScript and the page rendering performance. To truly improve performance you need to consider all factors and in fact some of the best gains can be achieved through a more holistic approach. For example a great performance improvement pattern is to anticipate the next request and cache this pathway. This works extremely well with a string find UI, before the user clicks "Find Next" the system has already calculated where the next string match is and jumps to it immediately. Simple approaches like this can get around the requirements for massive amounts of power or super efficient algorithms. Of course most of my best gains have been working on the "back end" processes, but there is plenty that can be done at the "front end" too.
Minimise
It is not exactly rocket surgery to realise that reducing the amount of data sent to the browser will improve performance. This is often the simplest change to make and most of the time gives you the biggest boost in performance.TCP/IP connections initially send a packet of 14kb followed by the rest of the data. If you are accessing the page via cellular data or perhaps through a satellite connection then latency of the connection becomes an issue. Latency can be in the hundreds of milliseconds so if you are aiming for great responsiveness then the initial 14kb becomes important. If at all possible it would be optimal to provide all of the data necessary to display the visible portion of the page in the first 14kb. To achieve this it is vital to minimise everything that is being sent
Lets take a look at a corporate website example. City Link is a large UK logistics company. On a decent broadband connection their website pops up pretty quickly, but, it could be faster.
Images
The home page comes in at a total of 567 kb. 256 kb of the total are the images. Most of these are compressed appropriately, it is possible to compress these without any loss to image quality and achieve a reduction of around 153 kb. Now of course you could argue that some of the images could be replaced with HTML and CSS for a much larger reduction in size, if there is opportunity for this then it would obviously be a good place to investigate next.HTML
The HTML for the page is 29 kb, by just removing carriage returns and unnecessary space this page drops to 22kb. Now the results are not quite so impressive if you consider that this will be sent over the line compressed, but we are still able to see a drop of 7kb to 6kb.CSS
Again none of the CSS files are compressed, while the built in server compression again helps hide the benefits of compression it is still possible to save 3kb off the 30.6 kb total without the need to try and combine CSS rules, or cull any unnecessary rules.JavaScript
The page loads 17 JavaScript resources totalling 245 kb of data. Again a lot of this is not compressed and it is easy to save 8kb of data just by "minifying" the JS.Requests
The page makes a total of 52 request to get all of the required files. Each request has quite an overhead and has quite an impact on how quickly the data gets to the user as well as the amount of traffic required to pass this data in. It is possible to combine a number of the images and use image slicing to produce the images. As the images are almost entirely green, yellow and white as the pallet is not diverse combining the images will compress well and can substantially reduce the overall image requirements on the page.Combining the CSS will have a similar effect. There are 4 css files, although one is served from Google the other 3 can be combined quite easily to reduce the number of requests.
Load the JavaScript as late as possible
JavaScript blocks the rendering of the page, moving the JavaScript from the header to the end of the body can often get the visuals up and running before the event control in the JavaScript is required.
Do not load what you do not need
I might be missing something in the code, but I cannot see where the Jquery.mousewheel.min.js is used. If this is indeed not in use then removing this code gets rid of an unnecessary http request, 1 kb of data and also stops the browser delaying render while it parses this file. Try to load JavaScript or any other resource as conditionally as possible, yes your site will Cache resources so the overall effects might not be too bad, but getting rid of an unnecessary parsing will improve the responsiveness even just a little.Fix any broken HTML
When the HTML is in error the browser triggers its error handling, this is only a fraction slower, but may trigger various quirks modes, this probably leads to extra CSS and HTML to "fix" the problem. Running a validator such as HTML Tidy is invaluable in preventing this path. City Link only triggers 7 warnings, this is far fewer than the majority of pages I have looked at, but these 7 could quickly be addressed for a perfect score.Sadly I have used technologies which prevent you from getting a perfect score for example they force you to use lang or type='javascript' when you are working in HTML5 because the source will not compile otherwise. fixing broken HTML is not always possible, but when it is try to use HTML and rid your page of those warnings.
Do not use XHTML
Some websites use XHTML such as City Link. Besides the fact that support for this version of HTML within browser is poor and they tend to just parse as if it were HTML, it is not compatible the with XHTML 2 draft and perhaps more importantly it requires more verbose HTML than HTML 5. Save precious bandwidth and choose HTML 5.
Comments