How To Speed Up Your Website By 80% Or More

SpeedIs what they should have called a 3 page whitepaper. Instead they wrote a 100 page book and sold it for $30 ($20 on Amazon). I am talking about High Performance Websites. I don’t like to rant about books, I believe you can never read too many, but in this case paying that much money for a 2 hour read stretched even my credulity. And still I would have been happy if it was 100 pages packed full of awesome content. But, you guessed it, in this case, if you cut out the filler you could really fit all the useful info into about 3 pages (which would have made those 3 pages a really awesome resource).

Still I can’t be 100% critical, the book did teach me a few things I didn’t know before, and if you’re predominantly a back-end developer you will probably pick out a few useful tidbits as well. Still, after you finish it you kind-of wish you stopped reading after the table of contents, which would have covered 80% of the useful info in the book (but of course you wouldn’t know this until you’ve read through the whole thing). Luckily, since I’ve already been through it, I can save many other people the time and the money and create a summary – which is what this book should have been to start with.

The Summary

If you examine the HTTP requests for how a web page is loaded in a browser, you will see that at least 80% of the response time is spent loading the components on the page (scripts, images, CSS etc.) and only about 20% is spent downloading the actual HTML document (that includes all the back-end processing). It therefore behooves us to spend some time on front-end optimization if we want to significantly speed up our website loading times. There are 14 main points to look at when we’re trying to do this:

1. Try to make fewer HTTP requests

  • try using image maps instead of having separate images
  • you may also try using CSS sprites instead of separate images
  • it is also sometimes possible to inline the images in your HTML page (base64 encoded)
  • if you have multiple JavaScript or CSS files, get your build process to combine these into one master file (one for CSS one for JavaScript)

2. Use a content delivery network (CDN)

  • a content delivery network is a collection of web servers distributed across multiple locations
  • this allows browsers to download from servers that are geographically closer, which can speed up download times
  • there are several CDN’s that major websites use e.g. Akamai, Mirror Image, Limelight etc.

3. Add a far future Expires header to all your resources

  • more specifically, add a far future expires header to allow the browser to cache resources for a long time
  • you can use apache mod_expires to take care of this for you
  • you don’t get the savings the first time users visit (obviously), only on subsequent visits
  • you should add far future expires headers for images, scripts and CSS
  • you should introduce revision numbers for your scripts and CSS to allow you to modify these resources and not worry about having to expire what is already cached by the browser
  • you can hook creating revision numbers for your scripts and CSS into your build process

4. Gzip components

  • you should gzip your HTML pages, scripts and CSS when they are sent to the browser
  • you can use apache mod_gzip (for 1.3) or mod_deflate (for 2.X) to handle all this for you

5. Put stylesheets at the top (in the document HEAD using the LINK tag)

  • we want the website to render progressively in the browser (i.e. to show content as it becomes available), but many browsers will block rendering until all stylesheets have loaded, so loading stylesheets as soon as possible is preferable
  • having CSS at the top may actually make the page load a little slower (since it can load stylesheets it doesn’t need), but it will feel faster to the users due to progressive rendering

6. Put scripts at the bottom

  • normally according to the HTTP spec a browser can make two parallel requests to the same hostname, splitting components across multiple hostnames can improve performance
  • scripts block parallel downloads, having scripts at the top will block all other components from downloading until the scripts have finished loading
  • having scripts at the bottom allows all other components to load and take advantage of parallel requests

7. Avoid CSS expressions

  • CSS expressions are evaluated very frequently, so can degrade page performance after it has loaded
  • instead use one-time expressions or better yet use event handlers

8. Make JavaScript and CSS external

  • if user visits infrequently, you’re better off inlining your CSS and JavaScript into your HTML as the page is unlikely to be in the browser cache anyway and this minimizes requests
  • if users visit frequently, you’re better off having separate files for your CSS and JavaScript as this allows the browser to cache these components and only need to fetch the HTML page which is smaller due to the fact that CSS and JavaScript are externalized

9. Reduce DNS lookups

  • if the browser or OS has a DNS record in it’s cache no DNS lookup is necessary which saves time
  • you can use Keep-Alive to avoid DNS lookups, if there is an existing connection no DNS lookup is needed
  • if there are fewer hostnames, fewer DNS loolups are needed, but more hostnames allow more parallel request

10. Minify your JavaScript

  • this means removing unnecessary stuff from your scripts, such as spaces, comments etc. this makes the scripts much smaller
  • you can also obfuscate, but the extra savings compared to minification are not worth it, especially if gzip is used
  • you can use JSMin to minify your JavaScript
  • you can also minify inline scripts
  • minifying CSS is possible but usually not worth it

11. Avoid redirects

  • redirects mean an extra request, and means that all other components are prevented from loading, this hurts performance
  • don’t use redirects to fix trivialities such as missing trailing slash, this can be done through apache configuration
  • you don’t need to use redirects for tracking internal traffic, you can instead parse Referer logs

12. Remove duplicate scripts

  • often duplicate scripts creep in, this can make web pages larger and require more requests which hurts performance
  • implement processes to make sure scripts are included only once

13. Configure or remove ETags

  • ETags are used by servers and browsers to validate cached components
  • if you’re using Expires headers the last modified date may be used by browsers to check if the component needs to be fetched, ETags are an alternative to the last modified date
  • the problem is that ETags are constructed to be specific to one server, when a site is distributed across several servers this is an issue
  • there are apache modules that can customize ETags to not be server specific
  • if the last modified date is good enough, it is best to remove ETags

14. Make AJAX cacheable

  • same rules (as above) apply to AJAX requests as to all the other requests, especially important to gzip components, reduce DNS lookups, minify JavaScript, avoid redirects and configure ETags
  • try to make the response to AJAX requests cacheable
  • add a far future expires header for your cachebale AJAX requests

That’s it, that was the whole book without having to spend $20-30. There are so many things that could have been expanded on in this book, examples, step-by step instructions, configuration snippets etc. With a bit more effort it could have been made into a valuable resource, worthy of it’s price tag. Alternatively it could have been priced in a fashion commensurate with the level/amount of content. The way it stands though, it was just a little annoying. Enjoy the summary.

For more tips and opinions on software development, process and people subscribe to today.

Image by Alex C Jones

  • Korny

    Or you could just install YSlow: – which would tell you most of those things, for free.
    I’d also say, it’s worth taking some of these things with a pinch of salt – sure, minifying your JS might speed things up – but it might be such a tiny speedup, that it’s not worth the extra pain and possible strange bugs – Javascript can be sensitive to line endings if you don’t have semicolons everywhere, so minifying Javascript can change behaviour!
    If in doubt, test the speed before and after – as always, beware premature optimisation.

    • The guy who wrote it is (or was) a yahoo guy, and yslow is a yahoo tool, so it is interesting that yahoo has released all this info for free, and yet the book was still published.

      Taking it all with a grain of salt is definitely good advice. Never a good idea to wholeheartedly jump into something without thinking.

  • John

    Sounds like they just published the YSlow guide [] in dead tree form.

    • As I mentioned above, yahoo guy – yahoo tool. It looks like a fail on the part of the publisher to allow this book to be printed.

  • mat

    not only was the book short but it was really thin too! cheated in two dimensions!!!

    • Ehehe, if only it hadn’t been wide enough, we could have made it three :).

  • Google Page Speed is clearer than YSlow IMO.

    • Also a good resource, just another reason to ask why the book got published in the first place.

  • Haha, right when I started reading your post, and saw the fact that it was talking more about front end, YSlow was the first thing I thought about, but it looks like you’ve gotten more than enough comments on it :) Anywho, regardless, thanks for the good refresher of the information, and for saving me the trouble of having buyer’s remorse, heh heh.

    • Hehe, yeah people were right on to it with Yslow, good to see that so many people are ‘on the ball’ so to speak :).

  • Good work Alan! You saved several hours for several people.

  • Thanks for sharing the information.It is valuable.

  • Do you use a content delivery network?
    I’m curious to see how it scales with WordPress since the content is dynamic and all.

    • I think a CDN is one of those things you would only consider if you’re a pretty big website (i.e. top few thousand), otherwise it is not worth the hassle and cost. That’s what it seems like to me anyway, I would happy to hear other opinions though.

      • yeah, that makes sense.
        I was just curious if it would even work with wordpress.

  • Dan

    Great article. I would also try Google Page Speed as an alternative to YSlow.

  • I kinda see this from both perspectives…

    I buy a lot of books for myself as well as for the developers in my organization for the development library. I try to get books that I have read or that are reviewed well. 30-50 for a book that has poor value can be painful.

    However, I as a first time author (I just finished my book: “Next Generation Business Intelligence Software using Silverlight 3”)…I knew beforehand I can’t please everyone. If you are familiar with either technology Silverlight or BI, this can mean that you gain less value from investing in my book. Its difficult to pack everything in even in a 400 page book. Some people will feel it is light on the content in some areas.

    Most “intro” technology book are just well organized/researched information from the Internet. How many books have you read where 3 pages are taken up listing all the properties/definitions of a particular class from an API?

    I am not defending that book (because I have not read it), but I have in the past valued books as resources immensely even though they were “re-hashes”…because they shed light on the subject in a unique way or explained things clearer.

    • There is nothing wrong with presenting existing info in a unique way or providing introductory content as long as you:
      a) set expectations correctly, i.e. present it as introductory rather than as reference or exhaustive resource
      b) price it correctly
      It would even be ok to present info which is mostly available from one source, but you need to make sure you bring perspective and insight, if you’re going to be charging people money.

  • Any of you people who are badmouthing the book and\or publisher clearly have not read the book.

    Yes, It is based on the YSlow rules. Steve Souders wrote the rules and also the book. The book is so much more than a single sentence saying “Do this” or “Dont do this”

    It explains the rules and dives deep into the reasons behind certain browser and server behaviors.

    Surely everyone here doesn’t know everything about the Max-Age header or how proxies obey caching and gzipping rules.

    This book is awesome for everyone, even the most seasoned web developers. Steve’s second book, Even Faster Websites, improves upon the first and really explains more things even further.

    • I certainly have read the book, and as I mentioned, it did teach me a few things I didn’t know. My gripe was with the fact that the amount of content and useful information was not commensurate with the price of the book. And lets face it, aside from a few things (a couple of which you mentioned), there was quite a bit of filler in the book. Which is normally fine, but for a 100 page book worth that much money is not acceptable in my opinion. I was able to summarize 95% of the books content in one page, this should not really be possible with any technical book.

  • Pingback: 網站製作學習誌 » [Web] 連結分享()

  • My suggestion is to host your media files (including images/css/js) on another sub-domain instead of putting in main domain. It helps to improve the download parallelism. and will improve your website loading speed for sure.

    google maps is taking benefit of this technique only. so I think this should be our first and major step in order to improve speed.

    • That’s a great suggestion, cheers for that.

  • Ruben Zevallos Jr.

    The problem is when you have a dynamic app where you cannot use the GZip cache, but you are right… minimize the code…

    I think you can add, to say to users to use Tableless, focus on HTML semantics and validate it´s HTML code… Tableless and semantic HTML are faster than the regular based in tables or <DIV and SPAN

    • I wish more people thought like this, I really do. Having to deal with some heavy-weight frame-based, infinitely-nested website fails, is just not fun. No developer should put another developer through that kind of pain :).

  • Igoris Azanovas

    All those that “put another developer thtough that kind of pain” are not worthy calling themselves a developer i think ;) Thank you for the post, already knew many of these techniques from YSlow.

  • One of the great tools I used is

    it looks like you did a great job with optimization.

    • That actually looks pretty cool, thanks for sharing it.

  • The last step is to add a favicon.ico to your site….

  • This is an awesome post thanks for the info!

  • Johar Just Translate

    Thank for this info :)