Myself, Coding, Ranting, and Madness

The Consciousness Stream Continues…

Keeping the Blog Fast

23 Sep 2013 8:00 Tags: Blog, Programming, Web Design

Although I've dropped off the front page for any useful set of keywords in Google1Now 11th for C++ for Physicists2, I still take some amount of pride in this blog. Some of the content is even readable, and the underlying tech does seem to work.

Part of my guiding principles in the design of the blog has been to keep it really fast and really technical simple. I’ve discussed some of the server work I’ve done on this in previous posts. What I have been working on more recently is reducing what is being sent to the browser.

The site no longer uses any form of cookie, nor does it use any form of client-side scripting (JavaScript, etc.). There have never been many images which were not part of a post; the icons you see are encoded them straight into the style sheet using the data-URI standard3. Whether this is more efficient that using a sprite system is something which I have not tested; however, a sprite requested in the style sheet itself will have the disadvantage that it’s HTTP request cannot begin before the CSS has been parsed (and, depending on how the CSS engine is implemented, applied to the page).

I’ve also stripped down the styling to be simpler, and no longer reliant on web fonts; the currently available font APIs don’t leverage caching in a useful way, and the amount of data in a font file actually outweighs the advantages of having slightly nicer fonts; selecting a list of fonts such that almost all browsers rendering in a similar fashion is relatively simple these days4

The remainder – content and style sheet – are publically cacheable as they will appear the same for all users of the site; there’s no cookies to get in the way, and there is not user-specific content. The page itself is also quite simple – there are no embedded iframes for the social sharing code (a methodology I don’t approve of; I have recently been asked at work if we can build something similar, and I am yet to work out why the other sites decided to do it this way). I have in the last few weeks added the metadata information to make social sharing better – you should try it and share your favourite post with all your friends.

Where can I go from here? Quite a few places, as it happens. Currently, there is still a noticeable delay when you load a landing page; the script takes ~20-30ms to execute and generate a response, and there are overheads of connecting to my server, and the browser performing the initial layout and render 5. This request also feels longer because of the rendering itself – when you’re already on the site, the page changes in a relatively subtle way, and your brain doesn’t match up the times of click and load quite as well.

There’s a lot of little tricks I could perform here, which would also speed up the site in general. Firstly, I could jump from PHP to another CGI platform; for a noticeable increase in speed, I would be looking to use something like C, with some optimisations on the string hanlding to avoid large number of string allocations / de-allocations. In a FCGI mode, this should be extremely fast, as both string buffers and database connections being used by native code would be in RAM between requests.

However, there are improvements that could be made on even this: any operation on the database is going to take some amount of time to perform, and the overhead with either spawning a CGI process of communicating with a FCGID is problematic. What we can do instead, for a site of this size, is to build the entire thing into RAM, and let the web server itself handle everything else. Doing this isn’t actually that hard with my current Apache setup, although it would require me to add mod_rewrite to the configuration, and the overheads in incurs.

The principle after that, however, is simple: if the requested file exists in the specified directory (for this, I would probably use /dev/shm/blog), Apache serves it directly to the user; otherwise, it falls back to passing the request on to PHP. For all ‘cacheable’ resources, PHP writes the output back to Apache, and into the specified folder.

For the non-get requests, such as posting a comment, you have to add some extra logic to the backend code. If someone posts a comment on a post, you have to invalidate (delete) all indexes that post appears in, and the post’s page itself, so that they get regenerated when they are next requested. A similar effect to this can be achieved using a server side time based cache, at the cost of speedy updates.

Both of these changes are currently more effort than I care to go to. There is a third option that I actually considered: analysis the rendering cycle of the page. A poorly designed page can be quite computationally expensive to composite and render, due to multiple translucent layers, large number of complicated sizing or cropping rules, poorly sized images, and other visual ‘enhancements’. You can get a reasonable amount of information about page rendering from the developer tools on most major browsers; the conclusion that I quickly came to, however, is that I would only see any noticeable improvement by dropping the layout and/or the colours entirely.

So, the site is fast, but out of the search rankings. The site will continue to be fast, and it might one day undergo some form of SEO. In the meantime, keep reading, send suggestions.

  1. 1 https://google.com/search?q=c++ for physicists
  2. 2 I’m not entirely sure how I feel about this. It wasn’t a post ever worthy of the front page, but it is still a shame to not be there anymore.
  3. 3 A thing of which I highly approve.
  4. 4 All hail the might Helvetica
  5. 5 Whether modern browsers perform some kind of layout caching is something I’m not sure of, but watching how some page loads happen indicates they may do.