During 2016 I decided that I wanted to blog again. This website long ago was a wordpress install that had my front page, some blog content, and my portfolio for my Library Science masters. I did not use the website much and paid for a mysql database for no real reason except that wordpress needed it. I was able to get to the point that I could tweak some settings though it was a challenge at the time. I did not have much to say then.
I did bring my website back and it stayed a static single page for most of its current iteration. Just my contact details and a short biography and not much else. I liked the idea of a static site generator but learning how the system works for a specific generator stayed on the back burner for a while. I ended up choosing pelican a python static site generator but ultimately it could have been any provided I did not mind maintaining it. Until recently it was not so common to have node, ruby, php, and python all on the same cheap no support webhost. Now I can keep all my content on the server and describe a publishing pipeline.
Once I started to write blogs again I chose a theme and set it up. I began to look at performance. I recently removed the bootstrap js and jquery to lighten the load but I probably broke some mobile functionality with that. I used Google’s Page Speed Insights to measure the site and its performance and started to tweak things so it would load faster.
At the beginning of this year I found this amazing article about performance. I want all of my future projects to take this into serious consideration. Performance and security are a tough balance against developer productivity. At this time I can see that most web technologies are chosen for developer productivity and shininess and not always for the users best interests. Online news and magazines have shown how far this can go. Many people have reported janky experiences reading a simple news article with a lot of analytics tracking, external libraries that monitor everything and add a lot of extra ads and content on the page. Along with extra modals nagging us to add/subscribe/sign up.
There are some technical challenges and tradeoffs with aiming for performance. Ideally a lot of the measures in the checklist can be taken on their own and not require doing everything up front.
First look at your webserver and make sure you are returning responses quickly. There might need to be a lot of troubleshooting and debugging to find out why responses are slow. Does your page query the database a lot to render one page? Can caching be used at all?
It is ok to send a lot of assets to the user especially if they are expecting it such as a marketing website for products. However, what can be done for that that initial paint? According to the checklist having an initial paint of less than 1250 milliseconds is a big goal for desktop and 3 seconds for mobile. The ideal experience would be the end user begins to see the content quickly and can do some basic interaction while all the nice stuff loads in over time. This might not be an easy task for an existing website.
removeAttributeQuotes: true, removeRedundantAttributes: true, removeComments: true, removeCommentsFromCDATA: true, removeCDATASectionsFromCDATA: true, collapseBooleanAttributes: true, removeRedundantAttributes: true, useShortDoctype: true,
I will be using this performance checklist in my future applications but I wouldn’t expect every project to adopt every single item.