I’ve had this thought for quite some time, and it’s that most websites don’t need to be served dynamically. For example, most blogs that are powered by WordPress or Ghost will dynamically fetch the relevant content and build the page every time a visitor visits a URL1.
There’s nothing stopping sites from being built dynamically, using centrally stored content, and various templates that can be put together to build a complex website. It should just happen once, and then the generated static content can be efficiently served again and again, until the source content changes, and triggers it to be rebuilt.
This is much more relevant for blogs since the content on the page doesn’t change, except for possibly a web font, or a JavaScript snippet for analytics or an advert. However, these are usually externally sourced, and won’t affect the static HTML code that can be served to your users2.
This may sound a bit ironic, since my blog currently runs on Ghost, and serves content dynamically3. Although, I am working towards a solution for that, by building my own static site generator, Arbok.
-
Yes, I’m sure some people have a caching mechanism installed, but I wouldn’t say it’s everyone, and it really masks a problem rather than fixing one. ↩︎
-
Another benefit of this, is that you can bundle together resources into a final
.html
file, such as any CSS styles. Which reduces the number of requests the browser needs to make when visiting your page. ↩︎ -
Although if you have a look at your browsers web inspector, you’ll find that I’ve already done some work to reduce the size of my website. ↩︎
Random Prnt.sc #
Okay, so I was bored today, and that led to me building a website. Specifically one that lets you find random images that are hosted on Prnt.sc.
Background Information
Basically, there’s a screenshot utility called Lightshot, and it has the option to upload your images to the web. is a screenshot utility that allows you to quickly customise screenshots, and upload them to the internet. These images can be found on a website called Prnt.sc, and they’re publicly available.
In fact, all you need in order to find an image on Prnt.sc is a 6-digit alphanumeric identifier. Which is easily generated.
This afternoon I was playing around with random combinations, trying to find anything amusing. But I’m a lazy person, so I try to make any manual process easier.
My first idea was to somehow built a simple website that could actually find images from Prnt.sc, and display them inline. However, due to cross-origin resource sharing, it seemed way to complex for a fun afternoon project. So I settled on simply generating random identifiers, and opening a them in new tabs.
The website is now live, and you can view it at chrishannah.me/prntsc. And it comes, as I mention in the footer of the page, “built with minimal style”.
I’m not sure what type of aesthetic this is, but it always reminds me of the purities of the web. I much prefer a website that has well structured HTML, and little to no CSS. I mean, I didn’t even add any styles to links, and it still looks good!
Sometimes I want to just change my blog completely to a static site, with a super basic design. But I’ll leave that to another day.
A Few Initial Notes on My Website Analytics Project #
I’ve done some minor researching into this idea of mine, that really became a thing when I started making my blog super lightweight. And I really want to carry that over into whatever this project becomes.
Whatever I do, will of course be personally oriented, and it will be packed full of decisions that wouldn’t work best for most people. But it’s a personal project first, and if it becomes more flexible and open in the future, that’s just a bonus.
What To Track
With Google Analytics, you get a whole bunch of stats. This can be really handy for someone trying to deeply understand interactions with a website, but it’s a bit over the top for the menial use I want out of it. There’s also the added fact that you’re tracking your users — it’s not a big deal, but I’d rather not invade people’s privacy.
There are very minimal metrics that I want to capture, and that is page views, referrer websites, and possibly number of sessions – although I don’t care about this too much. But regarding the first two, this can be completed by simply telling something the page that’s been loaded, and what referred it. Luckily for me, it’s all in the HTML DOM and I think I’ll be able to do this super minimally.
The way’s this basic data could be used is also rather interesting to me, as when the message is received (by a server, or whatever), a date can be applied. Which means the data can be sorted by the date, collated into individual pages, and some pretty cool graphs could be made from it.
How to do it
For the sake of the front-end implementation, I plan this to be a simple PUT request, which will send the (as mentioned above) data to whatever server that is in control of the analytics. From there, it will require no more work from the client.
For the back-end, the speed, and “heaviness” of the implementation isn’t super important for me at the beginning stage. Because initially it will only serve myself, so it’s not a big load that will be put on it. But my first idea is to use a cloud server on Digital Ocean, to host a Swift server app! Built using Perfect, because I had a great experience with it when I experimented with a text formatting API. There’s also the fact that I am mainly a Swift developer, and is more likely to get finished if I make use of that.
Progress
As with all my other projects, I’ll be pretty vocal with the progress, and try to share as much as possible. This will be done mainly on Twitter, where you can follow me at @chrishannah, and if you want to know something I haven’t shared yet, just ask!
Small File Sizes and Quick Load Times #
I’ve been getting more obsessed with these two things recently, and you may have already noticed it with the recent “redesign”, if you can call it that. Basically, the design has been simplified even more, and a higher focus (like everyone always says) has been put on the content.
I’ve had this mindset towards website sizes, and how fast they should load for quite a while. But it’s only the past week or so that I’ve put effort into sorting out my website.
It started with optimising the images on the website, which consisted of resizing every image so that the width didn’t exceed 1400px and height didn’t exceed 1200px. They’re not exactly small sizes, but we live in a Retina world, and I have to put up with that. On top of that, all PNGs were put through the highest compression in Squash 2, and any image that was currently on the front page (I really couldn’t be bothered to do this for every post), was converted to JPG.
It was a decent start, and it certainly made a noticeable change in the size, with it taking my home page from 6MB to 1.2MB. It’s a relatively big difference, but I still felt that it wasn’t near enough what I was aiming for.
My desire is to have my website show off the content really nicely, be measured in mere kilobytes, and load so fast it’s not even recognisable.
Fast forward to today, where both the size and load speed metrics have improved a lot. I’ve been playing around with a few static site generators, and thinking about doing a more custom approach to the website, but I realised that Ghost (what this site runs on) can be manipulated itself. So for now, nothing major has changed with the underlying blog engine.
I have done a few things though:
- Removed Prism – this was the already small library that I used to style any embedded code, but it’s not really relevant.
- Cleaned up and minified my CSS file (yes, I write basic css).
- Removed all javascript, including Google Analytics!
Google Analytics was the hardest to remove, but I got down to a point a page showing a single text-only post, would be roughly 50KB. 29KB of that was Google Analytics. This was didn’t seem like nice ratio to me, so for now it is gone. Hopefully in the future, I can write something minimal myself to track basic page views, but I’m not worried about that just yet.
Here are a few examples of the website size and load speeds:
- “5K Versions of Every Default macOS Wallpaper →” (<30 words, 0 images) – 7.36KB (70-130ms)
- “Emergency SOS on Apple Watch” (<250 words, 0 images) – 8.39KB (85-105ms)
- “Snapchat Can Now Share Your Location” (<500 words, 3 images) – 427KB (200-300ms)
I’m really happy with the low page sizes, and it appears the only thing truly adding to the size now is the images, which I can deal with. I’ll just start to use them where the need to be used, and nowhere else. The load speeds varied across multiple attempts, so that’s why a range was given (caches were disabled).
My next step will be to try and further optimised the actual Ghost engine itself, to see if any speed improvements can be made there. And I guess maybe an improved cloud server would help also? Then there is the dream goal of custom web analytics.
So rest assured, for now, nothing is being tracked on this website.
I’d be very interested in hearing everyone else opinions on website sizes, and all the rubbish I’ve wrote here in this post. Because while I really want my blog to be under 10Kb in most scenarios, it probably doesn’t make a huge difference to the reader.