Nerdy Stuff

A “few” words – yes, I know, but the previous post was short, right? – about the migration progress, and about the tools and the infrastructure for this new website.


The posts of 2018 and 2017 are migrated over. In the process, I needed to tweak the CSS files here and there, not least to make things work in the different browsers I care for, but also correcting my own mistakes resulting from the fact that I don’t – or didn’t – know a lot about modern CSS. Luckily, the CSS is pretty straight forward and simple. Check it out, if you’re so inclined, but use the browsers’ inspector, never read the raw CSS files, as they are generated from SCSS, and a mess to look at. Of course I’ be happy to share the source SCCS files, they look pretty neat.


These days, it’s pretty easy to use a specific typeface on your site. A page on the site will download it from the server, either your own or a provider’s, which adds some load time initially, but then the browser caches it (stores it locally) for subsequent pages. Of course, the user can either block such downloads, or override the typeface in the browser’s settings, in which case they might end up with something dead-ugly as Arial on their display. Nothing I can do about that, aside from also testing the looks accordingly, and make sure it’s not a total mess in such worst case.

Anyway, being something like a typeface nerd, I checked things out. Google has a wide selection of freely available fonts. As I am a Adobe Creative Cloud subscriber for using Lightroom, I also evaluated Adobe’s offering, as I know they have a vast typeface library. Thousands of them, literally. And yes, Adobe offers their fonts also as webfonts, as part of the subscription. Yay! I could use my beloved Myriad Pro! Adobe even has a web application, where I can select the typefaces I want for a specific site, and they will prepare a customised CSS file that I can link to. To change my choice, I only need to go there and edit my specification – no need to change my set-up here locally. A neat service.

Or so you would think. In the process of testing, Ghostery flagged Adobe’s CSS files as trackers. What?! Some digging then revealed that yes, Adobe actually uses the font selection and download files to track the website users, not only for compliance with the terms of use for the fonts – which would be totally OK –, but they reserve the right to use the tracking data for marketing and whatever purposes, even sharing it with third parties. Yuck! No, sorry, Adobe, I will not include this kind of sneaky tracker on any of my websites. Bye, bye, Adobe Typekit, as neat a service as you are, technically speaking.

So back to Google fonts. I now need to specify directly in the <head> section of my pages which typefaces, including which variants (eg. italic), weight (eg. normal, semi-bold) and language subset (eg. Latin, Greek, Cyrillic) I want, but luckily I can automate that, and only need to edit my site configuration file. As far as I can tell, Google cannot track the website users beyond what’s possible with any webpage anyway. Of course, aggregating all data from the HTTP request headers… So a final solution might be to find an open source font that I can download to my server, and provide it from there as well, together with all my other website files.

The typeface you see right now – unless you have blocked it! – is Source Sans Pro, both for body text and headings. I considered using a different font for headings, as I would for sure if it were for print, but this would add more resources to download. Maybe later. Source Sans Pro is pretty similar to Myriad Pro; it lacks ligatures, but I doubt many people care for, or even see the difference.

The availability of real italics is important, as just using the “slanted” normal version really looks ugly. Check out the difference:

Source Sans Pro:

great great


great great

The real italic typeface actually uses different glyphs – easily seen with “a” and “g”, which are different for Source Sans Pro, while Arial just slants the normal font. It’s gross.


Talking about tracking, at least in the initial phase of a new website design it’s nice to have some information about the devices used to access the site, simply has we have so many different screen sizes out there, from desktop computers with large screens to laptops to tablets to phones. The design has to make sure that the contents is accessible and readable on all screen sizes, the buzzword being Responsive Design. The CSS code has to cater for that. I am sure I can improve at that front.

Tracking website visit gives you that information. Google offers their free analytics toolkit. It’s easy to incorporate into your website, and yields tons of information about the visitors, how they use the site, which site they come from, and so on. But… it’s also free information for Google themselves. All tracking data is available to them as well, oiling their massive surveillance machinery.

So while playing with it, it just creeped me out. That’s not what I want. I then found Matomo, which is two things: an open source web analytics tool, as well as the possibility to run it on their servers. Their privacy policy is clear: they will not use my data themselves, that data belongs to me and me only. In fact, it’s used by the European Commission for exactly that reason. Think GDPR.

I have opened a test account, and I am trying it now for free. Thereafter, it’s a paid service – as it should be, given they don’t (mis-)use my data to compensate for being “free”. I don’t know yet what to do, maybe the 30 days trial will give me sufficient information. Just be aware that you leave a trace when you visit this site, but then again, you leave that anyway on my webserver’s logs, as you do everywhere on the web. And many websites not only use Google analytics, but use ten, 15 oder more trackers.


I need two main tools: a text editor, and the website generator. A third if you also consider version control.

As text editor, I currently use Atom, simply as it’s the same on the Mac and on Windows, where I do my Lua programming. It has its quirks, and can be somewhat sluggish at times (I think it’s an Electron application), but has many neat features, with many packages available for writing and programming (code and text highlighting, code linters, GitHub integration, and so on). However, it’s replaceable with any text editor, such as BBEdit or Sublime Text. I could even use all these editors at the same time in parallel. It’s pure text, bottom line.

I have chosen Jekyll as site generator. A contender was Hugo, and I might still try it in parallel. At some point of my evaluation, I had to make a decision, and not having any previous experience with recent tools of this kind (in the mid-nineties, I had used UserLand Frontier for this purpose), it was difficult to really compare the two, as they do the same, basically. So to understand what it means to work with such site generator these days, I had to jump in at the deep end, and just use it. I decided for Jekyll as it’s supported and used by GitHub, so I knew there would be a big community available in case of questions and troubles.

Jekyll generates all the files that make up the website on the server (or copies them over, such as JavaScript files). Jekyll complements my post content with all the different sections a complete HTML-page requires, such as the <head> and <body> sections and tags, adds the navigation on the top, and the footer, and so on. For this, Jekyll makes use of templates, which are again pure text files which I also need to provide. Templates also automatically collect archive lists of posts per year, or create the sitemap.xml and feed.xml files.

Soooo – did I replace the lock-in of Wordpress by a new one, Jekyll? Not really. First, all files, for contents and configuration, are local on my Mac. Second, all configuration data is stored in pure text files, very Unix-y. Jekyll itself does not have any configuration, it’s simply a command-line programme that builds the site based on all these files it finds within a defined directory structure when run.

Moving from Jekyll to, say, Hugo, would be pretty straight forward. The templates use a different language, converting these would be the main work. All contents, ie. mainly the posts, the site structure, and the looks as defined in the CSS files would be the same.

For version control, I use Git and GitHub, where I host all my programming projects anyway. Currently, I use SmartGit as front-end tool, which is also replaceable at any time by a different Git client. In fact, I have tried quite a few of them, both on Mac and Windows. For strictly non-commercial purposes, SmartGit is free. Nice of them.

So all of the above tools are free, the first two are open source projects, the Git client is a commercial product, but free for me.

Ah, yes. I should not forget all the command line tools I use with Unix (Mac) and Linux (Server), such as the aforementioned rsync, the nano editor on the server to edit the webserver’s config files (no, I don’t use vi), or certbot to obtain and manage the server certificates.

OK, OK. I was cheating. There one tool that deserves to be mentioned as well. Actually, praised. The web browser. It’s amazing what possibilities modern browsers offer to analyse websites and their pages, if you dive into the developer tools that are built right into each and every browser. Amazing. From simply checking out the HTML and CSS of single page elements, exactly seeing which CSS rules have fired, to evaluating server responses and load sequences and timings, and more. A browser is an analysis and debugging tool, right out of the box. From Safari on my Mac, I can even remotely analyse what’s going on on iPad and iPhone, and I must assume this is also possible with Chrome for Android devices.


Which brings me to the backend, or server, infrastructure. Not a lot to say here. After Jekyll has generated all the website files, I upload them to the server using rsync as described. From there, the files – ie. the web pages you see right now – are dished out by an nginx webserver, which runs on a Linux box I rent from Linode. My specific server is located in Frankfurt, Germany.

Linode have good documentation how to set up the server itself as well as the webserver. I decided to use Debian Linux, but could have chosen another distro as well, such as Ubuntu or Fedora. As I also use Debian – or a derivative thereof – on the Raspberry Pi, I could re-use my limited knowledge. I also followed Linode’s instructions to clamp down the access to the server (key only, no password login, firewall, and such), and to obtain and install certificates for the website.

I chose nginx over Apache as I prefer their structured config files, which are easier to understand and maintain, I think. It’s also supposed to serve static files better. But I guess I could have chosen either of them for my purposes.

For a complete picture, I host all my domains actually in use with Fastmail, they make sure all my DNS settings are top-notch, which is especially important – even tricky – for e-mail accounts. I just had to edit the A-records for this very site to point here, which even I could do without risking to eff up things. Fastmail makes this easy.


Aside from the basic Internet connection, which I need anyway, the only cost in terms of actual money to shell out is for the Linux server right now. It’s a small server, with only one processing core available – albeit a pretty strong one, Intel Xeon E5 – , 1GB of RAM, and 25GB of storage. I still have to see any memory swapping, or the CPU usage spike, and I am far away from the 25GB storage limit. So for now, I am good. It’s sufficient for all my seven users…

This server costs USD 7 per month.

All other tools and components are free. Even the certificates are free of cost using Let’s Encrypt, with certbot automating the timely renewal.

To compare, my old website, to be replaced by this one, costs USD 16 per month for the server, USD 50 per year for the backups, and USD 30 per year for the certificate. The server is shared, so I am not free to install anything I want. To be fair, I ran several Wordpress installations there, and had some e-mail accounts. But I have now moved all e-mail away to Fastmail, so I don’t have any need for that functionality anymore, and only one Wordpress installation is active right now. Soon, I can cancel the subscriptions there.


The beauty of this set-up is its simplicity and robustness. Everything that is important, and that I have invested time into creating, is on my Mac and included in my normal backup scheme. The same holds for version control on GitHub. I can upload the website files at any time anew. Even if the server runs into a problem, or I misconfigure something beyond repair, I can simply wipe the server, re-install Linux and the few software components I actually use, upload my files, and I am off to the races again.