Minifying and compressing text files to reduce load

Post #5 published on by Tobias Fedder

So far this website consists of HTML files almost exclusively. These are generated by 11ty from the prose written in Markdown and the Nunjucks templates for layout and snippets. Unfortunately — probably due to my sub‐par Nunjucks skills — some parts of the HTML contain a good amount of whitespace. The generated HTML for the site map, for example, is 293 lines long, somehow 75 of those contain only whitespace. Oops.

Luckily there are tools to get this under control. One of them is the html-minifier, which is mentioned in the excellent 11ty documentation regarding the transform function as an example. I use that example almost exactly in my eleventy.config.js. That shaves off a few dozen bytes per page — or in case of the site map 25%. All in all it's just a small improvement. Nonetheless, not sending pointless data is a benefit for all.

The difference between uncompressed and compressed files should be way more significant. The best compression algorithm available in all up‐to‐date browsers is Brotli. To save some work and compute, I'll limit my effort to this one compression algorithm, leaving out the well‐established gzip.

Compression and static resources are a perfect match, because the compression can be done upfront in a build step. In contrast to compressing responses on‐the‐fly the webserver can send content that is already compressed, thereby avoiding calculations and saving time. Caddy has a convenience subdirective for serving precompressed files, called sidecar files, matching the requested URI. Setting the subdirective with the value br, instructs Caddy to look for a matching file that additionally ends in .br. A request to / by a client that can decompress Brotli, indicated by the request header Accept-Encoding: br, will lead to Caddy serving / if available, otherwise /index.html. The part in the Caddyfile looks like this:

tfedder.{$TLDOMAIN:de} {
  root * /srv/tfedder
  file_server {
    precompressed br

The webserver configuration is fairly easy. Now I need to make myself some Brotli‐compressed sidecar files. My first impulse was to use the transform function from 11ty again. But that only applies to files created by 11ty. Assets copied into the output directory wouldn't be affected. I assume that it is easier to look for files with file extensions that indicate text files after 11ty created the output. Using the event eleventy.after in the eleventy.config.js allows me to run that code automatically right when 11ty is done generating the output.

const fs = require("fs")
const zlib = require("zlib")
module.exports = function(conf) {

	const outputDir = "_site"
	conf.on("eleventy.after", brotli_compress_text_files)

	return {
		dir: {
			input: "src",
			output: outputDir

		markdownTemplateEngine: "njk"
	function brotli_compress_text_files(_) {
		const textFileEndings = ["html","css","svg","js","json","xml"]
		const files = fs.readdirSync(outputDir, {recursive: true})
		const textFiles = files.filter(f => textFileEndings.some(ending => f.endsWith(ending)))
		textFiles.forEach(f => write_brotli_compressed_file(f))

	function write_brotli_compressed_file(uncompressedFilePath) {
		const outputPath = (outputDir.endsWith('/') ? outputDir : outputDir + '/' ) + uncompressedFilePath
		const uncompressed = fs.readFileSync(outputPath)
		const compressed = zlib.brotliCompressSync(uncompressed, {
			[zlib.constants.BROTLI_PARAM_MODE]: zlib.constants.BROTLI_MODE_TEXT,
			[zlib.constants.BROTLI_PARAM_QUALITY]: zlib.constants.BROTLI_MAX_QUALITY
		fs.writeFileSync(outputPath+'.br', compressed)

I'm sure that could be made even more performant leveraging asynchronous functions, but it's quite fast the way it is, good enough for now. Have a look a the reduction of file sizes in bytes.

HTML file sizes for this website's pages
pagegenerated+ mini+ br

Using Brotli for compressing HTTP responses is very common. Yet, serving files precompressed at maximum compression rate is not that widespread. It's another quick win for this site.

I might have to reassess the decision to leave out gzip, depending on the Accept-Encoding headers I'll find in the access logs when taking a closer look. I have a hunch that some RSS aggregators might only accept gzip. Maybe I'll make an exception for the feed XML. Speaking of the access log, it seems that at least three people already subscribed to my RSS feed. Thank you.

Also a big thank you to the mysterious visitor(s) who managed to access this website using HTTP/⁠3. It's good to know that it works. Caddy supports that protocol out‐of‐the‐box, and I was sure that I configured the machine correctly, but I couldn't bring any web browser to use HTTP/⁠3 myself — thanks a lot.