Audit web performance in development environment

2020-04-05

I've listed this site into the Eleventy Performance Leaderboard. When doing so, I figured it'd be good to tighten the feedback loop for tracking effect of changes. Shipping to production and then measuring, is a long roundtrip to find a change made performance worse.

Production-like performance audits

With a static site generator, like Eleventy, serving its output with a web server enables running local performance audits. Eleventy's built-in development server doesn't quite cut it, as it includes assets you don't want to account for in an analysis. Another hurdle to achieve production-like audits, is that tools like Lighthouse expect your assets to be served with HTTP/2.

Serving your static site with HTTP/2 can be achieved with simplehttp2server. Via npm script you can run the web server with Eleventy's output in an easy way.

Run simplehttp2server

json
"scripts": {
"build": "eleventy --input=./src/ --output=_site",
"start": "eleventy --input=./src/ --serve",
"serve": "yarn build && cd ./_site && simplehttp2server"
},

After executing yarn serve, you'll likely get a warning about the certificate not being trusted when browsing your site. Instead of getting your browser to trust simplehttp2server's self-signed certificate, I recommend generating a development certificate with mkcert. Use mkcert to configure a local certificate authority. Once done, replace simplehttp2server's with a trusted development certificate.

Replace certificate

sh
# Static site root
> cd ./_site

# Generate development certificate
> mkcert localhost

# Replace certificate
> mv localhost.pem cert.pem

# Replace public key
> mv localhost-key.pem key.pem

Now run yarn serve again. If you browse to your site, you'll notice it's served with HTTP/2 with a valid mkcert development certificate issued by mkcert development CA. With that, run a Lighthouse audit to see what it recommends for you to adjust.

Development vs Production

As we now have production-like audits close at hand, we can track how changes affect performance. To further extend this capability, install Node.js module performance-leaderboard. Create a perf-test.js module in your repository, with specification for what sites to test. I'm running tests against development and production to determine optimization efficacy.

Automated Lighthouse audit

js
// perf-test.js
const leaderboard =
require("performance-leaderboard");

(async function() {
let urls = [
"https://localhost:5000",
"https://jouni.kantola.se"
];

console.dir(await leaderboard(urls));
})();

In practice, I run automated tests in two terminal windows; one with yarn serve and another running yarn perf-test. The audit provides you with a comparison between development and production. performance-leaderboard also keeps history of executed runs in a ./log folder.

Perform audit

json
"scripts": {
"build": "eleventy --input=./src/ --output=_site",
"start": "eleventy --input=./src/ --serve",
"serve": "yarn build && cd ./_site && simplehttp2server",
"perf-test": "node ./perf-test"
},

Worth noting, comparisons against production are tricky. Infrastructure, like hosting static assets on CDNs, affect performance a lot. With logs you can detect what patterns works and which don't.

Once you've covered common well-known practices, you'll have to adapt performance optimizations with your site's specific setup and UX top of mind. This is when your web performance audits will really shine.