|View this newsletter on the web.
The Space Jam website has been a time capsule. You’d head to spacejam.com suddenly be teleported back to 1996; a time before CSS, a time when using
<table> elements to make grids and layouts on a page was cool, a time before DevTools and Firebug showed us what the heck we were even building.
The Space Jam website is special because it’s a piece of web design history. It shows us how far we’ve come.
But if you head to the site today, it now shows a completely redesigned website for the new movie starring LeBron James. You can still access the old site by clicking a logo in the top right which is neat but there’s something a tiny bit sad to me that the old website is no longer attached to this URL anymore.
While I was lamenting that URL changing and being lost to the sands of time, Max Böck had other concerns. He wrote this great piece about performance and compared the original 1996 website with this new version. After running both sites through WebPageTest, Max comes to this conclusion:
[…] after 25 years of technological progress, after bringing 4.7 billion people in the world online, after we just landed a fifth robot on Mars, visiting the Space Jam website is now 1.3 seconds faster. That seems…underwhelming.
So if the old Space Jam website shows us how far we’ve come, then the replacement shows us how far we still have to go.
And despite all the tools we have now — not to mention the conferences, books, and websites that are dedicated to the subject — we’ve sort of stagnated. Why after all these years are we stuck with painfully slow websites? Why haven’t things improved much at all and, in many cases, actually gotten a whole lot worse?
There’s probably a lot of reasons, but I think it’s ultimately a cultural problem. Many folks, and not all of them are developers, tend to believe performance is a nice-to-have, an additional feature we can get to later instead of it being baked into our work day-to-day.
Kealan Parr wrote about bad web performance and how to improve it the other day. He argued that a slow website isn’t just an annoying experience for users but a huge detriment to a business as well. Or, to put it another way, bad performance is bad for business:
[…] Firefox made their webpages load 2.2 seconds faster on average and it drove 60 million more Firefox downloads per year. Speed is also something Google considers when ranking your website placement on mobile. Having a slow site might leave you on page 452 of search results, regardless of any other metric.
How do we make fast websites though? Kealan explains:
Here’s the thing: performance is more than a one-off task. It’s inherently tied to everything we build and develop. So, while it’s tempting to solve everything in one fell swoop, the best approach to improving performance might be an iterative one. Determine if there’s any low-hanging fruit, and figure out what might be bigger or long-term efforts. In other words, incremental improvements are a great way to score performance wins. Again, every millisecond counts.
Incremental wins, I like that. But before we dive in and try and make our websites fast, we need to understand why things are slow in the first place. That’s why I enjoyed this other performance-related blog post by Jake Archibald where he looks into who has the fastest F1 racing website in 2021.
What’s great about this post is that all the recommendations that Jake suggests are tiny improvements that would shave off whole seconds from each website. And, thankfully, these are all small enough that they don’t require restructuring your whole organization.
Also, there’s so many great bits of advice that can be applied to the websites that we’re all building right now, like this one:
[…] it’s important to avoid hosting render-blocking content on other servers.
That’s always a good reminder and it’s why I now tend to avoid hosting images, fonts, and CSS on other people’s servers.