You can find out more about Jeremy at adactio.com

Jeremy Keith  00:00

In short, yes, we have absolutely ruined the web for a lot of people.

Gerry McGovern  00:11

Welcome to World Wide Waste, a podcast about how digital is killing the planet, and what to do about it. In this session, I’m chatting with Jeremy Keith. Jeremy is a philosopher of the internet. Every time I see him speak, I’m struck by his calming presence, his brilliant mind and his deep humanity. Jeremy makes websites with Clearleft. His books include DOM Scripting, Bulletproof Ajax, HTML5 for Web Designers, Resilient Web Design, and, most recently, Going Offline. Hailing from Erin’s green shores, Jeremy maintains his link with Irish traditional music, running the community site The Session. He also indulges a darker side of his bouzouki playing in the band Salter Cane. You can find out more about Jeremy at adactio.com. 

According to WHO, mobile traffic went up from about 55% pre coronavirus pandemic to 70% once the crisis hit. And I’ve seen similar figures come from other health environments in Canada and other areas – some of their coronavirus pages were reaching over 80% in the Canadian government. Yet we’re looking at situations where an average web page has gone from about 400Kb in 2005 to about 4 megabytes on average 10 years, 15 years later. A major study of about 5 million web pages last year by Backlinko found that the average time it takes to fully load a web page is 10.3 seconds on desktop, and it can be as long as 27 seconds on mobile. And the broad question I had – or a couple of questions – was: have we, in a way, ruined the web for a lot of people? And if so, how do we fix it? Now, they are huge questions, but maybe that’s a starter.

Jeremy  02:37

In short, yes, we have absolutely ruined the web for a lot of people. And I find it so exasperating, it’s flabbergasting, in that developers, designers, researchers – we spent all this time trying to figure out what do our customers want and what did they respond to well, and we’ll do A/B testing and we’ll figure out did this color button do better than that color button. And yet all the evidence staring us in the face is that faster websites will make you more money, give you happier customers, please everyone – and yet for some reason, that’s just ignored in favor of weirdly prioritized stuff like, ‘Oh, it’s so much more important that we have these great big images’ or this particular number of fonts or third-party things serving up ads. 

It’s really bizarre to me how we kind of almost collectively choose to ignore the obvious, obvious way to improve the experience for everyone and improve a business’s bottom line in favor of fiddling with the details. That might make a slight difference, sure. You know, maybe this particular design is slightly better than that particular design, but the elephant in the room is just how long something is going to take to load. I mean, there’s a direct correlation with frustration. You know, faster websites mean happier users. It’s absolutely in no doubt. And I just don’t get it, to be honest. I don’t get how people can be working on the web claiming to be doing user centered design and yet, at the same time, ignoring this huge factor. 

I think maybe what happens is that it’s not clear whose job it is to fix this. Like, maybe designers think it’s a technical issue and they don’t think about these sorts of things when they’re doing their designs – usually in a medium that isn’t the final medium of HTML, CSS and JavaScript. Usually designers, we work in some kind of graphic design tool, wireframing, whatever it might be. And they might not be considering performance effects of design decisions they’re making because they think, ‘Well, that’s for the developers to figure out.’ Meanwhile, the developers may be thinking, ‘Well, my job is to do what the designers have designed and get that out the door.’ And so they can say, ‘Well, I would have loved to have made this a fast website, but the designers designed it this way, so what can I do?’ So maybe the issue is that it isn’t clear who’s responsible for this. We should all be responsible for this, no matter what your job is. This is such a crucial factor in making a good website that everyone should be responsible. But maybe it’s one of those situations where, when everyone is responsible, nobody is. 

Lara Hogan wrote a whole book on web performance a few years back, and a lot of what she talked about was – I mean, it’s a lot about the technical side, but she really talked about creating a culture of performance, making sure everyone understood that, no matter what you’re doing to get a website out the door, you are in some way responsible for the performance. It’s absolutely flabbergasting to me and I just don’t get it.

Gerry  06:06

Yeah. When I saw these stats coming from WHO, the images that started coming into my mind were, you know, a doctor or a nurse or a mother or a father using their mobile phone to try and access critical health information and waiting. And in some cases, if these people are not on big incomes, they’ve got older phones. They’ve got expensive data deals, because the poor always pay more in these processes. So this here is affecting people’s health and is potentially impacting who will live and who will die. Maybe that’s too extreme, in some ways, to look at it. But I’m sure there’s certain situations where people cannot access this information because the pages are too badly designed – they’re too big, they’re too heavy, they’re taking too long to download – and that impacts people’s lives. And yet, we blithely – no matter what we say, it doesn’t seem to make a difference.

Jeremy  07:34

Yeah. And if nothing else, mental health is affected by this. There have been studies to show that the experience of waiting for a slow website to load when you need to get that information is comparable to watching a horror movie in terms of how your body is reacting to the stress of the situation.

Gerry  07:52

Yeah, I saw that. Think of how that is doubled or trebled when you’re looking for symptomatic information for the coronavirus or you’re looking for other vital stuff that you need. Or you’re trying to sign up for unemployment benefit, or a whole panoply of other type of services that you make. You know, why?

Jeremy  08:22

Here’s one way of looking at it. What we’re going through right now – the entire world, basically – every country individually is one giant edge case. And edge cases are the kind of thing where, when they come up in the process of designing or building a website – somebody brings up ‘Oh, but what about this situation? What about that situation? This person on a poor internet connection, this person who’s low income’ and stuff – generally the reaction will be something like, ‘Oh, that’s an edge case.’ And there’s an unsaid follow-on to that sentence ‘that’s an edge case’, which is ‘and so we won’t deal with it’. That’s an edge case, so we won’t deal with it. When, really, we could be saying, ‘That’s an edge case and how will we fix it?’ Because what we’ve learned from the world of inclusive design for years is if you take care of the extremes, the middle takes care of itself, right? If you take care of what Eric Meyer calls the stress cases – rather than edge cases – take care of the stress cases, and you’ll make something that’s better for everyone. And I guess what happens is when suddenly everyone gets to experience what a stress case, what the edge case is like, then maybe finally people start thinking about this. And lo and behold, suddenly websites are able to find that they can create the light versions of a static version, strip away those images, strip away those scripts and those web fonts and stuff. And it turns out that what people really need is the content, which is usually text, right? It is normally some text on a screen, not technically difficult to get out there. But it seems to require us to all collectively experience an edge case in order to have the empathy, I guess, to deal with it.

Gerry  10:00

Maybe. But another argument to that would be in the sense that is it really an edge case? Weren’t there just millions of people on crappy mobile phones trying to do stuff that were being excluded before this crisis?

Jeremy  10:18

Absolutely, yeah. And this is something I think as an industry we’ve done for years with many things, which is we will come to a collective agreement – I like to use the term ‘collective consensual hallucination’ basically – that let’s all agree that this is the way things are and ignore any data that shows otherwise. So a good example was – this is a much more simple and from simpler times example, but back in the 90s into the early 2000s, we were designing layouts on the web and said, ‘Well let’s all assume everyone has a monitor that’s 640 pixels wide.’ And then at some point we came to the collective agreement that, no, 800 pixels wide is the monitor everybody has. And then it became 1024 pixels wide. We all settled on like 960 pixels as this ideal width. And that was all based on just this collective hallucination. Yeah, let’s just all agree that that’s the truth, even though it was never the truth. People were always on different sized screens. 

And what comes along is some kind of event that seems to shake things up and seems to show that, oh, we’re in a new world now. But actually, all it’s doing is shining light on the existing situation. That happened when mobile suddenly burst on the scene in 2007 or so with the iPhone and other fully featured smartphones. ‘Oh no, now suddenly people have different size devices, and they’re on different network speeds.’ And actually, no, people weren’t suddenly in those situations. It’s just you started paying attention to those situations more, right? So it was shining a light on something that was already there. And so yeah, you’re absolutely right that what’s happening now with the systems being stressed by things like the coronavirus is it’s shining a light on a situation that was already there. People are already very unevenly distributed with things like not just network speed but processing speed on their devices. We like to think smartphones are very fast and stuff. Well, maybe yours is, but that is definitely not the case. So what’s happening is, hopefully, things are coming to the surface that aren’t new. What’s coming to the surfaces is this is the way things always were.

Gerry  12:37

Right, right. You joked when we were swapping emails about this that we should stop talking about images and videos etc as assets and start calling them liabilities. But what’s the underlying drivers here that make so many of us want – and I include myself, throughout my career – to have the highest resolution and the highest quantity of images and videos possible, even in a resolution where we can’t actually see the difference, and we can’t actually hear the difference, but we want that. What is it? What’s the human instinct? And how do we change that mindset?

Jeremy  13:19

I think again, there’s a disconnect in the process we go through when we’re making something, and then how that thing is experienced when it’s actually on the web, which is dependent on network speeds and processing speeds and stuff. So we’ll use things like graphic design tools, and in Photoshop or Figma or Sketch – whatever tool you’re using – there’s no difference whether you’re pulling in a large image or a small image, a high resolution or low resolution. You don’t feel any lag, right? So if you’re moving things around in a graphic design tool, the graphic design tool doesn’t respond more slowly or more quickly depending on the weight of those assets or liabilities. Now maybe if it did, maybe if you literally found it harder to design when you were using those high-resolution, high-bandwidth things, maybe we would start to behave in a more lean way and only reach for those things when it really counts. 

I think that, in any design medium, you need to have an understanding of what’s cheap and what’s expensive. To explain what I mean: if you’re a print designer, then you’re used to the idea that you can have as many different fonts as you want, as many different weights of the font. It’s cheap, it doesn’t cost anything more to do that. Whereas – and presume this is for let’s say a poster or a flyer, something that’s going to be printed out – the number of colors you use might be expensive, right? That might be something where you might have to constrain yourself to a two-color or four-color palette. Now that’s different on the web. In fact, it’s exactly the opposite way around. On the web, use as many colors as you want, because colors are free, basically. But every time you add a new font or an extra weight, you are increasing the size. So having an understanding of what’s cheap and what’s expensive is important, but I think you kind of have to feel it to understand it and we don’t feel it when we’re designing. 

There’s a real disconnect between the process of design and what’s actually experienced by people. Even when we are loading websites, we’re loading the local copies that we have on our machines, and we’re evaluating how things look and how things behave, but not in terms of the arrow of time, right? We don’t throttle our connections to simulate what it will actually be like for different people loading that. And yet, as I said, the number one factor in user experience, in my opinion, is speed, is time, the arrow of time. It doesn’t matter how beautiful the thing is if it’s going to take 30 seconds for it to be loaded, right. So there’s this real disconnect in our experience as we’re designing and building something to the end user’s experience.

Gerry  16:13

I wonder, Jeremy, you’ve touched on something there, a design challenge. Can we design for feel in some way in the design process itself so that designers feel a bit of the pain as they’re making a decision? Is that something that the designers of the tools of designers can be thinking about?

Jeremy  16:40

I think it’s maybe more of a cultural thing. There are ways of getting people to get it. And I’ve seen some people share these ideas – like every day of the week, they have a different kind of exercise they’ll do. On one day, they’ll deliberately throttle their connection. On another day, they deliberately use a different browser than they’re used to. And another day, they switch to monochrome display instead of full color, things like this. I know that the New York Times, I think they had something like a low-bandwidth Friday or something where, for literally everyone in the office, their internet connection was throttled to try and get that feeling for what it’s like to experience that. So I think those kind of exercises can be good. I mean, I said it’s a shame that the design tools, they don’t enforce that feeling. And it would be wonderful if they did make it more painful the more expensive your assets were. But I don’t see many people signing up to buy that software then.

Gerry  17:42

So we started talking about digital waste, about files and formats and storing stuff and keeping copies, and about how the waste dynamic is changing because so much now is stored in the cloud. And the cloud is hugely more energy intensive and thus wasteful than your hard drive, than what you do locally.

Jeremy  18:04

And here’s something interesting where it does tie into the waste question: in terms of digital preservation – which is something I actually think is really important, as is preservation of our culture, preservation of what we create – there’s a term in digital preservation called LOCKSS, which is ‘lots of copies keep stuff safe’. And I think broadly it’s true, that if you have just one copy of something, the chances of it surviving a long time are slim, but as soon as it’s distributed and it’s got a reasonable license attached to it, then it stands a greater chance of surviving. But, you see, the downside is that now you’re distributing, and again, that feels like a victimless thing, right? I’m just making multiple copies of something. But what you’re actually doing is using up more energy. So I am a little conflicted here, because I’m a great believer in digital preservation and I think, broadly speaking, the principle of LOCKSS, lots of copies keeps stuff safe, is true. But I’m also aware then of the cost of having too many copies of things.

Gerry  19:10

I then brought up what I term the Zettabyte Armageddon, about how we are creating and storing so much data. In the last two years, we have created more data than was created in all of previous history. And 90% of this data is crap.

Jeremy  19:32

I am going to fundamentally disagree with you there. I don’t think the issue is with cleaning up, because once the thing has been created and it’s just sitting on a hard drive somewhere, it isn’t causing any harm. Let’s say you’ve got a hard drive that has a certain storage capacity. Let’s say it’s a terabyte hard drive. The arrangement of the ones and zeros on that terabyte hard drive don’t waste energy once they’ve been arranged. Now, the act of making something uses energy. The act of putting it onto a hard drive – yes, that uses energy, absolutely. But once it’s on there, this idea that it needs to be cleaned up in order to save energy is not true. You know, a hard drive weighs the same whether it’s full of ones or whether it’s full of zeros. It’s like a bin bag getting full of stuff. The issue is actually the opposite. The issue is not that, ‘Oh, no, people aren’t even accessing this stuff after three months.’ No, that’s great. If nobody’s accessing it, that means it’s not consuming energy. That is a good thing. The waste comes when people do access things that they don’t need, right? So millions of people every day pulling down JavaScript files that don’t really add anything to a website or pulling down images that don’t add anything. That’s waste, because there’s no need for it. If we’re going to clean up, let’s clean up in the right area. Let’s clean up the stuff that doesn’t add anything. 

Now to say that 90% of the stuff being created is crap. Sure. Okay, that’s probably objectively true. Most of it is crap. But I’ll also say this: you don’t know the future value of something being created today. So let’s say our definition of crap is going to include some teenager posting a blog post about something we don’t care about, or some YouTube videos – something that is just objectively not important. Fine. But that teenager may turn out to be the first person to walk on Mars. That teenager may turn out to be a future president of the United States. But we don’t know the value of something created today to the future. You talked about cuneiform tablets, which are usually valuable sources of information to us, and most of them are about accounting and porn. Those are the bits of everyday life. So I actually think we should be preserving these useless bits of self-expression that people do all the time. As Patrick Kavanagh would have said, ‘wherever life pours forth ordinary plenty’. Because once they’ve been created, there isn’t a cost. The cost came at the point of creation. Now, we could encourage people to be circumspect in what they create, and maybe don’t upload everything and maybe self-edit a bit. But once something is on a hard drive, unless somebody requests that file, it isn’t harming.

Gerry  22:30

Okay, two things there. The hard drive, as we said, has a bigger energy cost, significant energy costs and pollution. And the hard drive will not last forever.

Jeremy  22:41

Right. But that energy cost came when it was created. 80% of the cost was when it was created regardless of what’s going to end up on that hard drive.

Gerry  22:49

Exactly. So buy less hard drives.

Jeremy  22:51

Oh yeah. 

Gerry  22:51

You know, we are buying so much storage. That storage costs the earth in materials and in energy. There’s much higher manufacturing energy for a digital product and for a physical product because of the complexity of the materials and the manufacturing process. So that hard drive costs money and costs energy and creates waste. Just because it’s all created, all the waste has been generated, does not mean it’s not waste. And if we now buy 100 hard drives instead of 50 hard drives, that’s 50 extra hard drives that are causing the deforestation of the planet, etc. And they’ll need to be replaced in five years or 10 years. Where’s the data going to go when the hard drive corrodes?

Jeremy  23:48

No, that is a fair point. And Moore’s Law does come into this that, yes, it is expensive to produce hard drives these days, but 10 years ago it would have been 1000 times more expensive.

Gerry  24:01

Oh a million times. But see, this is the problem, Jeremy. I think this is where it becomes core. You said it earlier about the cost: a close to zero cost is not a zero cost. We all must have the impression that it’s nothing. And another thing that’s happened with hard disks is that their prices have stabilized in the last three or four years. They’re not coming down. They’re not dropping in the exponential rates that they were dropping in previous times. So they seem to have stabilized in their pricing structure in the last four years. But we move on. But the idea is that this storage, even local – which I totally agree with you; it’s much, much better than the cloud – still has a cost because we had to manufacture that disk and that disk will not last as long cuneiform tablet. So if we have to replace a million hard drives in 10 years–

Jeremy  25:06

There’s this great blog post, actually, by a developer – Danny van Kooten is his name, and he’s got a blog post called ‘CO2 emissions on the web’. And he tried to trace his own contributions – he made a WordPress plugin. So it’s one file that gets distributed very broadly to lots of people. And he did the back of the napkin calculations for how much energy is being wasted, effectively, by what he’s put out there into the world because of multiple copies of this one thing. And I think that’s the area to focus our energy, the kind of unnecessary duplication.

Gerry  25:43

But I think we need to think about the machines as well. You know, we’ve got 10 billion smartphones since 2007, and they don’t get recycled. About 10% of them, from the data, actually get recycled. We create as much e-waste every year – we create about 50 million tonnes, which is basically the same tonnage of all the commercial aircraft ever built. The machines that we use to access digital have a very short lifecycle. They last three to five years.

Jeremy  26:16

This is true, and this is something that I’m not keen on at all. I’ve had my phone now for many years. I don’t want to upgrade it. I’m happy with it; it does everything I want. So I agree, and it is it is kind of shocking. And we’re seeing right now the European Union stepping in with right to repair laws, which I think are super important so that people can make one device last a long time. Just to be a bit of an asshole here, though, I’m going to point to a little counter argument, because this is something that made me think as well. I saw an article a while back about this. Yes, there’s this wastage now of somebody buys a phone, they use it for a couple of years and then it ends up going to landfill. It’s not being recycled. That’s bad. That’s terrible. But what we’re not seeing is what would have been created and what would have been going to landfill had the phone not existed. So we’re not seeing landfills full of cameras, camcorders, dictaphones, photo albums – there’s all the stuff that is concentrated into a device. We can’t A/B test the universe, so we can’t compare the wastage. But you could imagine how much physical device wastage would be happening if phones hadn’t come along and kind of wiped out entire product lines of cameras and camcorders and dictaphones and all this stuff. But I’m kind of just being the asshole there, playing the devil’s advocate.

Gerry  27:41

No, I think that’s a very interesting argument. And I mean, I love my smartphone. But I’ve started reducing my use of things. I used to always buy the biggest screens I could get and now I’m beginning to think, ‘Do I need this thing? Can I do it on my laptop?’ So I’m trying to do far more work on my laptop – just being conscious, not stopping. These devices are amazing. And that’s a brilliant point you brought up – the benefits. It’s not that this is all bad, but that we just become a bit more conscious of the cost of all this stuff.

Jeremy  28:24

I would say there’s an opportunity there as well. Let’s say you’re a device manufacturer trying to break into the smartphone field, which would be a very tough field to try and break into because you got these large companies that dominate it. Well, as Marty Neumeier is always saying, when someone else zigs, you should zag. So if all the advertising around smartphones is like oh, you need to get the latest and greatest one. It’s got the best features blah, blah. And so the general consensus is you upgrade a phone every couple of years. You get rid of your old one, you get a new one. Can you find a way to market the exact opposite, which is: buy our phone and it’ll last for a decade. Right? That would be a really interesting marketing pitch, and the larger companies just couldn’t compete with that because that is very much against how they sell things and what their business model is. Their business model is relying on that upgrading every couple of years. So there’s actually an opportunity here for smart companies, I think, to get in and market to people who are beginning more and more to think about ‘not only do I not need to upgrade, but I actively don’t want to’. I’m starting to feel almost like I’m being very encouraged to upgrade my phone, right? Almost shamed for having a very old phone at this point. Like you’re made to feel bad about it. And I would love if there was some company that said, ‘Hey, we’re going to cater to you. We’re going to make a phone that will last 10 years guaranteed.’

Gerry  29:52

Totally. I mean, these are the ideas we need. These are the type of – ‘it’s cool to be old’. It’s cool to have the oldest thing, rather than that – we somehow shift the cultural Zeitgeist that has got in fast fashion. We buy five times more clothes than we did 20 years ago, and we wear them for half as long – that we shift it. It’s cool to have old clothes. You know, it’s cool to have old – but that’s a different cultural challenge. Let’s get back to some of the stuff, the areas that you’ve been writing about or thinking about. In particular, you’ve separated first-time visitors to a website and repeat visitors. And you think that there’s very energy sensible strategies for treating those type of visitors differently. Can you talk a little bit about that?

Jeremy  30:54

Well, I mean, first of all, you have to figure out what kind of service you offer. Are you offering the kind of service where people come once and then probably won’t come again? And if the answer is yes then, okay, you should focus your energy on making that initial visit as lightweight as possible, right? I mean, I think you should focus your energy in that area and no matter what But if you are offering something where people are going to come back again and again, then what you don’t want to do is have their second visit, their third visit and their fourth visit cost as much as their first visit, right? If there’s some way to say, ‘Okay, well, they’ve already visited once. So how can I use that so that they don’t have to download all the same stuff again – so that they don’t have to download that CSS file and that JavaScript file and those logos or icons or whatever – every single time they return?’ And put steps into place accordingly. Now, I have seen people take this too far, where they architected an application – a web application, usually – because they think, ‘Yeah, everyone’s going to come back at least twice.’ And they really frontload that first experience; they dump everything into that first visit. And you can’t even use the app till 30, 40 seconds have passed. But then the thinking being, ‘Ah, but when they come back again, it’ll be faster than that.’ Yeah. But maybe you’ve taken it too far when you’ve done that. So I think there is this line to be threaded between trying to make the initial visit as lightweight as possible but, once somebody has visited for the first time, ensuring that then the next visit will be much, much more lightweight. And this is why I got excited about technologies like service workers. There’s the cache API in browsers now, they’ll allow you to have more control over what gets stored locally and say, ‘Yeah, next time you visit, it’s not going to take as long as the first time you visit.’ So I think it’s important to just have an understanding of what kind of service you’re offering and where you should be pulling the levers there in terms of prioritization. Are you prioritizing first-time visits or repeat visits? 

Gerry  33:03

Right. And, of course, that comes from that model of thinking about making it easier for people and faster for people to do what they need to do. In relation to that, you’ve talked a good bit about concepts like progressive enhancement. Certainly my understanding of that is that it seems like a really positive mental model for developing what I would be thinking about – you know, not just a human experience, but an earth experience. It seems like an approach that is more economical with its philosophy and its use of materials. Could you talk a little bit about it?

Jeremy  33:47

Sure. Because I think there are some misunderstandings about progressive enhancement. I think some people think progressive enhancement limits you in terms of what you can do – ‘Oh, I can’t use the latest and greatest technologies if I’m using progressive enhancement’ – but actually that’s not true. You can use all the latest and greatest technologies, it’s just how you go about using them. So the idea with progressive enhancement is – again, you have to do a bit of a prioritization exercise to begin – you have to decide what’s fundamental, what is the one thing that my service offers that people need to be able to do. So that might be they need to be able to read this article, this piece of information, or they need to be able to fill in this form, they need to be able to click on this button to check out an item of clothing. 

Once you’ve identified the core functionality – not all the functionality, the core functionality – then you say, ‘Okay, what technology can I use to make that as widely available to a vast number of people as possible?’ Now, that usually means boring technology. That usually means using the simplest possible technology – on the web, that’s probably going to be HTML, if you can get away with it. You know, an article just structured in paragraph elements, a form that’s just using straight-up input tags, right? Now this is where some people, I think, misunderstand progressive enhancement, because they kind of stop there. And they think, ‘Well, yeah, we could all build like that. But that would be a very boring website.’ And I agree, that would be a very boring website. 

So the third step, which is hugely important, is you then enhance. You then think, ‘Okay, now that I’ve got a baseline, I’ve created something that I know works for the most amount of people – this foundational part of what I’m building. Now, how can I improve the experience using technology? How can I make it a nicer experience?’ And that’s when you can start layering things on. You can start adding in functionality using JavaScript. You can start using browser APIs, if the browser supports geolocation or accelerometer or whatever kind of things. Yeah, go ahead, start using that stuff. And with each one of these things, you can usually test for support and say, ‘Okay, does the browser support this? If it does, great, download this JavaScript file, and do this.’ You can also apply this in a macro scale of, let’s say, images. Okay, it’s super important that we have an image – an informational image on this page. Okay, fine. But then to begin with, send that image in the lowest possible size, right? The smallest file size. And then you can say, ‘Okay, now am I on a wider screen? Can I afford to send a larger image? Am I on a faster connection? Can I afford that?’ Then you can augment and enhance. 

So the idea with progressive enhancement is you think about the core functionality, you provide that core functionality with the simplest possible technology, and then you go crazy. Then you add in all your animations and your web fonts and all the stuff that that delights people – you add that stuff on top. And to be clear, I think some people think that progressive enhancement means, ‘I have to make all the functionality available to everyone regardless of their technology stack, even stuff that requires JavaScript.’ Now, it’s not about all functionality; it’s your core functionality. So I remember, for example, a friend of mine who worked on The Boston Globe redesign years ago, which was one of the first big responsive sites. He said, ‘You know, there are lots of things in The Boston Globe site that require JavaScript to work, but reading the news is not one of them.’ Right? So there’s this distinction between what your core offering is, and everything else around that. 

And I’m not dismissing the ‘everything else around that’. That ‘everything else’ stuff is usually how companies differentiate, and that’s where the user experience and the delighters and the really gorgeous little touches happen. That stuff is not to be diminished. I’m not saying don’t do that stuff. I’m just saying that stuff needs to be layered on top of a solid baseline of providing the bare minimum. Now what we’re seeing, interestingly enough, is in these emergency situations like the coronavirus, when sites start getting slammed with traffic, if they built things the right way, they can then peel back some of those layers, they can peel away some of those enhancements and just provide the core content –  that base level stuff. So if you build with progressive enhancement, it means you can layer stuff on top. It also means you can then strip those layers away, which is hugely important in these kinds of times.

Gerry  38:21

Wow, yeah. That sounds like a really important thing to be able to do. Just building on that or connected with that, Eric Meyer recently wrote an article about static – that pages should be delivered as static pages where possible, rather than dynamically driven from a database. What would be your opinions there?

Jeremy  38:50

I’ll just step back a bit and try and define some terms in terms of how pages get served up. From a user’s point of view, it’s the same experience no matter what website you go to, as in you type a URL or you click on a link – so you’re making a request – and you get back a page. Right? Now, from a developer’s point of view, how you respond to that request, how you build that page, there are a couple of different ways to do it. One way is you’re waiting for the request on the server and, when it comes, you assemble the page on the fly. Okay, they’ve requested this particular news article. Right, grab the header file over here, go to the database, pull out that news article. Now we’ve got the content. And then grab the footer from over there. Okay, now that assembled page, we send down the pipe. So I compare that to kind of like a short order cook who’s frying something up when the order comes in, right? The other way of doing it is you assume you’re going to have orders coming in. And like a TV chef, ‘Here’s one I made earlier.’ So you’ve preassembled a page with the header, the article and the footer, and it’s literally sitting there as an HTML file on the server. And then when the request comes in for that file, you serve up that HTML file. Now, I don’t want to speak in absolutes, but I think it’s safe to say that there’s very little that’s going to be faster than serving up a static HTML file. Even if you’ve got a super optimized database, it’s always going to be a little longer to go in there, grab that result and bring it back. So serving up static files can give you a huge boost. Now, there are some kinds of operations that simply can’t do that. If you’re performing a search on a website, you need to do some kind of lookup, there’s probably some kind of relational database needed. But for informational pages, yeah, absolutely. If you can pre-bake, if you like, the pages, then yes, do it. Because the other thing is that if you have a large influx of traffic, a database is going to struggle to handle all those connections. Whereas web servers, going back to the birth of the web, who are just responsible for serving up HTML files, they can actually scale up pretty well to doing that. And then you’ve got other things like adding in CDNs – content delivery networks – which can really, really help there, and they work especially well with static files just waiting to be served. So this idea of sort of pre-baking as much as possible is catching on. And what I like is that it’s becoming one of the cool things to do in the dev community, because quite often, a lot of the cool things to do are not very good for the end user, right? ‘Oh, it’s now cool to use this giant JavaScript library’, when it makes the experience worse for the end user. But there’s this whole term now, Jamstack, which is just a buzzword really, but what’s behind it is this idea of serving up static files. And to be honest, it’s not sold for the user experience benefit. It’s sold more as, ‘Oh, your developer experience will be better because you just have to deal with generating this stuff once and not have to worry about databases, and you don’t have to maintain a server with a database on it.’ So they’re selling it in terms of the developer experience. But actually, as Eric points out, the user experience is always going to be better, and it’s more robust, more resilient to these stress cases, to the sudden influx in an emergency situation, lots of people are trying to access this. Yeah, this idea of static files trumping databases is a pretty solid concept.

Gerry  42:36

Right? I tested this about a year ago on a site and found that a typical page coming from the database was taking about five seconds to fully load. And that if it was done as static, it was down to about two-and-a-half, three seconds. Would that seem like a reasonable proportion, that in many situations you could get that sort of rough differences in performance in the download if you went static with, as you said, the right type of pages, the informational type of pages?

Jeremy  43:13

Yeah, I mean, it’s going to vary. And some databases are much more optimized and others, but even with the best optimized database, it’s always going to be a little bit more of an expensive operation to just go in there and grab something and return the result than just serving a premade file. But I’ve seen this myself. It works really well when you’ve got just the static pages, informational pages. But what do you do about the dynamic pages, things that are updated very frequently, or they rely on data coming from multiple sources and they’re generated at runtime. But even there, there might be opportunities to pre-make or pre-bake bits of a page, right? Have these files sort of ready to go, and then you still do some assembly, but maybe that assembly doesn’t involve going to a database. Or you can send down as much static as possible and then maybe use JavaScript on the client side to pull in the more dynamic stuff. As long as it’s not core functionality, I would say that’s reasonable to do. You can pull in the weather widget using JavaScript, sure, or pull in the extra stuff in the sidebar using JavaScript so that the core stuff comes down quickly. But yeah, in general static files are just going to be faster. And if you can find a way to use that, to use some kind of caching and pre-baking, it’s worth doing it.

Gerry  44:36

Final question: from a web development and design perspective, what’s the biggest thing you’ve learned from this coronavirus pandemic so far? 

Jeremy  44:48

It’s interesting, because I’m well aware of confirmation bias. I see people on social media and their interpretations of what’s happening now and what the world might look like afterwards. And funnily enough for everyone, it’s confirming what they already believed. Right? It’s confirming their pre-existing political affiliations, it’s confirming their ideas about how the world should be run. I don’t see anybody having Damascene conversions because of this. So I am well aware that, just looking at how the web is responding to this in terms of performance, I’m probably just going to see what I already believe getting confirmed by this. So I already thought that web pages were too big and there’s too much JavaScript and unnecessary images. And what I’m seeing with vital information trying to get out there fast and stuff like that is, ‘A-ha, I was right all along. There was too much JavaScript, there was too much unnecessary images.’ So frankly, I’m seeing this confirmation bias for what I already believed. And I’m well aware that that’s a human bias, so take it with a huge pinch of salt.

Gerry  46:09

Okay, so just a follow on to that. Is there anything that has disturbed that preset set of knowledge or perspectives or attitudes that you have? Is there anything that has – whether from web development or design or in general – made you begin to rethink something that you really believed before and that there’s a little seed of doubt? Or maybe now?

Jeremy  46:41

I would say not on the performance front, not on the idea that web pages should be leaner, and we shouldn’t be serving up – I haven’t seen anything to make me change my mind on that. In broader terms, though, maybe rethinking some things. Over the past few years there’s been lots of examples of how the internet’s terrible and all the negative consequences of what the internet does. And I have to say, over the past few weeks, there’s a lot more seeing how the internet can be a great place. And that’s kind of in a good way making me re-evaluate. I’ve even seen some people have quite large not changes of heart, but rebalancing, like my friend Maciej. He runs idlewords.com. For years, he’s been lobbying and working politically against surveillance capitalism, and the fact that these large companies on the internet are tracking us, tracking our movements and invading our privacy. And he’s not changing his mind about that. But now he’s seeing how, well, could we use that? Can we take this existing apparatus that’s in place and use it for better tracking of people who have the coronavirus, better tracking down of people who are one degree of separation away from someone who has the coronavirus? I mean, if we can do that for people who looked at an advertisement of bunny slippers, then if there’s a way to use that same technology to slow the spread of this virus, then should we be doing that? It brings up the interesting questions about liberty and freedom and security and all that. But that’s been an interesting one to observe. And I haven’t made my mind up on that in one way or the other, but it is an example of maybe re-evaluating pre-existing ideas about technology in light of this new situation.

Gerry  48:47

Yeah, and maybe that’s a good way to end. We can’t help reinforcing our own ideas or looking for confirmation bias, but we should step back a little bit and see: are there things that we need to re-evaluate or rethink or come at from a different angle in this crucial moment in the history of the world? So I think you’ve given us loads of ideas, loads of thoughts and practical things to do, Jeremy. So I’d like to really thank you very much for your time. Just really appreciate you doing this.

Jeremy  49:33

Thank you for having me. Thanks for letting me vent. I tend to rant on and on once you wind me up and let me go.

Gerry  49:40

Oh, that’s what we need. If you’re interested in these sorts of ideas, I published a book called World Wide Waste. You can find out more at gerrymcgovern.com/www. I hope you enjoyed this episode. If you’d like to be part of the conversation or community, hop on over to ThisisHCD.com, where you can join the Slack community and help shape future episodes and connect with other designers around the world. Or join the HCD newsletter, where you can win books and get updates. Subscribe to our content on Apple Podcasts or Spotify, and listen to any of our design podcasts, such as Getting Started in Design, Bringing Design Closer with Gerry Scullion; or Power of Ten with Andy Polaine; or Decoding Culture with Dr John Curran; ProdPod with Adrienne Tan; and EthnoPod with Jay Hasbrouck. Thanks for listening and see you next time.

Posted by Gerry McGovern

4 Comments

  1. […] Gerry has started a new podcast to accompany his new book, World Wide Waste. He invited me on for the first episode: ‘We’ve ruined the Web. Here’s how we fix it.’: […]

    Reply

  2. […] I was talking to Gerry on his new podcast recently, we were trying to figure out why web performance is in such a woeful state. I mused that […]

    Reply

  3. […] website pages we create have become too big, too heavy, they take too long to download. We’ve ruined the Web. Here’s how we fix it, says Jeremy […]

    Reply

  4. […] This something I touched on when I was talking about web performance with Gerry on his podcast: […]

    Reply

Join the discussion