

Garbage In, Garbage Out: The Data Problem Hiding Inside Your AI Strategy - with Caroline Jarrett
Forms expert Caroline Jarrett joins Gerry to talk about why the rush to AI is colliding with something most organisations haven't fixed - their own data. A conversation about errors, forms, and the unglamorous work that actually moves the dial.
Guest

Caroline Jarrett
England, United Kingdom
View Directory Profile →Watch on YouTube
Transcript
[00:00:00] Hey folks, and welcome back to another episode of This Is Hate CD. My name is Gerry Scullion [00:00:05] and I'm a human-centered service designer based in the beautiful city of Dublin, Ireland. [00:00:10] Uh, today in the show, I'm delighted to welcome Caroline Jarrett. Caroline has [00:00:15] been one of the people that I wanted to have on this podcast for a very long time.
[00:00:19] I was first [00:00:20] introduced to Caroline's work by the brilliant Gerry Gaffney in Melbourne, whom I [00:00:25] worked with way back, uh, in another lifetime when I lived in Australia. [00:00:30] We caught up when we were both speaking at SD&Gov in [00:00:35] Edinburgh last year. And truthfully, we actually did an episode there, but the [00:00:40] noise of the, uh, the crowd and stuff walking by was just too distracting to publish.
[00:00:44] So [00:00:45] we redid it again and I'm delighted to have her on the show to share with you today. Now, [00:00:50] here's the thing. You know, it's a question that I've been sitting with for quite a while. What's the point of [00:00:55] plugging your organization into AI if the data you're feeding it is already broken? [00:01:00] And that's the thread that we spoke about, myself and Caroline, a number of weeks ago.
[00:01:04] [00:01:05] For anyone who doesn't know Caroline, Caroline is a forms expert, a surveys expert, and one of the [00:01:10] sharpest observers I've ever come across in the field. Caroline has been quietly [00:01:15] working on glamorous bits of design over the last 30 years, forms, [00:01:20] errors, data quality, the stuff that doesn't really need to get to trend on LinkedIn, but [00:01:25] Caroline is there when you've got a question about this stuff.
[00:01:27] They have done the hard work. [00:01:30] Now, in this conversation, we talk about why forms are the only [00:01:35] compulsory part of any service. And why so many teams can't tell you their own error [00:01:40] rates and what designers should be doing right now before the AI conversation [00:01:45] outpaces the reality on the ground. Before we get into it, I just want to give a shout out to my own [00:01:50] newsletter and this is HateCD.
[00:01:51] We have totally redesigned thisishatecd.com. [00:01:55] There's now a, a live directory where you can join and connect with other designers. We're very [00:02:00] proud of it. It's not something we've bought off the shelf. It's something that we have built through research. We are [00:02:05] walking the walk and talking the talk. Really, this conversation is about garbage in, garbage out, [00:02:10] uh, the data problem hiding inside your AI strategy.
[00:02:14] Caroline is [00:02:15] brilliant. If you haven't connected with Caroline on LinkedIn, go ahead and do it. You're absolutely [00:02:20] fantastic just to learn from asynchronously. But also, if you want to engage with Caroline, go [00:02:25] direct to her. She is fantastic. She's got an absolute wealth of information that she can [00:02:30] amplify any team on this planet with that information with.
[00:02:33] Let's jump straight into [00:02:35] this episode.[00:02:40] [00:02:45]
[00:02:45] Caroline, I'm delighted to have you on the podcast. Long time [00:02:50] Myer. Um, second time speaking, uh, the first time we, we met and in [00:02:55] person and connected was in service design and gov in Edinburgh in [00:03:00] 2015. And we're doing this podcast again because we weren't really too ... Well, I wasn't too happy with the [00:03:05] audio quality.
[00:03:06] Um, we recorded it in, in, uh, one of the lobbies. [00:03:10] But maybe we'll start off, tell us a little bit about yourself, where [00:03:15] you're from, and what you do.
[00:03:17] Well, I'm ... Right here, I'm in L- Layton Buzzard, [00:03:20] uh, which is a little town in the UK. Um, [00:03:25] probably no one's ever heard of it, and, and the name is as silly as it sounds.
[00:03:28] I mean, the Buzzard is spelt [00:03:30] like the bird. Yeah. Uh, apparently it's an, an ancient Norman name. It's a [00:03:35] corruption of Bossard, but I prefer to think of, you know, raptors flying around.
[00:03:39] [00:03:40] That's
[00:03:40] it. And, and actually these days, we don't have any buzzards, but we do have a lot of red kites, which are [00:03:45] very beautiful birds.
[00:03:46] They become very common in my area of the UK. [00:03:50] Um, for people who don't know the country that well, um, about 50, 40 [00:03:55] miles, um, northwest of London, or another way of looking at it is halfway between Oxford [00:04:00] and Cambridge. Uh, I'm not really from here, um, but for [00:04:05] complicated reasons, my husband and myself, um, ended up buying a house in [00:04:10] this town in 1985 and heavens above.
[00:04:12] More than 40 years later, we're still [00:04:15] here. We still- Oh, it is. And that's, that's my background. And so, um, [00:04:20] what do I do is that, um, through a combination of [00:04:25] sort of random things, I ended up in the, um, [00:04:30] 1992, ended up getting a job with, uh, what, what, uh, was [00:04:35] then called Bull Information Systems. It's now called Stereo.
[00:04:38] Um, and they delivered, [00:04:40] um, PCs to our tax authorities, um, now called HMRC [00:04:45] for his Majesty's revenue and customers, but in those days it was the inland revenue. [00:04:50] And, um, they have about 26,000 [00:04:55] staff. So it was a lot of PCs. Yeah. Uh, and they also delivered the Unix systems they [00:05:00] connected to. And they decided that, the revenue decided that they would try and, [00:05:05] uh, scan forms.
[00:05:06] Um, so instead of typing in the tax forms, they wanted to [00:05:10] scan them in and have what we now call artificial intelligence, so neural network [00:05:15] image recognition, um, to, uh, process the forms. [00:05:20] Um, and they'd actually delivered a couple of systems and it wasn't going well. And they hired me. I was a project [00:05:25] manager.
[00:05:25] They hired me to try and turn it around. And I, I did a couple of visits [00:05:30] to the sites where these things were working and not working.
[00:05:32] Yeah.
[00:05:33] And one of them was, um, [00:05:35] for a, a form where people would, um, have the [00:05:40] opportunity if they paid that at the time any sort of bank interest was taxable [00:05:45] and, um, people could reclaim the small amounts of tax if they weren't sufficient [00:05:50] taxpayer to have this tax.
[00:05:52] So, um, basically, [00:05:55] um, it didn't work and it didn't work because people would write things on [00:06:00] the form like, "Please see letter attached."
[00:06:02] Ah.
[00:06:02] Right. And, um, [00:06:05] roll forward. I mean, I'm talking about 1992 and we're now [00:06:10] 2026, I think. Yeah. And, um, where artificial intelligence is [00:06:15] everywhere, isn't it?
[00:06:16] Yeah.
[00:06:16] But I can tell you for sure that your artificial intelligence today [00:06:20] still can't deal with a form where someone's said, "Please read Letter Attack."
[00:06:24] I
[00:06:24] [00:06:25] know.
[00:06:25] So I got very interested in, well, how do we design forms so that people actually [00:06:30] write processable answers on them. And it's kind of a passion that's never let me [00:06:35] go. I still really find that forms very fascinating. So that's me in a [00:06:40] nutshell.
[00:06:40] No, absolutely. I, I need to give a shout out to Jerry Gaffney, uh, Jerry Gaffney in, [00:06:45] in Melbourne, who I've worked with, um, you know, when I lived in Australia, [00:06:50] um, for a couple of years and Jerry kind of introduced me to an awful lot of [00:06:55] the work that he and you had done in that first book, which, uh, I have no [00:07:00] idea how many years ago that was, but it's, it's now a collector's edition is [00:07:05] what he likes to tell me.
[00:07:06] But, um- Well, I,
[00:07:08] I, I see I've, I've, [00:07:10] I've managed to position my forms book just so you just can't see it behind my head. I see [00:07:15] it, yeah,
[00:07:15] in the, in the background there.
[00:07:16] Yeah, that's Jerry and me. We, we met sort of online. We were both [00:07:20] members of a, um, um, in the ... Do you remember in the days when we used to have email [00:07:25] communities- Email lists.
[00:07:26] Email lists. Mailing lists, yeah. Yeah,
[00:07:27] yeah.
[00:07:27] So we were both on that and we started [00:07:30] chatting about forms and he was interested in writing a book on forms and I twisted his arm to [00:07:35] help me write a book on forms. Wow. And a mere 10 years later, the book came out.
[00:07:39] A great [00:07:40] combination of the two.
[00:07:41] Yeah, he's, uh, I really valued his [00:07:45] contributions to the book.
[00:07:46] Um, and yeah, he's a great, great guy.
[00:07:49] So, [00:07:50] like, I, I know you said to me before we were chatting, I said, "I can't believe you, you keep on referring to me as a [00:07:55] forms expert and a surveys and all of these different aspects." But you [00:08:00] have written two books and one of them is titled Forms that work so that's, that's [00:08:05] maybe worse.
[00:08:06] I'm
[00:08:06] more than happy to be referred to as a forms, forms person. [00:08:10] That, that's my, my ideal identity. But what happens is [00:08:15] that, that one thing led to another and I ended up writing a book on surveys. [00:08:20] Yeah. Um, sort of accidentally wrote a book on surveys because [00:08:25] broadly speaking, I mean, you can see a few bookshelves behind me and I've got pretty much one bookshelf [00:08:30] that's full of survey stuff, right?
[00:08:32] And, and I've got about three books on [00:08:35] forms, you know, the number of books on forms around is really, really small. Um, when [00:08:40] my book came out, people, when they heard, "Oh, you, you, you've written a, you know, you're a [00:08:45] forms person," people would, would very l- in a lovely way, [00:08:50] um, suggest to me I should read Luke Rublewski's book on forms- Oh yeah.
[00:08:53] which is also good. And I [00:08:55] would just smile politely because I'm a smile, politely sort of person, but they hadn't [00:09:00] noticed I actually contributed a perspective to that book.
[00:09:03] Yeah.
[00:09:04] So it's like, [00:09:05] yeah, uh, but there's only perhaps 10 or 15 books around on forms. Another [00:09:10] one's been written by Jessica Renders, who's also Australian and like Jerry.
[00:09:14] Um, [00:09:15] but surveys is the opposite. There's an absolute [00:09:20] constant survey literature. And of course the survey people are [00:09:25] also interested in some of the things that matter for forms, like how do people answer a [00:09:30] question, you know?
[00:09:30] Yeah.
[00:09:31] So there's a torrent of survey stuff. It's absolutely [00:09:35] impossible to keep up.
[00:09:36] I went to the European Survey Research [00:09:40] Association, has a conference every two years, and they will have something like [00:09:45] 300 papers at that conference. That's just one conference [00:09:50] every two years, and that's just, just to give you a flavor. And then another one is that one [00:09:55] of the most cited papers in all of academia is Lickert's famous [00:10:00] paper-
[00:10:00] Mm.
[00:10:01] on measuring attitudes, which is from 1932. And, you know, [00:10:05] 100 years later, that's still a very, very highly excited. Yeah. So it's not only [00:10:10] massive volume of stuff, but it's also very, very historic.
[00:10:13] Yeah.
[00:10:13] Um, so [00:10:15] yeah, I ended up reading a lot about surveys and then my mentor, [00:10:20] Ginny Reddish, told me I had to write a book on surveys.
[00:10:22] So I said, "Yes, Ginny," and wrote a book on surveys. [00:10:25] And, um, I won't say I mind when people ask [00:10:30] me that survey but I'm not as passionate about them as well.
[00:10:33] I know, but in, [00:10:35] in terms of, we'll focus on forms first, okay, because some people will use them [00:10:40] interchangeably. Like a form and a survey, like, you know, they're, they, they go hand in hand, people who may not be from the industry, [00:10:45] maybe they are from the industry.
[00:10:46] Um, I'd like to understand a little bit more [00:10:50] from a conversation that I had with Jerry Gaffney, and that was probably in [00:10:55] 2017 when I, um, was kind of starting this podcast, to be honest. [00:11:00] And Jerry was likening forms to being like a conversation, [00:11:05] um, with between people, but being also being able to identify [00:11:10] from the forms that an organization produces, what he can [00:11:15] derive at the organization is like.
[00:11:18] What are your thoughts on [00:11:20] that in the current world of 2026? Where do you see, [00:11:25] um, the same patterns and, and how is it manifesting in what organizations are [00:11:30] producing? What are the problems they're producing at the moment when they're creating their forms? [00:11:35]
[00:11:35] Well, I think, um, the, the, the ... [00:11:40] I have a, a, a diagram on my website that was created by Tim [00:11:45] Paul, who's very, um, used to be head of interaction design at, at gov.uk, and [00:11:50] he's now, um, working in artificial intelligence for our government.[00:11:55]
[00:11:55] And he, um, his diagram has got, like, a series of mountain peaks [00:12:00] with the little red peaks peaking up of the forms and the, everything else underneath is the [00:12:05] service. And there's only teeny, tiny little red peaks, you know? Um, and most of us [00:12:10] working in user experience, we spend most of our time not on the forms, right?
[00:12:14] Mm-hmm.
[00:12:14] [00:12:15] But the forms are the little peaks that people actually see, that's the only compulsory part. [00:12:20] So I'll see, you know, enormous amounts of stuff about all sorts of [00:12:25] things to do with website design, experience design, everything about interacting with [00:12:30] customers, but it's actually the forms of the compulsory bit.
[00:12:32] Yeah. You know, that's the bit that you can't escape, [00:12:35] you have to do it. Sure. And one of the main differences for me between a form and a [00:12:40] survey is that, broadly speaking, you know, a form is something you have to do, [00:12:45] whereas a survey is something you can opt out of. Mm-hmm. Yeah. You, so form is often a, a [00:12:50] barrier between someone trying to get something done-
[00:12:53] Yeah.
[00:12:53] and getting something done. [00:12:55] And so if organizations can kind of make those [00:13:00] barriers almost feel invisible, um, then that's gonna be better for everybody. [00:13:05] Yeah. You know, somebody asked me about, well, should you want to delight people with forms? And [00:13:10] my answer to that is, well, I think you should probably kind of try, you know, hugging a loved one or [00:13:15] going for a walk in the fresh air for your delight and try and give [00:13:20] people time back to use for their delight in the way they want to- Yeah.
[00:13:23] by making their forms [00:13:25] experience be as fluid and easy and almost unnoticeable as possible. [00:13:30] Um-
[00:13:30] Do you, do, do you see, um, any kind of shift and [00:13:35] improvement with the advent of AI in forms design? Because, you [00:13:40] know, apparently some people out there, maybe they're in design, maybe they're not, [00:13:45] they're speaking about, you know, AI is gonna change everything.
[00:13:47] We're able to produce forms and [00:13:50] websites and, you know, what, what's it like from your perspective when you [00:13:55] look at the web now, what are the glaring risks that [00:14:00] you see that are staring us in the face with the advent of AI? I
[00:14:04] think one [00:14:05] of the things I'm noticing e- more and more is kind of where [00:14:10] many of us already rely on some level of agentic AI, which is to say AI that does stuff [00:14:15] for you.
[00:14:15] Yeah.
[00:14:15] Okay. So there's a type of agentic AI that's been around for [00:14:20] quite a long while, which we know is AutoCorrect.
[00:14:22] Yeah.
[00:14:22] Okay? That's an Agentic AI. [00:14:25] And, um, many of us have been embarrassed in some way by AutoCorrect [00:14:30] doing something, you know, typically, I think many of us have the [00:14:35] experience that practically every day AutoCorrect does something silly for us.
[00:14:39] Yeah. You know, okay, [00:14:40] in our, if I'm just sending a text to my husband, like, "What time's dinner?" Does it really matter [00:14:45] when it sends what banana dinner or something random? I mean, why, why does it [00:14:50] insert these words? But it does. It doesn't really matter. But in terms of [00:14:55] having that sort of stuff in something that really matters can be quite worrying.
[00:14:58] Yeah.
[00:14:59] And then the [00:15:00] next level of agentic AI that many of us are also very, very familiar with and rely on, [00:15:05] which in fact, just logging in today, my browser suggested some stuff for [00:15:10] me to log into. Mm-hmm. Now, if I hadn't been deliberate, you know, when it said, asked me [00:15:15] for a name to put into your, you know, logging in- Yeah.
[00:15:18] um, [00:15:20] my browser suggested some names. Now, it suggested some names which are [00:15:25] appropriate. I want it to be known as the name I usually use, like Caroline Jarrett, but [00:15:30] because in this country we still use, like, I, I call myself Mrs. Caroline Jarrett, I, I prefer [00:15:35] to be known by that title, it will suggest that, but also it [00:15:40] suggests some names, like, for example, I used to do a lot of paperwork for my late parents, so [00:15:45] it's got my parents' names.
[00:15:47] Now I had to be on top of that and making choices [00:15:50] to ... Even in that tiny trivial thing, so one of the things I [00:15:55] think many of us experienced has been our browsers are quite handy for doing that kind of [00:16:00] filling in for us. Yeah. But we've also accidentally polluted that stuff with [00:16:05] perhaps a typo or something.
[00:16:06] Yeah.
[00:16:06] So you can end up propagating a mistake. [00:16:10] And many of us, I mean, myself included, I'm sure I could work out how to eliminate that [00:16:15] crud that my browser has accumulated, but I can't quite be bothered. Yeah. [00:16:20] And I'm, and I'm someone I've worked in and around computers from, since, you know, [00:16:25] 1977. Right. Do you know what I mean?
[00:16:26] I've got a long experience with this. You think a lot of people my [00:16:30] age and older wouldn't even begin to know where to start on doing that. Yeah. [00:16:35] And lots of young people are just like their computers are as natural as breathing to [00:16:40] them, but they still may not know things like how to eliminate that crime.
[00:16:44] So now we're looking and [00:16:45] saying, "Well, those are two AI agents that many of us just use all the time, but we know they're [00:16:50] problematic." If we then say, "Well, we're gonna hand over to AI a bit [00:16:55] more and say, okay, AI, you know, buy a book for me. " Well, fine, maybe I get the [00:17:00] wrong book every now and then. Oh, okay, AI, write my will.
[00:17:04] Really? [00:17:05] You know, you really want to do that? Okay, AI, file my taxes. I mean, I don't know what [00:17:10] the Irish tax authorities are like. Amazing. But I can tell you for sure that the UK [00:17:15] tax authorities will not accept, AI did my tax return as a mistake- Yeah. ... but, you know, [00:17:20] reason for getting it wrong. So we have to look at the potential level of problems.
[00:17:24] [00:17:25] Now, that was all just about filling in the forms.
[00:17:27] Yeah.
[00:17:27] But when I'm saying that the [00:17:30] user behavior of how they interact with the forms is also a crucial part of the work, the way [00:17:35] we design them. And, um, so that's one aspect. [00:17:40] Another aspect is to say, "Oh, well, okay, AI, just build me a form, you know? All right, well, what form is it [00:17:45] gonna build?
[00:17:45] Is it gonna ask useful questions? Is it going to ask questions so you can answer?" [00:17:50] And I don't know. It might be. I mean, I'm hearing a lot from [00:17:55] developer colleagues about the perils of vibe coding, you know, let the AI [00:18:00] code, and the answer is it will code something. Yeah. Does it code robust, [00:18:05] truly accessible, effective, maintainable code?
[00:18:08] No. No, no, [00:18:10] no.
[00:18:10] You still need, I would still say you need about 50% understanding [00:18:15] of code to un- to be able to really vibe code. Something that I, I love to [00:18:20] do in my spare time, I prototype with it, I'm always playing with that [00:18:25] space, but like you can really code/design yourself into a corner and then it just says, [00:18:30] "Oh, I don't know what to do now."
[00:18:31] And then you're stuck. So it takes [00:18:35] a lot of that, um, early kind of flux out of [00:18:40] your, your process, but it doesn't scale very nicely unless you're very, very careful or skillful [00:18:45] about it. Can I ask you a little bit more around, [00:18:50] uh, the, the topic that you spoke about in service design and gov because I know [00:18:55] when I spoke to, to Martin in, um, the German government, we were both kind [00:19:00] of interested in total error, um, uh, across the, [00:19:05] the sequence of a service.
[00:19:08] Um, [00:19:10] where did this come from? W- w- like I know you were, you were very much passionate about this. Your workshop was [00:19:15] brilliant. Like myself and Mark went to it, we loved it, uh, and Owen as well from [00:19:20] Dublin City, we thought it was really, really good and it got us thinking about an [00:19:25] important metric to measure.
[00:19:27] Um, and it's very often [00:19:30] not included in, uh, a lot of those core metrics when I look within an [00:19:35] organization. Why do you think organizations don't do it more? So- What's [00:19:40] holding you back?
[00:19:41] The workshop that I did was about error rates. Like, do we know our error [00:19:45] rates and do we understand them? And that was partly as well because I feel [00:19:50] that to be fair, uh, you know, many organizations are looking to use [00:19:55] AI in different ways.
[00:19:57] Mm-hmm. Um, and there's a, an [00:20:00] argument, I think, that says, if we're gonna be putting data into AI, we [00:20:05] should probably try and give the AI the best chance of giving it decent quality data- Yeah. ... in the first [00:20:10] place. If it's riddled with errors, chances are that's not going to improve the [00:20:15] tendency for AI to create even more.
[00:20:16] Yeah. You know, so it's like, I'm hoping that [00:20:20] the whole conversation about implementing AI- Yeah. ... will help people to think about [00:20:25] what is the accuracy of our data like, what quality of data are we looking at? [00:20:30] And so the workshop was really about getting people to think about what errors [00:20:35] happen, um, why they happen, are we looking at error rates in our service [00:20:40] and is there a possibility of, of thinking about the total error across the service?[00:20:45]
[00:20:45] Yeah.
[00:20:45] And that partly came about because of something that Martin Jordan. Martin is, um, [00:20:50] head of design for the German government-
[00:20:52] Yeah. ...
[00:20:52] and he put a, a, a, a, [00:20:55] a post on socials out saying that measuring error rates would be a [00:21:00] metric in German government. And I, I think that's fascinating. Huge. Yeah. Because, um, it's very [00:21:05] difficult to get any sort of government metrics on any government service.
[00:21:08] In particular, I think error [00:21:10] rates are quite a challenging one to measure.
[00:21:11] Mm-hmm.
[00:21:12] But, um, uh, that really got me [00:21:15] thinking. And it got me thinking that I hadn't actually thought that much about errors in Ford- Yeah. ... [00:21:20] for really, um, back in the day, you know, around about year 2000, I was [00:21:25] talking about error rates and data capture processes, and then I hadn't really thought about it much [00:21:30] in between apart from, it turned out that there's a central [00:21:35] concept in surve- survey methodology called total survey error.
[00:21:38] Yeah.
[00:21:38] So, which is about looking at [00:21:40] all the sources of error in a survey and trying to minimize across all of them. You know, so to [00:21:45] encapsulate that, um, a, a lot of us might think about sampling error [00:21:50] with surveys, like when you ask a sample, there's an inherent mathematical- Yeah. ... sampling error built [00:21:55] in.
[00:21:55] But there's also something called measurement error, which I would describe as asking the [00:22:00] wrong question. And so, you know, one of the things is that no matter how [00:22:05] much you increase your sample size, produce your sampling error, if you're asking the [00:22:10] wrong question, it won't help you, you know? Yeah. Those are independent errors.
[00:22:14] So, [00:22:15] um, you've got to kind of balance the time, like, in a survey-
[00:22:19] [00:22:20] Yeah. ...
[00:22:20] don't put all your time into asking a ton of people, put some of your time into thinking about [00:22:25] whether you're asking correctly- The righ question. And so that whole total survey [00:22:30] era thing also fed into why I was really thinking about errors.
[00:22:34] But I'm just [00:22:35] gonna put a quick advert in, if you don't mind.
[00:22:37] Come on, yeah,
[00:22:38] stick an ad in. [00:22:40] I did an online version of the workshop for- Yeah. ... um, [00:22:45] Rosen, Lou Rosenfeld in the Rosenverse. Yeah.
[00:22:47] Give a plug there.
[00:22:48] Big
[00:22:48] up to Lou.
[00:22:49] [00:22:50] Big up for Lou, yeah. Yeah. Uh, so if people want to get like a, a slightly different [00:22:55] but an online version of the workshop that's available, um, and I'm also repeating an [00:23:00] online version of it at the, um, service design and government virtual in March.
[00:23:04] Ah, [00:23:05] class. Yeah, yeah.
[00:23:06] Come along.
[00:23:06] I mean, service design and gov is one of the best conferences. [00:23:10] I don't want to, I don't want to harp onto it. I absolutely love the team that put [00:23:15] together, um, those events, but I just want to come back to the error piece, [00:23:20] okay, and, and total survey error. Depending on the Zoom level that you're looking at [00:23:25] in an organization, what constitutes an error?
[00:23:28] Like, like, asking the wrong [00:23:30] question is pretty high up on the Zoom level versus a micro interaction [00:23:35] where you might be able to track that, like, you know, it's, you know, maybe pre- [00:23:40] pre-populating something that it shouldn't be or placeholder text or isn't clearing. [00:23:45] Walk me through errors and what falls into [00:23:50] something that constitutes as moving the dial in the metric for total survey [00:23:55] error.
[00:23:55] I'd like to understand that a little bit more. What, what gets tracked?
[00:23:59] Well, let's, [00:24:00] let's look more at total service error rather than survey error.
[00:24:03] Okay,
[00:24:03] yeah. Let's look at services and, [00:24:05] and experiences that we're designing. So, um, our, our, our [00:24:10] colleagues who work primarily in e-commerce, I don't know if that's you, but I, I do a bit of e-commerce, but it's [00:24:15] not my main thing.
[00:24:16] Yeah.
[00:24:16] So, um, recently I've been working with an e-commerce [00:24:20] business and in e-commerce, they're a lot better than we are in [00:24:25] government. Um, my main thing is government.
[00:24:27] Yeah. The,
[00:24:28] the e-commerce folks tend to be [00:24:30] pretty good at tracking their conversion rates- 100%. ... which is another way of looking at it to say, [00:24:35] "How many people are on our website?
[00:24:36] How many people actually progress to buying?" Yeah. [00:24:40] So one of the ways of looking at is to say, how many people have we lost on the journey? [00:24:45] Mm-hmm. Um, e-commerce people tend to call that conversion rate, [00:24:50] um, elsewhere we tend to call it completion rate. Like, did they start the process? Did they [00:24:55] succeed?
[00:24:55] Mm-hmm. So that's one way of saying where are people dropping out?
[00:24:58] Yeah.
[00:24:58] Um, and are [00:25:00] they dropping out because of their errors or our errors? Did they answer questions wrongly? [00:25:05] Uh, at the moment, I'm thinking a lot about why people lie on forms, for example. So, [00:25:10] um, you might say, "Well, no one ever lies on forms."
[00:25:12] And like, everybody lies on forms and the [00:25:15] con- the classic example is, did you actually read the terms and [00:25:20] conditions? No, you did not. You just ticked the box, right?
[00:25:23] How did you know? [00:25:25] How did you know-
[00:25:26] I've seen you filling in full.
[00:25:28] I knew, I thought [00:25:30] I heard the door move when I was ... I knew you were here.
[00:25:34] There you [00:25:35] go. You see? So we've, we've all done it. We've all- Yeah. ... we've all, we've all met a [00:25:40] situation where we- Okay. You know, you
[00:25:40] know my dirty little secret, okay?
[00:25:42] Exactly. I don't read the terms,
[00:25:43] conditions.
[00:25:44] [00:25:45] Some of us do, a few of us do, but I don't read them consistently for sure, you know? [00:25:50]
[00:25:50] Yeah.
[00:25:50] And, um, so we've got that kind of, how do we [00:25:55] work with, with what people's actual behavior?
[00:25:58] Um, all of those things, [00:26:00] we've got problems like one of the things that's very important for people in Ireland [00:26:05] is many Irish people have got apostrophes in their names, like O'Connor-
[00:26:09] And [00:26:10] fathers as well.
[00:26:10] Yeah, there you go. And, um- And
[00:26:12] Wales as well, they've got
[00:26:13] their own- And so, you know, [00:26:15] a lot of websites won't accept a p- apostrophe in someone's name.
[00:26:19] Right, yeah. [00:26:20] So then people are forced into giving a wrong answer. Yeah. You know, so we can force people into [00:26:25] lying by saying, "Your name isn't valid." It's like, "It's my name. It's you that's not valid." [00:26:30]
[00:26:30] Yeah.
[00:26:30] You know, and my screenshot library has an Irish airline rejecting someone's [00:26:35] Irish name. Like, really?
[00:26:37] You should have done better. Um, but- I know,
[00:26:39] that's a [00:26:40] huge thing.
[00:26:40] So it's kind of understanding that just kind of interaction [00:26:45] level. Yeah. And I also have a rant about don't ask people no questions [00:26:50] with only have the answers yes and no, because the real world is always more complicated. [00:26:55] There's always some exception, there's some extra thing, always have some [00:27:00] other option.
[00:27:01] It might be other or it might be something else or you have to work [00:27:05] on the wording of the extra option, but you always want to have a further option. [00:27:10] Nobody ever fills it in after five years, take it away, but make sure [00:27:15] you're designing for it. So that was one level of sort of scale that you asked. [00:27:20] Yeah.
[00:27:20] But the other one is like an outcomes level. Yeah, so just going back to [00:27:25] e-commerce, did they actually deliver, you know, if, if I bought [00:27:30] like from most of, many of us do online shopping quite a bit, if I [00:27:35] placed my order in the supermarket, did they actually deliver what I asked for?
[00:27:39] [00:27:40] Yeah.
[00:27:40] Or not.
[00:27:41] Huge
[00:27:42] thing.
[00:27:42] I've been saying this for a while.
[00:27:43] Do you see
[00:27:44] what I mean? [00:27:45] Yeah. So it's kind of, every now and then, particularly in government, [00:27:50] services will hit the headlines because there's been some tragically wrong outcome, you know?
[00:27:54] Yeah.
[00:27:54] [00:27:55] Um, I don't know if you've heard, but we had a scandal recently which affected people in Northern [00:28:00] Ireland a lot-
[00:28:01] mm-hmm.
[00:28:01] where our tax authorities, bless them, decided that they [00:28:05] would stop paying child benefit to people who'd left the country. Now, [00:28:10] why did this affect people in, in Northern Ireland in particular is because what can [00:28:15] happen is you can leave from Northern Ireland and so you're, you might, let's say you [00:28:20] fly from Northern Ireland to Paris, right?
[00:28:22] Yeah. So there's a record that you've left, right? [00:28:25] But for whatever reason, you decide to come back via Dublin. Well, there's no entry record [00:28:30] because you've not come back, so now you've emigrated.
[00:28:33] Wow. [00:28:35]
[00:28:35] Okay. So there's, there were lots of ways it could affect people, not in Northern Ireland, but that's just a very [00:28:40] particular example.
[00:28:41] To, to give any of our people in other parts of the world who don't understand the [00:28:45] local politics, Ireland as an island, um, is made up [00:28:50] of two kind of, um, guest countries, Northern Ireland and the [00:28:55] Republic of Ireland, depending on how you see things, it's the north of Ireland or [00:29:00] the south of Ireland, but what happens is there's no border between those two [00:29:05] entities and as a result, you can enter, uh, in through Dublin and just drive up and [00:29:10] there's no border control, uh, makes, um, that's because of the Good Friday [00:29:15] agreement, which we want peace in our country, uh, and our island.
[00:29:19] And as a [00:29:20] result, it makes it much more difficult to track these because the systems don't speak to each other. They're, Dublin [00:29:25] is Ireland, Republic of Ireland, and generally Northern Ireland is governed by the [00:29:30] United Kingdom. So even though the Irish have a lot of say in Northern Ireland, it's [00:29:35] still, they're two separate systems.
[00:29:37] Yeah. So you can [00:29:40] imagine that, that, like, that was a-
[00:29:42] Huge. ...
[00:29:42] a problem of outcomes. And, and in the [00:29:45] end, they've had to roll back the whole policy because it was basically not well thought through, but, [00:29:50] you know, we can look at, well, what are we, what are the actual outcomes we're [00:29:55] achieving? And, um, that can ... So in [00:30:00] In my, um, in my workshop or on my website, I've developed a [00:30:05] sort of six types of errors for people to think about.
[00:30:08] Mm-hmm.
[00:30:08] Um, [00:30:10] and that can be quite, I hope, people might find that thought provoking to say, well-
[00:30:14] Where is that? Is that on your [00:30:15] blog?
[00:30:15] Yeah. Yeah.
[00:30:16] Is that the one that you did, um, because I was on your website and I'm on it [00:30:20] again here at the moment. When did you write that?
[00:30:22] Last year.
[00:30:23] May, was that the total? Yeah.[00:30:25]
[00:30:25] Yeah, I see it here. So there's, there's six error things in services to really focus on. And [00:30:30] that's number one, problems along the way. Two, wrong results. Three, on unnecessary [00:30:35] action. Four delayed impact problem. Five, non-uptake [00:30:40] over, o- or over uptake, and six technology problem. [00:30:45] Um, they're pretty decent. I'm gonna put a, a, say pretty decent, that sounds really rude..
[00:30:49] They're, [00:30:50]
[00:30:50] they're- It's all
[00:30:51] working- I'm gonna put a link to that in, in, into the show
[00:30:54] notes for this [00:30:55] episode. I'm, I'm trying to kind of, again, I'm hoping that, that this whole thing about A- AI and [00:31:00] errors and so on can open some conversations- Yeah. ... perhaps with me, but perhaps within organizations, you [00:31:05] know, I always, wherever I can make all my slides, for example, creative [00:31:10] comments, by all means- Yes.
[00:31:11] people help yourselves, run your own workshop, tell me if it worked [00:31:15] or not, as the case may be. Yeah, I love it. But, um, actually thinking about [00:31:20] what is an error, how do we measure it? What are our error rates? Mm-hmm. Like [00:31:25] is this happening all the time? Um, often, you know, for example, good [00:31:30] old fashioned technique of going and listening into your call center to find out what [00:31:35] people are calling about.
[00:31:35] Yeah. Do anybody still do that? We used to in the olden days and it was [00:31:40] incredibly instructive. Okay. Um, I'm not hearing so much about it now.
[00:31:44] Can I ask, like, [00:31:45] o- obviously usability, um, studies is a, is an, an [00:31:50] amazing way to find out how people are using your, your product and your service. But [00:31:55] generally speaking, are you seeing more organizations tracking the failure at the micro [00:32:00] interaction of forms?
[00:32:01] So, like, the failures of, and many [00:32:05] times they click, you know, next and, and error pops up. Is that something that's been measured, are you [00:32:10] seeing?
[00:32:11] The, does I say our e-commerce colleagues are better at that.
[00:32:14] [00:32:15] Yeah. It's
[00:32:15] quite common for e-commerce people to be looking pretty closely- [00:32:20] Yeah. ... at that sort of stuff.
[00:32:21] And I mean, one of the books on the shelf behind me is Erin Va- Vigil's [00:32:25] book, um, about AB testing. Um, the, in e-commerce, [00:32:30] it's much more common to do e-commerce testing and look to see whether that's improving [00:32:35] conversion rates, which is the same as reducing error rates, really.
[00:32:37] Yeah.
[00:32:38] Um, in [00:32:40] service design and other things, not so much.
[00:32:43] Yeah. You know, broadly speaking, [00:32:45] from my workshop experiences last year, I'd say that probably [00:32:50] only a third of people had any idea what their error rates might be. [00:32:55]
[00:32:55] Yeah.
[00:32:55] Yeah. Now, I'm not ... I don't want to cast any shade on that [00:33:00] because I say I haven't thought much about error rates for about 20 years, so why would I [00:33:05] expect anybody else to really?
[00:33:07] Yeah.
[00:33:07] Um, but I'm hoping that kind of getting that out [00:33:10] in the open a bit will encourage us all to do a bit more measuring and a bit more thinking about it.
[00:33:14] But to [00:33:15] measure, look, what, what are the, the implements that we need to use to find the [00:33:20] errors more effectively, I guess, is a, is a better way of looking at the question.
[00:33:24] Is it [00:33:25] through asynchronous kind of Google analytics type [00:33:30] services, or are you seeing just a more of an increase of a manual [00:33:35] usability of design research kind of approach to determining the [00:33:40] metric? How do you see it being used?
[00:33:42] Well, those things, [00:33:45] yes, and, you know, like- Yeah. ... one of the ways of finding out where, for example, people might be [00:33:50] forced into wrong answers is your classic usability test, where you watch people use it.
[00:33:54] [00:33:55] Sure. Yeah. Um, your analytics, where are you getting dropout rates, um, and what, [00:34:00] trying to figure out why. Like the analytics might tell you where people are dropping out, but they won't tell you why [00:34:05] they're dropping out. Yeah. You've got to do that in another way. Um, there's [00:34:10] your, as I mentioned, classic observational work of going and listening in or, [00:34:15] or watching how many calls coming into your call center and what they're about.
[00:34:19] Yeah. Um, [00:34:20] remembering that call center staff are rewarded on answering the calls, not [00:34:25] on analyzing why people are calling. You've got to do that for them. And I, I [00:34:30] found that attempts to get call center people to record the purpose of the [00:34:35] call are generally not very successful, that they're, [00:34:40] they don't really have time to be that analytical about it.
[00:34:43] Mm-hmm. Um, it's much better to actually [00:34:45] go and do some observation, find out yourself, but just knowing what the volumes are, [00:34:50] um, actually knowing what the volumes are is important. Like, how many [00:34:55] people are using your thing, um, compared to the number of people who ought to be using [00:35:00] it, you know? Yeah. What, what's your take up rate?
[00:35:03] How many people are using paper [00:35:05] compared to digital, for example, and why that can be very instructive. [00:35:10] Like are being, people being forced back into paper even though they don't want to be.
[00:35:13] Yeah.
[00:35:14] Um, [00:35:15] there's, uh, yeah, so a- actually [00:35:20] looking at, um, units that you've sold, you know-
[00:35:24] Yeah,
[00:35:24] absolutely. ... [00:35:25] in e-commerce or how many of this thing have you sent out?
[00:35:28] What, what have you [00:35:30] delivered? Um, so there's kind of higher level things that you can do. [00:35:35]
[00:35:35] Sure.
[00:35:35] Um-
[00:35:37] I've got a question for you that I wanted to [00:35:40] ask in Edinburgh, okay, when we were, when we were hanging out, like, you know, going for dinner and [00:35:45] go for drinkies, um, and that was, the title of your [00:35:50] workshop was garbage in, garbage out, okay, right?
[00:35:53] Now I sometimes use a different way of [00:35:55] saying that, but generally speaking, when the [00:36:00] listeners are looking at that and saying, "Garbage in, garbage out, " how do they know if they've got [00:36:05] garbage in the first place?
[00:36:09] Well, [00:36:10] I
[00:36:10] guess
[00:36:10] that- In their data is what I'm talking about. In
[00:36:12] the, in the data. So, you know- [00:36:15]
[00:36:15] And the re- sorry, just, just to cut across you, the reason why I think that's so important as a question [00:36:20] is because most of the designers are like, "We know we've got crap data, we know what's in there at [00:36:25] the moment."
[00:36:25] But the other side of the organization may say, "It's fine. Let's [00:36:30] rush towards jumping into bed with AI, jumping into bed with all of these things and say, well, we're [00:36:35] not set up for it[00:36:40]
[00:36:40] yet." I think it's been very interesting that there are things [00:36:45] like there's a government data quality framework, for example, in the UK. Yeah. [00:36:50] Um, other government initiatives, so the, the Center for, uh, Digital [00:36:55] Public Government, I'm sorry, I've got the name slightly wrong, but the world's equivalent of our [00:37:00] UK, um, govern.uk has got very interesting thing about getting ready for AI, [00:37:05] which says, you know, you've got to make sure your data is accurate.
[00:37:09] Yeah. But I'm not [00:37:10] seeing that much advice about how do you do that. I think looking, [00:37:15] looking at the data, for me, when I think about is my data [00:37:20] accurate, I think about it very much in terms of getting my dataset, which I would normally be [00:37:25] doing in, in survey design. Like in survey design, you have your responses and there's your data [00:37:30] set and you can start cleaning it.
[00:37:31] Yeah.
[00:37:32] Um, or anyone working in statistics will [00:37:35] talk about genuinely working in statistics, we'll talk about looking at the dataset [00:37:40] and, and cleaning it. So where you're looking for things like, have you got missing [00:37:45] entries, have you got duplicated entries, have you got inconsistent entries, um, [00:37:50] to see whether the data is coherent with itself and perhaps coherent with other [00:37:55] sources?
[00:37:55] Yeah. You try matching anyone who's tried data matching of two different data [00:38:00] sets will know the fun and games that they never match nearly as well as you hope. Um, [00:38:05] you've got duplicate entries, you've got missing entries, you've got all sorts [00:38:10] of stuff- Yeah. ... at the data level. Um, and there's [00:38:15] no real substitute for actually interrogating it, like pulling some records, looking at [00:38:20] some customer records, looking at things like outliers, you know, how many [00:38:25] customers in your data space have got a, a, a date of birth if you record [00:38:30] that of, um, January 1900, you know, how [00:38:35] many people are over 120 years old?
[00:38:37] Yeah.
[00:38:38] How many people are under [00:38:40] five, even though they seem to be a current customer? All those sorts of things, you [00:38:45] can start to ... It's hard work. I mean, I'll tell you, the data scientists earn their money. [00:38:50]
[00:38:50] I know.
[00:38:50] Um, and if you're not quite sure how to do about, go about it, [00:38:55] see if you can find a data scientist or a statistician to help you.
[00:38:59] Yeah.
[00:38:59] Or [00:39:00] just sit there and think about it. You know, does this make sense? Does this, does this line up [00:39:05] with that, what's going on here?
[00:39:07] Yeah, okay. So for, for a [00:39:10] lot of designers out there, and I know I'm coaching a few at the moment who are having those [00:39:15] conversations, they're saying, "Well, we just don't want to have accurate data and, and in our kind of [00:39:20] reach and all of a sudden the organization, one part of the organization is, you know, [00:39:25] using Agentic AI and they're like, " Man, if they start to implement that over here, we're, we're in a [00:39:30] fast road to, to failure, "like, you know?
[00:39:33] And it, it's, it's a challenge for a [00:39:35] lot of people at the moment trying to balance those conversations where there's different cadences happening [00:39:40] within the organization. So with the garbage in, garbage out kind of stuff, [00:39:45] uh, that conversation is ongoing and, you know, I'm really [00:39:50] happy to hear given some sort of details on what we can do about it.
[00:39:54] One of the [00:39:55] tips for those of us, I mean, I, although I've, you know, my, my [00:40:00] degree is in mathematics, so you'd think I could understand numbers, but I actually have mild dyscalculia, [00:40:05] I'm not good with actual numbers, I like patterns. Yeah. And so I think one of us for tho- I [00:40:10] think that's pretty common in designers, is that we're more comfortable with visualizations than with [00:40:15] the actual numbers.
[00:40:16] So a real tip from, from that is- Yeah. ... is [00:40:20] try getting hold of some data and, and draw some graphs and charts and see if you can spot [00:40:25] anomalies in that. Ah, that's nice. Yeah. 'Cause I'm, I'm more comfortable with, okay, if I [00:40:30] graph ... I mean, I, I have an example for where I was working with a survey [00:40:35] where, um, I looked at people answering how many children have you got at [00:40:40] different ages and it should be fairly consistent, like the age that if you have the [00:40:45] children in, by age, you should have roughly similar numbers.
[00:40:49] And then I looked [00:40:50] at it and there was a big spike at over 18. And I realized that we were asking people if [00:40:55] they had children over 18, right?
[00:40:58] Ah.
[00:40:58] Which is, um, [00:41:00] many of us, our parents still consider us to be their children- Yeah. ... even though we're grown up, [00:41:05] you know, so that was capturing everyone who'd ever had a child.
[00:41:09] Yeah.
[00:41:09] [00:41:10] Assuming rather than 18 to 21 year or whatever it was. So I had a big spike and that made me [00:41:15] realize I had a problem. That's the wrong question. Where for many of us who are much more [00:41:20] visual, we can feel much more comfortable with a chart or a graph or something- [00:41:25]
[00:41:25] Yeah. ...
[00:41:25] that is a more visual thing of representing our data.
[00:41:28] Yeah.
[00:41:28] And then even [00:41:30] asking ourselves the question of, how would I represent this [00:41:35] data in a chart- Yeah. ... helps us to get to grips with it?
[00:41:39] That's a [00:41:40] really, really good way of finding out if your data is actually a bit [00:41:45] dead. Um, and I love that. It always comes back to visualization for me, being able to [00:41:50] visualize- Yeah, me.
[00:41:51] what you've got. I, I'm such a visual person, which is why [00:41:55] a lot of the stuff that you're speaking about here really makes sense, uh, and I love it. [00:42:00] Caroline, um, you're so interested and I mean that in, in, in a [00:42:05] possible way. And I mean, I had that as well when we first connected over dinner, um, there was a bunch of us, [00:42:10] I was like, "You had me thinking."
[00:42:12] And that's usually the sign of somebody who's got a lot, [00:42:15] lot of wisdom in there. Your books, you, you've got two books, isn't that right? [00:42:20]
[00:42:20] I've got, well, I've got three- Or
[00:42:21] three, is it?
[00:42:22] I mean, uh, you probably, I know again, I'll [00:42:25] go the other way. We've talked about forms that work, which I co-authored with Jerry Gaffney.
[00:42:29] Yeah.
[00:42:29] We've
[00:42:29] [00:42:30] talked a bit about surveys that work, which was just me-
[00:42:32] Sure. ...
[00:42:33] um, published with, [00:42:35] by my Powell Rosenfeld, Rosenfeld Media.
[00:42:38] Yeah.
[00:42:38] And then the other one is called [00:42:40] User Interface Design and Evaluation, which was a textbook- Yeah. ... that it came out of an [00:42:45] open university course, um, so it's a little old.
[00:42:49] Yeah. Um, [00:42:50] you can still get it. You can still get it secondhand, frankly. Um-
[00:42:54] Free [00:42:55] grand a book or something like that though, Carolina,
[00:42:56] let's be honest.ly available. It's a textbook.
[00:42:59] Once they go [00:43:00] out of publications, it gets, it gets pretty kind of gnarly to try and find the book at a reasonable [00:43:05] price.
[00:43:05] Yeah, we, we
[00:43:06] I don't get any money from that book. Any royalties go to the open [00:43:10] university, so buy it where you like ... In fact, buy all of them where you like. I don't mind. Read it. I don't care. [00:43:15] Yeah. Um, obviously, I'd much rather, if you buy the survey book, buy it from the publisher because [00:43:20] my publisher's lovely and they get a bit more money that way.
[00:43:23] Yeah. Um, forms that work is published [00:43:25] by Elsevier, which is sort of evil empire, so buy that whatever you like. Okay. [00:43:30]
[00:43:31] But you're, you're, you're waving a lightsaber here and you're just taking everybody out [00:43:35] in the, in the final parts of this podcast.
[00:43:37] I guess the user interface design and [00:43:40] evaluation book, we really wrote that in the 90- late end of the 1990s, and then we edited it in the [00:43:45] 2000s.
[00:43:45] So some of the stuff is kind of [00:43:50] old-fashioned, you know, there's a lot about requirements, which we don't really do anymore. We now work in an agile way. [00:43:55] Yeah. But there's some chapters at the back about how to persuade people to do this [00:44:00] stuff, which actually I'm quite proud of because I wrote. And I think they still have a [00:44:05] value that we still have a situation where, uh, many of us are confronted with, well, how do [00:44:10] I persuade people to do this?
[00:44:11] Yeah.
[00:44:12] To look at user experience and [00:44:15] so I kind of, that brings together the thread of the garbage in, garbage out idea is to say, [00:44:20] well, how do we persuade people to look at the data quality they [00:44:25] have? Well, if I say to you, I think you should look at your data qualities, like, uh, [00:44:30] boring. Yeah. If I say, "Oh, you're thinking of doing some AI, maybe [00:44:35] looking at your data quality will give you better results from that.
[00:44:37] Absolutely. Ah, yeah, okay. You know, [00:44:40] I can see a point. So we can sort of hope use AI to, to, to [00:44:45] crack that open. I
[00:44:45] know, absolutely.
[00:44:46] And so that kind of persuasive thing, how do we persuade people to be [00:44:50] interested? I think there's still some value in, in that from the book.
[00:44:54] 100%. [00:44:55] Just going back to your books, just going back to your books.
[00:44:57] There was one last book that we didn't [00:45:00] mention on sur- surveys that work on Rosenfeld.
[00:45:02] Yeah.
[00:45:03] And there's no lightsaber treatment [00:45:05] for that, that organization, um- Oh, no, no, Louis
[00:45:07] definitely.
[00:45:08] Encouraging you to go and buy in [00:45:10] Rosenfeld. Look, I'm doing your, I'm doing your job here for you. Go and buy surveys that [00:45:15] work on Rosenfeld.
[00:45:16] They produce brilliant books, have done, and they continue [00:45:20] to do so. And I'll put a link to that book into the show notes as well, [00:45:25] Caroline.
[00:45:25] Yeah. But, but also, you know, I try and publish a lot of stuff on my [00:45:30] website.
[00:45:30] Yeah, I'm just looking
[00:45:30] at amazing. You welcome to, you know, don't, don't feel you have to buy [00:45:35] a book, just I hope that- To learn.
[00:45:37] Find something useful on the website. And if you don't, [00:45:40] then write to me because a primary important thing for me at the [00:45:45] moment is refreshing and bringing my website up to date, so-
[00:45:48] I love
[00:45:48] it. Please feel [00:45:50] welcome. Write to me, link, link up with me on LinkedIn or I do Blue Sky, ask me [00:45:55] questions, and that will, I hope, provoke me into actually writing some of the [00:46:00] 150 blog posts that are in my queue that I haven't quite managed to write yet.
[00:46:04] Yeah, no, you're [00:46:05] brilliant. Caroline, I gotta put a link to your LinkedIn and there for people to connect with you. Um, [00:46:10] and I'll put a link to your website as well, because as you said, it's a, it's a great repository of, [00:46:15] of everything that you've done in your career so far. It's been an absolute pleasure speaking with you.
[00:46:19] Oh,
[00:46:19] thank
[00:46:19] you. [00:46:20] Not, not just today, but also like when we got to hang out quite a bit in, in Edinburgh, I really, really [00:46:25] enjoyed it. And I messaged Jerry to say what a joy it was to, to get to hang out and learn [00:46:30] all about the way you think and the way you've operated and your career and the, the large body of [00:46:35] work.
[00:46:36] You've really helped the design practice grow over the [00:46:40] last 20, 30 years in particular. So thank you on behalf of everyone else for all that [00:46:45] work. Um, thanks so much again, folks, for listening. Uh, I hope you [00:46:50] enjoy Caroline. Again, check out our website, buy the books, and hopefully [00:46:55] Caroline will get to speak to each other soon.
[00:46:57] I'd love that. Thank you so much. I really appreciate the [00:47:00] [00:47:05] opportunity.
Subscribe to the Podcast
Rate & Review
If you enjoy the show, please leave a rating. It helps others find it.
Discussion
Sign in to join the discussion
Sign In