All Episodes · Next →
Smoke Testing, Knowledge Work and Testing in Production Episode 1

Smoke Testing, Knowledge Work and Testing in Production

· 36:57

|

Vernon (00:00)
Three, two, one. Greetings and welcome to the Vernon Richard show. This is episode one or maybe episode zero of our new pod. And please note that was Vernon Richard without the S because I'm one of the hosts. And we also have a co-host whose name is

Richard (00:22)
Richard, hello everyone. My name's Richard Bradshaw. I am one of the co-hosts of the Vernon Richard Show. Oh, even I got it wrong. The Vernon Richard Show. We'll get there eventually, right? Ha ha ha. Um.

Vernon (00:33)
Yeah. So what we're talking about is software testing, software quality on all related topics. And it's going to be just a conversational style. Things that we've noticed in between episodes or things that are currently on our mind. And we're just going to get together. How often are we doing this every two weeks? And we're just going to

Richard (00:53)
To start with every couple of weeks. Yeah, so those are the topics, but also just generally life in tech. The stuff that goes with that, obviously we have our expertise in that field, but everything else around life in tech. And I imagine football is gonna come up a fair bit of time as well, given our ongoing football rivalry. But yeah, we're gonna keep this very conversational format and hopefully we're gonna have a few kind of regular episodes in a slightly different format, maybe a bit more AMA style.

Vernon (00:59)
Yeah.

Richard (01:23)
take some questions from the audience throughout, over the weeks and maybe we'll ask them live on the show instead of me and Vernon discussing whatever topics are coming up in our lives or careers. Yeah, so Vernon, obviously I know you, but for anyone listening who doesn't know us yet, tell us a bit about yourself.

Vernon (01:23)
Mm.

Okay, so my name is Vernon. I am based in Leicester in the UK, which is one of those annoying places that if you see it written down, it's not pronounced how it's spelled at all. And I've been in the software and testing game for 20 years. And I know what you're thinking, 20 years? But you're only 22 years old. I'm a little bit older than I look. So I've been in this for 20 years. I started out testing video games, which was very cool. And then from there, I just went and did websites and vending machines.

Richard (02:01)
Oof.

Vernon (02:14)
At one point I was testing Formula One cars, that was very cool. Uh, and then over the years I've moved into speaking and delivering training workshops and hosting events in person and online and all kinds of stuff. So yeah, that's me. Richie Rich. What about you?

Richard (02:32)
Awesome. Yeah. So I think I think I'm 15 years now. So that makes me what? 18. Yeah. I think I'm 15 years into my, my testing software quality journey. Started off in, in banking as part of my placement degree. And then, yeah, I went through a lot of mobile space, got into talks, workshops, just general being active around the people in the industry. And

Vernon (02:39)
What's up, old man?

Richard (03:02)
At the moment, I am a senior architect within quality engineering. That's the, yeah. Basically, job titles are job titles. We should have a whole episode about job titles, but I, go on, yeah.

Vernon (03:16)
We should join miners right now. So I have a day job with a health tech company based in Germany with a lot of incredible people. And my job title, clear my throat for this one, senior expert quality engineer. That's right, ladies and gentlemen. Not as an expert quality engineer. I'm not just a senior quality engineer. I'm a senior expert quality engineer. Thank you very much. All bow down before me when you see me.

Richard (03:36)
What?

Vernon (03:47)
But I also like to do a bit of quality coaching in that job and, you know, outside as well.

Richard (03:47)
Buh.

That is a great job, Tile. I didn't even know that. Awesome.

Vernon (03:58)
Oh mate, shout out to Dan Ashby for that one. I think this is all he's doing.

Richard (04:03)
I'm curious what other job titles exist in the company now, but anyway, we'll go back with our first topic. So we decided to call this topic smoke test or the smoke test or smoke testing. We're going to see where it goes, but we haven't done this before. Not on, not on independently anyway. And we had never, I've never added podcasts to platforms. I've never edited a podcast.

Vernon (04:08)
Thank you.

Richard (04:28)
So I've only really been a guest on podcasts. So yeah, this is all new. So we're calling it the smoke test cause we don't know if this is going to work. Um, there's lots of stuff to set up. There's lots of stuff to even see. We don't even know if this is going to work. Like just me and Vernon might just talk about football for a whole 30 minutes and no one's going to listen to that.

Vernon (04:44)
I love.

I love how you said that. Oh yeah, we've never done this before. Translation, neither of us knows what the we're doing.

Richard (04:55)
So that's why we called it the smoke test, but let's dive into smoke testing, Vernon. What is it? Or what is it in your world?

Vernon (05:06)
to me whenever I've used this term.

I don't use it for a long time in work, it feels like actually. But whenever I've used it in the past, it's been those really quick tests that just tell you, is this even alive or valid or like breathing? It's very quick. It's not deep, they're very shallow and fast. You know, it's literally turn it on and did the light come on, okay cool, then I'm off. That's it.

And you might do that for a bunch of different reasons. So you might do it when you get a new build, depending on how you work. That's where I first ran into it back in the day in my waterfall projects. We'd get a new build, we'd deploy it to our environment. Or we'd get a new CD with the version of the game on it. And we'd put it in the test machine. And if it worked.

Richard (06:00)
Yes.

Vernon (06:05)
you know, turned on, did it boot to start the game? That was our smoke test. And so then that was like, okay, yeah. Then we'd let all the other team know, okay, this is working, we can actually use it. So it was like a signal to carry on. Or another common situation I see in is, there have been situations and times where testing in production has been either difficult or frowned upon or both. So.

you would deploy your, you know, you'd go through the different testing in all different environments and you get to production. And because you couldn't go hammer and tongs in production, you would do a smoke test just to make sure, did we even, did we just deploy what we thought we deployed and is it on fire or not? And if we would confirm those two things, we stopped. That was it, success.

Richard (07:01)
I think that's interesting because I've been having similar thoughts recently with some work. And it's like smoke tests for me is that, you know, is it alive? Like I'm configured to talk to third party A. Can we talk? We might be talking utter gibberish, right? But there's a connection there. It's alive. It might not be working, but that's not what I'm looking for. I'm looking for are you up and running and can I talk to you?

Vernon (07:21)
I'm gonna go.

Richard (07:31)
And again, all that do all that for the various third party pieces or not even just third parties, but also the moving parts of your own system. Like I'm currently learning a lot about AWS. It's all new to me and the amount of configuration that you can do and how easy it is to break your app or misconfigure it like, so there's all these internal like smoke test or health checks going on. Um, so that's kind of.

That's kind of how I get used to it. Like it is, like you said, it's that health check. I think it does come from the original term of electronics, right? Like you turn the thing on and there's no smoke. Um, but I don't, I don't have any evidence for that memory.

Vernon (08:10)
Yeah.

Yeah, you know what, I'd heard that. I'd heard that too. I haven't actually researched that to know if that's a real thing or not, but it sounds plausible. Makes a lot of sense. And I'm so glad

Richard (08:21)
If you're listening and you know, tell us.

Vernon (08:25)
I'm so glad you used the term health check because I think that's what they are now, are health checks. You know, it's all mixed up with monitoring and things like that these days, or at least that's what it feels like.

Richard (08:39)
And I think that's interesting, Mike, we did a release today actually, and there was monitoring in place to tell us that things were working. And we could see payments coming in from someone else. So we could have tested a payment flow, but we could see in real time that someone had just paid. So obviously it might not be working, right? But it's, it's alive and someone was able to pay. But yeah, I think that's really interesting, but like I've also

Vernon (09:01)
Yeah.

Richard (09:08)
I think we're kind of in agreement with how we view what a smoke test is, but I've also seen it be slightly differently. I've seen it, I've seen it kind of morph into the world of regression or like a set of end to end checks or, um, like felt like kind of validation tests. I've seen that kind of space as well. It's like, want to go a little bit. It's more that instead of the things can talk, it's more that they can talk in this, um, planned way. So.

Vernon (09:18)
Oh.

Mmm.

Richard (09:36)
Yes, they can talk, but I want to know that they're talking correctly for this specific scenario. I don't know if you've come across that.

Vernon (09:44)
The example that always springs to mind, and you just reminded me of it again, is I worked at a media company some years ago, quite a big one, at least if you're in the UK. And whenever we did a release, when I was there on this team, there's probably, depending on how you count it, there's basically one thing you wanted to check, and you checked it in maybe two or three different ways. And that is, can we still sell?

the TV packages that sell the most. And so long as we can do that, we are golden. So I don't, but I don't think we called it a smoke test actually. It was something else, but we had to, we still needed to be able to sell, but not just sell our most popular packages we had to be able to sell. And so long as we could do that, everything was deemed to be, well, we're fine.

We haven't just doomed the company to hideous pain.

Richard (10:45)
you ever had to do the old hack of creating a 99% discount code so you can actually buy it in production using a real card instead of doing the full price stuff? I did that at Tesco I think.

Vernon (10:54)
Haha! That... Yeah, that... 100% did that and also...

getting the CTO's credit card or somebody's credit card because they've got access to whatever leave is to get that money back or whatever. So it's like, give us your credit card.

Richard (11:08)
Yeah, yeah, done that.

We used to have it where you'd use our card. I don't know whose it was. It was probably with the officers. But then someone would eventually remember it and be like, did you cancel it? It's not going to like subscribe again or resubscribe. Like make sure you cancel it. And then you also had to go and dig out the real receipts because like they, they treated it as a genuine, you know, expense. So then you had to go into the system and get the receipts. Yeah. So yeah.

Vernon (11:35)
Yeah. Yep. Done that. It's nightmare fuel when you think...

Richard (11:43)
Yeah, absolutely. I think, well, you know, you touched on testing in production. I think it's the, it's a similar thing. Like you, you can, a lot of systems where you want to do that kind of testing, you can get all the way to the basket. But then it's like, well, I can't, I don't want to buy it because I don't want to put an order in. I don't want to start shipping things to one test fill in test town. I don't want to start doing any of that. So like you, you are limited with the kind of, you know, these production kind of, um, you know,

smoke test, full end to end,

whatever we want to call them, like validation tests. Um, yeah, you are limited with what you can do, but yeah, smoke testing. I think like you said, help check, keep the lights on, other lights on even. Um, and then yeah, other, I think other things are, I think all these things are valid to do in production. I want to make that clear. Like, I don't think you shouldn't be doing an end to end test or going through a purchase, like, you know, if that suits the context.

Vernon (12:24)
Mm-hmm.

Richard (12:40)
I'm not saying you should always be doing that. Some systems, there's a lot of things I can think could go wrong, like metrics and analytics, and go, we keep selling this product that no one buys. We sell one every Friday at 4 PM. Who keeps buying it on Friday at 4 PM?

Vernon (12:49)
the main

The main, me in my context is testing in production screws up a lot of metrics for people. Yeah, yeah. So it's really, you know, it's a really, it's a thing we're trying to figure out how to get better at, that's for sure.

Richard (13:04)
Does it.

Nice. So our smoke test in this instance is number one, can we record it? Can we edit it? And can we somehow put it into some tool that distributes it to where hopefully you are listening to it now? And can you watch it on YouTube? So yeah, that's the plan with this. We're going to put the video on YouTube and we will try and distribute the audio podcast kind of version of it across.

Vernon (13:21)
Yeah.

Yeah.

Richard (13:40)
all major platforms as I hear on some of these more professional pods or wherever you get your pods from.

Vernon (13:45)
Yeah, yeah, this is where we drop the cliches now, right? It's like make sure you smash that like button, comment, subscribe.

Richard (13:50)
Yeah.

Follow and subscribe, where's the link? I don't know where the link is because I don't know where it goes.

Vernon (13:58)
Yeah, wherever you're listening to this, please leave a five star review.

Richard (14:03)
I keep listening to the diary of a CEO, very good podcast with Stephen. And he, his latest thing is the most important thing you can do for this podcast is to follow us on whatever platform you are listening to. It's not about sharing or anything else. All I need you to do is just to follow us on the platform. It will do more good than I can put into words. So yeah, I'm going to copy him. Follow us, like it, subscribe, but do all those fun things that you do on all these platforms.

Vernon (14:06)
Oh yeah.

Yeah.

Richard (14:28)
Right. Next topic then.

Vernon (14:32)
This is super sketchy, but the thing I wanted to talk about was knowledge work. So why was I thinking about knowledge work? Well, over the past, I don't know, man. It feels like since the beginning of the year, maybe even in December, there've been a ton, LinkedIn is the new Twitter anyway. So that's where a lot of the cool conversations are happening. And so.

Richard (14:56)
It's true, yeah.

Vernon (14:58)
But there are a lot of conversations that were actually arguments and rows and, you know, people sharing opinions and then getting smashed and blasted and then people reacting to that and, you know, going around and around in circles like that.

from developers saying, you know, flamey things about testers and testing to testers saying, using, I'm using air quotes if you're listening, the wrong language for certain individuals in certain communities and get inflamed and smashed for that. And it just, it always, it always makes me think of, I don't know why, but it always makes me think of this.

The thing that I first heard Alan Page say like years ago, and it really, really stuck with me. And it was something, I think it was as simple as, look man, it's all just knowledge work, which was his way of saying.

Testing is not special in that regard because it's another variation or manifestation of knowledge work. So, over the years I've always thought about what is knowledge work? And then I read this really cool book by, I'm gonna get his name wrong.

Richard (16:11)
Pressure Tantan.

Vernon (16:12)
See, I can't remember the name of the book, but I can remember the name of the author, but I can't remember the name of the book. The book is called Same As Ever, and I'll just Google the name of the author a bit. And there's a really, the last chapter of that book is absolutely brilliant. Morgan Hounsell, yeah, that's it. Absolute legend. He was on Die of a Sea, yeah, actually. Definitely go watch that episode. Yeah, and the last.

Richard (16:20)
I'll do it for you while you carry on.

Morgan Housel.

Okay.

Vernon (16:38)
I was reading the last chapter of that book in and around these time that these arguments are happening. And he talks in there about why did these things happen? And, you know, it was around, well, you know, if you're an expert in a thing, then people can say things and to you.

they've completely missed the nuance of the topic and they've just tried to skirt over a million and one things and you're like, how dare you just minimise my whole sphere of study into one pithy line, you devils. And I think that rang true. So that's why it was on my mind. So I went and looked at what knowledge work was and things like that. What do you think what knowledge work is? Have you thought about this much or?

Richard (17:11)
Yeah.

So I'm thinking two things right now. I'm thinking the context of where you would got this from was obviously an argument or discussions on LinkedIn, right? So is it that being an active member of a community or being active in the industry and trying to share your knowledge is knowledge work, or is it that the job of a tester or QA, QE, whatever super duper energetic quality engineer or whatever it is that you're called at your company?

Is it that that's knowledge work or is it sharing that's knowledge work? But just to clarify for me.

Vernon (18:02)
I think the definition.

of knowledge work? Well, actually I'll start with what isn't knowledge work. So I think the work that isn't knowledge work in the before times is basically manual labor and physical labor. And then knowledge work is not that. So it's not that work that you're always, doing stuff with your hands and you're not, grinding the gears in the machine or tilling the land or anything, it's more around.

Richard (18:24)
Yeah.

Vernon (18:33)
concepts and information and manipulating that information and processing it in some way, shape or form. So.

Richard (18:43)
Cause I...

Vernon (18:43)
It's more cognitive in nature and it's more based on expertise, I think. Let me see if I can quickly find the definitions.

Richard (18:51)
Cause yeah, cause when you mentioned it to me, I had a quick Google and it was, um, kind of your, your capital is knowledge. So if you're building something, if you're a, let's say the stay in engineering, right, you're a developer, your capital is probably code or functions or software. Whereas that's what you deliver, right? That's what you build. Obviously you contribute other stuff too, but predominantly your capital is work is software. And.

Vernon (18:56)
Bye.

Hmm.

Mm-hmm.

Richard (19:22)
in when you first mentioned this to me, what I started thinking about is, and something I started at that recently at the new job was, I've struggled to get going originally because I was like, I don't have any knowledge. Like, like, you know, we're getting tickets and we're discussing things and like, I don't know the system yet. Like, I'm hearing the words and you're saying, and I'm trying to make a model in my head, but it ain't forming because it's full of holes. It's full of gaps.

And I don't have this knowledge, but previously I've kind of always excelled myself or treated that knowledge as kind of my number one asset is that I have this deep knowledge of the system. And I think interestingly as well, like I always wanted to, I think the way that changed for me was I kind of went, I had deep knowledge of the system in terms of users, functionality, the pages.

Vernon (20:00)
Okay.

Richard (20:18)
And then when I moved into the automation space, I started deep deepening that knowledge to what's the technology, what's the protocol, how does that talk? How does this happen? So I've kind of viewed myself that I've got all these great skills and I've got all these techniques and heuristics and coding skills and all this stuff. But my end goal is what I want is information, knowledge, things that I can communicate back to the team. Like again, an engineer would go, Hey, here's some working code. And I'm like,

Vernon (20:27)
Mm-hmm.

Mm-hmm.

Richard (20:48)
There's some knowledge, like, you know, like, I thought that's what I'm trading in. I'm saying like, my knowledge is the system behaves and it works. So that's what it made me think of knowledge work. And I think I mentioned to you off air, that's how I've used it in the past. Like I, I traded knowledge and I gave a talk last year about like gap analysis and how a lot of my work is there suddenly is a gap in my knowledge and I need to fill it and know a new feature.

Vernon (20:50)
Mm-hmm.

Mm.

Richard (21:16)
I don't know anything about that. I need to learn about it. There's a bug in production. You need to learn about that. There's a random error that keeps happening. I want to learn about that. So like most of my day is like basically my mental model of the system is my knowledge and I spot these gaps and I've got to go and plug them basically.

Vernon (21:32)
Hmm. Yeah, that, to me, that is exactly it. And so I love that example because that is what testing is about. It's about trading the knowledge. But what I think the issue is, so when I've dug through my notes here, I've got some stuff written down. I was using, you know, chat GPT, searching, trying to scan a few articles and things.

And there are some characteristics of knowledge work that you just talked about. So cognitive efforts or complexity is one, specialized knowledge or expertise and skills, innovation and creativity, information processing, and non-routine work. So like problem solving and continuous learning. And so why I think part, I think at least part of why

software testers don't always get the credit and why it's sometimes easy to, you know, be disparaged or not taken seriously is because of the parts that are visible look routine, which a lot of other knowledge workers have. So my, you know, some of the folks will know that my wife is a solicitor and solicitors are absolute classics for this

you know, people go and talk to them for hours and then they weren't here for them for days and days and then they'll get given a letter and they're like, what that letter has got like two paragraphs in it. Why have you and you've charged me $10 million this thing. But what they don't realize is all the training and the expertise and the knowledge and the creativity that has to go into well, how do I create those paragraphs? What needs to go in there? What do I need to leave out?

What are the key points to cover, et cetera, et cetera. And I think it's the same with testing. Certainly traditionally, when I started back in the day, early 2000s, it was very much, at least when I left video games, video games was weirdly a little bit forward thinking to some degree. It was very much around test cases. Like if you don't have any test cases, nothing happened. Basically was how it worked. So if you don't have any test cases,

Richard (23:51)
Oh wow. Yeah.

Vernon (24:01)
and your work is not visible, then people just think, well, you're not doing anything. Or they might think, whenever I see Vernon working, he's just smashing buttons on a keyboard. That's all he's doing. And it's interesting that you talked about developers and your experience

being an automation engineer, because I think they suffer less from that, because there's almost this tacit understanding.

or this assumption that, oh yeah, well, that's, you know, that's difficult to do. Yeah, like I know that the work isn't bashing keys on the keyboard. I know they've gone and thought about something because their work product

Richard (24:32)
That's hard. Yeah, yeah.

Vernon (24:42)
is kind of intangible. I mean, it isn't because you're making a product, but it's kind of intangible. I know that they do stuff on the keyboard and then some magic happened and now I've got something on the screen and I don't really understand that. So, yeah. And I think...

Richard (24:56)
Yeah, but...

Vernon (24:57)
it's trickier for us to make that connection with people.

Richard (25:01)
I think it is, but I think we, we don't help ourselves. I think of a few examples now, but with the programming, with the developers one, one thing I've always come across. And it's a quote I heard at a conference that I can never find. I always try and find it, but I don't think it is thinking of someone's opinion, but it's the distinction between programming and coding. So they made the argument that, and I get them the wrong way around, but I think it's this way around. Programming is thinking of the models of the system, the architecture.

the designing, the design patterns, the, what should be a config, what shouldn't, right? All the hard work that in some companies and this I'll relate, I'll relate this back to testing in a second is done by somebody else. In some companies you are given, this is how you're going to do it. Here's the architecture. I remember architect role in my very early days was the person who built all the models, built all the DTOs, built all the stack. And then all you were doing was using bits of it.

Vernon (25:45)
Mm-hmm.

Yeah.

Mm-hmm.

Richard (26:01)
right, to do what you wanted. And, and then they mentioned coding as simply being the passion of the Keats. And I think we had a distinction in this, in the testing space a long time ago. Again, it might be my own experiences, but in my early careers, in my early role, sorry, it was the test leads that would write all the test cases.

Vernon (26:07)
Mm-hmm.

Okay.

Richard (26:25)
And I would just get given the test case. And then I would have to run said test case. Obviously using just the explicit knowledge that has been codified into the test case, not all that tacit gold that was sat in the test leads head as they wrote that test case and not, I'd get nothing. I get none of the stuff that was left out. Why have you decided this flow and not that flow? Why is this not here? And why is that not here?

Vernon (26:28)
Hmm.

Okay.

Mm.

Richard (26:53)
So like, I wasn't getting any of that information. I was just getting given a test case. So that led to then me obviously doing some of this more work of, you know, trying to understand the test case and running it. But like, back to what you said earlier as well is my output of that was a pile of passed or failed test cases and maybe a few bug reports, right? My own thinking wasn't captured. My own models, my little scribbles on a piece of paper.

Vernon (27:06)
Mm-hmm.

Yeah.

Richard (27:21)
My testing story is, I always reference Huib whenever I say that, but Huib Schoots whenever I say testing story, that wasn't being told. My, my perceived work was a pile of passed or failed test cases. So that's all they believed I did was read some words, follow the instructions and tick a box. And I didn't know how to argue back about that at the time. So yeah.

Vernon (27:48)
Mm-hmm. It's definitely, I couldn't agree more. Like, it's the intellectual effort is difficult to perceive. So all of your thinking.

transformed into a tick or a cross basically. And like, oh, well, if that's all, that's all there is to it. Coupled with the fact that everybody tests. And I'm not just talking about at work in a software team. I'm talking about, you know, after I've had a hard workout at the gym, then, and I run myself a hot bath. I don't just dive straight in the bath. Like, I'll, you know, I'll dip my toe in or ease myself in. That's testing. You make yourself something to eat.

Richard (28:07)
Yeah.

Vernon (28:32)
taste it. It's missing some.

Richard (28:34)
I just made some Peri sauce just tonight and as soon as I'd blitzed it up, I was like, ooh, bit hot that. I always test them.

Vernon (28:42)
Yeah, yeah, exactly. I'll, you know, go out with.

Richard (28:47)
I've got a fun test for you just to just while you talk about always testing. My car is currently sat. My car is currently sat at 4% outside my house. And after this, I have to drive it about just a couple of miles to charge it. Cause I haven't got a charge yet. So testing, always testing.

Vernon (28:51)
I'm gone.

Change anxiety is seriously kicking in for me right now. It's not even like up.

Richard (29:11)
But I, I back onto this idea of knowledge, like knowledge work, or I think just a bit Googling, I did them a difference between like a knowledge worker and knowledge work, I know that's kind of a very subtle difference, but those examples that we gave, I've seen the same thing in the automation space. And it's why, again, I don't, I don't think we've shifted the problem from a perception of what the workers testers do or QA's do or QE's do or

Vernon (29:25)
Mm-hmm.

Richard (29:40)
automation engineers do because the same thing's happening in automation. Now you're being judged by how many tests you automate, right? Or your perceived value is the number of tests that you've automated. But actually the amount of, if you think of building a feature for a piece of software, you've probably got a product owner facilitating a scrum master or something like that, one of these roles you've got.

Vernon (29:54)
Mm.

Yep.

Richard (30:06)
You've got might have BAs in there. You might have developers, engineers, senior, all levels, right? They're all contributing to this feature that then gets built and then gets tested. So there's like five, six people or whatever contributing. When it comes to an automated test, it's me, myself, and I, right? It's I'm thinking of what to automate. I'm thinking of the test. I need the knowledge. I need the risk. I need the system knowledge to know what layer to do it on.

Vernon (30:26)
show.

Richard (30:35)
I need the skills to build it in the first place. Then I need the skills to test it. So I'm doing all those things myself, but the output is still viewed as, Oh, you wrote an automated test. Well done, Richard. Well done. Pat on the back. And I'm like, no, I did all this. And sometimes you don't get the opportunity to talk about all that extra knowledge work, let's say. Um, and I feel, yeah, I feel like that's the very similar problem that we've taken

Vernon (30:35)
and

Yeah.

Mmm.

Yeah.

Richard (31:05)
And not, we could probably talk about it for hours, but I think that there is a, there is a talk or an article or conversation to be had that automated tests are basically test case 2.0. They've, they've gone the same way. Like, so.

Vernon (31:13)
write it.

Oh, hell yeah. Yeah, I had a workshop that I did a few years ago that was about this, because I'd realized I'd noticed that that's what I was doing. And that I had this allergic reaction to test cases, which was actually daft. Because there are places where that makes sense, because I'd made that connection between

Oh, well, if I'd automated these, suddenly I think that they're valuable just because the mechanism has changed. So there must be something valuable in there. And I just need to get over myself and just figure out, okay, when is it appropriate? What are the strengths and weaknesses of using a test case? And what are the strengths and weaknesses of exploring? And when does it make sense to use one or the other in any given moment?

So I think you're spot on there. The article I'm definitely gonna write is around knowledge work. Cause I wanna explore this idea a bit more. So I wanna write some stuff about knowledge work. And if you too are so inspired on hearing us talk about this to create some stuff about it, please do. And tag us on all your favorite social media platforms and we

Richard (32:27)
I'm out.

Vernon (32:35)
give you a signal boost for sure.

Richard (32:37)
Absolutely. And I think just what you said then about the automated tests, I think it's probably the same thing as programmers building software. Like you said, they don't get judged the same way and it's probably the same with us. And I think you're absolutely right. There is a hundred percent places for, if I think of kind of the four key things, you've kind of got this instruction script, test case, whatever you want to call it. You've got an automated version of that.

which can obviously run itself. You've then got the exploratory kind of base exploration, type of testing, and then you've got tools that can kind of support it. And I think it's funny what you said though, that suddenly it's valuable because I've automated it. Yet in some of the training that I do, I talk about how that a lot of its value has been lost because if I give you a test case to run, you don't follow my test case.

Vernon (33:25)
Mm-hmm.

Mm-hmm.

Richard (33:31)
You'll read it and you'll definitely do what it says, but you're also going to do 50, 100, 200 other things that aren't written on that test case. And even more fun, back to knowledge where you ain't going to tell me that you did those things either, and you ain't going to tell anyone else, like...

Vernon (33:37)
Exactly. Yeah.

you may not be explicitly aware of it yourself. That's the really, really interesting thing.

Richard (33:51)
Yes, absolutely. Yeah. And then you've got, then when you automate it, you kind of lose that serendipity aspect of it, but you gain speed, repetition. Um, so it's this trade off. And I think, like you said, it's the same with a test case. Sometimes in some domains, you're going to have to write very lengthy, explicit, anyone can follow test cases.

Vernon (34:00)
Yeah. Yeah, that's it.

That's it. Yeah.

Mm-hmm.

Richard (34:17)
And sometimes you might meet in the middle. I'm in a context now where we're kind of meeting in the middle. We're avoiding writing, you know, we'll say, um, you know, login as a X user. We're not defining what X user is or what the credentials are. We're telling you to buy a premium package, but we're not telling you which premium package to buy because it's not necessarily required. So we're saving a bit of time by not going into the very explicit. But we're, but it's still a lot more than a charter.

Vernon (34:42)
Yeah.

Richard (34:46)
Right. It's still, it's still a set of instructions you can follow with a little bit of knowledge, a little bit of the main knowledge. And it's part of our transition or the journey that we're on to getting it automated. There's a lot of hurdles in place right now. That means it can't just be automated. Um, access knowledge, um, working architecture, working framework. So we know we're on our journey and not saying like we're, we're not going to be in this position for a long time.

Vernon (34:47)
Mm-hmm.

Mm.

Mm-mm.

Richard (35:16)
We're probably only going to be in this position for a few months, but in those few months period, a test case makes sense. It fits the problem space. It helps.

Vernon (35:25)
Yep. Well said.

Richard (35:29)
Well said. All right. So there we go. That was, uh, the smoke test. So if you enjoyed listening to the pod, um, do all the things that a podcast hosts are meant to say. Subscribe, follow, like, reshare, tell your friends, tell your family. Um, yeah, listen, and you know, give us, give us any comments, any feedbacks, um, definitely start conversations around the topics that we've been

Vernon (35:47)
Yeah.

I just wanted to say, if there's anything that you liked about the podcast, please send it to me. If anything you didn't like, send it to Rich. That's key.

Richard (36:09)
We need to get a domain set up and I've got some rules. I'm going to code some rules, get some, run it through an AI that says, is this nice or bad? Redirect it this way. Get a little model on the go.

Vernon (36:09)
I'm a

But jokes aside, welcome, because this is an experiment for us. We're trying to figure out if this is, we're just doing this kind of for ourselves, first and foremost, but we also want it to be useful for more than us too. So if you've got any feedback, comments, concerns, please let us know, we would appreciate it.

Richard (36:33)
Absolutely.

And if you are listening to this or you've just finished listening to this, you'll be pleased to know that all the smoke tests have passed. Um, so thank you all for listening and we'll see you at the next one.

Vernon (36:54)
See you, everybody.

View episode details


Creators and Guests

Richard Bradshaw
Host
Richard Bradshaw
A true driving force in the software testing and quality domain. I’m a tester, automator, speaker, writer, teacher, strategist, leader, and a friendly human.
Vernon Richards | Ghostwriter & Coach
Host
Vernon Richards | Ghostwriter & Coach
I ghostwrite Educational Email Courses for Software Testing SaaS Founders | 20+ years testing & coaching in tech | Will Smith's Virtual Stunt Double

Subscribe

Listen to The Vernon Richard Show using one of many popular podcasting apps or directories.

Apple Podcasts Spotify Overcast Pocket Casts YouTube
All Episodes · Next →