In this episode, Dave and Peter have a special guest, Peter Budden, who has over 15 years working in QA and Testing. We discuss what quality means in the terms of agile and DevOps teams.
This week takeaways:
· Make sure that you're having conversations about what quality means for you and for your team.
· Figure out ways to help your teams improve their testing skills.
· Testing is a mindset, asking good questions, and giving people the space to be inquisitive.
We love to hear feedback! If you have questions, would like to propose a topic, or even join us for a conversation, contact us here: feedback@definitelymaybeagile.com
New episodes released every Thursday to challenge your thinking and inspire action.
Listen and subscribe:
Welcome to Definitely Maybe Agile, the podcast where Peter Maddison and Dave Sharrock discuss the complexities of adopting new ways of working at scale. Hello and welcome. So Dave and I are here again. We're joined by a special guest, Peter Budden, who's going to give us some wonderful information all about testing and lots of different aspects of that today. So would you like to go ahead and introduce yourself, Peter?
Peter BuddenYeah, sure. I've been working in QA and testing for the last 15 years and working with projects large and small at companies from bare bones startups all the way to obviously large enterprises. And a lot of that time spent in Agile and DevOps, really trying to problem solve, I guess, projects that have gone wrong and thinking about how we can improve from a QA mindset, how we can improve the way that QA is working, the way that testing's happening, and thinking about how to then bring that into an agile process. So merging these two things and kind of we've seen over the last 15 years a huge change in testing. Things have massively changed. And I think on this podcast, you've been talking about cross-functional teams and all these things. Well, let's let's you know, I think using a sort of testing lens to look at that and thinking about well, how do we deal with that problem of quality is a really useful um perspective just to stop and think for a lot of agile teams who are having trouble in this space.
DaveWell, welcome to our our conversations, Peter. So uh I'm looking forward to this discussion just because I'll be able to throw a question out and say Peter in that, and I'll leave the two of you to figure out who's going to answer first. Um, but uh one of the things that and and this is a really an interesting point is how quality has changed. Quality in so many like the the the topic of testing and quality is one of those things that is almost an add-on. It's uh it's a core of agile teams and it's a core of DevOps, but it's very rarely given its day in the sun to really understand what it means to be at the heart of agile teams in terms of building quality in, in terms of focusing as a team on quality. And also if you look at DevOps with things like automation and so on, it's again right in the middle of that. So, uh Peter, you mentioned that uh you've worked a lot with agile teams, but also a lot not with agile teams. What's the difference?
Peter BuddenI mean, I think we can look at differences and we should. Let's also look at some things that are the same, right? Like quality is essentially it's all about the consumer's perspective. So the difference is about how we approach quality, but the consumer just sees quality in terms of the product they receive. And of course, agile is about how we get to that quality of the end product, right? So I think when we talk about agile teams versus um versus more traditional like Vmod or waterfall teams, um, the difference really is that instead of having one person who's really taking accountability for testing, we need to all be part and present in that process. More than you know, in any other methodology, there's an emphasis on everyone having a part of that part of that problem. You know, it's not enough to just do testing. We also need to fill in all the gaps and the knowledge and the skill set that testers in a traditional team would have built up. We actually need some of that everyone needs to understand that. And so if you look at really high-performing agile teams, um, I think what distinguishes them from a lot of waterfall teams is that everyone in the team has got a tremendous amount of discipline, right? So um, when we look at really high-performing agile teams, when I talk to developers, they're already at a relatively high level of understanding of all the things that testers anyway understand and intrinsically understand, whether it's you know formal training or informal training, they're doing good testing. And so when we look at agile teams, you may have someone who's who's got a background as a tester in the team and they may be involved. But it's not the case that they're doing all the testing, and the reason why they can get away from doing all the testing is that everyone else in the team already knows a lot about testing. And so, cross-functional teams, and when we think about how we create a great performing team in a cross-functional um with a cross-functional skill set, um, what distinguishes those who are successful and not successful is that everyone, um, from a testing perspective, everyone's got that kind of um discipline there. And you know, we talk about discipline, it's also about you know wanting to get involved in testing and understanding the value that that has. So I think um one of the things is is that when I look at it purely from a QA or a testing lens, I come to, you know, I come to the agile world just just focused on that that thread that runs through it all. And when I look at successful teams, I see that everyone has taken a little bit of that um away from the tester and brought it into their own world. So the tester can focus on other things. And and testing in an agile world is a process of like coaching and mentorship. So you may have someone in the team who's a tester. One of the questions I often ask Scrum Masters is hey, is the tester actually spending time with other people in the team? Like, do they have their bandwidth to to coach and to mentor? Like, are we seeing that improvement of testing over time in everyone's um workspace, right? So that's one question that I find really helpful to ask.
DaveI it it's it's quite interesting there because um one of the patterns, and I'll ask the other Peter about this in a minute, but one of the patterns that I often see is this whole push of, oh, we've got an automated testing, you know, we've got some sort of a DevOps team, and they're over here. And I have an agile team and they're doing their agile thing, and then they're flinging work over to this other testing team. And I had one conversation with an organization this week where their test automation team is three or four sprints behind their agile delivery team, right? And so for me, that's an anti-pattern that is like you know, we're gonna trip them up and stop this and look at that and bring those teams together a little bit and so on. But what you're describing um is if there's a really good understanding, first of all, that gap between test automation and agile delivery team isn't going to happen because the agile delivery team is getting really upset and uncomfortable because they know that gap means there's a whole bunch of undiscovered stuff which is going to come back and and kick them in the shins as they go forward. So there's an understanding and appreciation there. But other Peter, what is the from it uh um how to address test automation, how to bring these skills and technologies up, what do you see as patterns that work and patterns that don't?
PeterI think you uh I think you touched on it there a little. It's this it's this move away from the the this idea that you can actually assure quality, um, which you can't really. Quality either is something in the eyes of the whole, or we can't assure it, we can't say, hey, you've got 94.3% quality now. You're you're you've succeeded, you've got it, you've done, and you're you're awesome. Uh you can but you can start to move to a world where you can engineer quality into your processes and your practices. Uh so you can start to build out that uh the the test, the platform, the engineering, the capability to make it easier to test, build out the test harnesses to simplify your ability to engage. But I I completely agree with you that the the engineering piece of it can't be behind, uh it needs to be ahead of wherever the teams are. It's got to be engineering capabilities that are going to start to enable those teams to be able to go faster, to be able to uh basically offload some of the complexity of understanding how to glue these different pieces together so that the automation is actually helping them. So you're moving some of the complexity out into the system, into the platform, so that you can focus on the value add on top of that.
Peter BuddenI think that the situation you've described is is really typical. Um, and there's a couple of things actually underneath it also that we should explore. So if an automation team is is four weeks behind, what do we mean by four weeks behind? Four weeks behind of what, right? You know, I think I think what we what we hear is the automation team is four weeks behind because we've got a target of achieving X percentage of coverage using end-to-end test automation of the new features or the new requirements that we're building or the new stories that we're building, right? One of the things that is important to look at is that automation doesn't need to be at end-to-end level, doesn't need to be at a functional level. Right? Unit testing and integration testing and test automation at these levels is also an important part of the test coverage picture. And I think when people look at automation teams and they're sitting off in their silo, part of the issue may be that they're not taking into account all of the testing that's actually going on.
unknownRight?
Peter BuddenSo we know that as as test automation gets more complex and it covers more business features, that it's way more brittle. So when we talk about quality and testing, like one of the things that I really hope people take away and think about is am I looking at all of testing or am I just looking at what the team that's most obviously doing testing is doing? In this case, it's the test automation team. So sure, they are like four four weeks behind based on a um, you know, based on a target of X coverage or percentage coverage. What does that mean? Is it really taking into account everything that's going on?
DaveI gotta I mean I always find this uh conversation quite interesting because so many um stakeholders in the development of features view testing as some sort of a linear, you know, I add a feature, I add a test, and therefore that feature is now tested somehow. So it's a it's a linear challenge to them, or at least when they're understanding how many testers we need on a team or what's the coverage and things like this. And yet testing is so much more of a mindset of uh that uh experience that customers have. And we've uh anybody working in IT has had that whole conversation of it, I've tested it, it works here, I can prove it's working, and yet the experience from a customer's perspective is it might be working here, but it's not working where I am. And I distinctly remember working with a data center migration where um we couldn't get past a firewall. So, of course, all the testing going on inside the data center was fantastic, it looked perfect, but everybody outside of the data center had a very different experience. And and those um these are this understanding that testing is not about a linear approach to testing all the different integrations, functionality, features, whatever it is, but is is actually it's that whole exploratory piece, right? I mean, the the challenge we have with automation, the automation is to get the boring stuff out of the way so that you can do the testing you really need to do, which is the exploratory customer experience testing, is one way of looking at it.
Peter BuddenYeah, I really like that that sort of idea that really automation is great for sort of verifying that things that did work still work. But it's not the same as sitting down at the application and and really thinking, hey, what what might we have broken? What new tests should we be adding? And how do I investigate, how do I explore, how do I think more deeply about um and be inquisitive about this system? And that's the mindset I think of a real tester. So of course, a developer can develop automation scripts, of course, an automation team can develop automation scripts, but that's not the same as testing. And so I think you know, one of the things that I've I've really had, you know, as a topic of conversation with with Scrum Masters and Scrum teams is so when do you actually do testing? Like, forget this verification stuff. What time is there in the week when you sit down and you just like have nothing else on your plate? You're not delivering new code, you're not looking at features, you're not verifying that existing things continue to work the way they did, or fixing tests that were just broken. When are you actually looking at testing and investigating? And and so there's a there's a perspective that is a mindset. And like you said, Dave, um, you know, we need to do this because customers' expectations changed over the last 10 years. Like what they expect from systems has just changed. You know, Amazon and you know, Microsoft and all these amazing enterprises who have upped the game in IT have meant that you know customers' expectations is relative to what else is out there. And it's it's subjective, right? It's not about an absolute, you can't just say, hey, I have 80% coverage of my of my code. That's not good enough, right? What matters is what the product owner says is good enough. It's possible to completely overhead testing, it's possible to spend an endless amount of time in testing. You can never cover all the combinations. So that's not the objective. I think what we would like to see, what I would like to see in all Scrum teams is regular conversations about what good enough looks like. Um, and instead of saying, hey, we're four weeks behind on this coverage percentage that we need to get to, the question is more. Are we achieving the level of quality? And does everyone in the team understand what that quality level really means? Um, you know, should we doing more quality, more quality um related activities, more testing, or can we get away with less? And then how are we spending time in that testing mindset? When are we doing it, right? Testing Thursday, right? Like pick a time, right?
PeterYeah, it's an interesting one, isn't it? It's uh when I think of when we push stuff out and we think of the amount of complexity, one of the things I've noticed that's caused this is that we have very, many, many, many more platforms that we're testing across, many, many more places that we need to go, and so many tests that we've automated across them, which ends up with this. You can end up with a massive heat map of all of the different uh areas for every platform I'm testing across and every single uh so every browser combination, every uh device combination, every uh place that it could be, and then all of the different tests, which gives you this uh this idea that I I've got if I've got 10, 15 different device targets I'm testing against and I've got a thousand tests, that's 10,000 areas that I'm testing across. At that point, I cannot possibly guarantee that every single little piece of that is going to work. At some point, I have to be able to make a call that yes, this is this is good enough. I'm okay with living with the fact that some parts of this I know aren't necessarily going to be perfect, but the quality is good enough and it's uh and it's working well enough in enough places that I'm confident that this is something that I can give to our customers, and our customers uh uh can give us the feedback and we can learn and say, well, okay, that was a good idea.
Peter BuddenYeah, I think that's what you're describing is essentially um combinatorial explosion. It's where you have, and this is like I think everyone's come up against this at some point. It's when you you have a potential number of combinations of things you could test that's just so huge you can never test it. I guess that's the essence of testing, which is it's about risk. And so I I um I look at that and I think, you know, we were doing this in in waterform. It's one of the things we should really learn from like traditional style testing is we should think back to the techniques that worked then to manage this problem of risk and make sure that they're well understood. You know, there are skills about testing that I think um are still relevant in an agile world, and and there are skills that developers should understand, and and and we should always make sure, I think, in in agile organizations that there's someone to learn from. So, you know, if you're a if you're a scrum master or a product owner and you're listening to this and you're saying, you know, how can I build skill sets? One of the ways that's really easy is like, okay, if you were in your team, who would you go to for advice on how to test better and more efficiently? Is there someone in the organization who can be an oracle of testing good habits? You know, um, it's not just um something that I talk about, it's something that you know, Google have been talking about this for years. They've got, you know, they're testing on a toilet thing that's been going on for a long time. Um, but testing skills they don't happen by magic, and and for sure, um, avoiding um maybe over-emphasizing one area of the solution, covering it in tests completely, um, and then not hitting other areas. This is uh this is a skill, and you can get better at it over time. So we should try to find ways to coach and to to sort of draw that through the organization.
DaveUh Peter, I'm just gonna follow up. Testing on the toilet at Google, you're gonna have to say more about that.
PeterYeah, I was wondering the same thing.
Peter BuddenHonestly, it's it's it's hilarious. Um, I I think it's it's a very Google thing, and I I wouldn't recommend it be taken verbatim, but you know, I think they instituted it in like 2007 where they started posting in the toilets on their campuses different testing techniques because they recognized that testing was um a major part of making Scrum teams productive, a major part of making developers productive is giving them feedback as quickly as possible. So they said, hey, you know, let's create a team that can explain and coach and advocate for good um good quality testing. Um, they developed some rules by which they assessed different, you know, different teams. Hey, are you meet you know, are you following good practices in your test automation? Do you have good practices in testing? Um, and then they posted things for people to read while they were, you know, using the loo in a way to raise awareness, right, of testing and how important it is to being good, you know, being fast in development.
DaveNow I think um um what's interesting there is uh and you said this right at the beginning, which is um testing is not this unique skill that one individual has. It's something that the whole team has to understand, and it and it comes into so many aspects of how we you know how features are developed, designed, coded, and so on. So it really becomes uh um uh a topic of conversation across the team. And I I would say the same thing. When I've worked with you know the best agile teams I've seen, none of the skills, it's not just testing, but all of the skills are distributed across the team. There's an appreciation of the you know how to do the different skills on the team, and it's not necessarily they can go and uh build a career in it, but there is an appreciation of that handoff and what's required, and rather than that whole tossing something over a fence and not worrying too much about what gets picked up on the other end. Um maybe a follow-on question around that, and I know well uh we've got a a minute or so, so uh let me ask a different way, which is if you're asking our listeners to walk away with something, two or three things that they can have front of mind when they go back and talk to agile teams that they're working with. Uh Peter, what would it be?
Peter BuddenLet me let me start with the idea that quality it isn't like a given. There's no number that we put on it for all software. Make sure that you're having conversations about what quality means for you and for the product you're delivering to market. Make sure everyone in the team understands that and have regular conversations about it. Don't be the team that spends too much time on testing, and don't be the team that breaks production, right? Talk about quality and what it means to your customer in terms that you would understand, that the customer would understand, right? Um, you know, another thing to kind of take away is just like you can um you can see skill atrophy and testing, and it's important to figure out how people in your team can learn to be better at testing over time. You know, try testing yourself if you're a product owner, if you're a scrum master, get in there and figure out what the team are doing and ask questions about what other people in the team are doing to get a full picture of what they're doing and their skills. Um, it's important, I think, in every organization to think about how testing skills are um kept up and um and advanced. And the third thing that I would think everyone should should understand is that testing is a mindset, you know, it's about asking awkward questions, and it's about giving people the space. And I know a lot of um what you guys are talking about is psychological safety, um, but giving people the space to explore and be inquisitive is hugely important to finding the things that you didn't think of when you were designing the system. Brilliant.
DaveUm, I Peter, did you want to close up, wrap things up?
PeterI I was about to do that. I thought that would be a wonderful thing to do. I think we've we've hit the the top of our 20 minutes, and I was gonna say so. Thank you very much, Peter, and thank you very much, Dave. It's uh been an interesting conversation as always, and uh I I look forward to hearing some feedback from uh our listeners.
Peter BuddenThanks again.
PeterSo thank you.
Peter BuddenAwesome, thanks.
PeterYou've been listening to Definitely Maybe Agile, the podcast where your hosts, Peter Maddison and Dave Sharrock, focus on the art and science of digital, agile, and DevOps at scale.



