POPULARITY
Transcript: Agile FM radio for the agile community. [00:00:04] Joe Krebs: In today's episode of Agile FM, I have Lisa Crispin with me. She can be reached at very easy to remember lisacrispin. com. Lisa is an author of a total of five books. There's three I want to highlight here or four actually is Obviously, a lot of people have talked in 2009, when the book Agile Testing came out, a practical guide for testers and Agile teams.Then following that more Agile Testing, right? Then I thought it would be most Agile Testing, but it turned into Agile Testing Condensed in 2019 and just very recently a downloadable book, Holistic Testing, a mini book. Welcome to the podcast, Lisa. [00:00:47] Lisa Crispin: Thank you for inviting me. I'm honored to be part of the podcast.You've had so many amazing people on so many episodes. So it's great. [00:00:54] Joe Krebs: Thank you. And now it's one more with you. So thank you for joining. And we will be talking a little bit about a totally different topic than maybe the last 20 episodes I had maybe way back. I did some testing topics, but I cannot recall maybe the last 20 episodes.So we're not about testing a super important topic. I would not consider myself an expert in that. And I don't know of the audience who has been listening to maybe the last 20 episodes are very familiar with agile testing. Maybe everybody has a feeling about, when they hear the word testing, but there is a huge difference between agile testing.And let's say traditional testing methods. If you just want to summarize like very brief, I know a lot of people are familiar with some of those things, but what it is, if somebody says what is agile testing, why was this different to traditional testing methods? [00:01:47] Lisa Crispin: Yeah. I think that there are a couple of big differences.One is that testing this is just a truth and not necessarily something to do with agile, but testing is really just part of software development. So many people think of it as a phase that happens after you write code, but in modern software development we're testing all the time, all the way around that whole DevOps loop, really.And and so the whole team's getting engaged in it through the whole lifecycle and the focus. Is on bug prevention rather than bug detection. Of course, we want to detect the bugs that make it out to production so we can fix them quickly. But really what we want to do is prevent those bugs from happening in the first place.So there are all these great practices that were popularized by that extreme programming and agile, things like test driven development, continuous integration, test automation all the things that go into, the planning. Workshops and things where we talk about our new features and break them into stories and what's going to be valuable to customers, having those early conversations, getting that shared understanding, things like behavior driven development, where we think about what we're going to code before we code it.That's all really different from, I guess I would say a more traditional software development approach where, Oh, we focus on these requirements. The requirements and a lot of people think about testing is just make sure it met the requirements. But there's so much more to that. We've got all these quality attributes, like security and performance and all the things that we also need to test.So it's a huge area, but it's woven into software development, just like coding, just like design, just like architecture, just like monitoring and observability. It's all part of the process. [00:03:31] Joe Krebs: Yeah. It's like a QA baked in, if you want to see it this way. And then also the automation of all that, right?So automating everything you just said is probably also a concern. Not that's necessarily new to agile, but that's a focus as well now I don't know if I don't have necessarily data points around that but I have worked with a lot of Scrum teams and Agile teams in my career.And it seems, if somebody would say what are the challenges within these teams? And one of them is, you can almost always highlight that, and I say almost purposely because there are good exceptions, is to build an increment of work once per sprint. A lot of teams do not accomplish that, and it's often related to testing activities.Why is that, in your opinion, like when we're seeing these teams struggle to put an increment of work out or a piece of the product or whatever you want to call it if you don't use Scrum necessarily, but to build something that could potentially go out. It's the quality standards of going out. What are the struggles out there for teams, especially on the testing side?I see that as you just said, like it's always happening or often happens at the end, rather than in the front. [00:04:46] Lisa Crispin: Yes. Unfortunately, I see, still see a whole lot of scrum teams and other agile teams doing a mini waterfall where they have testers on the cross functional team. But. The testers are not being involved in the whole process, and the developers aren't taking up practices like tester development, because those things are hard to learn and a lot of places don't enable.The non testers to learn testing skills because they don't put those skills into the skills matrix that those people need to advance their careers. And the places I've worked where we succeeded with this sort of whole team holistic approach, everybody had testing skills in their skills matrix.And we all had to learn from each other and, testers had other skills in their taste, matrix, like database skills and at least the ability to read code and be able to pair or ensemble with somebody. So that's part of it. And I just think. It's people don't focus enough on that, on the early process of the business stakeholder has brought us a new feature.We need to test that idea before we do anything. Is this really giving, what value, what's the purpose of the feature? What value is it going to give to the customer and to the business? And a lot of times we don't ask those questions up front. And the stakeholders don't ask themselves and then they get, you deliver the feature and it's something the customers didn't even want.[00:06:11] Joe Krebs: Lisa, we need to code. We need to get software. Why would we talk about that? Why would we not just code? I'm kidding. [00:06:18] Lisa Crispin: Yeah. Yeah. And that's also required, that's why the whole concept of self organizing team works really well. When you really let the teams be autonomous, because then they can think how best, how can we best accomplish this than they can do?Let's do some risk storing before we try to slice this into stories and let's do good practices to slice that feature into small, consistently sized stories that give us a reliable cadence predictable cadence of the business can plan and. Take those risks that we identified, get concrete examples for the business stakeholders of how this should behave and turn those into tests that guide development.Then we can automate those tests. And now we have regression tests to provide a safety net. So that all fits together. And of course, these days, we also need to put the effort into kind of the right side of the DevOps loop. We're not going to prevent all the bugs. We're not going to know about all the unknown unknowns, no matter how hard we try.And. These cloud architectures are very complex. Our test environments never look like production, so there's always something unexpected that happens. And so we have to really do a good job of the telemetry for our code, gathering all the data, all the events, all the logging data for monitoring. For alerting and also for observability, if something happens that we didn't anticipate, so it wasn't on our dashboard.We didn't have an alert for it. We need to be able to quickly diagnose that problem and know what to do. And if we didn't have. Enough telemetry for diagnosing that problem without having to, Oh, we've got to go back and add more logging and redeploy to production so we could figure it out. Oh, how many times has my team done that?That's all part of it. And then learning from production using those. And we've got fantastic analytics tools these days. Learning from those and what are the customers do? What was most valuable to them? What did they do when they, especially I mostly have worked on web applications.What did they do again? We released this new feature in the UI. How did they use it? And it's, we can learn, we know that stuff now. So that feeds back into what changes should we make next? [00:08:29] Joe Krebs: All right. So it's, it comes full circle, right? What's interesting is there's this company, it's all over the news.It's Boeing, right? We're recording this in 2024 quality related issues. Now, that is an extreme example, obviously, but. We do have these kind of aha and wake up moments in software development too, right? So that we're shipping products and I remember times where testing, I purposely call it testing and not QA, testing personnel was outsourced.That was like many years ago. We actually felt oh, this activity can be outsourced somewhere else. And you just made a point of if we have self organizing teams, And we're starting with it and we're feeding in at the end of a loop back into the development efforts, how important that is and how we treated these activities in the past and how, what we thought of it is, it's shocking now looking back in 24, isn't it?[00:09:23] Lisa Crispin: Yeah, it's just, it just became so much part of our lives to run into that. And the inevitable happened, it generally didn't work very well. I've actually known somebody who led an outsourcing test team in India and was working with companies in the UK and Europe.They actually were able to take an agile approach and keep the testers involved through the whole loop. They had to work really hard to do that. And there were a lot of good practices they embraced to make that work. But you have to be very conscious. And and both sides have to be willing to do that extra work.[00:09:56] Joe Krebs: You just mentioned that there were some really cool analytics tools. I don't know if you want to share any of those because you seem very excited about this, [00:10:05] Lisa Crispin: the most, the one that I found the most useful and I, a couple of different places I worked at used it.It's called full story. And it actually. It captures all the events that are happening in the user interface and plays it back for you as a screencast. Now, it does block anything they type in. It keeps it anonymized. But you can see the cursor. And I can remember one time a team I was on, it's we put a whole new page in our UI, a new feature.We thought people would really love it. And we worked really hard on it and we, we tried to do a minimum viable version of it, but we still put some effort in it and we put it out there. And then we looked at the analytics and full story and we could see that people got to the page. Their cursor moved around and then they navigated off the page.So either it wasn't clear what that page was for, or they just couldn't figure it out. So that was really valuable. I was like, okay, can we come up with a new design for this page? If we think that's what the problem is, or should we just, okay, that was a good. Good learning opportunity. But as a tester, especially there, because we can't reproduce problems, we know there's a problem in production, can't reproduce it.But if we go and watch a session where somebody had the problem, and there are other things, mixed panel, won't play it back for you, but you can see every step that the person did. And even observability tools Honeycomb and LightStep can show you like the whole, they can trace the whole path of what did the user do.And that really helps us not only understand the production problem, but, Oh, there's a whole scenario. We didn't even think about testing that. And so there's so much we can learn because we're so bound by our cognitive bias, our unconscious biases that we know how we wanted it to work.[00:11:54] Joe Krebs: Yeah. [00:11:55] Lisa Crispin: And it's really hard to think outside the box and get away from your biases and really approach it like a customer who never saw it before would do. [00:12:03] Joe Krebs: Yeah. It's this is the typical thing, right? If a software engineer demonstrates their own software they produce and was like eight books on my machine, I'm sure you have heard that.And it's it's obvious that you would do this, right? And it's just not necessarily obvious for somebody else. But if you're like sitting in front of a screen developing something for a long time, it just becomes natural that you would be working like this. I myself have engineered software and and fell into that trap, right?It's oh my God, eye opening event. If somebody else looks at you or. Yeah, [00:12:33] Lisa Crispin: Even when you sometimes have different people, like I can remember an occasion that Timo was on with a, again, a web application and I was just changed in the UI, just adding something in the UI and I tested it. My, my manager tested it.One of the product owner tested it. And we all thought it looked great and it did look great. We didn't notice the other thing we had broken on the screen until we put it in the production and customers were like, Hey, I really do think things like pair programming, pair testing, ensemble, working in ensembles for both programming and testing, doing all the work together.Getting those diversity points does help hugely with that. My theory is we all have different unconscious biases. So maybe if we're all together, somebody will notice a problem. I don't have any science to back that up, but But that's why those kind of practices are especially important. [00:13:28] Joe Krebs: Yeah. [00:13:28] Lisa Crispin: To catch as many things as we can.[00:13:30] Joe Krebs: Yeah. So we both didn't have any kind of science to back this up, but let's talk a little bit about science. Okay. Because metrics, data points, evidence. What are some of the KPIs if somebody listens to this and says Oh, that sounds interesting. And we definitely have shortcomings on testing activities within Agile teams.Obviously there's the traditional way of testing. They're using very different data points. I have used some in the past, and I just want to verify those with you too. It's that's even useful and still up to date. What would be some good KPIs when somebody approaches you and says that's got to have that on your dashboard?[00:14:08] Lisa Crispin: I think you, I actually think one of my favorite metrics to use is cycle time, although that encompasses so many things, but just watching trends and cycle time. And if you're, if you've got, for example, if you've got good test coverage with your automated regression tests, you're going to be able to make changes really quickly and confidently.And if you have, a good deployment pipeline, you're going to Again, there's a lot of testing that goes into making sure your infrastructure is good and your pipeline is performing as it should, because it's all code to that reflects a whole lot of things. It's hard to isolate one thing in your cycle time but what counts is, how consistent are we at being able to frequently deliver small changes?So I think that's an important one. And in terms of. Did we catch all the problems? I think it gets really dangerous to do things like, Oh, let's count how many bugs got in production because all measures can be gained, but that's a really easy one to gain. But things like how many times did we have to roll back or revert a change in product in production?Because there was something we didn't catch and hopefully we detected that ourselves with an alert or with monitoring before the customers saw it. And now that we have so many release strategies where we can do, Canary releases or blue green deploy so that we can do testing and production safely.But still how many times did we have to roll back? How many times did we get to that point and realize we didn't catch everything? That can be a good, that can be a good thing to track. And depending on what caused it. If we had to, if we had a production failure because, somebody pulled the plug of the server out of the wall.That's not, that's just something that happened, but if it is something that the team's process failed in some way, we want to know about that. We want to improve it. And, just how frequently can we deploy I think, the thing with continuous delivery, so many teams are practicing that are trying to practice that you're not going to succeed at that if you're.If you're not preventing defects , and if you don't have good test automation, good automation the whole way through. [00:16:08] Joe Krebs: Yeah. [00:16:08] Lisa Crispin: And I think, deployment frequency, that's another one of the Dora key metrics. Yeah. That's a real that we know that correlates with high performing teams.And of course we shouldn't ignore, how do people feel are people burned out or do they feel happy about their jobs? That's a little harder metric to get. I was on a team, my last full time job, we really focused on cycle time as a metric and we didn't really have that many problems in production.So we didn't really bother to track how many times we had to revert because we were doing a good job, but. But but, how frequently were we going? What was our cycle time? But also we did a little developer joy survey. So once a week, we sent out a little 5 question survey based on Amy Edmondson's work.And now I would base it on I would also use Nicole Forsgren's space survey. Model, but that was just a little before that came out, but just asking just a few questions and multiple, from one to five, how did you feel about this? And it was really interesting because over time, if cycle time was longer, developer joy was down.So there's something happening here that people are not happy. And. Something's going wrong. That's affecting our cycle time. And then the reverse is true. When our cycle time was shorter, joy went up. So I think it's I think it's important and, you don't have to get real fancy with your measurements, just start just, I think you should first focus on what are we trying to improve and then find a metric to guide, to measure that.[00:17:41] Joe Krebs: I'm glad you said or mentioned code coverage. That's one of one of those I mentioned earlier. I've been working with it quite a bit and cycle time. Um, very powerful stuff. Now, with you, such, somebody who has written published about agile testing extensively we are in 2024. There was the years ahead.There are agile conferences. There is a lot going on. What are the trends you currently see in the testing world? What is what's happening right now? What do you think is influencing maybe tomorrow? The days coming, I know you have holistic testing yourself. So maybe that is one but I just want to see, what do you see happening in the agile testing? [00:18:24] Lisa Crispin: Oh, just all of software development, definitely evolving. I think one of the things is that we're starting to be more realistic and realize that executives don't care about testing. They care about how well does their product sell?How much money is the company making? We know that. Product quality and process product quality obviously affects that. And that's from a customer point of view. It's the customer who defines quality. And, back in the nineties, we testers thought we were defining quality. So that's a thing, a change that's occurred over time and really thinking about that and also knowing that our process quality has a huge impact on product quality and what are our, are the, What are practices?What are the most important practices we can be doing? Janet Gregory and who is my coauthor on four of those books and Selena Delesie they've been consultants for years and helped so many huge, even big companies through an agile transformation. And they've distilled their magic into their, what they call a quality practices assessment model.And they identified 10 quality practices that they feel are the most important and things like feedback loops. Things like communication, right? And the model helps you ask questions and see where is the team in these different different aspects of practices that would make them have a quality process, which would help them have a quality product.And it gives teams a kind of a roadmap. It's here's where we are now. What do we need to improve? Oh, we really need to get the continuous delivery and these things are on our way, things like that. So I think that's one realization that it ties back to the idea that testing is just part of software development and we've had for years.So like, how can I make the, president of this company understand that we testers are so important. We're not, but it's important that the team build that quality. [00:20:29] Joe Krebs: But you could also argue that maybe a CEO of a company or the leadership team would say, we also don't care if this is being expressed in one line of code or two lines of code.So it's not necessarily to testing. I think they're just saying we have, here's our product. But I think what has changed is that your competition is just one mouse click away. Yeah. Quality is a determining factor. Now, let's take this hypothetical CEO out there right now listening to our conversation and saying I do want to start to embrace agile testing and agile in general, but more of those things you just mentioned, what would be a good starting point for them?Obviously there's a lot of information right now keywords and buzzwords we just shared today. What would be a good starting point for them to start that journey, because that is obviously not something that's coming overnight. [00:21:20] Lisa Crispin: I think one of the most important things that leadership can do is to make, to enable the whole team to learn testing skills that will help them build quality.And that means making it part of their job description, making it part of their skills matrix for career advancement, because that gives them time. If developers are paid to write lines of code, that's what they're going to do. But if they're, it's okay, you're an autonomous team.You decide what practice you think will work for you. We're going to support you. It's going to, it's going to slow things down at first. Okay, like I was on a team in 2003 that was given this mission. Okay. Do what you think you need to do first. We decided what level of quality we wanted, of course.We wanted to write code that we would take home and show our moms and put it on our refrigerators, and and we all commit to that level of quality. How can we achieve that? We're seeing that test driven development has worked really well for a lot of teams. So let's do test driven development, which is really.Not that easy to learn, but when you have a leadership that lets you have that time to learn and support you, it pays off in the long run because eventually you're a lot more confident. You've got all these regression tests. You can go faster and things like continuous integration, refactoring, all these practices that we know are good.And we were empowered to adopt those. It was all part of all of our job descriptions. And that's, so we became a high performing team, not overnight. Yeah, within a few years and a part of our, part of what we did was spend a lot of time learning the business domain. It's a very complicated business domain.And so when the stakeholders came and said we want this feature and we asked them what, why do they want it? What was it supposed to do? What was the value? We could usually cut out half of what they thought they wanted. We can say, okay, if we did all of this, we think it's going to be this much effort, but we could do 80 percent of it for half the cost.How's that? Oh yeah. Oh yeah. Nobody ever turned us down on that one. So that's another way where you go fast, we eliminate things that customers don't want or need. And so yeah, it's the unicorn magic of a self true self organizing team. [00:23:30] Joe Krebs: Yeah. But I do think what you said is, , this one thing that just stood out to me It is an investment, it is an investment into the future.It's a really good feeling to have later on the capability of releasing software whenever you want. If that is not becoming a massive burden and the whole company needs to come together for all nighters to get a piece of software out of the door. Now you're not only an author here you're also a practitioner.You also work with teams and I just want to come back to that business case of agile testing. One more time. Do you have an example from a client recent or further back where you would say that stands out or that's an easy one? You remember where agile testing made a huge difference for an organization.I'm sure there are tons you have where you would say there was a significant impact for them based on introducing agile testing practices. [00:24:29] Lisa Crispin: I certainly, especially early on in the extreme programming and Agile adoption there was a few occasions where I joined a team that never had testers.They were doing the extreme programming practices and you may recall that the original extreme programming publications did not mention testers. They were all about testing and quality, but they didn't mention testers. And. So these teams were doing test driven development, continuous integration. They were writing really good code and then they were doing, they were doing their two week sprints and maybe, maybe it took them three sprints to develop what the customer wanted and then they give it to the customer and the customer is but that's not what I wanted.So they like, maybe we need a tester. So then they hired me. And I was like, oh okay, let's let's have some of the, some, okay, we're gonna do a new, some new features. Let's have some brainstorming sessions. How are we gonna, what is this new feature for? How are we gonna implement it?What are the risks? And start doing risk assessments. And how are we gonna mitigate those risks? Are we gonna do it through testing? Are we gonna do it through monitoring? And just asking those what if questions? What's the worst thing that could happen? That's my favorite question when we release this.And could we deploy this feature to production and have it not solve the customer problem? And just add, anyone could ask those questions. It doesn't have to be a tester, but I find the teams that don't have professional testers, specialists, they, nobody else thinks of those questions. They could. But they just, testing is a big area.It is a big set of skills. And anybody on that team, I know lots of developers who have those skills, but not every team has a developer like that, other specialists, like business analysts could also help, but there were even fewer business analysts back in the day than there were testers.And as soon, so as soon as the tester, and when I, one team I joined early on, okay, they're like, okay, Lisa you can be our tester. But you can't come to the planning meetings and you can't come to the standups. That's a little weird. I did as best I could without being involved in any of the planning.And so that's the end of the two weeks. They weren't finished. Nothing was really working. And I said, Oh, Hey, can we try it my way? Let me be involved in those early planning discussions. Let me be part of the standup and Oh, amazing. Next time we met our target. And and I was I couldn't support all the, there were 30 developers and one tester, but we agreed that one other person or two other people would wear the testing hat along with me every sprint or at least on a daily basis.And so they all started to get those testing skills. Yeah, they just test, like I say, testing is a big area and you don't know what you don't know. I see teams even today. That they don't have any testers because years ago they were told they didn't need them if they did these extreme programming practices and they're doing testers involvement, they're doing continuous integration.They're maybe even doing a little exploratory testing. They're doing pair programming, even some ensemble or mob programming. They're doing great stuff, but they're missing out all that stuff at the beginning to get the shared understanding with the stakeholders of what to build. [00:27:43] Joe Krebs: All those lines of code that were needed. Wouldn't need to be tested. [00:27:48] Lisa Crispin: And so they release the feature and bugs come in. And they're really, they're missing features. It's not what the customer needed. Too many gaps. . And of course, I want to say those aren't really bugs. But they're bad. Yeah. And if you'd had a risk storming session, if you had planning sessions where you got.Example mapping sessions, for example, where you got the business rules for the story, concrete examples for his business role, and then you turn those into tests to guide your development with behavior driven development. This would have solved your problem, but they didn't know to do that. Anybody could have learned those things, but we can't all know everything.[00:28:25] Joe Krebs: Yeah. We're almost out of time.But there's one question I wanted to ask you and it might be a short answer. I hope you condense it a little bit. But when somebody gets on your LinkedIn page Lisa Crispin there is you in a picture plus a donkey. And you have donkeys yourself.And how does this relate to, does it relate to your work? What do you find inspirational about donkeys? And what, why is, why did you even make your LinkedIn profile? It has to be, it has to be a story around it.[00:28:55] Lisa Crispin: It's interesting. A few years ago at European testing conference, we had an open space and somebody said, Oh, let's have an open space session on Lisa's donkeys.And then we got to talking about this and I actually have learned a lot. About Agile for my donkeys. And I think the biggest thing is trust. So donkeys are work on trust. So with horses, I've ridden horses all my life and had horses all my life as well. You can bribe or bully a horse into doing something, they're just, they're different.If you reward them enough, okay they'll go along with you. If you kick them hard enough, maybe they'll go. Donkeys are not that way. They're looking out for number one. They're looking out for their own safety. And if they think you might be getting them into a situation that's bad for them, they just flat won't do it.So that's how they get the reputation of being stubborn. You could beat a bloody, you could offer them any bribe you want. They're not doing it. And so I learned I had to earn my donkey's trust. That's so true of teams. We all have to trust each other. And when we don't trust each other. We can't make progress and the teams I've been on that have been high performing teams We had that trust so we could have discussions where we had different opinions We could express our opinions without anyone taking it personally Because we knew that we were all in it together and it was okay Anybody could feel safe To ask a question, anybody can feel safe to fail, but you have that trust that there's nothing bad is going to happen.And so I could bring my donkey right in that door in the house. I've taken them in schools. I've taken them to senior centers because they trust me. And if I did anything, if they came to harm while in my care, if I, let's say I was driving the cart and the collar rubbed a big sore on them, that would destroy the trust.And it would be really hard to build it back. And so we always need to be conscious of how we're treating each other in our software teams. [00:30:55] Joe Krebs: Yeah, wonderful. I did hear about the rumor of being stubborn. But I also always knew that donkeys are hardworking animals. [00:31:02] Lisa Crispin: They love to work hard.Yeah. [00:31:05] Joe Krebs: Awesome. Lisa, what a great ending. I'm glad we had time to even touch on that. That was a great insight. Thank you so much for all your insights around testing, but also at the end about donkeys. Thank you so much, Lisa. [00:31:17] Lisa Crispin: Oh, it's my pleasure.
Embark on a captivating journey through the Agile Mentors Podcast in 2023 with Brian Milner. Explore a spectrum of Agile topics, from Scrum Master challenges to leadership insights. Join Brian for insightful summaries, memorable moments, and a walk through the rich tapestry of Agile wisdom on the show. Overview In this episode of the Agile Mentors Podcast, Brian embarks on a retrospective journey through the standout moments of the podcast in 2023. Explore carefully curated episodes, offering solutions to the common challenges and then delving into the world of Agile beyond software development. Listen in as Brian shares insightful summaries featuring memorable moments and a diverse landscape of Agile wisdom shared by his esteemed guests. Categorized into topics like Scrum Masters, Product Owners, Developers, Agile’s use beyond software, general career advice, and leadership and coaching, this retrospective is a treasure trove of practical advice, actionable insights, and real-world experiences. Tune in for an inspiring tour through the rich tapestry of the Agile Mentors Podcast 2023 episodes. Listen Now to Discover: [01:16] - Brian introduces the episode and invites listeners to join him in a retrospective of the year's episodes, highlighting ones that may have been missed or are hidden gems worth revisiting, which he will group by listener preferences and areas of interest. [02:39] - For Scrum Masters: Brian begins discussing the first episodes tailored for Scrum Masters, kicking things off with #47, "Exploring Lean Thinking and Agile Development," featuring guest Bob Payne, who shares insights into lean thinking, a foundational principle in agile development. Brian recommends this episode for Scrum Masters aiming to enhance their understanding of Agile's fundamentals. [03:34] - Episode #52, "The Birth of Agile: How 17 Adventurous Techies Changed the World," features Agile icon Mr. Jim Highsmith, one of the authors of the Agile Manifesto. Jim provides a glimpse into the past and offers insights into the future of Agile. [04:06] - Episode #59, "Revising the Scrum Guide," features Don McGreal, who played a key role in the guide's revision, shedding light on the thinking behind the revisions. [05:31] - In Episode #62, "Effective Sprint Goals," Maarten Dalmijn delves into effective crafting techniques and the finer details of achieving success with Sprint Goals. [06:12] - In Episode #69, "Should Scrum Masters Be Technical with Allison Pollard," Allison and Brian explore the question of whether Scrum Masters should possess technical skills. If you grapple with how technical a Scrum Master should be, this episode provides valuable insights and perspectives. [06:51] - In Episode #39, Mike Cohn, an authority on user stories, shares valuable insights into the art of crafting effective user stories. [07:15] - In Episode #65 with Randy Hale titled "Unlocking Lean Portfolio Management," Brian and Randy explore the concept of moving beyond a single-team focus as a product owner, delving into the realm of lean portfolio management building upon insights shared by Bob in episode #47. [07:50] - For Product Owners: Must listen bonus from last year, Episode #22, with Roman Pichler, who shares his insights on "How to Create Helpful Product Roadmaps," addressing challenges commonly faced by product owners in dealing with the nuanced aspects of their role. The episode covers strategies to avoid pitfalls, especially the dangers of rigidly locking into scope and schedule timelines. [08:54] - For Developers: Episode #33, "Mob Programming with Woody Zuill," introduces developers to the transformative practice of mob programming. Woody Zuill, a pioneer in this way of working, shares insights and a practical and thoughtful approach that makes it worth exploring. [10:00] - In Episode #48, Brian hosts a unique episode featuring the renowned Lisa Crispin and Janet Gregory, experts in Agile testing, in a show called "Holistic Agile Testing." This episode is particularly recommended for developers specializing in testing or involved in testing within a Scrum team. [11:00] - In Episode #54, "Unlocking Agile's Power in the World of Data Science," Brian and Lance Dacy explore the intersection of Agile methodologies and data science. The popularity of this episode prompted a sequel, Episode #63, on the fusion of Agile and data science. [11:58] - In the final developer-focused episode, Carlos Nunez joins Brian to delve into the world of DevOps. Carlos, a speaker at Agile 2023, shares insights on the significance of DevOps in today's Agile landscape, emphasizing DevOps as a means of empowerment rather than gatekeeping. [12:38] - Agile Outside of Software: Episode #32 with Cort Sharp focuses on Scrum in High School Sports—specifically high school swimming. Cort shares his experience applying Scrum principles to create practice schedules and routines for the swim team he coaches, providing valuable insights for those interested in using Agile beyond the software realm. [13:24] - In #38: "Using Agile for Social and Societal Transformation with Kubair Shirazee," Kubair walks listeners through how his nonprofit employs Agile methodologies to empower micro-entrepreneurs in developing countries. The episode highlights success stories, such as a barber's journey from a rented spot to owning a professional store, demonstrating Agile's transformative impact beyond the tech industry. [14:40] - Episode #45 with Scott Dunn explores "Overcoming Agile Challenges in Regulatory Environments." This crucial topic addresses the unique challenges faced in tightly regulated sectors like government, legal, and medical professions, offering a compelling dialogue on navigating regulatory hurdles within an agile framework. [16:00] - Episode #64 features John Grant discussing "How Agile Methodologies Reshape Legal Practices." This episode reveals the transformative impact of Agile in the legal profession and offers a unique perspective on Agile as a philosophy rather than just a practice, illustrating its broader applicability beyond the software realm. [17:00] - Today's episode is brought to you by Mountain Goat Software's Certified Scrum Product Owner (CSPO) course. This is a two-day training course taught by one of our certified Scrum trainers that teaches you how to use the product backlog as a tool for project success and how to respond to changes in business conditions by restructuring the product backlog. For the schedule, visit the Mountain Goat Training Schedule. [17:27] - General Career Advice: #34: "I'm Trained, Now What? with Julie Chickering" addresses the post-training phase for Scrum Masters and Product Owners. Julie shares insights on taking the next steps, implementing knowledge, and finding opportunities to build a resume in Agile roles. [18:29] - In #40: "Is it Time to Go Out on Your Own? Tips and Insights with Chris Li" Brian and Chris Li discuss considerations for those at later stages of their careers contemplating the transition to independent consulting. If you're pondering whether it's time to establish your consultancy, this episode provides valuable insights and considerations to guide your decision-making process. [19:00] - In #42: "The Importance of Self-Mastery with Bob Galen," Bob emphasizes the value of constant learning, even after years of experience, highlighting the importance of staying open to new discoveries and others' experiences. This episode serves as a compelling guide for personal growth and continuous improvement. [20:28] - Episode #46 with Christina Ambers: In this episode, Christina shares insights on "How to Assess Company Culture Before Accepting a Job Offer." As the year closes and people consider new job opportunities, Christina guides listeners through the crucial step of evaluating company culture and the importance of understanding if a company truly embraces Agile values or merely pays lip service to them. [21:14] - Episode #50 celebrated the milestone of the 50th episode. Lance Dacy was on the show to discuss "Choosing Your Path: Exploring the Roles of Scrum Master and Product Owner." The episode offers guidance for individuals at crossroads, helping them decide between Scrum Master and Product Owner roles. It serves as a valuable resource for those navigating career decisions in the Agile landscape. [22:13] - Leadership and Coaching: In the Leadership and Coaching category, Episode #37 features Brad Swanson discussing "Servant Leadership, Not Spineless Leadership." Brad dispels misconceptions and offers valuable insights into the essence of servant leadership, making it a compelling resource for those interested in effective leadership approaches. [23:28] - In Episode 41, Karim Harbott explores "Cultural Transformations in Organizations." The episode delves into the challenges of changing organizational culture, emphasizing the time and effort required beyond implementing specific practices. [24:00] - In "#44: Transformations Take People with Anu Smalley", Anu highlights the often-overlooked aspect of involving people in organizational transformations, shedding light on the human dynamics that can either support or hinder the process. [24:35] - In Episode #53, "Debunking Myths in Agile Coaching with Lucy O'Keefe," we tackle the common myths surrounding Agile coaching and provide insights on unlocking excellence in Agile coaching practices. [25:01] - Episode #66 is a solo episode where Brian shares his insights into navigating team conflicts, laying the foundation for understanding and mastering the essential skill of conflict navigation. [26:00] - In Episode #68, Brian hosts Mike Hall for a discussion of "The Pros and Cons and Real-World Applications of SAFe." Whether you're new to SAFe or deeply involved, Mike's expertise provides valuable perspectives and tips for navigating this framework. [26:42] - In Episode #70, Mike Cohn joins Brian to explore "The Role of a Leader in Agile." Here, Mike shares valuable insights based on his extensive experience, offering sound advice and perspective on the crucial role of leaders in self-organizing teams. [28:10] - Brian encourages listeners, especially newcomers, to explore relevant episodes based on their roles, with the goal being to offer practical advice and solutions on specific issues rather than lengthy discussions. All episodes are available in the show notes for convenient access. [29:33] - Brian expresses gratitude to listeners for the past year, reflecting on the unique nature of podcasting and letting listeners know he cherishes the encouragement and connections made, especially at events like Agile 2023. [31:00] - What do you want to hear in 2024? What are some of the hot-button topics that haven’t been covered on the show or guests you want to hear from? Send Brian an email with your ideas. [32:28] - And don’t forget to share and subscribe to the Agile Mentors Podcast on Apple Podcasts so you never miss an episode. [33:00] - We also have our Agile Mentors Community, where we have discussions about every podcast [33:24] - Wishing you a Happy Holiday Season! We'll see you early again in 2024. References and resources mentioned in the show: #47: Exploring Lean Thinking in Agile Development with Bob Payne #52: The Birth of Agile: How 17 Adventurous Techies Changed the World with Jim Highsmith #59: Revising the Scrum Guide with Don McGreal #62: Effective Sprint Goals with Maarten Dalmijn #69: Should Scrum Masters Be Technical with Allison Pollard #39: The Art of Writing User Stories with Mike Cohn #65: Unlocking Lean Portfolio Management with Randy Hale #22: How to Create Helpful Product Roadmaps with Roman Pichler #33 Mob Programming with Woody Zuill #48: Holistic Agile Testing with Lisa Crispin and Janet Gregory #54 Unlocking Agile's Power in the World of Data Science #63: The Interplay Between Data Science and Agile with Lance Dacy #71: The World of DevOps with Carlos Nunez #32: Scrum in High School Sports with Cort Sharp #38: Using Agile for Social and Societal Transformation with Kubair Shirazee #45: Overcoming the Challenges of Agile in Regulatory Environments with Scott Dunn #64: How Agile Methodologies are Reshaping Legal Practices with John Grant #34: I'm Trained, Now What? with Julie Chickering #40: Is it Time to Go Out on Your Own? Tips and Insights with Chris Li #42: The Importance of Self-Mastery with Bob Galen #46: How to Assess Company Culture Before Accepting a Job Offer with Christina Ambers #50: Choosing Your Path: Exploring the Roles of Scrum Master and Product Owner with Lance Dacy #37: Servant Leadership, Not Spineless Leadership with Brad Swanson #41: Cultural Transformation in Organizations with Karim Harbott #53: Agile Coaching: Debunking Myths and Unlocking Excellence with Lucy O'Keefe #66: Successful Strategies for Navigating Team Conflicts #68: The Pros and Cons and Real World Applications of SAFe with Mike Hall #70: The Role of a Leader in Agile with Mike Cohn #49: Celebrating One Year: A Look Back at 50 Episodes of the Agile Mentor Podcast Certified Scrum Master Training and Scrum Certification Certified Scrum Product Owner Training Advanced Certified ScrumMaster® Advanced Certified Scrum Product Owner® Mountain Goat Software Certified Scrum and Agile Training Schedule Join the Agile Mentors Community Subscribe to the Agile Mentors Podcast on Apple Podcasts Want to get involved? This show is designed for you, and we’d love your input. Enjoyed what you heard today? Please leave a rating and a review. It really helps, and we read every single one. Got an Agile subject you’d like us to discuss or a question that needs an answer? Share your thoughts with us at podcast@mountaingoatsoftware.com This episode’s presenters are: Brian Milner is SVP of coaching and training at Mountain Goat Software. He's passionate about making a difference in people's day-to-day work, influenced by his own experience of transitioning to Scrum and seeing improvements in work/life balance, honesty, respect, and the quality of work.
In this captivating episode, host Gunesh Patil sits down with Lisa Crispin, a renowned author and influential figure in Agile testing, for an exploration of her remarkable journey in software testing, and a deep dive into the ever-evolving world of Agile methodologies. The podcast begins with a trip down memory lane as Lisa recounts her early days in customer support during the 1980s. She shares vivid anecdotes of dealing with irate customers on the other end of the line, takes us back to a time when software was deployed via tapes, and she sent fixes via mail. The technology landscape of the era, including Wang OS, DB2, SQL, PCs, Xerox Star, Apple Lisa, and Next, provides a colorful backdrop to her journey. As the conversation shifts gears, Lisa offers insights into the evolution of Agile methodologies, reflecting on Agile then and now. She shares experiences from the world of Extreme Programming and notes how, contrary to popular belief, customers didn't always crave frequent changes. Lisa unveils what she considers the secret ingredient of Agile: releasing small, frequent chunks of software. She suggests that the fifth Agile value should be "Joy." The discussion touches on the magic of Agile Testing Mindset and the pursuit of joy within teams. Listeners gain valuable insights into biases, including confirmation bias, and how these biases can affect teams and lead to catastrophic results. Lisa underscores the importance of diverse teams in covering all bases and minimizing biases. Prepare to be inspired and enlightened as Lisa Crispin shares her incredible journey and offers valuable wisdom on Agile methodologies, testing mindset, biases, and decision-making in this thought-provoking episode. Whether you're a seasoned professional or an aspiring tester, this conversation promises to broaden your horizons and spark your curiosity. This episode is sponsored By ShiftSync, a Tricentis Community. It is a community for anyone interested in all aspects of quality engineering, from left to right across the software development spectrum. Join here https://bit.ly/LT-SS-Reg-Podcast ➥ Telegram Channel Follow on: Apple | Google | Amazon | Spotify | Gaana | JioSaavn
From a trainee reconversion program in the early 80s, Lisa took us on a fantastic testing journey. From discovering Agile before its time, living through highly collaborative Waterfall projects, to embracing XP and being one of the first to challenge the absence of "Testers" in the first installments of the method. Lisa spoke of how she came to write her first book, working with legends of our industry and kept being fascinated by quality.Here are the links from the showhttps://www.twitter.com/lisacrispinhttps://octodon.social/@lisacrispin@mastodon.socialhttps://www.linkedin.com/in/lisa-crispin-88420a/https://lisacrispin.com/https://leanpub.com/agiletesting-condensedhttps://agiletester.cahttps://agiletestingfellow.comCreditsCover Legends by HoliznaCC0 is licensed CC0 1.0 Universal License.Your host is Timothée (Tim) Bourguignon; more about him at timbourguignon.fr.Gift the podcast a rating on one of the significant platforms https://devjourney.info/subscribeSupport the show
Join Brian and his guests, Janet Gregory, and Lisa Crispin, as they share their expertise on integrating testing into Agile teams. Discover how to bridge the gap between programmers and testers for collaboration and success. Overview In this episode of the "Agile Mentors," Brian Milner sits down with Janet Gregory and Lisa Crispin, founders of Agile Testing Fellowship, to discuss integrating testing into Agile teams. They discuss the history of the divide between programmers and testers and the importance of collaboration and communication between the two groups. Listen in as they explore the different levels of holistic testing, the mindset shift needed for bug prevention, and the tools and strategies for planning and estimating testing activities. Plus, the role of AI in testing. Listen Now to Discover: [00:05] - Brian Milner introduces the guests for this episode, Janet Gregory and Lisa Crispin, who are advocates for integrating testing into Agile teams and the Founders of Agile Testing Fellowship. [02:25] - Lisa explains the most important goal for collaboration and success. [03:34] - Janet talks about the history of the gulf between programmers and testers. [05:09] - How to bridge the gap between programmers and testers and the value of collaboration. [07:29] - What the values of Agile and Extreme Programming emphasize. [09:49] - The mindset shift needed for bug prevention. [11:17] - Managers behaving badly—Brian shares a story about how measuring the wrong things can drive the wrong behaviors. [12:13] Brian discusses the micro view of testing instead of a system view. [12:17] How to handle intense forms of testing that take a long time to complete. [14:02] Janet explains the different levels of testing and that teams should determine where testing belongs based on when it can be performed earliest. [15:23] Avoiding a "hardening sprint." [16:48] Lisa shares how to use visual models like the agile testing quadrants and the holistic testing model to help plan and communicate the testing activities needed throughout the software development lifecycle. [17:25] The website where you can find the training written by Lisa and Janet, including “More Agile Testing” and “Agile Testing Condensed” (recently released), and where you can download the FREE Mini-book "Holistic Testing: Weave Quality into your Product." [18:29] - Brian introduces the sponsor for the podcast, Mountain Goat Software. If you are thinking about getting certified as a Scrum Master, check out the resources and training options where certification classes are available every week. [19:26] - The key to fitting testing into a normal sprint cycle and integrating testing with other system pieces. [20:52] - Janet shares a tip for ensuring testing is not overlooked. [20:59] - Lisa shares how to remind teams to do testing at the right time. [22:31] - Why have a visible reminder for testing? [23:54] - The importance of accounting for testing and not treating it as a separate thing to do. [24:37] - Lisa shares her experience using planning poker for estimation and her preference to get every story the same size so they can be completed in a day or two. [25:50] - Janet suggests sizing stories and estimating tasks, why she estimates her tasks herself, and what she’s learned in that process. [26:44] -How to reduce the time needed in estimation meetings: Lisa shares some insight to identify when a story is too big and needs to be split up. [27:35] - The importance of conversation and understanding to avoid creating a wall between programmers and testers during estimation. [28:03] - Another tool in the toolbox: how Chat GPT will revolutionize testing (and who it might replace). [29:01] - There will never be enough time to do all the testing required. [29:31] - Lisa highlights how AI as a tool saves time with testing and allows more time for critical thinking skills. [30:12] - The need for a human presence in the use of AI. [31:19] - Janet shares information about her and Lisa's two courses, Basic Strategies for Agile Teams and Holistic Testing for Continuous Delivery, based on the Holistic testing model of looking at testing activities throughout the software development lifecycle. These courses can be found here. [36:37] Lisa mentions that her book, “Assessing Agile Quality Practices” helps teams identify where they are and where they can improve, using a framework that looks at ten different quality aspects. Plus, information on the book they are working on now on how to facilitate an assessment. [39:03] - Brian provides a list of resources available from Lisa and Janet, including their books “Agile Testing Condensed: A Brief Introduction” “Agile Testing,” “More Agile Testing,” and Assessing Agile Quality Practices and their "Holistic Testing: Weave Quality into Your Product” free download. [40:14] - Join the Agile Mentors Community to continue the discussion. If you have topics for future episodes, email us by clicking here. And don’t forget to subscribe to the “Agile Mentors” Podcast on Apple Podcasts so you never miss an episode. References and resources mentioned in the show: Agile Testing Fellowship Agile Testing - The Book Agile Testing Condensed: A Brief Introduction More Agile Testing Holistic Testing: Weave Quality into Your Product Assessing Agile Quality Practices Mountain Good Software's Advanced Certified Product Owner course Mountain Goat Software Certified Scrum and Agile Training Schedule Join the Agile Mentors Community Subscribe to the Agile Mentors Podcast on Apple Podcasts Want to get involved? This show is designed for you, and we’d love your input. Enjoyed what you heard today? Please leave a rating and a review. It really helps, and we read every single one. Got an Agile subject you’d like us to discuss or a question that needs an answer? Share your thoughts with us at podcast@mountaingoatsoftware.com This episode’s presenters are: Brian Milner is SVP of coaching and training at Mountain Goat Software. He's passionate about making a difference in people's day-to-day work, influenced by his own experience of transitioning to Scrum and seeing improvements in work/life balance, honesty, respect, and the quality of work. Lisa Crispin is the Co-founder of the Agile Testing Fellowship, an author, and an Agile tester and coach, who helps practitioners deliver quality software frequently and sustainably. Janet Gregory is the Co-founder of the Agile Testing Fellowship, an author, and a consultant, specializing in building quality systems and helping companies promote agile quality processes.
Welcome to another episode of the Testing Peers. Regulars Chris, Simon, Russell and David this week talk all things strategies. Before diving into the main topic, we talk about strategies in our home life. We then try our best to describe what is a strategy, we all have slight different yet similar definitions. We share how context impacts strategies and we mention Janet Gregory and Lisa Crispin agile testing books and a few things they share to consider. The discussion moves on discuss about strategies transcend testing and shifting into quality and teams. Claire Reckless "One Page Test Plan" comes up as it sometimes feels like it's a strategy if you keep it high level, which highlights the blur between a strategy and a plan we often face. Strategies can take people on journeys and it helps if you ensure the problem it's solving is clear, we mention Simon Sinekand "Start with why". We talk about motivation and seeing the results of the strategy empowering people and how 'why' helps us grasp and implement the strategy better. We share a few examples of projects didn't go well and some that do go well. Chris quotes himself from a previous talk "make it our strategy not your strategy" as we elaborate on how to make strategies that work by engaging others. What are your experiences with building strategies, what has worked or failed for you?We hope you found the discussion useful and would love to hear your feedback.ContactUs@TestingPeers.comTwitter (https://twitter.com/testingpeers)LinkedIn (https://www.linkedin.com/company/testing-peers)Instagram (https://www.instagram.com/testingpeers/)Facebook (https://www.facebook.com/TestingPeers)We're also now on GoodPods, check it out via the mobile app storesIf you like what we do and are able to, please visit our Patreon to explore how you could support us going forwards: https://www.patreon.com/testingpeersSaffron QA is a provider of recruitment and consultancy services, exclusively for the software testing industry.You can find out more at https://saffronqa.co.uk/ or on LinkedIn at https://www.linkedin.com/company/saffron-qa/Support the show
After having a good time mob programming with introductory roles like driver/navigator, what are some examples of "turning up the good" even more? What are unique ways to help people in the ensemble grow? What mobbing tactics can you employ to improve the product in very particular ways? Picking up from where they left off last time, come join Chris and Austin having another fantastic time playing the Mob Programming Role Playing Game with Lisa Crispin, Joe Justice, and Willem Larsen. This time, they focus on embodying level 2+ roles like Sponsor, Archivist, Rear Admiral, Nose, Traffic Cop, Automationist, Anthropologist, and Dr. Feel Good. Not only do you get to see how things like "graphic kanban" can be used along side ensemble programming but you also get the feel of some of the many unique ways to contribute in a mob. Video and Show Notes: https://youtu.be/V1ZgaX99UJ4
Is mob programming a group of people watching one person type? Just a driver and a bunch of watchers? If not, what are some examples of things to do in the ensemble? What roles amplify learning, contribution, continuous improvement, and flow efficiency for all? How can people learn in a safe environment how to take on these mob roles? Come see the Mob Programming Role Playing Game in action with Chris, Austin, Lisa Crispin, Joe Justice, and Willem Larsen as they focus on role levels 1 and 2. After Willem, creator of the RPG game, introduces the game, they have a fantastic time embodying several roles including Driver, Navigator, Mobber, Researcher, and Sponsor. Not only do they seek to help each other take on the different mob roles, but they also seek to do it with kindness, consideration, and respect. We hope this episode inspires you to play the game with some colleagues and friends and "join the annals of the great mobs of history." P.S. Stay tuned next week for round 2 of the RPG in action for role levels 3+ Video and Show Notes: https://youtu.be/ixV8YG5vwyM
Peter Galison, Image and Logic: A Material Culture of Microphysics, 1997Wikipedia on academic genealogy@made_in_cosmos had a tweet about tradition that I mentionedPaul Hoffman, The Man Who Loved Only Numbers: The Story of Paul Erdős and the Search for Mathematical Truth, 1998Context-driven testing website and bookThe Agile Fusion workshop descriptionPeople mentioned: Lisa Crispin, Ward Cunningham, Janet Gregory, GeePaw Hill, Simon Peyton-JonesCreditsAn image from an undated review of a staging of "Fiddler on the Roof". DuckDuckGo claims it's CC-licensed, but I can't tell. I'm gonna risk it.
“Testing is an activity that happens throughout. It is not a phase that happens at the end. Start thinking about the risks at the very beginning, and how we are going to mitigate those with testing." Janet Gregory and Lisa Crispin are the co-authors of several books on Agile Testing and the co-founders of Agile Testing Fellowship. In this episode, Janet and Lisa shared the agile testing concept and mindset with an emphasis on the whole team approach, which was then followed by an explanation of the holistic testing concept with a complete walkthrough how we can use the approach in our product development cycle, including how Continuous Delivery fits into holistic testing. Janet and Lisa also described some important concepts in agile testing, such as the agile testing quadrants (to help classify our tests) and the power of three (aka the Three Amigos). Towards the end, Janet and Lisa also shared their perspective on exploratory testing and testing in production. Listen out for: Career Journey - [00:06:35] Agile Testing - [00:13:56] Whole Team - [00:15:17] Agile Testing Mindset - [00:19:19] Holistic Testing - [00:24:42] Continuous Delivery - [00:34:53] Agile Testing Quadrants - [00:39:03] The Power of Three - [00:42:50] Exploratory Testing - [00:47:08] Testing in Production - [00:50:49] 3 Tech Lead Wisdom - [00:54:10] _____ Follow Janet and Lisa: Janet's Website – https://janetgregory.ca Janet's Twitter – @janetgregoryca Janet's Linkedin – https://www.linkedin.com/in/janetgregory Lisa's Website – https://lisacrispin.com Lisa's Twitter – @lisacrispin Lisa's Linkedin – https://www.linkedin.com/in/lisa-crispin-88420a Agile Tester Blog – https://agiletester.ca/blog Agile Testing Fellowship Website – https://agiletestingfellow.com Our Sponsor Today's episode is proudly sponsored by Skills Matter, the global community and events platform for software professionals. Skills Matter is an easier way for technologists to grow their careers by connecting you and your peers with the best-in-class tech industry experts and communities. You get on-demand access to their latest content, thought leadership insights as well as the exciting schedule of tech events running across all time zones. Head on over to skillsmatter.com to become part of the tech community that matters most to you - it's free to join and easy to keep up with the latest tech trends. Like this episode? Subscribe on your favorite podcast app and submit your feedback. Follow @techleadjournal on LinkedIn, Twitter, and Instagram. Pledge your support by becoming a patron. For more info about the episode (including quotes and transcript), visit techleadjournal.dev/episodes/92.
Join Murray Robinson and Shane Gibson in a conversation with Lisa Crispin on Holistic Testing. Build quality in. Continuous testing for continuous delivery. Building shared understanding of requirements. Shorten feedback loops. Moving from traditional testing to agile testing. Test automation. The test pyramid and agile testing quadrants. Business-facing tests with behavior-driven and acceptance test-driven techniques. High performing teams. Listen to the podcast on your favourite podcast app: | Spotify | Apple Podcasts | iHeart Radio | PlayerFM | Amazon Music | Listen Notes | TuneIn | Audible | Connect with Lisa via Linkedin or https://lisacrispin.com/ and https://agiletestingfellow.com/ , Murray via email or Shane in the Twitter-sphere @shagility. The No Nonsense agile Podcast is sponsored by: Simply Magical Data
Welcome to another episode of the Testing Peers podcast. Today we talk Agile and our experiences around itThis week Simon banter borrows his banter ideas from the More Than Work Podcast. Once we move on from our banter we move on to talk about agile/Agile and what it isn't. We talk about the mindset that is needed and how transformations go badly or well. We talk about short cuts and bad implantations along with the good. We also share a quote from Vince Lombardi around chasing perfection 'Gentlemen, we will chase perfection, and we will chase it relentlessly, knowing all the while we can never attain it. But along the way, we shall catch excellence'. After talking agile we share a few useful resources - 'Agile Testing', 'More Agile Testing' and 'Condensed Agile Testing' from Lisa Crispin and Janet Gregory, Agile Manifesto. Mike Cohen - Mountain Goat Software. Alsatian blogs. We hope you found the discussion useful and would love to hear your feedback.ContactUs@TestingPeers.comTwitter (https://twitter.com/testingpeers)LinkedIn (https://www.linkedin.com/company/testing-peers)Instagram (https://www.instagram.com/testingpeers/)Facebook (https://www.facebook.com/TestingPeers)We're also now on GoodPods, check it out via the mobile app storesIf you like what we do and are able to, please visit our Patreon to explore how you could support us going forwards: https://www.patreon.com/testingpeersSaffron QA is a provider of recruitment and consultancy services, exclusively for the software testing industry.You can find out more at https://saffronqa.co.uk/ or on LinkedIn at https://www.linkedin.com/company/saffron-qa/Support the show (https://www.patreon.com/testingpeers)
Organizations must meet the demands for speed, quality, and flexibility to manage the expectations of both the business team and end customers. Organizations have made significant improvements in the past decade to implement Agile methodologies like Lean-Agile, Kanban, Scrum, Extreme Programming, etc, and modern software development tools. However, they are challenged by a lack of collaboration, low automation levels, and low focus on continuous improvement. The pandemic and remote made of working have also impacted some of the Agile practices and ceremonies. We talk with Lisa Crispin, an industry-renowned Agile Testing Coach, and Practitioner, in the episode. Lisa talks about improving software testing effectiveness, driving continuous improvement, and increasing test automation levels. Lisa also provides the need for having a common automation toolset and reviewing feedback from production.
There's so much more to testing than just writing automated tests that run in CI. Testers on high-performing teams don't just write tests. They work closely with site reliability engineers, ensuring that the infrastructure is tested as well. They get involved in production. But how does one introduce this holistic approach to testing to one's team? Even more so, how does one introduce continuous integration to an organization, if it hasn't been adopted yet?In this podcast episode, we welcome Lisa Crispin, Author, Agile Testing Coach, "tester by trade", in her own words. Among other things, we talk about a holistic approach to testing, how to shift from shipping many times a day to once a month, and how to help organizations adopt continuous integration. Listen to the full episode or read the edited transcript.Table of contents:What's more to testing than just writing tests?Exploratory testingHolistic approach to testingModern trends in testingImplementing CI in organizationsTesting as a practice: patterns and antipatternsYou can also get Semaphore Uncut on Apple Podcasts, Spotify, Google Podcasts, Stitcher, and more.Like this episode? Be sure to leave a ⭐️⭐️⭐️⭐️⭐️ review on the podcast player of your choice and share it with your friends.
On this episode of Member Spotlight, WCTV member, Lisa Crispin who is the host of "Cookin' The Books" and also an Assistant Librarian for the Wilmington Library, talks about her partnership with WCTV and hosting her own cooking show. She also shares how this show came to be and why she does it. Find out about why Lisa feels WCTV is a valuable resource for our community and what she's learned about herself by hosting her own TV show!
Episode 122 is a summary of our Prestigious Pints series where we catch up with some of our agile inspirations, and this time Geoff and Paul grab a drink with agile testing legend Lisa Crispin. We chatted about agile testing as a mindset and the challenges it faces in the coming years. We also get to know a little more about Lisa herself and who she feels about being known as "the agile testing lady". Other topics and questions included: Do you need to be technical to be a tester? How do you increase the appetite for agile testing in an organisation? What makes agile testing hard? What is the biggest misunderstanding about agile testing? If you would like to hear the interview in full you can subscribe as a FULL PINT patron at https://www.patreon.com/theagilepubcast
Lisa Crispin is a founder of the Agile Testing Fellowship: https://agiletestingfellow.com Lisa is a co-author of Testing eXtreme Programing: https://www.amazon.com/Testing-Extreme-Programming-Lisa-Crispin/dp/0321113551 Other mentions: The Strangler pattern for legacy code rescue: https://martinfowler.com/bliki/StranglerFigApplication.html
Lisa Crispin is a founder of the Agile Testing Fellowship: https://agiletestingfellow.com Mentions: Testing eXtreme Programing: https://www.amazon.com/Testing-Extreme-Programming-Lisa-Crispin/dp/0321113551
Lisa Crispin is a founder of the Agile Testing Fellowship: https://agiletestingfellow.com Mentions: Testing eXtreme Programing: https://www.amazon.com/Testing-Extreme-Programming-Lisa-Crispin/dp/0321113551
In this episode, Richard interviews Lisa Crispin, a quality owner at OutSystems, co-founder of Agile Testing Fellowship Inc, and one of the most influential testing professionals in the software industry. Lisa co-authored several books, including Agile Testing and Testing Extreme Programming, and was a curator of the www.testingindevops.org website. Lisa tells us about the significance of cultivating trust among the team members and how the resulting feeling of safety contributes to the increase of the team's productivity. When you finish listening to the episode, make sure to connect with Lisa on LinkedIn at https://www.linkedin.com/in/lisa-crispin-88420a/ or Twitter at https://twitter.com/lisacrispin, and visit her website at www.lisacrispin.com. You can read the full transcript of the episode at kasperowski.com/podcast-53-lisa-crispin/.
In this episode, Lisa Crispin shares her experiences with the pattern “Delayed Automation” from the Cloud Native Patterns repository (https://www.cnpatterns.org/development-design/delayed-automation). We discuss the different trade-offs of applying it, based on different contexts. I also ask a long-time question: What can we learn from donkeys? If you are curious why, donkeys are Lisa's brand! Lisa recommends: Quality Coaching Roadshow podcast from Anne-Marie Charrett Accelerate book from Nicole Forsgren, Jez Humble and Gene Kim Leading Quality book from Ronald Cummings-John and Owais Peer Lisa Crispin (@lisacrispin) is the co-author, with Janet Gregory, of three books: Agile Testing Condensed: A Brief Introduction, More Agile Testing: Learning Journeys for the Whole Team, Agile Testing: A Practical Guide for Testers and Agile Teams; the LiveLessons Agile Testing Essentials video course, and “The Whole Team Approach to Agile Testing” 3-day training course offered through the Agile Testing Fellowship. Lisa was voted by her peers as the Most Influential Agile Testing Professional Person at Agile Testing Days in 2012. She is co-founder with Janet of Agile Testing Fellowship, Inc. Please visit www.lisacrispin.com, www.agiletestingfellow.com, and www.agiletester.ca for more. Lisa is currently a Fellow Quality Owner at OutSystems, helping with the observability practice.
In this podcast recorded at Agile 2019, Shane Hastie, Lead Editor for Culture & Methods, spoke to Eric Willeke & Ronica Roth about supporting leadership through transformation and then he spoke to Lisa Crispin about the state testing in agile and DevOps Listen to the podcast for more. Curated transcript and more information on the podcast: https://bit.ly/2QyD3ES Follow us on Facebook, Twitter, LinkedIn, Youtube: @InfoQ Follow us on Instagram: @infoqdotcom Stay informed on emerging trends, peer-validated early adoption of technologies, and architectural best practices. Subscribe to The Software Architects’ Newsletter: www.infoq.com/software-architects-newsletter/
Read the full Show Notes and search through the world’s largest audio library on Scrum directly on the Scrum Master Toolbox Podcast website. Sometimes the perspective of the tester can be different from the team’s perspective when it comes to a bug or defect. When that difference of perspective exists, a conflict might arise. In this episode, we talk about how to bring the testers and developers to a common understanding on how to improve quality and avoid conflicts that escalate and can destroy a team. Featured Book of the Week: Agile Testing by Janet Gregory, Lisa Crispin In Agile Testing: A Practical Guide for Testers and Agile Teams by Janet Gregory and Lisa Crispin, Julio found what he calls “the bible for Agile testing practitioners”. The book helped Julio understand how different the tester perspective needs to be when working in an Agile team. In this segment, we refer to the concept of exploratory testing, a critical concept for Agile testers. About Julio de Lima Julio is a Principal QA Engineer working for Capco that believes in the Culture of QA. He has been sharing professional insights and experiences on a daily basis and has more than 4500 students in his 4 online courses. In 2020, he was elected the Brazilian Testing reference practitioner. You can link with Julio de Lima on LinkedIn and connect with Julio de Lima on Twitter.
Join a fun and insightful conversation that Chris and Austin have with Lisa Crispin on ensemble testing, mob testing, ensemble/mob programming, and agile testing. In addition to sharing success stories, we discuss challenges to collaboration and experiments to run to try to overcome it. Video and show notes: https://youtu.be/FIQcq5l1LHA
Welcome to episode four of the Testing Peers Podcast.This week we the peers are talking Cognitive Bias.We find cognitive bias everywhere, the awareness of our own biases and challenging ourselves on those biases are really important.We discuss cognitive biases that we have encountered in the testing world, biases that we have and the power of influential language projecting biases.This is a topic close to our hearts, and one that is so important to continue to discuss and challenge each other about.We touch upon areas of bias, such as anchoring, decision making, mirroring, confirmation and many more.Further areas of reading: Lisa Crispin - https://lisacrispin.com/2018/01/24/telling-stories-together-practicing-critical-thinking-vts/We hope you enjoy this episode and encourage you to get in touch with any feedback that you have, either @testingpeers on Twitter or contactus@testingpeers.comSupport the show (https://www.patreon.com/testingpeers)
In this episode I interviewed Lisa Crispin, one of "las madres del agile testing", an expert and great contributor to our field. You can learn more about her here: https://agiletestingfellow.com/. You can also play the episode and read the full transcript here: https://bit.ly/30ZPHmA Highlights: - We discussed what this new term "Observability" is and how it's related to monitoring, testing and even chaos engineering. - Lisa talked about the open source project, OpenTelemetry and mentioned different tools that help to implement observability. - At the end she suggested a book if you want to learn more about Continuous Delivery, DevOps and Observability Of course, we are going to talk also about how collaboration and pairing is a great way to learn and improve whatever you do!
In this episode I interviewed Janet Gregory. She told us about her experience adapting the agile testing training she and Lisa Crispin offer in Agile Testing Fellowship, to make it available online with remote facilitation. We discussed the challenges for agile teams working remotely and some ideas to overcome it (pairing, clarify communication channels, stay connected). Lastly, she commented about her trip to Uruguay when she was the keynote speaker at TestingUY and, as usual, she suggested some books and told us about some good habits she has implemented. You can also play the episode and read the full transcript here: https://bit.ly/3ehzLjJ
Os benefícios obtidos ao participar de eventos internacionais como a EuroSTAR, STARCanada e STAR(East|West) e a construção do conhecimento a partir do conhecimento existente. Entrevista com Larissa Rosochansky & Rafael Cintra sobre Design Thinking em testes e como foi a submissão e aprovação da palestra em um dos maiores eventos internacionais sobre testes de software, o Agile Testing days, em Chicago, IL. Minhas respostas a perguntas sobre preparação de palestras, eventos internacionais e sobre a construção de novos conhecimentos a partir da mescla de testes com outras áreas. Testadores mencionados nesse episódio em ordem cronológica: James Bach, Michael Bolton, Janet Gregory, Lisa Crispin, Elisabeth Hendrickson, Lee Copeland, Elias Nogueira, Walmyr Filho, Moisés Ramírez e Samanta Cicilia.
Agile Testing has now been around in some form or another for two decades, yet it seems that what people are calling Agile Testing and what Agile Testing actually is are still two different things. Why is there such a gap in both understanding and practice? Matt Heusser and Michael Larsen welcome Lisa Crispin, Elle Gee and Jamie Phillips to discuss exactly that. In the process, we get into how Agile is practiced in both small teams and in larger organizations, where it is practiced well, and some of the common pitfalls even the best of Agile organizations still face.
This month on the Cucumber Podcast we have another conversation with Janet Gregory and Lisa Crispin. They have released their latest book, Agile Testing Condensed (https://leanpub.com/agiletesting-condensed). In this conversation, Matt Wynne and Seb Rose ask the pair about the book and testing best practices in modern agile teams.
Lisa Crispin, Testing Advocate at mabl, chats with TechWell Community Manager Owen Gotimer about how testers can add value to development teams, the importance of exploratory testing in automated development pipelines, and why agile isn't all about speed. Continue the conversation with Lisa (@Lisa Crispin) and Owen (@owen) on the TechWell Hub (http://hub.techwell.com/)! Music by LiQWYD: @liqwyd www.instagram.com/liqwyd www.spoti.fi/2RPd66h www.bit.ly/youtubeliqwyd
Robby speaks with Lisa Crispin, co-author of Agile Testing and Testing Advocate at Mabl. Lisa speaks about "thinking skills" for developers, why testing professionals should be integrated into dev teams, testing and development cycles, and how to start building automated tests on a legacy application. Helpful Links Follow Lisa Crispin on Twitter Agile Testing Fellow Agile Testing with Lisa Crispin DevTestOps Community The Nightmare Headline Game by Elisabeth Hendrickson [Book] Agile Testing: A Practical Guide for Testers and Agile Teams [Book] More Agile Testing: Learning Journeys for the Whole Team [Book] More Fearless Change [Book] A Practical Guide to Testing [Book] Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing Subscribe to Maintainable on: Apple Podcasts Overcast Or search "Maintainable" wherever you stream your podcasts. Loving Maintainable? Leave a rating and review on Apple Podcasts to help grow our reach. Brought to you by the team at Planet Argon.
DevOps is a common term and one that seems to be hard to pin down or to define. Additionally, it seems that for many, the idea behind DevOps seems to be an elimination of testers or testing. Our guests Lisa Crispin of mabl and Jessica Ingrassellino of Salesforce.org are here to discuss how software testers are indeed important in the process of DevOps and some ways we can get involved in ways we may not have considered. A key element is Observability (#olly) and we get into the details of considering observability as part of the DevTestOps equation.
REBROADCAST - Join Andrew Leff in his conversation with four of the agile community’s strongest female voices as they discuss how far we have come and how far we have to go in creating trust, safety and opportunity for women in agile. Listen in as Janice Linden-Reed, Becky Hartman, Lisa Crispin and Colleen Johnson share their personal stories and challenge each of us to find ways to mentor women at all points of their careers by building strong relationships and providing honest feedback.
In today’s episode, we chat with Lisa Crispin, Testing Advocate at Mabl, the co-author of the Agile Testing: A Practical Guide for Testers and Agile Teams and one of the most influential testing professionals in the industry. Join us to hear about the significance of the whole team approach, collaboration and feedback in testing, and how the cues of success through teamwork can come from even the most unexpected sources – including from the miniature donkeys!
This month on the podcast we speak to Janet Gregory and Lisa Crispin about testing in DevOps. The conversation covers the whole team approach and why testers are as important as ever. Asking the questions from Cucumber is Matt Wynne, Sallyann Freudenberg, and Steve Tooke. Shownotes: Janet & Lisa's website - https://agiletester.ca/ Agile Testing - https://www.amazon.com/Agile-Testing-Practical-Guide-Testers/dp/0321534468/ref=sr_1_3?s=books&ie=UTF8&qid=1547738499&sr=1-3&keywords=agile+testing More Agile Testing - https://www.amazon.com/More-Agile-Testing-Addison-WesleySignature/dp/0321967054/ref=cm_cr_arp_d_product_top?ie=UTF8 Agile Testing Essentials - https://www.frontrowagile.com/courses/agile-testing-essentials/overview On Twitter: Janet Gregory (@janetgregoryca) Lisa Crispin (@lisacrispin)
GUEST BIO: Lisa is a software tester who enjoys sharing her experiences and learning from others. She is also the co-author of “More Agile Testing: Learning Journeys for the Whole Team” and “Agile Testing: A Practical Guide for Agile Testers and Agile Teams”. And in 2012 Lisa was voted the most influential agile testing professional person. EPISODE DESCRIPTION: Phil’s guest on today’s show is Lisa Crispin. She has spent much of her career working within the testing sphere. Today, she is also an author, public speaker and trainer. Lisa is the co-author of several books including Agile Testing: A Practical Guide for Agile Testers and Agile Teams. In 2012, she was voted as the Most Influential Agile Testing Professional Person. She is currently working with mabl who specialize in automated regression testing services. KEY TAKEAWAYS: (1.02) – So Lisa, can you expand on that brief introduction and tell us a little bit more about yourself? Lisa explains that she has been involved in the industry for a long time, so has seen a lot of change. She is currently working with mabl out of Boston, a start-up that provides an innovative automated testing service. But, Lisa does a lot of other things too. For example, with Janet Gregory she has written books and put together a video course. Lisa also said that she likes to spend her free time looking after her donkeys, who are still adapting to the move from Colorado to Vermont. (2.12) – Phil asks Lisa for a unique IT career tip. Lisa’s biggest tip is to ask questions. It helps you to learn and lets others know that you like to learn. It also helps the person answering the questions to think. As a tester that is 2nd nature for Lisa, but she knows this is not the case for everyone. (3.04) – Phil comments that a lot of people who are new to the industry are concerned about asking questions because they are afraid that it shows a lack of knowledge. Do you see that often? Lisa says yes people want to come across as confident and accentuate what they know. But, as a tester you have to ask questions. Doing that is the only way to uncover the unknown unknowns. That only happens when you ask the right questions. Lisa explains that testers have to be big picture people. They have to keep the end user in mind at all times, which their role as tester allows them to do because they are not focused on the code. (4.07) – Lisa is asked to share her worst career moment by Phil. Lisa explains that some years ago a company tracked her down, told her they were admirers of her work and offered her a job. She was flattered and intrigued, but still did her due diligence, after which she accepted the job. But, on the first day they did something that was not in line with her values. She knew immediately she had made a mistake, but pushed that feeling aside and carried on working with them. Within 6 months she ended up leaving and, fortunately, going back to her old job. (6.02) – Phil asks Lisa what would she do differently now. Lisa said there were no warning signs before she started the job, but, now she would listen to her gut. She would pause and ask herself why she felt that way. Often your subconscious is telling you something important, so it is best to pay attention to those feelings. If she had done that, she would have left that unsuitable job straight away. (6.35) – Phil asks Lisa what her best career moment was. Lisa starts by sharing the fact that helping people is something she loves, so being able to do that is a big plus, for her. She was also lucky enough to spend many years working for a company that valued, respected and trusted their IT team. The team was great they really gelled. Importantly, the IT team was involved in many business decisions and they had significant input into what tools they developed for the firm. (9.00) – Phil asks what excites Lisa about the future for the IT industry. As a tester Lisa can see the need and benefits of using machine learning for testing. So, that excites her and she is currently learning as much as she can about it. AI has the potential to take on the burden of much of the boring tedious work, which frees up our time to do more with our brains and intuition. (10.24) – What drew you to a career in IT, Lisa? Basically, it was the fact that she needed a job and wanted to move to Austin. She saw an advert for programmer trainees, took the aptitude test. They wanted people with business knowledge to work on accounting and payroll systems. Knowledge Lisa had because she had formally worked a government job. (11.22) – What is the best career advice you’ve ever received? Lisa says that came from one of her line managers. He explained that a good leader makes sure people know what they and their team are contributing. She feels that this is part of the reason she has had such a successful career. The role of testers is not well understood and what they contribute can easily be overlooked. It is important tok how to get around that issue so that you and your team are properly recognized and rewarded. Phil agrees, he has also noticed that it is hard for testers to demonstrate their contribution. (13.01) – If you were to start your IT career again, now, what would you do? Lisa says she would actually stick with testing and helping others. (13.32) – Phil asks Lisa what career objectives she is currently focusing on. She believes that she has a great future as a testing advocate. The way she likes to work and her experience means that she is able to reach out to both the testers and the people who use their services and draw them together. She has a deep understanding of both worlds and is a good connector. Lisa also enjoys helping people to learn, so that will be one of her focuses. So, she will carry on with her blog and public speaking. (14.41) – What is the non-technical skill that has helped you the most in your IT career? Lisa thinks it is probably her leadership skills. From an early stage she knew she wanted to manage. So, Lisa has always worked to hone those skills. Interestingly, she pointed out that you can be a leader regardless of what your title is. You just need to be willing to be a change agent and show the way to make things better. (15.44) – Phil asks Lisa to share a final piece of career advice. Be brave and push yourself out of your comfort zone. Lisa is a shy person, so often has to do things that make her feel a little uneasy. So, she works within her comfort zone for a while to build up the energy she needs while working outside of her comfort zone, so that she can get important things done. She also points out that you need to overcome your fear of asking for help. After all, not asking for help when you need it can easily lead to a disaster. BEST MOMENTS: (2.37) LISA – “My biggest tip is going to be to ask questions, you know, learn and show that you'd like to learn and learn what you need to learn about.” (5.58) LISA – “"We all learn from failure. There's no real failure, right? Just learning moments." (10.06) LISA – “I think AI just has a lot of potential to help us put more of the boring, tedious repetitive work on two machines, and free up our time to use our human brains and senses and intuition.” (11.36) LISA – “An important part of leadership is making sure that people know what you contributed, and what your team contributed, you had to make that visible.” CONTACT LISA CRISPIN: Twitter: https://twitter.com/lisacrispin LinkedIn: https://www.linkedin.com/in/lisa-crispin-88420a/ Website: https://www.lisacrispin.com
TestTalks | Automation Awesomeness | Helping YOU Succeed with Test Automation
Are you looking for ways to improve your continuous delivery testing process? Want to start incorporating new technologies like AI and machine learning to accomplish that? In this episode testing superstar and popular conference speaker Lisa Crispin, co-author of More Agile Testing: Learning Journeys for the Whole Team and Agile Testing: A Practical Guide for Testers and Agile Teams, will share her views and tips on testing and continuous delivery as well as what she’ll be doing in her new role as a testing advocate at Mabl. Don’t miss it!
The Agile Toolkit Podcast I always say that DevOps in one sense is really an extension of agile principles out to everybody on the ship. -Jeffery Payne Bob Payne chats with Jeffery Payne, Founder of Coveros, at Lean+Agile DC 2018. The Payne Cousins (not really) chat DevOps and tips for pairing developers and testers. The discussion covers moving toward a generalized specialist model, testers showing up like a demolition crew, and the true meaning of pairing. [caption id="attachment_7988" align="alignnone" width="2024"] Jeffery Payne sits down with Bob Payne (not cousins..)[/caption] TRANSCRIPT Bob Payne: [00:00:02] Hi I'm your host and technical idiot, Bob Payne. Just struggling with the equipment there for a little bit, making making the the the big newbie mistake of hitting play instead of record. So I'm here at Lean + Agile DC 2018 and I'm here with Jeff Payne of Coveros. Jeffery Payne: [00:00:25] Your cousin right. Bob Payne: [00:00:26] Yeah. Cousin Jeffery Payne: [00:00:27] Yeah. Bob Payne: [00:00:27] Yep. So Jeff what what are you talking about here today since I am out here in the hall and not not in the talks. Jeffery Payne: [00:00:38] Yes I'm talking about dev test pairing. Okay so trying to get developers and testers to work together better. We find that that's one of the biggest issues we see on teams when it comes from engineering perspective. Bob Payne: [00:00:52] Yeah I mean I think the early agilists were a lot of XP teams that sort of did away with testers because everybody was considered to be a tester. I think it was also sort of a chemistry of the particular group of folks that were on that first team. And you had folks like Elizabeth Hendrickson, Lisa Crispin. A lot of folks sort of brought testing back into the Agil fold. Yeah what do you think the biggest problems you see with testing and agile teams or trying to get testers and coders to pair? Jeffery Payne: [00:01:31] Yeah I think obviously one of the biggest problems is that they historically haven't worked well together. They're kind of on different sides of the fence as a check and a balance in some organizations right. Jeffery Payne: [00:01:42] And and a lot of organizations even they prefer that their testers not even talk to their developers they want them to be independent speak because they think it's kind of like an editor if if you haven't seen it and then you review it another set of eyes you're not you know you're not influenced by the development. The other sort of clean room actually that's the traditional approach. Of course it's always been very late lifecycle and very manual right. None of those things work well on edge. All right. Bob Payne: [00:02:11] Well none of those things actually work well in life. It's not just an agile thing. Jeffery Payne: [00:02:16] So you know how do we change that? Bob Payne: [00:02:17] Shoot, it's not secure and doesn't scale. I'm glad we have 12 hours to fix this before production. Jeffery Payne: [00:02:22] Yeah, Exactly. Here you go have it done by tonight. So yeah. And so what we try to help teams fix that. Bob Payne: [00:02:30] Yeah. Jeffery Payne: [00:02:30] Address those issues. Bob Payne: [00:02:33] What are you what do you think has been most beneficial recently for helping you in that in that quest of getting folks to pair together. Jeffery Payne: [00:02:42] Well we have some techniques and approaches that we like to use to try to get them to work together and also learn from each other because you know if you're moving toward a generalized specialist model on your teams we like that model. Yup and you want collective code ownership and you want a whole team quality all these you know motherhood and apple pie concepts that we espouse too. You've got to get everybody productive during the entire Sprint or whatever you're doing story development or whatever. And the only way you can do that is that people start learning from each other and cross fertilize. Historically you know I was a developer developers aren't great testers for a number of reasons. Jeffery Payne: [00:03:20] Just you know out of the gate they're not very good testers and testers oftentimes particularly if they are manual testers they don't have a very strong technical background they don't know code they can't write automation right. Those two things together don't work very well. So we've found that by pairing Dev and test they can help learn from each other and become stronger teammates and collectively on the code better. Bob Payne: [00:03:43] Now do you find that tools like cucumber or other. I don't know if you're running into teams using fitness but are early on fitness is one of those tools cucumber most recently specked flow help bridge that gap so that testers can blow out those scenarios a little more directly after the fixtures are done or even before the fixtures are done. Jeffery Payne: [00:04:09] Definitely. Yeah I mean the the BDD oriented. Bob Payne: [00:04:12] Yeah Jeffery Payne: [00:04:12] Cucumber with Gherkin, kind of natural language approach is a great way to start moving particularly manual testers toward understanding how to automate without having to dive right in and start like you know trying to write good maintainable selenium scripts for instance or whatever. I mean it's hard to write maintainable any kind of scripts. Bob Payne: [00:04:33] Write would be better then record -those are a nightmare to maintain Jeffery Payne: [00:04:39] No doubt, or record any test is a bad idea because that's how they're sold often so. Bob Payne: [00:04:44] Right. Jeffery Payne: [00:04:45] That's how you know people think you're supposed to use those tools. We definitely like those kinds of tools that we think they help move a a tester toward being more capable of providing automation support. Bob Payne: [00:04:57] What sort of behavioral, I mean, You mentioned the word pairing. What does that mean when you say that because I see a lot of I see a lot of misuse of the word. I'm assuming you're not but the mis use of the word pairing Jeffery Payne: [00:05:09] I Might be, who knows. Maybe you'll tell me i'm wrong, Bob. Bob Payne: [00:05:11] And TDD, I see a lot of people misusing or not really understanding TDD. That's most common but Jeffery Payne: [00:05:17] Yes. Yeah. So I mean to me I'm basing it off of the definition of pair programming. Go you know getting two people together to work together collectively on some task. When you talk in dev test you're really either talking about those two people working together on code almost pair programming and one of our techniques is to use a dev test to pair program yet which is a little different right because one of them maybe doesn't actually know how to write code. So what does that mean. Right. In pairing. The other thing we use it for is to review each others tests. So if you're going to ask developers to do a unit test you want them to learn how to write good unit test meaning think through not just happy path but you know the errors and boundary conditions exceptions and all those kinds of things they usually inherently don't know how to do that a tester can by working with them help them understand how to do that better. Second if you're asking your testers even if it's manual to create tasks for integration for system for you know kind of the combinations of things across use cases and your business flows they often don't they often won't load the design. Well enough particularly if they haven't been involved in those activities they should be but often aren't. Jeffery Payne: [00:06:34] Yeah and the developer can help them think through and understand how does this software all pieced together to meet the you know the flow that we're looking for in our application and how users use it so they can help each other from a testing perspective we found. Bob Payne: [00:06:47] And one of the other things that I think a lot of a lot of testers can help with as well is what are the business rules like oh yeah if you're doing an under UI test which quite often happens in the developers domain you know what are the what are those conditions you know the happy path is easy and that's usually where developers go because they know the happy path works but they don't necessarily test those boundary conditions as or that or the business rules right if I had a whole bunch of J rules or other stuff I wouldn't test that through the UI right. Jeffery Payne: [00:07:26] Yeah no doubt. Bob Payne: [00:07:28] Yeah. Jeffery Payne: [00:07:28] And to your point about a happy path. The other thing we've seen is not every developer's like this but you know a lot of developers consider what they're building to be a work of art. Right. They're like Michelangelo creating the Sistine Chapel in their in their mind. Yeah and they're all about creating this beautiful incredible thing that's going to last forever and just people are going to you and all over it even if it's just their peers. Bob Payne: [00:07:49] Yep. Jeffery Payne: [00:07:50] And then the tester shows up testers like a demolition crew. Bob Payne: [00:07:52] Yeah Jeffery Payne: [00:07:53] Right. They're trying to poke holes in it and figure out what's wrong with it and it's kind of like calling your baby ugly. If you're asked to test your own code because you know you might have every intention but in the back of your mind you might be thinking I don't really want my Sistine Chapel to have problems in it or look bad and changing that mindset is part of getting Dev and tests to work together to understand the best way if you want to build something great is to find any issues as fast as you can see eradicate them. That's really about what it looks like when it gets delivered yet not what it looks like. You know while you're making the sausage right. Bob Payne: [00:08:27] Yeah. I find a lot of people use the term Pairing and they're really talking about working together on just acceptance criteria or something like that that's necessary but not sufficient. I think that deeper level of the deeper you can go in interaction and an understanding the better off your team is clearly Jeffery Payne: [00:08:52] We've had good success getting developers involved in doing some exploratory testing as well. Bob Payne: [00:08:57] Sure. Bob Payne: [00:08:57] You know a lot of times testers get together and do you know session based exploratory testing across stories or whatever. What about the idea of just getting the Dev and test together for a story they're working on and having an exploratory testing session where they work together and explore the product and talk about it and identify bugs. Again that gets the developers a little bit more comfortable doing testing and knowing what to look for thinking critically about the app. And of course it helps the tester better understand the app because if they're they don't understand something about what they're testing they've got the developer right there they can ask Hey what was this supposed to do or how was this supposed to work. Jeffery Payne: [00:09:32] Now I think the story is maybe vague did we really build the right thing or are we testing it properly. That dialogue's very helpful. Bob Payne: [00:09:38] Yeah. What else is exciting in your your world right now Jeffery Payne: [00:09:42] Nothing Bob Payne: [00:09:42] No? Jeffery Payne: [00:09:42] Nothing. Well as you know we do a lot of DevOps work. Bob Payne: [00:09:47] Yeah sure. Yeah it's the new edge issue. Jeffery Payne: [00:09:51] Yeah exactly. Bob Payne: [00:09:52] Yeah. Actually you know we were going to be talking later with some folks talking about sort of you know in many ways Agile is sort of hit a ceiling and I'm hoping this will open up gaps where we can get to real real agility and real cause. All too often it's seen as a fix for the delivery team not right. Not a systemic change that can build better value faster. Jeffery Payne: [00:10:23] Yeah and I totally agree. I mean I think one of the mistakes that the founding fathers of Agile made is you know they were all about collaboration getting everybody to work together. But they forgot a key piece of the lifecycle which was delivery and release and production and production oriented. Bob Payne: [00:10:41] And actually intake in the business side. Jeffery Payne: [00:10:45] Exactly. You know it's funny this group that was all about collaboration and getting everybody on the same page left all these people out right by mistake. Obviously they were creating it as they went so I understand. So I always say that dev ops in one sense is really an extension of agile principles out to everybody on the ship you know involved in the software delivery process in the full lifecycle software. Bob Payne: [00:11:09] Yeah and agile and dev ops are both the you know great grandchildren of lean which was all about that base that whole process right. Jeffery Payne: [00:11:21] Yes. Bob Payne: [00:11:22] Yeah. You know this reintroduction of the concept of value streams and value team and stuff - It's like back to the future. Jeffery Payne: [00:11:32] I'm sure you've studied up on the history you know all the way back through Demming and you know all the way back to you know statistical process control and even beyond that I mean it's clearly standing on the shoulders of giants like everything we do. It's amazing how many people don't understand that or take the time to find that out or understand. Bob Payne: [00:11:50] And the idea that that actually Devops, Yeah there's a whole bunch of cool technical stuff going on, but it's about closing the loop to be able to learn. And my favorite Demming quote about that was learning is not compulsory. Neither is survival. Jeffery Payne: [00:12:07] Here's some great pithy comments. You know we're in this. You know there was an article I read that compared it to an extinct extinction level event you know where we've got you know Internet of Things and big data and and organizations being able competitors being able to go extraordinarily fast and learn and reintegrate that learning. The end for the many organizations that will that will mean their doom and not going to pretend that DevIps or Agile is any silver bullet in allowing them to survive. But I just know the status quo is not the strategy I would take. Jeffery Payne: [00:12:56] Yeah. Well yeah I mean if software is really eating the world which I think we would agree it is then you'd better figure out how to optimize how you build deploy deliver and feedback information fast because otherwise you are going to be out of business. Yeah eventually. Bob Payne: [00:13:15] So what's happening over at your company Coveros. Jeffery Payne: [00:13:18] Coveros, yes! So We're busy little busy little beavers helping people with Agile and devops just trying to get it right. And when we focus more on the engineering aspects of both of those things but I often get asked to you know help pull teams together and figure out how to make it all work. Bob Payne: [00:13:36] Yup Jeffery Payne: [00:13:36] But we really like the the engineering aspects as I call it you know Automation doesn't solve all your problems right. I always say a tool with a fool is still a fool. Right. So you have to know what you're doing and you have to collaborate work together. But automation can help and as long as you take that philosophy you can leverage test automation and then you see ICD Automation and other types of automation effectively. If your view is that automation somehow solves all your problems it's a magic bullet right. And it all you know takes culture or you know magically make it all work then you're going to be really upset right because it's not going to work so that's kind of what we're focusing on. Bob Payne: [00:14:16] Magical thinking is a strategy has also proved to be not the greatest.. Jeffery Payne: [00:14:21] Hope. Hope is not a strategy. One of my favorite sales books and I use that a lot. Yeah everybody says it's not grounded in reality I would say just remember hope is not a strategy. Bob Payne: [00:14:30] Yeah yeah yeah exactly. Well great. What what's exciting you coming up. What do you see coming down the pike in the next. You know what I know prediction is tough especially about the future. Jeffery Payne: [00:14:47] Yeah the future because if I could I wouldn't be in this business or I'd be retired long ago. Bob Payne: [00:14:52] Yeah exactly. Jeffery Payne: [00:14:53] Well I am I'm excited about. Bob Payne: [00:14:54] What do you actually see that's here. Jeffery Payne: [00:14:56] Well I was very skeptical at first but I am a little bit excited about what's going on with integration of A.I. into Dev and test. There are some interesting things going on around how you can leverage AI capabilities to build better tests for your applications. Do testing in a better way. So what actually look interesting. Are they going to scale or are they going to work right we've been talking about AI and you know robots take over the world forever which of course is not going to happen. Bob Payne: [00:15:30] The joke is AI is the next big thing and always will be. Jeffery Payne: [00:15:34] Yeah it's very true because you and I we probably are same same relevant age and we were coming up through the techie ranks. AI got really hot for a while. Bob Payne: [00:15:43] I was in the computer architectures for AI master's program so Jeffery Payne: [00:15:47] Yeah! It was hot hot hot, VR - the first VR systems came out and everyone was talking about these awesome things and how we were going to live in alternative worlds. And all that stuff and of course then like a lot of things that it didn't really happen and kind of disappeared but it bubbled along and now it's kind of popped its head up again. Bob Payne: [00:16:05] And so I'm not familiar with the uses that folks have been you know the application in the testing area what is the is this especially for like I mean if you look at big data you don't know what's in there necessarily. So you don't know know what to test for like where's the where's the current application of. Jeffery Payne: [00:16:33] Well there's a couple. One is of course everybody's trying to figure out how to even test AI-based systems whether it's B.I or or whatever it is you know how do we know the answers right. Right. That's the age old problem in the systems is you know how do you actually know whether what you got is true or not because you kind of need that testing right. But the other side of it which which we're more focused on is other ways to build better approaches to automation that analyze the product analyze what you're building and not completely write the scripts for you but take a step toward providing you test automation capabilities and scripting without having to do that on yourself. There are some new tools out on the market really small startup stuff that's trying to take a different look completely at how we create automated tests and how we maybe do that automatically. Yeah and the software is a really hard problem. Bob Payne: [00:17:37] Yeah I can I can I can extraordinarily easily imagine doing like really good deep progression by looking at sort of big data. Big data user behavior. You know we've kind of done that to heatmap. You know we really need this piece to be bulletproof because of risk. I'm sure there are folks out there that are mapping the the usage. But I could also imagine very easily just observe what folks are doing and and learn from that. I mean it's the way to go. [00:18:20] You know Al p haGo learned how to play go and meet you know with you know the vast majority of the learning in a system like that is not from the ruleset right. The initial ruleset it's actually playing another copy of itself Veriga and and and going through the database of previous games which for go is actually harder than chess but apparently it never played go. But yeah it sounds easy it's go go. How hard could it be. Jeffery Payne: [00:18:53] Just go right. Just go. So what. What's up with that. Just sounds a lot harder. four letters. Just kidding but Bob Payne: [00:19:04] It is four letters is twice as many. Jeffery Payne: [00:19:08] That's fine. We're just having a great time here right. Bob Payne: [00:19:15] Yeah. Jeffery Payne: [00:19:16] So yeah that's what what I'm interested in that is just you know trying to take the dev ops concept to the next level. You mentioned round trip. Right. Which is you know a lot of people spent their early instantiations of automation just focusing on how do I get code you know from a change in their production as fast as possible with quality and stability as well. You have to balance those. But now I think the more sophisticated companies are saying OK well it's great to get there but what happens if you get there and something's wrong. What's the fastest roundtrip approach to fixing that and addressing that. Is it rolling back. Is it going roundtrip and coming through. You know because the the other thing that's and people say why is that important if we're not the kind of company like you know say and Amazon who's pushing code out every 11 seconds right. Jeffery Payne: [00:20:05] Why do you need that we need that for security and stability and performance service level agreements. I mean if you've got a problem in production it cost you money every minute every second it's down or that there is a risk out there with a security perspective you've got to figure out how to round trip change as fast as possible. And that's an exciting area I think has been under looked at. You know it hasn't really been the focal point of house is now I think starting to be. I mean this it is really ironic that the safest way to go is to be able to go fast. Bob Payne: [00:20:41] I mean Jeffery Payne: [00:20:41] Oh yeah. Bob Payne: [00:20:43] I mean the level you know I remember those days where company would have to fail over to their dark side and emphasis on fail right because it would be days hours just downtime before they could you know oh shoot the Oracle logs didn't replicate. Yeah. Or whatever. And in like extreme programming and some of the techniques there early on they were seen as risky and the real practice in the same way that drove up seems risky. If you're doing it the way you and I think they should be doing it. It's actually the least risky way of behaving Jeffery Payne: [00:21:37] Right. Yeah it is. Yeah of course there are some apps that you'd like to be able to push into production quickly but maybe can't ever fail. So you know you can't you know this you know the Amazon concept of roll something out there doesn't really work. Jeffery Payne: [00:21:53] Roll it back and tune it roll back out and you're kind of using your customer to test test and give you some time to live life critical for that. So there are certain ones that you need. You know just double down on your assurance process during your dev ops capability because it can't fail on the field.. For a lot of others you know. Bob Payne: [00:22:10] Well one of them one of the things that I've been thinking about because I quite often talked about high quality and the key is and someone came up to me and said what you're really looking for is expected quality. So and he had an example that was was a big oil and gas company and one of the things that they said is your labels are too good. He's like What do you mean said we need the labels to start to deteriorate immediately said we do not want to see someone pouring a lubricant into a cooking pan in Africa or in some other area where this is unfortunately a common practice with a brand spanking new company logo on the outside of that thing said is we actually need that to deteriorate. And I start to think about that because as you mentioned you know some fine you know I may not have critical transactions push something around or find a roll it back. You know that might be fine. You know canary roll out on Spotify right fine right. Jeffery Payne: [00:23:39] Yep. Bob Payne: [00:23:40] Canary Roll out on the firmware and in a medical device maybe not so fun. Jeffery Payne: [00:23:46] Yeah Bob Payne: [00:23:46] Because the Canary dies Jeffery Payne: [00:23:48] And it's a big Canary. Bob Payne: [00:23:48] . Oh yeah yeah Jeffery Payne: [00:23:55] Yeah. No. No doubt Bob Payne: [00:23:56] Yeah. Jeffery Payne: [00:23:56] And that and that is something that I think people misunderstand about dev ops. [00:24:00] You know when I speak about DevOps at conferences I always well attended everybody's interested in the topic because it's hot Bob Payne: [00:24:06] Right. Jeffery Payne: [00:24:07] People have this perception and unfortunately senior management does that Dev ops means speed and speed alone. The goal no fast can I push things into production. Bob Payne: [00:24:17] But imagine a life critical system where you could have test automation every single infrastructure. Code line Change is auditable in and you can get that level of safety. We used to put two you know extraordinary manual testing. Jeffery Payne: [00:24:44] Yes it was very expensive. Bob Payne: [00:24:45] And it's prone to possibility of non repeatable results. Somebody makes a mistake. Somebody configurations off. And now with you know with tools that where you have immutable infrastructure you have software configured network you can actually know to some a greater degree of certainty than we were able to in the past that you have a Conformance Test system. And that adds a lot of safety. Jeffery Payne: [00:25:24] It does and it helps with regulatory is yes right. I mean the one of the under the under represented aspects of dev ops is CM Bob Payne: [00:25:34] Right. Jeffery Payne: [00:25:34] Because if you're doing it right everything you're dragging your entire manifest of your software your test your environments your even your rollback your recovery procedures your monitoring capability. Dragging that all the way through production in a way that you know where everything came from and everything takes and ties together. And that's what regulators want. Right. Bob Payne: [00:26:00] Those that know they actually want safety they don't care about the stack of documents they use sadly to hopefully inspect that you knew what you're doing. Jeffery Payne: [00:26:08] Want you to demonstrate that you have a process that delivers quality and they want to see that there's relationships between the various things that you're using to do that. And dev ops gives you all that if you do it right. Bob Payne: [00:26:20] Yeah. Jeffery Payne: [00:26:21] If you do it wrong it just you know throws your code down through there and everything around it is changing constantly and you're never really going to get the speed or quality that you want. Bob Payne: [00:26:30] Yep well great so anything you'd like to close out with Jeff for Jeffery Payne: [00:26:36] Well just thanks for the chance to talk. I know you've been doing this a long time and it seems like a great podcast and we're really enjoying the conference. Looking forward to the rest of it. Bob Payne: [00:26:49] And if you can stand to hear me talk then they listen to some of the older ones I think Bob Payne: [00:26:55] Definitely. Jeffery Payne: [00:26:56] Ok cool Bob Payne: [00:26:56] I'll get some popcorn and listen to early one's .. I wish you had started it maybe five years earlier than that right. I mean. Bob Payne: [00:27:03] Yeah yeah Jeffery Payne: [00:27:03] If you had started like right around 2000. Bob Payne: [00:27:05] Yeah Jeffery Payne: [00:27:05] Then Bob Payne: [00:27:06] Yeah. Jeffery Payne: [00:27:07] You know you would have had some interesting.. Bob Payne: [00:27:08] There's a there was some gap years as well. Jeffery Payne: [00:27:12] But Well thank thank you very much for having me. Bob Payne: [00:27:14] Thanks.
David and Jill are joined by Lisa Crispin. They talk representation, pacing and Jean-Ralphio. facebook.com/rhrpodcast | soapandropetheatre.com | inthecarmedia.com
(@stevevporter) joined Ryan Ripley (@ryanripley) to discuss a wide variety of deep Scrum topics and approaches.Steve Porter [featured-image single_newwindow=”false”]Steve Porter – Scurm.org[/featured-image] In this episode you'll discover: Ways to think about Scrum Why context is so important when discussing Scrum How Scrum is built for safety Deeper insights about the concept of a Sprint Goal Links from the show: Scrum.org – http://www.scrum.org Why a Product Owner Should Trust their Scrum Team – https://www.youtube.com/watch?v=6xFgjNvPGaw Tag Along Travel – Steve’s wife Deborah chronicles her travels as she joins Steve all around the world. She has great tips and tricks for world travelers and great insights for people who live with a spouse who frequently travels. https://tagalongtravel.com/ How to support the show: Thank you for your support. Here are some of the ways to contribute to the show: Share the show with friends, family, colleagues, and co-workers. Sharing helps get the word out about Agile for Humans Rate us on iTunes and leave an honest review Join the mailing list – Check out the form on the right side of the page Take the survey – totally anonymous and helps us get a better idea of who is listening and what they are interested in Leadership Gift Program Make a donation via Patreon Book of the Week: [callout]The Nexus Framework is the simplest, most effective approach to applying Scrum at scale across multiple teams, sites, and time zones. Created by Scrum.org—the pioneering Scrum training and certification organization founded by Scrum co-creator Ken Schwaber—Nexus draws on decades of experience to address the unique challenges teams face in coming together, sharing work, and managing and minimizing dependencies. Click here to purchase on Amazon.[/callout] [reminder]Which topic resonated with you? Please leave your thoughts in the comment section below.[/reminder] Related Episode: Want to hear another podcast about the life of an agile coach? — Listen to my conversation with Zach Bonaker, Diane Zajac-Woodie, and Amitai Schlair on episode 39. We discuss growing an agile practice and how coaches help create the environments where agile ideas can flourish. Help promote the show on iTunes: One tiny favor. — Please take 30 seconds now and leave a review on iTunes. This helps others learn about the show and grows our audience. It will help the show tremendously, including my ability to bring on more great guests for all of us to learn from. Thanks! Have you heard the big news? Agile Testing Days USA is coming to Boston from June 25–29 for a festival of learning and sharing for the agile and testing community. Agile for Humans listeners can use the code HUMANS to receive an additional 10% off Super Early Lobster pricing when you register by April 27. Check out the entire program at https://well.tc/agiletestingdaysusa. The agile-focused learning experience will provide an interactive way to get deep insights and the latest developments in testing and agile excellence as well as many opportunities to network with fellow passionate agile software professionals. It is going to be a fantastic week in Boston, so consider joining the inaugural event! Don't just take our word for it—see what well-known expert and speaker Lisa Crispin has to say about Agile Testing Days USA: https://well.tc/wemm The post AFH 086: Talking Scrum with Steve Porter appeared first on Ryan Ripley.See omnystudio.com/listener for privacy information.
Lisa Crispin (@lisacrispin) and Amitai Schleier (@schmonz) joined me (@RyanRipley) to discuss co-presenting at conferences, co-writing books, and agile testing. [featured-image single_newwindow=”false”]Lisa Crispin and Janet Gregory Co-Presenting a Conference Talk[/featured-image] Lisa is a tester who enjoys sharing her experiences and learning from others. She is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley, 2014) and Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009). Lisa is a tester on a fabulous agile team. She specializes in showing testers and agile teams how testers can add value and how to guide development with business-facing tests. Amitai is a software development coach, speaker, legacy code wrestler, non-award-winning musician, award winning bad poet, and the creator of the Agile in 3 Minutes podcast. He blogs at schmonz.com and is a frequent guest on Agile for Humans. Amitai has published many of his agile observations and musings in his new book – Agile in 3 Minutes on Lean Pub. In this episode you'll discover: How to get started in conference speaking with co-presenting The joys and techniques of writing a book with a partner What is being observed in the agile testing world today Links from the show: More Agile Testing: Learning Journeys for the Whole Team Agile Testing: A Practical Guide for Testers and Agile Teams Lisa’s website: lisacrispin.com Self.Conference – May 19th and 20th [callout]Janet Gregory and Lisa Crispin pioneered the agile testing discipline with their previous work, Agile Testing. Now, in More Agile Testing, they reflect on all they've learned since. They address crucial emerging issues, share evolved agile practices, and cover key issues agile testers have asked to learn more about. Packed with new examples from real teams, this insightful guide offers detailed information about adapting agile testing for your environment; learning from experience and continually improving your test processes; scaling agile testing across teams; and overcoming the pitfalls of automated testing. You'll find brand-new coverage of agile testing for the enterprise, distributed teams, mobile/embedded systems, regulated environments, data warehouse/BI systems, and DevOps practices. Click here to purchase on Amazon.[/callout] [reminder]What are your thoughts about this episode? Please leave them in the comments section below.[/reminder] Want to hear another podcast about the getting started with speaking at technical conferences? — Listen to my conversation with Don Gray, Tim Ottinger, Amitai Schleier, and Jason Tice on episode 32. We discuss how to write a compelling abstract, what track reviewers are looking for in a submission, and how to give yourself the best change of getting selected. One tiny favor. — Please take 30 seconds now and leave a review on iTunes. This helps others learn about the show and grows our audience. It will help the show tremendously, including my ability to bring on more great guests for all of us to learn from. Thanks! This podcast is brought to you by Audible. I have used Audible for years, and I love audio books. I have three to recommend: Agile and Lean Program Management by Johanna Rothman Scrum: The Art of Doing Twice the Work in Half the Time by Jeff Sutherland The Lean Startup by Eric Ries All you need to do to get your free 30-day Audible trial is go to Audibletrial.com/agile. Choose one of the above books, or choose between more than 180,000 audio programs. It's that easy. Go to Audibletrial.com/agile and get started today. Enjoy! The post AFH 062: Agile Testing with Lisa Crispin [PODCAST] appeared first on Ryan Ripley.See omnystudio.com/listener for privacy information.
Entrevista sobre a área de qualidade com a Carol Abdo. Links: Artigo: https://www.thoughtworks.com/pt/insights/blog/agile-tester-30 Livro: Agile Testing, Lisa Crispin e Janet Gregory http://www.saraiva.com.br/agile-testing-a-practical-guide-for-testers-and-agile-teams-2852257.html Manifesto Ágil: http://www.manifestoagil.com.br/ Certificações: http://www.bstqb.org.br/
Software Engineering Radio - The Podcast for Professional Software Developers
Felienne talks with Alexander Tarlinder on Developer Testing. Topics include Developer Testing, Agile Testing, Programming by Contract, Specification Based Testing, Venue: KTH, Stockholm Related Links Alexander on Twitter https://twitter.com/alexander_tar Agile Testing: A Practical Guide for Testers and Agile Teams by Lisa Crispin and Janet Gregory https://www.amazon.com/Agile-Testing-Practical-Guide-Testers/dp/0321534468 Clean Code https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship-ebook/dp/B001GSTOAM Alexander’s book review site http://www.techbookreader.com/ Developer […]
Guest: Lisa Crispin @lisacrispin Full show notes are at https://developeronfire.com/podcast/episode-174-lisa-crispin-just-try-stuff
In this week's podcast Richard Seroter talks to Lisa Crispin who works on the tracker team at Pivotal Labs, and is an organiser of the Agile Alliance Technical Conference. Lisa is the co-author of several books on Agile Testing, and is also the 2012 recipient of the Agile Testing Days award for Most Influential Agile Testing Professional Person. Richard also talks to Justin Searls, software craftsman, presenter of "How to Stop Hating Your Tests" and co-founder of Test Double, a company whose goal is to "improve how the world writes software." Why listen to this podcast: - Agile is mainstream, and being adopted by big enterprises, but there's a place to help small companies and startups. - Cloud Foundry pair testers to write production code with the programmers. - Developers have to be focused on right now, testers have freedom to look at more of the big picture - People know testing is good and there a lot of tools for it, but some tools are ill-conceived. - We need a better language for talking about good QA and full stack testing. Notes and links can be found on InfoQ: http://bit.ly/1U0ip8Q 2m:00s - The first XP universe conferences were mainly about XP practices, values and principles, and were attended by developers 2m:17s - Over time, topics moved towards processes and frameworks, and the number of developers who attend Agile conferences has gone down dramatically. 3m:51s - Now Agile is mainstream, it's being adopted by big enterprises, but there's a place to help small companies and startups. That's usually where the innovation comes from, and the Agile Alliance wants to encourage innovation. Quick scan our curated show notes on InfoQ. http://bit.ly/1U0ip8Q You can also subscribe to the InfoQ newsletter to receive weekly updates on the hottest topics from professional software development. http://bit.ly/24x3IVq
After talking with Agile tester extraordinaire Lisa Crispin about test automation code, George Dinwiddie noticed that "the example test codes online are really tiny... They teach people enough to understand the test framework, but... over time they have a hard time putting the tests into practice." The problem is that as the code becomes larger and more complex, it's harder to maintain it long term. In his AATC2016 session, "The Evolutionary Anatomy of Test Automation Code", George will share some patterns he's noticed for making tests maintainable long term as the system gets larger and more complex as well as ideas about how to organize test codes. The trick, according to George, is to start thinking about the organization early, before the code is large and complex, so that when that does happen, you're better prepared. George Dinwiddie helps organizations develop software more effectively. He brings thirty-five years of development experience from electronic hardware and embedded firmware to business information technology. SolutionsIQ's Neville Poole hosts. About Agile Amped The Agile Amped podcast series engages with industry thought leaders at Agile events across the country to bring valuable content to subscribers anytime, anywhere. To receive real-time updates, subscribe at YouTube, iTunes or SolutionsIQ.com. Subscribe: http://bit.ly/SIQYouTube, http://bit.ly/SIQiTunes, http://www.solutionsiq.com/agile-amped/ Follow: http://bit.ly/SIQTwitter Like: http://bit.ly/SIQFacebook
Agile tester JoEllen Carter sits down with Agile Amped at Mile High Agile 2016 to chat about using "Testing to Build the Right Thing", the topic and title of the hands-on session she presented with Lisa Crispin. After enjoying their experience diving into story mapping, the duo decided to share it with a wider audience. Though testers aren't always invited to story mapping sessions traditionally, JoEllen points out that testers can help determine where weaknesses in a story are before it gets build into the product. JoEllen Carter has more than ten years of experience defining the role of tester on agile teams. Her experience in software development and testing began in the highly regulated and QA-intensive nuclear power industry, and now includes direct marketing donor management software, staffing software, e-commerce systems, and project management software. SolutionsIQ's Howard Sublett hosts. About Agile Amped The Agile Amped podcast series engages with industry thought leaders at Agile events across the country to bring valuable content to subscribers anytime, anywhere. To receive real-time updates, subscribe at YouTube, iTunes or SolutionsIQ.com. Subscribe: http://bit.ly/SIQYouTube, http://bit.ly/SIQiTunes, http://www.solutionsiq.com/agile-amped/ Follow: http://bit.ly/SIQTwitter Like: http://bit.ly/SIQFacebook
Agile tester extraordinaire Lisa Crispin is back for another round with Agile Amped. This time she's talking about her presentation at Mile High Agile "Agile Testing to Build the Right Thing" (co-presented with JoEllen Carter). She also touches on some trends that she sees, some positive (e.g., people in the Agile field are recognizing testers as really valuable), others not so much (companies expecting that hiring an SDET will solve all their code quality problems). Lisa Crispin is the co-author of More Agile Testing: Learning Journeys for the Whole Team (2014, with Janet Gregory), Agile Testing: A Practical Guide for Testers and Agile Teams (2009, also with Janet Gregory), co-author of Testing Extreme Programming (2002, with Tip House). Lisa enjoys working as a tester and sharing her experiences in the agile and testing communities. See our last podcast with Lisa here: http://www.solutionsiq.com/resources/... SolutionsIQ's Howard Sublett hosts. About Agile Amped The Agile Amped podcast series engages with industry thought leaders at Agile events across the country to bring valuable content to subscribers anytime, anywhere. To receive real-time updates, subscribe at YouTube, iTunes or SolutionsIQ.com. Subscribe: http://bit.ly/SIQYouTube, http://bit.ly/SIQiTunes, http://www.solutionsiq.com/agile-amped/ Follow: http://bit.ly/SIQTwitter Like: http://bit.ly/SIQFacebook
Brains, Beauty, Agile Testing, and Donkeys with Lisa Crispin at Agile 2015- what more could you possibly need?
We interviewed Lisa Crispin asking her some questions about her latest book: More Agile Testing, her new team, and test conferences she recommends. Here are links to the conferences Lisa mentioned: Agile Testing Days Belgium Testing Days Testbash Let’s test CAST (Lisa forgot to mention, but asked us to add this one). Want to learn […] The post Interview with Lisa Crispin appeared first on Growing Agile.
SPaMCAST 314 features our interview with Janet Gregory and Lisa Crispin. We discussed their new book More Agile Testing. Testing is core to success in all forms of development. Agile development and testing are no different. More Agile Testing builds on Gregory and Crispin’s first collaborative effort, the extremely successful Agile Testing to ensure everyone that uses an Agile frameworks delivers the most value possible. The Bios! Janet Gregory is an agile testing coach and process consultant with DragonFire Inc. Janet is the is the co-author with Lisa Crispin of Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009), and More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley 2014). She is also a contributor to 97 Things Every Programmer Should Know. Janet specializes in showing Agile teams how testers can add value in areas beyond critiquing the product; for example, guiding development with business-facing tests. Janet works with teams to transition to Agile development, and teaches Agile testing courses and tutorials worldwide. She contributes articles to publications such as Better Software, Software Test & Performance Magazine and Agile Journal, and enjoys sharing her experiences at conferences and user group meetings around the world. For more about Janet’s work and her blog, visit www.janetgregory.ca. You can also follow her on twitter @janetgregoryca. Lisa Crispin is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley 2014), Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009), co-author with Tip House of Extreme Testing (Addison-Wesley, 2002), and a contributor to Experiences of Test Automation by Dorothy Graham and Mark Fewster (Addison-Wesley, 2011) and Beautiful Testing (O’Reilly, 2009). Lisa was honored by her peers by being voted the Most Influential Agile Testing Professional Person at Agile Testing Days 2012. Lisa enjoys working as a tester with an awesome Agile team. She shares her experiences via writing, presenting, teaching and participating in agile testing communities around the world. For more about Lisa’s work, visit www.lisacrispin.com, and follow @lisacrispin on Twitter. Call to action! What are the two books that have most influenced you career (business, technical or philosophical)? Send the titles to spamcastinfo@gmail.com. What will we do with this list? We have two ideas. First, we will compile a list and publish it on the blog. Second, we will use the list to drive “Re-read” Saturday. Re-read Saturday is an exciting new feature we will begin on the the Software Process and Measurement blog on November 8th with a re-read of Leading Change. So feel free to choose you platform and send an email, leave a message on the blog, Facebook or just tweet the list (use hashtag #SPaMCAST)! Next SPaMCAST 315 features our essay on Scrum Masters. Scrum Masters are the voice of the process at the team level. Scrum Masters are a critical member of every Agile team. The team’s need for a Scrum Master is not transitory because they evolve together as a team. Upcoming Events DCG Webinars: How to Split User StoriesDate: November 20th, 2014Time: 12:30pm ESTRegister Now Agile Risk Management - It Is Still ImportantDate: December 18th, 2014Time: 11:30am ESTRegister Now The Software Process and Measurement Cast has a sponsor. As many you know I do at least one webinar for the IT Metrics and Productivity Institute (ITMPI) every year. The ITMPI provides a great service to the IT profession. ITMPI’s mission is to pull together the expertise and educational efforts of the world’s leading IT thought leaders and to create a single online destination where IT practitioners and executives can meet all of their educational and professional development needs. The ITMPI offers a premium membership that gives members unlimited free access to 400 PDU accredited webinar recordings, and waives the PDU processing fees on all live and recorded webinars. The Software Process and Measurement Cast some support if you sign up here. All the revenue our sponsorship generates goes for bandwidth, hosting and new cool equipment to create more and better content for you. Support the SPaMCAST and learn from the ITMPI. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.
SPaMCAST 313 features our essay on developing an initial backlog. Developing an initial backlog is an important step to get projects going and moving in the right direction. If a project does not start well, it is hard for it to end well. We will provide techniques to help you begin well! The essay begins: Many discussions of Agile techniques begin with the assumption that a backlog has magically appeared on the team’s door step. Anyone that has participated in any form of project, whether related to information technology, operations or physical engineering, knows that requirements don’t grow on trees. They need to be developed before a team can start to satisfy those requirements. There are three primary ways to gather requirements based on how information is elicited. Listen to the rest on the Software Process and Measurement Cast! Call to action! What are the two books that have most influenced you career (business, technical or philosophical)? Send the titles to spamcastinfo@gmail.com. What will we do with this list? We have two ideas. First, we will compile a list and publish it on the blog. Second, we will use the list to drive “Re-read” Saturday. Re-read Saturday is an exciting new feature we will begin in November with a re-read of Leading Change. More on this new feature next week. So feel free to choose you platform and send an email, leave a message on the blog, Facebook or just tweet the list (use hashtag #SPaMCAST)! Next SPaMCAST 314 features our interview with Janet Gregory and Lisa Crispin. We discussed their new book More Agile Testing. Agile testing is evolving at the same rate as Agile or maybe faster! Testing is still critical for delivering business value. Buy and read the book this week before listening to the interview! The Software Process and Measurement Cast has a sponsor. As many you know I do at least one webinar for the IT Metrics and Productivity Institute (ITMPI) every year. The ITMPI provides a great service to the IT profession. ITMPI’s mission is to pull together the expertise and educational efforts of the world’s leading IT thought leaders and to create a single online destination where IT practitioners and executives can meet all of their educational and professional development needs. The ITMPI offers a premium membership that gives members unlimited free access to 400 PDU accredited webinar recordings, and waives the PDU processing fees on all live and recorded webinars. The Software Process and Measurement Cast some support if you sign up here. All the revenue our sponsorship generates goes for bandwidth, hosting and new cool equipment to create more and better content for you. Support the SPaMCAST and learn from the ITMPI. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.
Software Engineering Radio - The Podcast for Professional Software Developers
This episode covers the topic of agile testing. Michael interviews Lisa Crispin as an practionier and book author on agile testing. We cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration.
Software Engineering Radio - The Podcast for Professional Software Developers
This episode covers the topic of agile testing. Michael interviews Lisa Crispin as an practionier and book author on agile testing. We cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration.
Software Engineering Radio - The Podcast for Professional Software Developers
This episode covers the topic of agile testing. Michael interviews Lisa Crispin as an practionier and book author on agile testing. We cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration.
Lisa Crispin is an agile testing coach and practitioner. She is the co-author, with Janet Gregory, of Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009). She specializes in showing testers and agile teams how testers can add value and how to guide development with business-facing tests. Her mission is to bring agile joy to the software testing world and testing joy to the agile development world. Lisa joined her first agile team in 2000, having enjoyed many years working as a programmer, analyst, tester, and QA director. Since 2003, she's been a tester on a Scrum/XP team at ePlan Services, Inc. She frequently leads tutorials and workshops on agile testing at conferences in North America and Europe. Lisa regularly contributes articles about agile testing to publications such as Better Software magazine, IEEE Software, and Methods and Tools. Lisa also co-authored Testing Extreme Programming (Boston: Addison-Wesley, 2002) with Tip House. For more about Lisa"s work, visit www.lisacrispin.com.Join the SPaMCAST’s community by joining the SPaMCAST Facebook page and get involved!!!! http://tinyurl.com/62z5elThe essay is titled "A Really Simple Checklist for Change Readiness Assessment” Part 2. Planning for change is no very different from planning a vacation. The Checklist will remind you of the big things to remember that sometimes get forgotten when dealing with the details of making change happen. Remember that part one was originally uploaded in SPaMCAST 51. The text of the whole essay can be found at www.tcagley.wordpress.com.There are a number of ways to share your thoughts with SPaMCAST: • Email SPaMCAST at spamcastinfo@gmail.com• Voice messages can be left at 1-206-888-6111• Twitter - www.twitter.com/tcagley• BLOG – www.tcagley.wordpress.com• FACEBOOK!!!! Software Process and Measurement http://tinyurl.com/62z5elNext Software Process and Measurement Cast: The next Software Process and Measurement Cast will feature an interview with Capers Jones discussing a wide range of software measurement topics. The interview with Capers was an exciting and I think you will find the interview an exciting listen. Listen with a friend!One more item . . .my father has begun to podcast his fiction at www.talesbytom.com. Yours truly is doing the production. Feel free to check it out and give him feedback.
Show fifty one is an interview with Tim Lister discussing his new book,” Adrenaline Junkies and Template Zombies”. The interview discussed the impact of specific patterns and habits on how IT organizations work.***NEWS ***Adrenaline Junkies is one of 5 finalists for general computing book of the year.Tim Lister is a software consultant at the Atlantic Systems Guild, Inc., based in the New York office. He divides his time between consulting, teaching, and writing. Tim is a co-author with his Guild partners of Adrenalin Junkies and Template Zombies: Understanding Patterns of Project Behavior, (Dorset House, 2008 http://www.dorsethouse.com/books/ajtz.html), He, is also co-author with Tom DeMarco of Waltzing With Bears: Managing Risk on Software Projects (Dorset House, 2003) that won Software Development magazine’s Jolt Award as General Computing Book of the Year for 2003-2004. Tim and Tom are also co-authors of Peopleware: Productive Projects and Teams, (Dorset House, 1999) now available in 14 languages.Tim is currently a member of the Cutter IT Trends Council. He is a member of the I.E.E.E. and the A.C.M. He is in his 23rd year as a panelist for the American Arbitration Association, arbitrating disputes involving software and software services.Contact information: Web Site: http://www.systemsguild.com/Email: lister@acm.orgCheck out SPaMCAST’s Facebook page and get involved!!!! http://tinyurl.com/62z5elThe essay is titled “A Really Simple Checklist for Change Readiness Assessment” Part 1. The essay reminds us of the big things that sometimes get forgotten when dealing with the minutia of getting a change project off the ground. Check out the text of the current essay at www.tcagley.wordpress.com. I should be back with an essay next show.There are a number of ways to share your thoughts with SPaMCAST: • Email SPaMCAST at spamcastinfo@gmail.com• Voice messages can be left at 1-206-888-6111• Twitter – www.twitter.com/tcagley• BLOG – www.tcagley.wordpress.com• FACEBOOK!!!! Software Process and Measurement http://tinyurl.com/62z5elNext Software Process and Measurement Cast: The next Software Process and Measurement Cast will feature an interview with Lisa Crispin discussing agile testing. Lisa’s most recent book is “Agile Testing: A Practical Guide for Testers and Agile Teams.” The book was coauthored with Janet Gregory. Testing and agile are highly inter-related although sometimes understanding how all the parts fit together isn’t obvious. Lisa makes agile testing very clear in her interview. Do not miss the interview.The interview on the Software Process and Measurement Cast 51 is with Tim Lister. We discussed Tim's new book "Adrenaline Junkies and Template Zombies".