Podcasts about Agile testing

  • 40PODCASTS
  • 73EPISODES
  • 40mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • May 1, 2024LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about Agile testing

Latest podcast episodes about Agile testing

Agile FM
152: Lisa Crispin

Agile FM

Play Episode Listen Later May 1, 2024 31:25


Transcript: Agile FM radio for the agile community. [00:00:04] Joe Krebs: In today's episode of Agile FM, I have Lisa Crispin with me. She can be reached at very easy to remember lisacrispin. com. Lisa is an author of a total of five books. There's three I want to highlight here or four actually is Obviously, a lot of people have talked in 2009, when the book Agile Testing came out, a practical guide for testers and Agile teams.Then following that more Agile Testing, right? Then I thought it would be most Agile Testing, but it turned into Agile Testing Condensed in 2019 and just very recently a downloadable book, Holistic Testing, a mini book. Welcome to the podcast, Lisa. [00:00:47] Lisa Crispin: Thank you for inviting me. I'm honored to be part of the podcast.You've had so many amazing people on so many episodes. So it's great. [00:00:54] Joe Krebs: Thank you. And now it's one more with you. So thank you for joining. And we will be talking a little bit about a totally different topic than maybe the last 20 episodes I had maybe way back. I did some testing topics, but I cannot recall maybe the last 20 episodes.So we're not about testing a super important topic. I would not consider myself an expert in that. And I don't know of the audience who has been listening to maybe the last 20 episodes are very familiar with agile testing. Maybe everybody has a feeling about, when they hear the word testing, but there is a huge difference between agile testing.And let's say traditional testing methods. If you just want to summarize like very brief, I know a lot of people are familiar with some of those things, but what it is, if somebody says what is agile testing, why was this different to traditional testing methods? [00:01:47] Lisa Crispin: Yeah. I think that there are a couple of big differences.One is that testing this is just a truth and not necessarily something to do with agile, but testing is really just part of software development. So many people think of it as a phase that happens after you write code, but in modern software development we're testing all the time, all the way around that whole DevOps loop, really.And and so the whole team's getting engaged in it through the whole lifecycle and the focus. Is on bug prevention rather than bug detection. Of course, we want to detect the bugs that make it out to production so we can fix them quickly. But really what we want to do is prevent those bugs from happening in the first place.So there are all these great practices that were popularized by that extreme programming and agile, things like test driven development, continuous integration, test automation all the things that go into, the planning. Workshops and things where we talk about our new features and break them into stories and what's going to be valuable to customers, having those early conversations, getting that shared understanding, things like behavior driven development, where we think about what we're going to code before we code it.That's all really different from, I guess I would say a more traditional software development approach where, Oh, we focus on these requirements. The requirements and a lot of people think about testing is just make sure it met the requirements. But there's so much more to that. We've got all these quality attributes, like security and performance and all the things that we also need to test.So it's a huge area, but it's woven into software development, just like coding, just like design, just like architecture, just like monitoring and observability. It's all part of the process. [00:03:31] Joe Krebs: Yeah. It's like a QA baked in, if you want to see it this way. And then also the automation of all that, right?So automating everything you just said is probably also a concern. Not that's necessarily new to agile, but that's a focus as well now I don't know if I don't have necessarily data points around that but I have worked with a lot of Scrum teams and Agile teams in my career.And it seems, if somebody would say what are the challenges within these teams? And one of them is, you can almost always highlight that, and I say almost purposely because there are good exceptions, is to build an increment of work once per sprint. A lot of teams do not accomplish that, and it's often related to testing activities.Why is that, in your opinion, like when we're seeing these teams struggle to put an increment of work out or a piece of the product or whatever you want to call it if you don't use Scrum necessarily, but to build something that could potentially go out. It's the quality standards of going out. What are the struggles out there for teams, especially on the testing side?I see that as you just said, like it's always happening or often happens at the end, rather than in the front. [00:04:46] Lisa Crispin: Yes. Unfortunately, I see, still see a whole lot of scrum teams and other agile teams doing a mini waterfall where they have testers on the cross functional team. But. The testers are not being involved in the whole process, and the developers aren't taking up practices like tester development, because those things are hard to learn and a lot of places don't enable.The non testers to learn testing skills because they don't put those skills into the skills matrix that those people need to advance their careers. And the places I've worked where we succeeded with this sort of whole team holistic approach, everybody had testing skills in their skills matrix.And we all had to learn from each other and, testers had other skills in their taste, matrix, like database skills and at least the ability to read code and be able to pair or ensemble with somebody. So that's part of it. And I just think. It's people don't focus enough on that, on the early process of the business stakeholder has brought us a new feature.We need to test that idea before we do anything. Is this really giving, what value, what's the purpose of the feature? What value is it going to give to the customer and to the business? And a lot of times we don't ask those questions up front. And the stakeholders don't ask themselves and then they get, you deliver the feature and it's something the customers didn't even want.[00:06:11] Joe Krebs: Lisa, we need to code. We need to get software. Why would we talk about that? Why would we not just code? I'm kidding. [00:06:18] Lisa Crispin: Yeah. Yeah. And that's also required, that's why the whole concept of self organizing team works really well. When you really let the teams be autonomous, because then they can think how best, how can we best accomplish this than they can do?Let's do some risk storing before we try to slice this into stories and let's do good practices to slice that feature into small, consistently sized stories that give us a reliable cadence predictable cadence of the business can plan and. Take those risks that we identified, get concrete examples for the business stakeholders of how this should behave and turn those into tests that guide development.Then we can automate those tests. And now we have regression tests to provide a safety net. So that all fits together. And of course, these days, we also need to put the effort into kind of the right side of the DevOps loop. We're not going to prevent all the bugs. We're not going to know about all the unknown unknowns, no matter how hard we try.And. These cloud architectures are very complex. Our test environments never look like production, so there's always something unexpected that happens. And so we have to really do a good job of the telemetry for our code, gathering all the data, all the events, all the logging data for monitoring. For alerting and also for observability, if something happens that we didn't anticipate, so it wasn't on our dashboard.We didn't have an alert for it. We need to be able to quickly diagnose that problem and know what to do. And if we didn't have. Enough telemetry for diagnosing that problem without having to, Oh, we've got to go back and add more logging and redeploy to production so we could figure it out. Oh, how many times has my team done that?That's all part of it. And then learning from production using those. And we've got fantastic analytics tools these days. Learning from those and what are the customers do? What was most valuable to them? What did they do when they, especially I mostly have worked on web applications.What did they do again? We released this new feature in the UI. How did they use it? And it's, we can learn, we know that stuff now. So that feeds back into what changes should we make next? [00:08:29] Joe Krebs: All right. So it's, it comes full circle, right? What's interesting is there's this company, it's all over the news.It's Boeing, right? We're recording this in 2024 quality related issues. Now, that is an extreme example, obviously, but. We do have these kind of aha and wake up moments in software development too, right? So that we're shipping products and I remember times where testing, I purposely call it testing and not QA, testing personnel was outsourced.That was like many years ago. We actually felt oh, this activity can be outsourced somewhere else. And you just made a point of if we have self organizing teams, And we're starting with it and we're feeding in at the end of a loop back into the development efforts, how important that is and how we treated these activities in the past and how, what we thought of it is, it's shocking now looking back in 24, isn't it?[00:09:23] Lisa Crispin: Yeah, it's just, it just became so much part of our lives to run into that. And the inevitable happened, it generally didn't work very well. I've actually known somebody who led an outsourcing test team in India and was working with companies in the UK and Europe.They actually were able to take an agile approach and keep the testers involved through the whole loop. They had to work really hard to do that. And there were a lot of good practices they embraced to make that work. But you have to be very conscious. And and both sides have to be willing to do that extra work.[00:09:56] Joe Krebs: You just mentioned that there were some really cool analytics tools. I don't know if you want to share any of those because you seem very excited about this, [00:10:05] Lisa Crispin: the most, the one that I found the most useful and I, a couple of different places I worked at used it.It's called full story. And it actually. It captures all the events that are happening in the user interface and plays it back for you as a screencast. Now, it does block anything they type in. It keeps it anonymized. But you can see the cursor. And I can remember one time a team I was on, it's we put a whole new page in our UI, a new feature.We thought people would really love it. And we worked really hard on it and we, we tried to do a minimum viable version of it, but we still put some effort in it and we put it out there. And then we looked at the analytics and full story and we could see that people got to the page. Their cursor moved around and then they navigated off the page.So either it wasn't clear what that page was for, or they just couldn't figure it out. So that was really valuable. I was like, okay, can we come up with a new design for this page? If we think that's what the problem is, or should we just, okay, that was a good. Good learning opportunity. But as a tester, especially there, because we can't reproduce problems, we know there's a problem in production, can't reproduce it.But if we go and watch a session where somebody had the problem, and there are other things, mixed panel, won't play it back for you, but you can see every step that the person did. And even observability tools Honeycomb and LightStep can show you like the whole, they can trace the whole path of what did the user do.And that really helps us not only understand the production problem, but, Oh, there's a whole scenario. We didn't even think about testing that. And so there's so much we can learn because we're so bound by our cognitive bias, our unconscious biases that we know how we wanted it to work.[00:11:54] Joe Krebs: Yeah. [00:11:55] Lisa Crispin: And it's really hard to think outside the box and get away from your biases and really approach it like a customer who never saw it before would do. [00:12:03] Joe Krebs: Yeah. It's this is the typical thing, right? If a software engineer demonstrates their own software they produce and was like eight books on my machine, I'm sure you have heard that.And it's it's obvious that you would do this, right? And it's just not necessarily obvious for somebody else. But if you're like sitting in front of a screen developing something for a long time, it just becomes natural that you would be working like this. I myself have engineered software and and fell into that trap, right?It's oh my God, eye opening event. If somebody else looks at you or. Yeah, [00:12:33] Lisa Crispin: Even when you sometimes have different people, like I can remember an occasion that Timo was on with a, again, a web application and I was just changed in the UI, just adding something in the UI and I tested it. My, my manager tested it.One of the product owner tested it. And we all thought it looked great and it did look great. We didn't notice the other thing we had broken on the screen until we put it in the production and customers were like, Hey, I really do think things like pair programming, pair testing, ensemble, working in ensembles for both programming and testing, doing all the work together.Getting those diversity points does help hugely with that. My theory is we all have different unconscious biases. So maybe if we're all together, somebody will notice a problem. I don't have any science to back that up, but But that's why those kind of practices are especially important. [00:13:28] Joe Krebs: Yeah. [00:13:28] Lisa Crispin: To catch as many things as we can.[00:13:30] Joe Krebs: Yeah. So we both didn't have any kind of science to back this up, but let's talk a little bit about science. Okay. Because metrics, data points, evidence. What are some of the KPIs if somebody listens to this and says Oh, that sounds interesting. And we definitely have shortcomings on testing activities within Agile teams.Obviously there's the traditional way of testing. They're using very different data points. I have used some in the past, and I just want to verify those with you too. It's that's even useful and still up to date. What would be some good KPIs when somebody approaches you and says that's got to have that on your dashboard?[00:14:08] Lisa Crispin: I think you, I actually think one of my favorite metrics to use is cycle time, although that encompasses so many things, but just watching trends and cycle time. And if you're, if you've got, for example, if you've got good test coverage with your automated regression tests, you're going to be able to make changes really quickly and confidently.And if you have, a good deployment pipeline, you're going to Again, there's a lot of testing that goes into making sure your infrastructure is good and your pipeline is performing as it should, because it's all code to that reflects a whole lot of things. It's hard to isolate one thing in your cycle time but what counts is, how consistent are we at being able to frequently deliver small changes?So I think that's an important one. And in terms of. Did we catch all the problems? I think it gets really dangerous to do things like, Oh, let's count how many bugs got in production because all measures can be gained, but that's a really easy one to gain. But things like how many times did we have to roll back or revert a change in product in production?Because there was something we didn't catch and hopefully we detected that ourselves with an alert or with monitoring before the customers saw it. And now that we have so many release strategies where we can do, Canary releases or blue green deploy so that we can do testing and production safely.But still how many times did we have to roll back? How many times did we get to that point and realize we didn't catch everything? That can be a good, that can be a good thing to track. And depending on what caused it. If we had to, if we had a production failure because, somebody pulled the plug of the server out of the wall.That's not, that's just something that happened, but if it is something that the team's process failed in some way, we want to know about that. We want to improve it. And, just how frequently can we deploy I think, the thing with continuous delivery, so many teams are practicing that are trying to practice that you're not going to succeed at that if you're.If you're not preventing defects , and if you don't have good test automation, good automation the whole way through. [00:16:08] Joe Krebs: Yeah. [00:16:08] Lisa Crispin: And I think, deployment frequency, that's another one of the Dora key metrics. Yeah. That's a real that we know that correlates with high performing teams.And of course we shouldn't ignore, how do people feel are people burned out or do they feel happy about their jobs? That's a little harder metric to get. I was on a team, my last full time job, we really focused on cycle time as a metric and we didn't really have that many problems in production.So we didn't really bother to track how many times we had to revert because we were doing a good job, but. But but, how frequently were we going? What was our cycle time? But also we did a little developer joy survey. So once a week, we sent out a little 5 question survey based on Amy Edmondson's work.And now I would base it on I would also use Nicole Forsgren's space survey. Model, but that was just a little before that came out, but just asking just a few questions and multiple, from one to five, how did you feel about this? And it was really interesting because over time, if cycle time was longer, developer joy was down.So there's something happening here that people are not happy. And. Something's going wrong. That's affecting our cycle time. And then the reverse is true. When our cycle time was shorter, joy went up. So I think it's I think it's important and, you don't have to get real fancy with your measurements, just start just, I think you should first focus on what are we trying to improve and then find a metric to guide, to measure that.[00:17:41] Joe Krebs: I'm glad you said or mentioned code coverage. That's one of one of those I mentioned earlier. I've been working with it quite a bit and cycle time. Um, very powerful stuff. Now, with you, such, somebody who has written published about agile testing extensively we are in 2024. There was the years ahead.There are agile conferences. There is a lot going on. What are the trends you currently see in the testing world? What is what's happening right now? What do you think is influencing maybe tomorrow? The days coming, I know you have holistic testing yourself. So maybe that is one but I just want to see, what do you see happening in the agile testing? [00:18:24] Lisa Crispin: Oh, just all of software development, definitely evolving. I think one of the things is that we're starting to be more realistic and realize that executives don't care about testing. They care about how well does their product sell?How much money is the company making? We know that. Product quality and process product quality obviously affects that. And that's from a customer point of view. It's the customer who defines quality. And, back in the nineties, we testers thought we were defining quality. So that's a thing, a change that's occurred over time and really thinking about that and also knowing that our process quality has a huge impact on product quality and what are our, are the, What are practices?What are the most important practices we can be doing? Janet Gregory and who is my coauthor on four of those books and Selena Delesie they've been consultants for years and helped so many huge, even big companies through an agile transformation. And they've distilled their magic into their, what they call a quality practices assessment model.And they identified 10 quality practices that they feel are the most important and things like feedback loops. Things like communication, right? And the model helps you ask questions and see where is the team in these different different aspects of practices that would make them have a quality process, which would help them have a quality product.And it gives teams a kind of a roadmap. It's here's where we are now. What do we need to improve? Oh, we really need to get the continuous delivery and these things are on our way, things like that. So I think that's one realization that it ties back to the idea that testing is just part of software development and we've had for years.So like, how can I make the, president of this company understand that we testers are so important. We're not, but it's important that the team build that quality. [00:20:29] Joe Krebs: But you could also argue that maybe a CEO of a company or the leadership team would say, we also don't care if this is being expressed in one line of code or two lines of code.So it's not necessarily to testing. I think they're just saying we have, here's our product. But I think what has changed is that your competition is just one mouse click away. Yeah. Quality is a determining factor. Now, let's take this hypothetical CEO out there right now listening to our conversation and saying I do want to start to embrace agile testing and agile in general, but more of those things you just mentioned, what would be a good starting point for them?Obviously there's a lot of information right now keywords and buzzwords we just shared today. What would be a good starting point for them to start that journey, because that is obviously not something that's coming overnight. [00:21:20] Lisa Crispin: I think one of the most important things that leadership can do is to make, to enable the whole team to learn testing skills that will help them build quality.And that means making it part of their job description, making it part of their skills matrix for career advancement, because that gives them time. If developers are paid to write lines of code, that's what they're going to do. But if they're, it's okay, you're an autonomous team.You decide what practice you think will work for you. We're going to support you. It's going to, it's going to slow things down at first. Okay, like I was on a team in 2003 that was given this mission. Okay. Do what you think you need to do first. We decided what level of quality we wanted, of course.We wanted to write code that we would take home and show our moms and put it on our refrigerators, and and we all commit to that level of quality. How can we achieve that? We're seeing that test driven development has worked really well for a lot of teams. So let's do test driven development, which is really.Not that easy to learn, but when you have a leadership that lets you have that time to learn and support you, it pays off in the long run because eventually you're a lot more confident. You've got all these regression tests. You can go faster and things like continuous integration, refactoring, all these practices that we know are good.And we were empowered to adopt those. It was all part of all of our job descriptions. And that's, so we became a high performing team, not overnight. Yeah, within a few years and a part of our, part of what we did was spend a lot of time learning the business domain. It's a very complicated business domain.And so when the stakeholders came and said we want this feature and we asked them what, why do they want it? What was it supposed to do? What was the value? We could usually cut out half of what they thought they wanted. We can say, okay, if we did all of this, we think it's going to be this much effort, but we could do 80 percent of it for half the cost.How's that? Oh yeah. Oh yeah. Nobody ever turned us down on that one. So that's another way where you go fast, we eliminate things that customers don't want or need. And so yeah, it's the unicorn magic of a self true self organizing team. [00:23:30] Joe Krebs: Yeah. But I do think what you said is, , this one thing that just stood out to me It is an investment, it is an investment into the future.It's a really good feeling to have later on the capability of releasing software whenever you want. If that is not becoming a massive burden and the whole company needs to come together for all nighters to get a piece of software out of the door. Now you're not only an author here you're also a practitioner.You also work with teams and I just want to come back to that business case of agile testing. One more time. Do you have an example from a client recent or further back where you would say that stands out or that's an easy one? You remember where agile testing made a huge difference for an organization.I'm sure there are tons you have where you would say there was a significant impact for them based on introducing agile testing practices. [00:24:29] Lisa Crispin: I certainly, especially early on in the extreme programming and Agile adoption there was a few occasions where I joined a team that never had testers.They were doing the extreme programming practices and you may recall that the original extreme programming publications did not mention testers. They were all about testing and quality, but they didn't mention testers. And. So these teams were doing test driven development, continuous integration. They were writing really good code and then they were doing, they were doing their two week sprints and maybe, maybe it took them three sprints to develop what the customer wanted and then they give it to the customer and the customer is but that's not what I wanted.So they like, maybe we need a tester. So then they hired me. And I was like, oh okay, let's let's have some of the, some, okay, we're gonna do a new, some new features. Let's have some brainstorming sessions. How are we gonna, what is this new feature for? How are we gonna implement it?What are the risks? And start doing risk assessments. And how are we gonna mitigate those risks? Are we gonna do it through testing? Are we gonna do it through monitoring? And just asking those what if questions? What's the worst thing that could happen? That's my favorite question when we release this.And could we deploy this feature to production and have it not solve the customer problem? And just add, anyone could ask those questions. It doesn't have to be a tester, but I find the teams that don't have professional testers, specialists, they, nobody else thinks of those questions. They could. But they just, testing is a big area.It is a big set of skills. And anybody on that team, I know lots of developers who have those skills, but not every team has a developer like that, other specialists, like business analysts could also help, but there were even fewer business analysts back in the day than there were testers.And as soon, so as soon as the tester, and when I, one team I joined early on, okay, they're like, okay, Lisa you can be our tester. But you can't come to the planning meetings and you can't come to the standups. That's a little weird. I did as best I could without being involved in any of the planning.And so that's the end of the two weeks. They weren't finished. Nothing was really working. And I said, Oh, Hey, can we try it my way? Let me be involved in those early planning discussions. Let me be part of the standup and Oh, amazing. Next time we met our target. And and I was I couldn't support all the, there were 30 developers and one tester, but we agreed that one other person or two other people would wear the testing hat along with me every sprint or at least on a daily basis.And so they all started to get those testing skills. Yeah, they just test, like I say, testing is a big area and you don't know what you don't know. I see teams even today. That they don't have any testers because years ago they were told they didn't need them if they did these extreme programming practices and they're doing testers involvement, they're doing continuous integration.They're maybe even doing a little exploratory testing. They're doing pair programming, even some ensemble or mob programming. They're doing great stuff, but they're missing out all that stuff at the beginning to get the shared understanding with the stakeholders of what to build. [00:27:43] Joe Krebs: All those lines of code that were needed. Wouldn't need to be tested. [00:27:48] Lisa Crispin: And so they release the feature and bugs come in. And they're really, they're missing features. It's not what the customer needed. Too many gaps. . And of course, I want to say those aren't really bugs. But they're bad. Yeah. And if you'd had a risk storming session, if you had planning sessions where you got.Example mapping sessions, for example, where you got the business rules for the story, concrete examples for his business role, and then you turn those into tests to guide your development with behavior driven development. This would have solved your problem, but they didn't know to do that. Anybody could have learned those things, but we can't all know everything.[00:28:25] Joe Krebs: Yeah. We're almost out of time.But there's one question I wanted to ask you and it might be a short answer. I hope you condense it a little bit. But when somebody gets on your LinkedIn page Lisa Crispin there is you in a picture plus a donkey. And you have donkeys yourself.And how does this relate to, does it relate to your work? What do you find inspirational about donkeys? And what, why is, why did you even make your LinkedIn profile? It has to be, it has to be a story around it.[00:28:55] Lisa Crispin: It's interesting. A few years ago at European testing conference, we had an open space and somebody said, Oh, let's have an open space session on Lisa's donkeys.And then we got to talking about this and I actually have learned a lot. About Agile for my donkeys. And I think the biggest thing is trust. So donkeys are work on trust. So with horses, I've ridden horses all my life and had horses all my life as well. You can bribe or bully a horse into doing something, they're just, they're different.If you reward them enough, okay they'll go along with you. If you kick them hard enough, maybe they'll go. Donkeys are not that way. They're looking out for number one. They're looking out for their own safety. And if they think you might be getting them into a situation that's bad for them, they just flat won't do it.So that's how they get the reputation of being stubborn. You could beat a bloody, you could offer them any bribe you want. They're not doing it. And so I learned I had to earn my donkey's trust. That's so true of teams. We all have to trust each other. And when we don't trust each other. We can't make progress and the teams I've been on that have been high performing teams We had that trust so we could have discussions where we had different opinions We could express our opinions without anyone taking it personally Because we knew that we were all in it together and it was okay Anybody could feel safe To ask a question, anybody can feel safe to fail, but you have that trust that there's nothing bad is going to happen.And so I could bring my donkey right in that door in the house. I've taken them in schools. I've taken them to senior centers because they trust me. And if I did anything, if they came to harm while in my care, if I, let's say I was driving the cart and the collar rubbed a big sore on them, that would destroy the trust.And it would be really hard to build it back. And so we always need to be conscious of how we're treating each other in our software teams. [00:30:55] Joe Krebs: Yeah, wonderful. I did hear about the rumor of being stubborn. But I also always knew that donkeys are hardworking animals. [00:31:02] Lisa Crispin: They love to work hard.Yeah. [00:31:05] Joe Krebs: Awesome. Lisa, what a great ending. I'm glad we had time to even touch on that. That was a great insight. Thank you so much for all your insights around testing, but also at the end about donkeys. Thank you so much, Lisa. [00:31:17] Lisa Crispin: Oh, it's my pleasure.

Arguing Agile Podcast
AA159 - Exploring the Evolving Role of Quality Assurance, with CEO Bob Crews

Arguing Agile Podcast

Play Episode Listen Later Apr 10, 2024 37:25 Transcription Available


In this episode of the Arguing Agile podcast, hosts Om Patel and Brian Orlando are joined by special guest Bob Crews, founder and CEO of Checkpoint Technologies. They dive deep into the critical role quality assurance plays in the agile development process. Discussions include:The difference between QA and QC and how they fit into agile How QA has evolved over the past 20 years, especially with the rise of automation and DevOpsThe importance of risk analysis as a QA function to determine test coverage- How QA can bridge the gap between IT and the business The impact of AI on software testingTips for spotting and developing top QA talentWhether you're in QA, product management, or an agile leader, you'll gain valuable insights on empowering your QA function to ensure high quality software. #QualityAssurance #AgileTesting #DevOps #TestAutomation #AI= = = = = = = = = = = =Watch it on YouTube= = = = = = = = = = = =Subscribe to our YouTube Channel:https://www.youtube.com/channel/UC8XUSoJPxGPI8EtuUAHOb6g?sub_confirmation=1Apple Podcasts:https://podcasts.apple.com/us/podcast/agile-podcast/id1568557596Spotify:https://open.spotify.com/show/362QvYORmtZRKAeTAE57v3Amazon Music:https://music.amazon.com/podcasts/ee3506fc-38f2-46d1-a301-79681c55ed82/Agile-Podcast= = = = = = = = = = = =Toronto Is My Beat (Music Sample)By Whitewolf (Source: https://ccmixter.org/files/whitewolf225/60181)CC BY 4.0 DEED (https://creativecommons.org/licenses/by/4.0/deed.en)

Agile Mentors Podcast
#48: Holistic Agile Testing with Lisa Crispin and Janet Gregory

Agile Mentors Podcast

Play Episode Listen Later May 17, 2023 41:24


Join Brian and his guests, Janet Gregory, and Lisa Crispin, as they share their expertise on integrating testing into Agile teams. Discover how to bridge the gap between programmers and testers for collaboration and success. Overview In this episode of the "Agile Mentors," Brian Milner sits down with Janet Gregory and Lisa Crispin, founders of Agile Testing Fellowship, to discuss integrating testing into Agile teams. They discuss the history of the divide between programmers and testers and the importance of collaboration and communication between the two groups. Listen in as they explore the different levels of holistic testing, the mindset shift needed for bug prevention, and the tools and strategies for planning and estimating testing activities. Plus, the role of AI in testing. Listen Now to Discover: [00:05] - Brian Milner introduces the guests for this episode, Janet Gregory and Lisa Crispin, who are advocates for integrating testing into Agile teams and the Founders of Agile Testing Fellowship. [02:25] - Lisa explains the most important goal for collaboration and success. [03:34] - Janet talks about the history of the gulf between programmers and testers. [05:09] - How to bridge the gap between programmers and testers and the value of collaboration. [07:29] - What the values of Agile and Extreme Programming emphasize. [09:49] - The mindset shift needed for bug prevention. [11:17] - Managers behaving badly—Brian shares a story about how measuring the wrong things can drive the wrong behaviors. [12:13] Brian discusses the micro view of testing instead of a system view. [12:17] How to handle intense forms of testing that take a long time to complete. [14:02] Janet explains the different levels of testing and that teams should determine where testing belongs based on when it can be performed earliest. [15:23] Avoiding a "hardening sprint." [16:48] Lisa shares how to use visual models like the agile testing quadrants and the holistic testing model to help plan and communicate the testing activities needed throughout the software development lifecycle. [17:25] The website where you can find the training written by Lisa and Janet, including “More Agile Testing” and “Agile Testing Condensed” (recently released), and where you can download the FREE Mini-book "Holistic Testing: Weave Quality into your Product." [18:29] - Brian introduces the sponsor for the podcast, Mountain Goat Software. If you are thinking about getting certified as a Scrum Master, check out the resources and training options where certification classes are available every week. [19:26] - The key to fitting testing into a normal sprint cycle and integrating testing with other system pieces. [20:52] - Janet shares a tip for ensuring testing is not overlooked. [20:59] - Lisa shares how to remind teams to do testing at the right time. [22:31] - Why have a visible reminder for testing? [23:54] - The importance of accounting for testing and not treating it as a separate thing to do. [24:37] - Lisa shares her experience using planning poker for estimation and her preference to get every story the same size so they can be completed in a day or two. [25:50] - Janet suggests sizing stories and estimating tasks, why she estimates her tasks herself, and what she’s learned in that process. [26:44] -How to reduce the time needed in estimation meetings: Lisa shares some insight to identify when a story is too big and needs to be split up. [27:35] - The importance of conversation and understanding to avoid creating a wall between programmers and testers during estimation. [28:03] - Another tool in the toolbox: how Chat GPT will revolutionize testing (and who it might replace). [29:01] - There will never be enough time to do all the testing required. [29:31] - Lisa highlights how AI as a tool saves time with testing and allows more time for critical thinking skills. [30:12] - The need for a human presence in the use of AI. [31:19] - Janet shares information about her and Lisa's two courses, Basic Strategies for Agile Teams and Holistic Testing for Continuous Delivery, based on the Holistic testing model of looking at testing activities throughout the software development lifecycle. These courses can be found here. [36:37] Lisa mentions that her book, “Assessing Agile Quality Practices” helps teams identify where they are and where they can improve, using a framework that looks at ten different quality aspects. Plus, information on the book they are working on now on how to facilitate an assessment. [39:03] - Brian provides a list of resources available from Lisa and Janet, including their books “Agile Testing Condensed: A Brief Introduction” “Agile Testing,” “More Agile Testing,” and Assessing Agile Quality Practices and their "Holistic Testing: Weave Quality into Your Product” free download. [40:14] - Join the Agile Mentors Community to continue the discussion. If you have topics for future episodes, email us by clicking here. And don’t forget to subscribe to the “Agile Mentors” Podcast on Apple Podcasts so you never miss an episode. References and resources mentioned in the show: Agile Testing Fellowship Agile Testing - The Book Agile Testing Condensed: A Brief Introduction More Agile Testing Holistic Testing: Weave Quality into Your Product Assessing Agile Quality Practices Mountain Good Software's Advanced Certified Product Owner course Mountain Goat Software Certified Scrum and Agile Training Schedule Join the Agile Mentors Community Subscribe to the Agile Mentors Podcast on Apple Podcasts Want to get involved? This show is designed for you, and we’d love your input. Enjoyed what you heard today? Please leave a rating and a review. It really helps, and we read every single one. Got an Agile subject you’d like us to discuss or a question that needs an answer? Share your thoughts with us at podcast@mountaingoatsoftware.com This episode’s presenters are: Brian Milner is SVP of coaching and training at Mountain Goat Software. He's passionate about making a difference in people's day-to-day work, influenced by his own experience of transitioning to Scrum and seeing improvements in work/life balance, honesty, respect, and the quality of work. Lisa Crispin is the Co-founder of the Agile Testing Fellowship, an author, and an Agile tester and coach, who helps practitioners deliver quality software frequently and sustainably. Janet Gregory is the Co-founder of the Agile Testing Fellowship, an author, and a consultant, specializing in building quality systems and helping companies promote agile quality processes.

Scrum Dynamics
Agile Testing for Dynamics 365 with Emma Beckett

Scrum Dynamics

Play Episode Listen Later Mar 29, 2023 41:11 Transcription Available


#139. Today's guest is Emma Beckett, an experienced test professional who runs her own company, Fortitude 17, in London, UK. A professional footballer, Emma pursued a career in software testing, even though her first role in tech was in Desktop Support — hardware, not software!As you'll hear, Emma is a Certified ISTQB Test Consultant and trained in Microsoft Dynamics AX 2012 and Microsoft Dynamics 365 F&SCM, CE, and HR. She is currently training on the Microsoft Dynamics 365 Business Central and Ceridian Dayforce (HCM) solutions.In this episode, Emma shares how she got into software testing and discusses 1) professional testing, 2) how she approaches business application projects, and 3) what testing at large can bring to Dynamics applications.HGHLIGHTS[01:06] How Emma came to open her own testing consultancy[09:01] Emma's approach to professional testing[12:48] The qualities that Emma looks for in good test professionals[18:09] The relationship between testing and training[20:00] Approaches to acceptance testing that have worked well for Emma[24:25] How Emma approaches other forms of testing on projects[29:43] How Emma deals with testing challenges[32:28] Other solutions that Emma's consultancy has worked on[34:22] Emma talks about her recently launched podcastRESOURCESFortitude 17Connect with Emma Beckett on LinkedInCheck out the elbeckio showI've just registered for Microsoft Power Platform Conference in Las Vegas from 3-5 October. I'd love to see you there. Visit customery.com/mppc for a $150 discount voucher to register.Support the showCONNECT

Agile-Lean Ireland (ALI) Podcast
Building Agile Testing & BDD Example mapping ways of working - Jorge Luis Castro Toribio - Agile Lean Ireland

Agile-Lean Ireland (ALI) Podcast

Play Episode Listen Later Mar 22, 2023 49:09 Transcription Available


In this presentation, we will discuss our firsthand experiences of constructing agile testing methodologies through the utilization of BDD (Behavior Driven Development) and example mapping. Our aim is to minimize gaps and functional errors while enhancing lead time. Jorge Luis Castro Toribio Jorge is an agilist, Agility & Digital Transformation Lead Coach, QA Manager, DevOps Program Manager, and global speaker who feels passion for Testing Automation, Agile, DevOps and business agility. He enjoys research and learning about business agility and working with cutting-edge technologies. He has been learning continuously for more than 12 years and is still learning new ways to foster enterprise agility and team greatness. He has worked in several roles (developer, tester, IT program manager, Software engineering in Test manager, QA manager, agile coach) which helps him to see the big picture and operational and team members dynamics happening inside organizations. This experience lets him help teams design, build, and implement digital, DevOps and agile transformation strategies. He encourages focus on: People, Productivity, Continuous improvement, Innovation and having fun to enjoy success in our agile journeys.Jorge has been a speaker in Agile and DevOps events organized in Peru, Canada, UK, Netherlands, India, Germany, Mexico, Colombia, Nigeria, Panama, Azerbaijan, Australia, US among other countries. Find us here: www.agileleanireland.org

The Testing Show
The Testing Show: Assessing Quality Practices

The Testing Show

Play Episode Listen Later Jan 18, 2023 35:00


Agile Testing and some would say Modern Testing is built around understanding the processes and quality practices necessary to deliver a quality product to customers. How do we know if the practices our company or organization is using actually deliver what we hope them to. To help answer that, Selena Delesie and Janet Gregory join Matthew Heusser and Michael Larsen to talk about their new book "Assessing Agile Practices With the Quality Practices Assessment Model (QPAM)" and help determine if the practices an organization is using are effective and ultimately will help deliver quality products in the first place.

Compiler
Testing, PDFs, And Donkeys

Compiler

Play Episode Listen Later Oct 27, 2022 28:38


We reach our penultimate episode for Stack/Unstuck, and arrive on the topic of testing. Testing isn't necessarily part of any technology stack, but it is a vital part of building software. Sometimes, it can feel like testing is an afterthought, or just a box for busy coders to tick once completed.We hear from our guests about how testing doesn't need to be saved for a curtain call. It can have a starring role when identifying problems within different components of a software stack. And as we include it more in discussions and planning, and as we start thinking about it earlier in development cycles, testing can further an application's potential, and help teams build software better.

Lean Blog Interviews
Luke Szymer on Agile, Testing Hypotheses, and Process Behavior Charts

Lean Blog Interviews

Play Episode Listen Later Aug 3, 2022 43:59


Founder of “Launch Tomorrow.” Episode page with transcript, video, and more My guest for Episode #452 of the Lean Blog Interviews Podcast is Luke Szyrmer. He's the founder of “Launch Tomorrow.” He helps new technology products get to market faster (even remotely). Luke is the author of the books Align Remotely: How to achieve together, when everyone is working from home and Launch Tomorrow: Take Your Product, Startup, or Business From Idea to Launch in One Day. He's the host of the highly rated “Managing Remote Teams” podcast. He comes from a product management background and has a BA in Economics and English from the University of Pennsylvania. He's joining us on the podcast from Poland. Today, we discuss topics and questions including: Background question — How did you get introduced to Agile, Lean Startup, things like that? “Fuzzy side of innovation”?? — time wasted 20-30 years ago? Doing the wrong things righter? Tampering – and increasing variation Processes for creating software? When you were reading about “Lean Manufacturing”? How does that resonate with? How does that relate to you and your work? How easy is it to estimate “story points”? Lean Thinking – batch vs flow… physical flow vs. work flow — Adaptations to the flow of software? Takt time – how to translate this in terms of required software, requirements, points How did you learn about Process Behavior Charts? Why did that resonate with you? How do you incorporate PBCs into your work? Counting physical products vs. story points (something more esoteric)? Landing pages – product or service that doesn't exist yet What to test BEFORE a landing page? How to make a good decision with limited data points? What's so powerful about testing an idea as a hypothesis? The podcast is sponsored by Stiles Associates, now in their 30th year of business. They are the go-to Lean recruiting firm serving the manufacturing, private equity, and healthcare industries. Learn more. This podcast is part of the #LeanCommunicators network. 

Tech Lead Journal
#92 - Agile and Holistic Testing - Janet Gregory & Lisa Crispin

Tech Lead Journal

Play Episode Listen Later Jun 13, 2022 61:16


“Testing is an activity that happens throughout. It is not a phase that happens at the end. Start thinking about the risks at the very beginning, and how we are going to mitigate those with testing." Janet Gregory and Lisa Crispin are the co-authors of several books on Agile Testing and the co-founders of Agile Testing Fellowship. In this episode, Janet and Lisa shared the agile testing concept and mindset with an emphasis on the whole team approach, which was then followed by an explanation of the holistic testing concept with a complete walkthrough how we can use the approach in our product development cycle, including how Continuous Delivery fits into holistic testing. Janet and Lisa also described some important concepts in agile testing, such as the agile testing quadrants (to help classify our tests) and the power of three (aka the Three Amigos). Towards the end, Janet and Lisa also shared their perspective on exploratory testing and testing in production. Listen out for: Career Journey - [00:06:35] Agile Testing - [00:13:56] Whole Team - [00:15:17] Agile Testing Mindset - [00:19:19] Holistic Testing - [00:24:42] Continuous Delivery - [00:34:53] Agile Testing Quadrants - [00:39:03] The Power of Three - [00:42:50] Exploratory Testing - [00:47:08] Testing in Production - [00:50:49] 3 Tech Lead Wisdom - [00:54:10] _____ Follow Janet and Lisa: Janet's Website – https://janetgregory.ca Janet's Twitter – @janetgregoryca Janet's Linkedin – https://www.linkedin.com/in/janetgregory Lisa's Website – https://lisacrispin.com Lisa's Twitter – @lisacrispin Lisa's Linkedin – https://www.linkedin.com/in/lisa-crispin-88420a Agile Tester Blog – https://agiletester.ca/blog Agile Testing Fellowship Website – https://agiletestingfellow.com Our Sponsor Today's episode is proudly sponsored by Skills Matter, the global community and events platform for software professionals. Skills Matter is an easier way for technologists to grow their careers by connecting you and your peers with the best-in-class tech industry experts and communities. You get on-demand access to their latest content, thought leadership insights as well as the exciting schedule of tech events running across all time zones. Head on over to skillsmatter.com to become part of the tech community that matters most to you - it's free to join and easy to keep up with the latest tech trends. Like this episode? Subscribe on your favorite podcast app and submit your feedback. Follow @techleadjournal on LinkedIn, Twitter, and Instagram. Pledge your support by becoming a patron. For more info about the episode (including quotes and transcript), visit techleadjournal.dev/episodes/92.

Testing Peers
Testing in Agile

Testing Peers

Play Episode Listen Later Apr 24, 2022 30:32


Welcome to another episode of the Testing Peers podcast. Today we talk Agile and our experiences around itThis week Simon banter borrows his banter ideas from the More Than Work Podcast. Once we move on from our banter we move on to talk about agile/Agile and what it isn't. We talk about the mindset that is needed and how transformations go badly or well. We talk about short cuts and bad implantations along with the good. We also share a quote from Vince Lombardi around chasing perfection 'Gentlemen, we will chase perfection, and we will chase it relentlessly, knowing all the while we can never attain it. But along the way, we shall catch excellence'. After talking agile we share a few useful resources - 'Agile Testing', 'More Agile Testing' and 'Condensed Agile Testing' from Lisa Crispin and Janet Gregory, Agile Manifesto. Mike Cohen - Mountain Goat Software. Alsatian blogs. We hope you found the discussion useful and would love to hear your feedback.ContactUs@TestingPeers.comTwitter (https://twitter.com/testingpeers)LinkedIn (https://www.linkedin.com/company/testing-peers)Instagram (https://www.instagram.com/testingpeers/)Facebook (https://www.facebook.com/TestingPeers)We're also now on GoodPods, check it out via the mobile app storesIf you like what we do and are able to, please visit our Patreon to explore how you could support us going forwards: https://www.patreon.com/testingpeersSaffron QA is a provider of recruitment and consultancy services, exclusively for the software testing industry.You can find out more at https://saffronqa.co.uk/  or on LinkedIn at https://www.linkedin.com/company/saffron-qa/Support the show (https://www.patreon.com/testingpeers)

Automation Hangout
How to Improve the Effectiveness of Agile Testing

Automation Hangout

Play Episode Listen Later Apr 5, 2022 34:00


Organizations must meet the demands for speed, quality, and flexibility to manage the expectations of both the business team and end customers. Organizations have made significant improvements in the past decade to implement Agile methodologies like Lean-Agile, Kanban, Scrum, Extreme Programming, etc, and modern software development tools. However, they are challenged by a lack of collaboration, low automation levels, and low focus on continuous improvement. The pandemic and remote made of working have also impacted some of the Agile practices and ceremonies. We talk with Lisa Crispin, an industry-renowned Agile Testing Coach, and Practitioner, in the episode. Lisa talks about improving software testing effectiveness, driving continuous improvement, and increasing test automation levels. Lisa also provides the need for having a common automation toolset and reviewing feedback from production.    

Compile Podcast / پادکست کامپایل
نگاه اجمالی به کتاب تست چابک (Agile Testing)

Compile Podcast / پادکست کامپایل

Play Episode Listen Later Feb 15, 2022 87:14


این یک اپیزود مشترک با پدرام کشاورزی از پادکست اجایل گپ است که سعی کردیم راجع به موضوع تست در اجایل صحبت کنیم. پدرام خودش مربی اجایل و اسکرام مستره و توی پادکستش اطلاعات خوبی در مورد توسعه چابک میتونید پیدا کنید. بهتون پیشنهاد میکنم حتما به پادکستش سر بزنید. اما توی این اپیزود یک کتاب را به شکل اجمالی بررسی کردیم، کتابی که موضوعش در رابطه با همین عنوان این اپیزود است. امیدوارم که خوشتون بیاد تدوین: پدرام کشاورزی  طراح پوستر: سید محمد حسین بطحایی لینک پادکست اجایل گپ لینک کتاب لینک آموزش ویدیویی

The Testing Show
The Testing Show: Do We Still Need the Phrase "Agile Testing"?

The Testing Show

Play Episode Listen Later Feb 2, 2022 34:50


Agile as a development practice (and by extension Agile Testing) has been around now for two decades. By virtue of that, many changes and adaptations have been made and it brings up a simple question: is the term "Agile Testing" all that relevant any longer? To this end, Matthew Heusser and Michael Larsen have a chat with Janet Gregory and Jenny Bramble about Agile Testing as a name and as a practice and dare to consider... is an Agile Testing practice by any other name just as effective?

Parlons UX Design - Podcast
#65 - Test UX et méthodologie Agile

Parlons UX Design - Podcast

Play Episode Listen Later Jul 2, 2021 18:49


J'ai suivi une formation sur la méthodologie SAFe et je vous partage mon interrogation sur la place qui est accordée aux tests UX au sein de cette méthodologie... Attention, je ne suis pas certifié, ni expert. Je vous partage un retour d'expérience de formation et ce que j'en ai compris. N'hésitez pas à partager votre avis dans la section commentaire du podcast ou à entamer la discussion avec moi sur LinkedIn... Méthodologie SAFe de Skaled Agile https://www.scaledagileframework.com/ S'éloigner de la dynamique en cascade ( Water fall ) : faire les tâches de façon séquentielle et ordonnée ce qui peut vous conduire à un effet tunnel. J'en parle dans l'épisode 28. Agile Testing https://www.scaledagileframework.com/agile-testing/ Merci d'avoir écouté ce podcast, je vous invite à vous abonner pour ne pas rater les prochains épisodes. Si vous voulez en savoir plus sur moi, je vous invite à consulter mon profil LinkedIn => https://www.linkedin.com/in/thomas-gaudy/. Si vous souhaitez de l'accompagnement pour implémenter ces notions et ces outils dans vos équipes et vos projets, vous pouvez faire appel à mes services de consultant en UX Design. Il vous suffit de me contacter via mon profil LinkedIn ou visitez notre site internet ( ludocielspourtous.org ) à la section Nos services. Au plaisir ! Édition : Stéphanie Akré « Jingle du podcast » : Nous souhaitons remercier chaleureusement Gordon W. Hempton The Sound Tracker® qui nous a fait don de la totalité de sa merveilleuse bibliothèque de sons récoltés dans la nature.

Coding Over Cocktails
Changing your approach to testing with Alan Richardson

Coding Over Cocktails

Play Episode Listen Later Jun 4, 2021 38:55


In this episode of Cocktails, “The Evil Tester” Alan Richardson joins us for a round and discusses how we can change our approach towards testing, as well as the important qualities that a good tester should have. He also talks about Agile Testing in relation to development, as well as various models on testing that we're seeing today.

Agile World
Agile World S2 E8 Sabrina C E Bruce and Karl Smith talk with Andrew Palmer about his Agile journey

Agile World

Play Episode Listen Later Apr 13, 2021 34:54


Agile World with our hosts Sabrina C E Bruce and Karl Smith talk with Andrew Palmer https://www.linkedin.com/in/mrajpalmer/ about his Agile journey and Agile Testing, Quality in organisations and all manner of things. Agile World magazine show with Sabrina C E Bruce and Karl Smith on YouTube. Agile World is a spin off from the The Agile20Reflect Festival https://agile20reflect.org/ now called Access Agile https://access-agile.org/ and affirms its commitment to a Global Agile Community. #Agile_World #AgileWorld #Agile #AgileTalkShow #AgileManifiesto #AgileCoach #ScrumMaster #Agile20ReflectFestival #Agile20ReflectEvent #Agile20Reflect Online Website https://agile-world.news/ LinkedIn https://www.linkedin.com/company/agile-world-news/ Facebook https://www.facebook.com/agileworldnews Twitter https://twitter.com/AgileWorldNews Tumblr https://www.tumblr.com/blog/view/agile-world YouTube https://www.youtube.com/c/AgileWorld Podcast Spotify https://open.spotify.com/show/1aMY1R5ct7EqrehR4aZUat Apple Podcasts https://podcasts.apple.com/gb/podcast/agile-world/id1553727032 Google Podcasts https://www.google.com/podcasts?feed=aHR0cHM6Ly9hbmNob3IuZm0vcy80Y2FmNDhmYy9wb2RjYXN0L3Jzcw== Pocket Casts https://pca.st/vbyfqprr Anchor https://anchor.fm/agile-world Breaker https://www.breaker.audio/agile-world Radio Public https://radiopublic.com/agile-world-WPNL9j Co Hosts Sabrina C E Bruce https://www.linkedin.com/in/sabrinabruce/ Karl Smith https://www.linkedin.com/in/karlsmith2/ Agile World © 2021 Karl Smith and Sabrina C E Bruce --- Send in a voice message: https://anchor.fm/agile-world/message

Software Engineering Unlocked
Episode 33: From intern to CEO with agile testing expert Alex Schladebeck

Software Engineering Unlocked

Play Episode Listen Later Dec 22, 2020 50:08


Today’s episode is sponsored by CodeSubmit – the best take-home assignments for your tech hiring!Links:Alex’s TwitterAlex’s WebsiteMob testing blog postBredex websiteBook reference about “being technical”Subscribe on iTunes, Spotify, Google, Deezer, or via RSS.

With Great People
Lisa Crispin: How to transform your workplace into a safe place

With Great People

Play Episode Listen Later Nov 17, 2020 25:05


In this episode, Richard interviews Lisa Crispin, a quality owner at OutSystems, co-founder of Agile Testing Fellowship Inc, and one of the most influential testing professionals in the software industry. Lisa co-authored several books, including Agile Testing and Testing Extreme Programming, and was a curator of the www.testingindevops.org website. Lisa tells us about the significance of cultivating trust among the team members and how the resulting feeling of safety contributes to the increase of the team's productivity. When you finish listening to the episode, make sure to connect with Lisa on LinkedIn at https://www.linkedin.com/in/lisa-crispin-88420a/ or Twitter at https://twitter.com/lisacrispin, and visit her website at www.lisacrispin.com. You can read the full transcript of the episode at kasperowski.com/podcast-53-lisa-crispin/.

Software Crafts Podcast
Interview with Lisa Crispin

Software Crafts Podcast

Play Episode Listen Later Sep 29, 2020 34:33


In this episode, Lisa Crispin shares her experiences with the pattern “Delayed Automation” from the Cloud Native Patterns repository (https://www.cnpatterns.org/development-design/delayed-automation). We discuss the different trade-offs of applying it, based on different contexts. I also ask a long-time question: What can we learn from donkeys? If you are curious why, donkeys are Lisa's brand! Lisa recommends: Quality Coaching Roadshow podcast from Anne-Marie Charrett Accelerate book from Nicole Forsgren, Jez Humble and Gene Kim Leading Quality book from Ronald Cummings-John and Owais Peer Lisa Crispin (@lisacrispin) is the co-author, with Janet Gregory, of three books: Agile Testing Condensed: A Brief Introduction, More Agile Testing: Learning Journeys for the Whole Team, Agile Testing: A Practical Guide for Testers and Agile Teams; the LiveLessons Agile Testing Essentials video course, and “The Whole Team Approach to Agile Testing” 3-day training course offered through the Agile Testing Fellowship. Lisa was voted by her peers as the Most Influential Agile Testing Professional Person at Agile Testing Days in 2012. She is co-founder with Janet of Agile Testing Fellowship, Inc.  Please visit www.lisacrispin.com, www.agiletestingfellow.com, and www.agiletester.ca for more. Lisa is currently a Fellow Quality Owner at OutSystems, helping with the observability practice.

Agile Coaching Network
Agile Testing Strategies and Coaching an Overwhelmed Product Owner

Agile Coaching Network

Play Episode Listen Later Aug 15, 2020 48:07


In this episode, we talk about Agile testing strategies and does it look like in different companies.  Also, we talk about coaching a new product owner who is overwhelmed by the job.   (00:00) Introduction(01:41) Agile testing strategies(34:17) Coaching the overwhelmed product owner(46:00) Wrap upThis podcast is licensed under CC BY-NC-ND 4.0.  If you want more information about the Agile Coaching Network.  Please go to AgileCoachingNetwork.org Support the show (https://www.agilealliance.org/membership-pricing/)

Fast and easy tech!
Agile Testing : Real World Outlook

Fast and easy tech!

Play Episode Listen Later Aug 14, 2020 23:58


The test process is not a separate process that can be outsourced or left alone, it is merged with the development processes. The true benefits of Agile testing are revealed when it is used as part of Agile development and accorded an equal importance. The sooner testing team and testers get involved, better it is for the project. Ideally, testers should be part of the overall scheme of things from day one, simply because giving testers a seat at the table from the start provides a higher level of insight into requirements and goals, encourages collaboration and helps hammer home the need to conduct frequent, continuous testing.

LinkedIn Ads Show
Ep 27 - Agile Testing For Your LinkedIn Ads Management

LinkedIn Ads Show

Play Episode Listen Later Aug 4, 2020 27:42


Show Resources: Episode 25 - How to Optimize Your LinkedIn Ads For Better Performance Episode 06 - LinkedIn Ads Bidding & Budgeting Strategies Episode 15 - Benchmarking Your LinkedIn Ads LinkedIn Learning course about LinkedIn Ads by AJ Wilcox: LinkedIn Advertising Course Contact us at Podcast@B2Linked.com with ideas for what you'd like AJ to cover. Show Transcript: You've heard of agile development. But have you heard of Agile LinkedIn Ads management? Strap in, I'm gonna show you how it's done. Welcome to the LinkedIn Ads Show. Here's your host, AJ Wilcox. 0:21 Hey there LinkedIn Ads fanatics. So Episode 25 was all about the types of optimizations that you can make in your account to show improvement. But you've all been asking me how often you should be doing this optimization and how much data you need and how fast you can go. So I'm going to break down exactly how you can go and launch, test, and pivot with your LinkedIn Ads in a very agile way so you can move faster and conquer your competition. In the news, LinkedIn did a survey where they had just over 800 marketers across the globe kind of self report on how they'd been disrupted during the whole COVID situation and the results were quite interesting. About 10% of respondents said that their business had been totally disrupted. So this is a group who are trying to save their marketing budgets, and basically try to stay in business and not lay anyone off. Then about 70% responded that it was business, but unusual. So they're spending time on strategic planning. They've been mildly disrupted, and they're trying to figure out ways to get around and get through. But business is still going. And then you had 20%, who really said their business had been evolving. These were companies in a category that are really thriving right now. They're in a growing market. They're finding new customers, they're taking advantage of these new opportunities that the post COVID world is providing from those that I've spoken to I would say that these percentages are probably about accurate. In the survey results, they linked to a document that they call Driving ROI. And it's a set of best practice resources that LinkedIn compiled to help marketers. So I clicked on it, I read it, it was from 2019. And it is something that I'd read before but had kind of forgotten about. And I want to point out just some interesting findings here. And I've linked to this in the show notes down below. But it's called the long and short of ROI. As you consume this, you can just hear LinkedIn ranting, during the whole piece about the complaints that they hear so often from advertisers. They really come to three points here. Number one, marketers are measuring ROI too quickly. Number two, when they say they're measuring ROI, they're not really measuring ROI. And number three, internal pressures are causing marketers to move too fast and make actually poor decisions. And although it does sound like a bit of a rant, I totally understand where they're coming from. I'm sure they've had many advertisers quit where they say, you know what, we had to quit, we're just not seeing the return on our investment here. So this is LinkedIn reminding us if your sales cycle is six months long, quit complaining when you haven't seen ROI after four months of advertising. We talked about this in Episode 24, all about funnels. But LinkedIn is going to look really expensive if you're just tracking to the cost per lead. And if you're not measuring past that, if you're not measuring to cost per sales qualified lead, or cost per proposal, or cost per closed deal. Those are the points in your sales process where LinkedIn is gonna start looking really good. So if you're not tracking all the way to that, you probably won't have enough faith in the platform to give it a real shot, or even keep going. We're also planning a Q and A episode here in the next few episodes. This is our first of hopefully many. And I want to get our q&a questions from you. So any question you have about LinkedIn, I mean, we'll make it a total potpourri. Email them over to us at Podcast@B2Linked.com and we'll do our best to include it. I want to highlight a couple new reviews on our iTunes page Naira Perez, who is the founder of SpringHill Digital up in Portland. She is a LinkedIn Ads and a social ads expert. I've gotten to talk to you on many occasions. She's amazing. Okay, so her review says two words "amazing and useful". "AJ gives you actionable lessons. He doesn't keep secrets when it comes to optimizing ads on LinkedIn. He shares what has worked for him and what hasn't. If you run LinkedIn ads, listen to AJ, you will learn something in every episode, he is the gift that keeps on giving". Naira, thank you so much for the kind words, I'm so glad that you're getting a lot of use out of this. The next one comes from Mark Gustafson, who's the founder of 900Kings, and actually a close friend. He's actually the inspiration and my fact checker for episodes eight and nine, all about Facebook ads and Google ads. He's a fantastic paid search and social marketer. So Mark says "best B2B Advertising resource". "AJ is the best in the business. There isn't anyone else I turned to with B2B questions. He's easily the most knowledgeable about the LinkedIn ads platform. The podcast is pure value and perfect for the newest B2B marketer all the way to the most senior. Also, can we talk about that dreamy voice? I could listen to those dulcet tones for days." Mark, thanks for the kind words and thanks for turning a bright red behind the mic. I'm so glad this isn't a video podcast. And seriously, listen to my voice. There is nothing sweet about this. I'll rant for a second. I've got this crazy accent from growing up in both Utah and Arizona and learning Spanish in high school learning Russian after high school and before college. Basically, I don't even know how to describe my own accent. I sound real weird and I fully admit it. Okay, cool. Now I want to feature you so please do make sure you review on whatever podcast player you listen to this on. Leave a review. I'd love to shout you out on air. Thanks in advance for that. Okay, with that being said, let's hit it. 5:59 Agile Testing We're gonna talk about agile testing. So what is agile? Well, agile methodology is really started out in project management as a way for cross functional teams to get to move quickly and build collaboratively through continuous improvement. Now, you may have heard of development teams working in two weeks sprints, or doing daily stand up meetings. This all comes from the whole agile movement. We've adapted this to LinkedIn Ads management because it's a process that really requires continuous improvement, just like project management. So what is agile management of LinkedIn Ads? Well, to me, it's it's really making quick decisions on results from your LinkedIn Ads, so that you can learn more and test faster, find out what performs and then you can do more of that and have success for longer. So I'm going to share the agile process that we follow. And it all starts the moment we launch new ads. So when we launch new ads, we try to launch on either a Monday or a Tuesday whenever possible. And that's because those two days are the days where LinkedIn traffic is the strongest. We try to stay in the morning because morning tends to be the strongest traffic times for LinkedIn. We're always going to launch two ads, an AB test, where we are varying something so that we have something to compare against. Because if you just launch one ad, whether it performs well or whether it performs poorly, you don't know what caused that. But if you launch two at a time, you're giving yourself a better chance of having something that's going to be successful and getting to compare against what wasn't successful. Now if our Monday or a Tuesday happens to land on a holiday or the next few days our holiday we try to postpone either launched the week earlier or a week later. Again, when possible. Sometimes you've got a gun to your head and you just have to get ads launched. I'm sure you've listened to episode six because it was one of our most popular episodes. It's all about bidding and budgeting and it's the strategy we use to get the lowest cost from LinkedIn, no matter what your budget and what your performance. So if you've listened to that, you know that you're going to start with cost per click bidding. And you're going to bid really low to keep your risk low as you're testing. Right after your ads go live for the first day or day and a half, LinkedIn is testing your ads to try to figure out what the relevancy score is going to be. In order for them to test, they seem to give you pretty prime placement for your ads. And they're going to show quite a few impressions to your audience. And really, they're going to give you the benefit of the doubt in most situations. Regardless of how your bidding, chances are, they'll probably show you towards the top of the rankings. So you're probably getting impressions that are worth more than what you're paying, especially if you're bidding low. But once a day to a day and a half has passed. LinkedIn has shown your ad enough times they've given it enough impressions or given them enough impressions, that they can give you this relevancy score. And let's say it's a number between 0 and 10. Based off of that relevancy score, your next few days are going to become very evident how you're performing. So days two and three, we're watching to see what happens. LinkedIn has given you the relevancy score that they think you deserve. I think they make the decision a little bit quick, but so do all the other platforms. But you're really on your own now. And it could result in three possible outcomes here. So the first outcome is, you have a really high click through rate right out of the gate. And so LinkedIn gives you a great relevancy score, you immediately start spending everything you want to. And if you click performance chart inside of campaign manager, and look at these campaigns, by impressions, it will look like a couple of flat days where LinkedIn was giving you the benefit of the doubt. And then a spike upward when you actually outperformed their expectations. That's fantastic. On the opposite side of the spectrum, you could also come out of the gate with a low click through rate and a poor relevancy score. You'll know this happened when your impressions really fall off a cliff. So if you go to again, performance chart inside of campaign manager, and you look by the impressions by day, your first day, LinkedIn gave you a bunch of impressions. The second day, it was kind of halfway through where they decided you were a poor performer. And then by the third day, they just didn't deliver much. It looks like the downslope of a mountain. And the third potential outcome is really you did okay, you got an okay click through rate and a decent relevancy score. And things might continue the way that LinkedIn kind of predicted they were. So now we'll dive into what you can do, based off of which outcome you really landed in. 10:40 Outcome Number 1 So outcome number one, you did great, your ads are attractive enough to get traffic, but don't rest on your laurels yet it's not over. Once you get people to click, now you need to convert them. So assuming things are looking good to the click through rate, people actually care about your ads. Now you're going to go into data gaps. mode, let your ads run for the next $300 to $1,000 and get a feel for the conversion rate. If you're happy with your conversion rate and your cost per lead, just let it ride. Go back into data gathering mode, you're gonna hands off, leave it alone. So you can accumulate enough data to analyze, and use to optimize later. Go listen to Episode 15, if you haven't already, because it's all about benchmarks. And we go super deep into how you can tell what's working and what's not. So you can focus in the right area. And you want to make sure that you're watching this performance over time, because we have this thing happen in social advertising, especially called ad saturation, or audience fatigue. And what that is, is you are showing your same ads and offers to the same people over and over and on a pretty good performing ad, still only about 1% of people who see it will end up clicking on it. So that means 99% of people potentially see your ad and go, nah, I don't want to click on it. Even the people who might want to click on it, if it's the fourth, fifth, eighth time they've seen this exact ad, they're going to take a mental note of it, and then just skip it next time they become banner blind. And the way this will manifest itself in your account, if you go back into performance chart in campaign manager, and switch to looking by average click through rate, you can see over time that your click through rate is starting to drop. We found this period to be about 27 to 33 days on average, which is about a month. So what that tells us is if things are going pretty well, we might check on it two weeks from now three weeks from now and just see our our click through rates decreasing significantly. Does it feel like these ads have lost steam? Have they dropped in relevancy score leading to higher cost per click or lower delivery? And we know because the average is about a month. That means once a month, we're going to plan on refreshing our ad creative or testing a new offer, even even if it was a fantastically powerful high performing offer to begin with, because over time, anything will become a low performer, if you've saturated your audience hard enough, and that's mostly ad saturation, people getting sick of seeing the same ads. And you can relieve them of this by simply just changing the image. You may also want to change your ad copy in case they've already clicked or maybe even converted. But what's most important here is the ad needs to look different and stand out. 13:35 Audience Fatigue Now audience fatigue is something that's a little bit different. Let's say you've been advertising heavily for the last four or five years. Or maybe we'll simplify it, let's say for the last one year, you've been going heavy on an audience. You've been religiously changing your ad copy, keeping things fresh, trying new offers entirely, you may find that your performance decreases over time because the that audience has already heard of your company and is now starting to ignore anything from your company. This isn't a great place to be in because if you've been advertising heavily for years and years, chances are you are really relying on the leads that this platform pushes. But it could be a sign that you either need to really vary things up, or even try going on pause for a little bit, giving your audience a rest, and then see if they come back after a little while. So that's how I handle it. If you launch and the launch was very successful, your ads did great. 14:30 Outcome Number 2 But what if you had a failed launch? This is outcome number two, it means your ads really didn't land with your audience. There could be a lot of reasons for that. You might have done a poor job of actually defining who that audience was. Your ad might not have looked attractive enough for them to stop scrolling. And maybe most often, your ad copy didn't do a good job of pressing on a pain point, or calling out to them in a way that they cared about. But for whatever reason, your ads came out of the gate with a poor, click through rate and you've got a bad relevancy score. LinkedIn gave you a small sample of impressions for a day to a day and a half. And it means your ads didn't get enough clicks to qualify for a high relevancy score. So performance is likely going to be terrible. You can force it, you can specifically go in and bid higher or do automated bidding. But if you do this, you will pay way too much per click, and you will just be getting robbed by the platform. When ads come out of the gate performing poorly, sometimes it's really unfair. Sometimes LinkedIn unfairly awarded you a poor relevancy score because in the sample, they showed they didn't see enough clicks. And so they assumed you had a poor relevancy score. But you can resurrect this by just launching the same ads again. So if we launch on a Friday or a Saturday, and our ads just die, most of the time, what we'll try doing is just go and relaunch exactly the same ads on a Monday or a Tuesday, and just see. We're giving LinkedIn just one more chance with these ads to see if they unfairly awarded a poor relevancy score. If they fail for a second time, though, especially on a Monday or a Tuesday morning launch, then we know something's wrong with that ad creative. And we need to go back to the drawing board, or continue to suffer the most expensive cost per click you've ever seen from any channel. If you're having a hard time getting people to click on your ads or getting LinkedIn to even serve them. This point, you probably have not gotten enough traffic to find out how it's going to convert on your offer. Or maybe you did get a decent conversion rate. But since the clicks are costing so much, your cost per lead wouldn't be worthwhile in the end anyway. So at this point, things are going poorly and you should plan on just pausing these ads and relaunching an entirely new test. Episode 24 was all about funnels, so make sure you're paying attention to the right thing. If your click through rates are doing great, don't go and test new ads copy. And if your conversion rates are great, don't go and make major landing page changes. First start with the lowest hanging fruit, the parts of the account that are having the most trouble. If what you're trying isn't working, you can really try something radically different. And like I mentioned earlier, an AB test is going to give you a better chance of finding success with at least one of your variations. So try launching two ads where you vary the pain point you're pressing on. Or maybe you're touching on a different motivation. Or maybe you're even testing different calls to action or offers. You never know how something is going to perform until you test it. So don't be afraid to launch new ads and quickly retire them or pause them. If they're not living up to your standards. Then we have the third outcome, which is kind of like it's okay performance. There are certainly things you can do to try to increase performance. But if things are just going okay, I would say just like option number one where we had a success campaign go into data gathering mode. And then once you have enough data, evaluate to see if it's worth continuing or if there's something in the account that needs to be improved. Here's a quick sponsor break, and then we'll dive into maintaining an account once it's doing well. 18:15 The LinkedIn Ads Show is proudly brought to you by B2Linked.com, the LinkedIn Ads Experts. 18:25 If the performance of your LinkedIn Ads is important to you, B2Linked is the agency you'll want to work with. We manage LinkedIn's's largest accounts and we are the only media buying agency to be official LinkedIn partners. And performance to your goals is our only priority. Fill out the contact form on any page of B2Linked.com to get in touch, and we'd love to help you absolutely demolish your goals. 18:47 Continued Maintenance Alright, let's jump into continued maintenance. We talked about Ad saturation, how about once a month, your ads aren't going to be fresh anymore and you've got to change them up. So plan on doing that at least once per month. And follow the same agile testing steps that I mentioned before. Every new ad launch, you're going to take a look at it and say, okay, for the first three days, I'm going to watch and see, do these perform well? Are they getting a good click through rate? Are we getting a good cost per click? And then over the next week or two, you're saying, okay, is this leading to the conversions at the right costs that I want. You'll also want to keep your offers or your calls to action fresh, because you can change the image you can change the ad copy of your ads, a ton of times, but eventually people are going to catch on if you've been advertising exactly the same offer for the last six months. Every offer really has its own life. We've seen some that after a month that audience has just done and then we had one account where we had a winning offer that we couldn't dethrone for like seven months. No matter what other offers we threw at this audience, they kept preferring the one from seven months ago, and it was still converting even though it wasn't converting at the same rate that it was at the beginning, so we were trying to get it off its throne. So if you've been running an offer for about a month, chances are you can refresh your ad creative, use a new visual, and you can get that offer to live for another month, maybe even two. So watch that performance. Specifically watch your cost per lead, and your conversion rate as you go. As soon as you see that conversion rates start to slip, that's probably a great sign that you need to change up your call to action, give them a different kind of offer something that's new, that they will actually consider if they've already seen the other one several times and have decided, oh, I've already converted, or no, I'm not interested in that. And then keep that up, rotate through new ad creative and new offers as needed as your performance starts to decrease. And if you do this, congratulations, your lead generation machine is complete. To maintain it, what you need to do is Just keep feeding it new ads, new headlines, new intros, and new offers only when they're needed. And then this entire time you're gaining knowledge, you're learning about your audience along the way, you're finding out what they like and what they don't. 21:14 Pain Points Here at B2Linked, we do a lot of ads troubleshooting. And so I thought it would be helpful to at least share with you how we think about finding pain points and what potential solutions are. So let's say your ads aren't performing well. That means either you're getting a high cost per click, or a low click through rate, or even both, they oftentimes go together. What you can do is try new ad copy, new imagery. If it's a video ad, try new video creative. And after two or three different tests of messaging, or visuals, if it's still not getting clicked, chances are the problem is your offer. You're probably asking people to do something that they are either unwilling to do and it scares them away, or they just don't see value in what about your conversion rates. Let's say you're not happy with how much you're paying per conversion, or your conversion rate is low. There are two things that we like to test here. And the first is evaluate your landing page. It's possible that your offer itself is really attractive, but maybe the way that your landing page is laid out, or the elements on it are getting in the way or distracting, and it's decreasing your conversion rate. One easy way to test if it's your landing page that's getting in the way, or the offer is you can test the same exact ad, but run it as a LinkedIn lead gen form. So you're asking people with the same form, but you're skipping the landing page, your website visit all together. Now lead gen forms, as I'm sure you know, tend to convert significantly higher. So we expect that when we do this test, the lead gen form is probably going to convert let's say 10 to 50% higher. But if it is significantly more than that, let's say conversion rate doubles or triples, that's my first clue that something on the landing page was getting in the way and you need to do some testing there. If it's not the landing page, though, it's likely the offer itself. Listen to Episode 10, where we go really deep into offers. And that'll give you some great ideas on how to try out new offers ideate, formulate, and create new offers. What if your ads and conversions are all going really well, but sales reports back that they're not closing these deals? Well, there's a lot of different things that are possibly out of your control here as a marketer, but maybe your sales team isn't nurturing right? Or, and this is a hard realization. You might not have the right product market fit, maybe your product or service that you're selling, maybe it doesn't solve a significant pain point enough that people want to buy. And no amount of snazzy marketing can fix that. Those are obviously much deeper problems, but see if you can isolate where they are and take off your marketer hat and put on your sales enablement hat and go and try to solve that problem, and that will earn you some significant quarterly bonuses. 24:06 Goal of agile management The goal of agile management of LinkedIn is really threefold. Number one, you want confidence that what you're doing is working and will lead to business. And this is hard, because in B2B, we oftentimes have these long sales cycles. And it doesn't make sense to keep advertising for a year and a half if you don't know for sure that there's going to be some revenue that comes from that. You'll want to have stopped advertising much earlier. So as a marketer, you're looking for shorter term clues that the traffic will convert into profitable sales. Goal number two is you want to keep fresh with your audience so that they don't get sick of your ads, your offers, or even your company. If you can stay fresh, your performance won't decline significantly. And this is fantastic because social ads of any kind really are a moving target because the same thing that worked two months ago may totally fail today. And that's just because of ad saturation. So do your part to keep things looking fresh so that you never get stale to that audience. And then Goal number three here, you want to identify the problems and inefficiencies of your account so that you can fix them early and they don't turn into something terrible later on. Okay, I've got the episode resources for you coming up, so stick around. 25:28 Thank you for listening to the LinkedIn Ads Show. Hungry for more? AJ Wilcox, take it away. 25:39 References I referenced quite a few other episodes in here. So check those out in the show notes if you haven't listened to them already. Episode 25 is all about optimization of your account, making it better when it's already doing well. Episode Six is on bidding and budgeting. Episode 10 is all about offers. And Episode 15 is all about benchmarking. To see where you stand if you're performing well, or if you're performing poorly. If you're new to LinkedIn ads, or if you have a colleague you're trying to train, check out the course, I actually did the LinkedIn Ads course on LinkedIn Learning. There's a link for that down below. But because it's LinkedIn Learning, it is insanely inexpensive, and it's a great training. It's the same thing that I charge $500 an hour for and would take me an hour and a half to train you and your team. And through LinkedIn, you can get it for 25 bucks. Or if you have a LinkedIn premium subscription, it's free. Take a look at your podcast player right now, especially if you're new, if this is your first episode. If so, congratulations. Welcome! And hit that subscribe button. We want to make sure you stick around and hear more awesome LinkedIn Ads strategies. Please do rate and review us on whatever podcast player you lean on. We're especially looking for stitcher right now. So if you happen to be a stitcher user, I would love it if you go and review us there because we only have one lonely one there. And of course, I'll shout you out for your review. As long as I know that it's you. So give me something good to say about you there. And like I mentioned at the beginning of the show, get your QA questions in, email us at Podcast@B2Linked.com. And actually, feel free to email us with ideas or topics you'd like to see us cover or questions, anything that you'd like. But especially I'm looking for good Q&A questions to cover or a whole Q&A episode. Hopefully, it's the first of many. So I'll see you back here next week, cheering you on in your LinkedIn Ads initiatives.

Podcast do Júlio
Episódio 3 @ Temporada 1: Eventos internacionais de teste, entrevista com Larissa Rosochansky & Rafael Cintra sobre Design Thinking em testes e respostas sobre preparação de palestras

Podcast do Júlio

Play Episode Listen Later Mar 8, 2020 45:49


Os benefícios obtidos ao participar de eventos internacionais como a EuroSTAR, STARCanada e STAR(East|West) e a construção do conhecimento a partir do conhecimento existente. Entrevista com Larissa Rosochansky & Rafael Cintra sobre Design Thinking em testes e como foi a submissão e aprovação da palestra em um dos maiores eventos internacionais sobre testes de software, o Agile Testing days, em Chicago, IL. Minhas respostas a perguntas sobre preparação de palestras, eventos internacionais e sobre a construção de novos conhecimentos a partir da mescla de testes com outras áreas. Testadores mencionados nesse episódio em ordem cronológica: James Bach, Michael Bolton, Janet Gregory, Lisa Crispin, Elisabeth Hendrickson, Lee Copeland, Elias Nogueira, Walmyr Filho, Moisés Ramírez e Samanta Cicilia.

De Voorproeverij
S01E11: All about (Agile) Testing

De Voorproeverij

Play Episode Listen Later Dec 24, 2019 71:59


Einde van het jaar, 2019 is bijna voorbij en daarmee ook onze laatste aflevering van het eerste seizoen van De Voorproeverij! In deze 11de (of zijn het er toch 12?) aflevering hebben wij als kers op onze taart Huib Schoots (@huibschoots) van de Agile testers te gast. Huib geeft zijn mening over van alles en … S01E11: All about (Agile) Testing Lees verder »

The Testing Show
The Testing Show: Agile Testing – From DevOps to Continuous Delivery

The Testing Show

Play Episode Listen Later Dec 19, 2019 32:43


Agile Testing has now been around in some form or another for two decades, yet it seems that what people are calling Agile Testing and what Agile Testing actually is are still two different things. Why is there such a gap in both understanding and practice? Matt Heusser and Michael Larsen welcome Lisa Crispin, Elle Gee and Jamie Phillips to discuss exactly that. In the process, we get into how Agile is practiced in both small teams and in larger organizations, where it is practiced well, and some of the common pitfalls even the best of Agile organizations still face.

Cambiemos esto
Agile testing

Cambiemos esto

Play Episode Listen Later Dec 16, 2019 15:41


Nuestros clientes a menudo nos preguntan cuál es el cambio de enfoque que plantean las Metodologías Ágiles en cuanto a pruebas se refiere. Es por ello que os traemos nuestra visión de lo que es el Agile Testing y respondemos a las preguntas: ¿cuál es el cambio de enfoque? ¿Cuáles son las prácticas de desarrollo que allanan el terreno de las pruebas? ¿Qué tipos de pruebas hay? ¿Qué estrategia utilizamos? Todos los episodios en: https://lk.autentia.com/CambiemosEsto ¡Conoce Autentia! Twitter: https://goo.gl/MU5pUQ Instagram: https://lk.autentia.com/instagram LinkedIn: https://goo.gl/2On7Fj/ Facebook: https://goo.gl/o8HrWX

Cucumber Podcast RSS
Agile Testing Condensed with Janet Gregory and Lisa Crispin

Cucumber Podcast RSS

Play Episode Listen Later Oct 24, 2019 40:46


This month on the Cucumber Podcast we have another conversation with Janet Gregory and Lisa Crispin. They have released their latest book, Agile Testing Condensed (https://leanpub.com/agiletesting-condensed). In this conversation, Matt Wynne and Seb Rose ask the pair about the book and testing best practices in modern agile teams.

Maintainable
Lisa Crispin: Agile Testing & Technical Debt

Maintainable

Play Episode Listen Later Jun 10, 2019 39:22


Robby speaks with Lisa Crispin, co-author of Agile Testing and Testing Advocate at Mabl. Lisa speaks about "thinking skills" for developers, why testing professionals should be integrated into dev teams, testing and development cycles, and how to start building automated tests on a legacy application. Helpful Links Follow Lisa Crispin on Twitter Agile Testing Fellow Agile Testing with Lisa Crispin DevTestOps Community The Nightmare Headline Game by Elisabeth Hendrickson [Book] Agile Testing: A Practical Guide for Testers and Agile Teams [Book] More Agile Testing: Learning Journeys for the Whole Team [Book] More Fearless Change [Book] A Practical Guide to Testing [Book] Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing Subscribe to Maintainable on: Apple Podcasts Overcast Or search "Maintainable" wherever you stream your podcasts. Loving Maintainable? Leave a rating and review on Apple Podcasts to help grow our reach. Brought to you by the team at Planet Argon.

Cucumber Podcast RSS
Testing in DevOps with Janet Gregory and Lisa Crispin

Cucumber Podcast RSS

Play Episode Listen Later Jan 17, 2019 36:57


This month on the podcast we speak to Janet Gregory and Lisa Crispin about testing in DevOps. The conversation covers the whole team approach and why testers are as important as ever. Asking the questions from Cucumber is Matt Wynne, Sallyann Freudenberg, and Steve Tooke. Shownotes: Janet & Lisa's website - https://agiletester.ca/ Agile Testing - https://www.amazon.com/Agile-Testing-Practical-Guide-Testers/dp/0321534468/ref=sr_1_3?s=books&ie=UTF8&qid=1547738499&sr=1-3&keywords=agile+testing More Agile Testing - https://www.amazon.com/More-Agile-Testing-Addison-WesleySignature/dp/0321967054/ref=cm_cr_arp_d_product_top?ie=UTF8 Agile Testing Essentials - https://www.frontrowagile.com/courses/agile-testing-essentials/overview On Twitter: Janet Gregory (@janetgregoryca) Lisa Crispin (@lisacrispin)

Software Process and Measurement Cast
SPaMCAST 524 - Quality, Risk and Agile Testing, An Interview with Matt Heusser

Software Process and Measurement Cast

Play Episode Listen Later Dec 9, 2018 38:33


SPaMCAST 524 features the return of Matt Heusser.   Matt and I talk about the nuts and bolts of being a tester in today’s software environment including agile testing.  Matt and I had covered difficult areas that anyone that is interested in quality and risk needs to think about. Matt’s bio: The Managing Director of Excelon Development and a member of the board of directors for the Association for Software Testing, Matt Heusser leads test training and change efforts while making as many contributions as he can to active projects. Learn more about Matt at www.xndev.com follow him on twitter at @mheusser or check out the testing show podcast on iTunes or online at https://www.qualitestgroup.com/resources/the-testing-show/    Re-Read Saturday News We are re-reading Bad Blood, Secrets and Lies in a Silicon Valley Startup by John Carreyrou (published by Alfred A. Knopf, 2018 – Buy a copy and read along!). Chapter 9, titled Wellness Play, continues the focus on overpromising and the toxicity overpromising generates inside and outside Theranos. Week 7 - Wellness Play - https://bit.ly/2rqUYk6 Previous Entries: Week 1 – Approach and Introduction – https://bit.ly/2J1pY2t    Week 2 -- A Purposeful Life and Gluebot - https://bit.ly/2RZANGh Week 3 -- Apple Envy, Goodbye East Paly and Childhood Neighbors - https://bit.ly/2zbOTeO Week 4 -- A Reflection -https://bit.ly/2RA6AfT Week 5 -- Sunny - https://bit.ly/2AZ5tRq Week 6 - The miniLab -  https://bit.ly/2rfmwJh Next SPaMCAST SPaMCAST 525 will continue the conversation on story points.  Many teams find that story points are only a partially useful tool to facilitate the flow of work within a team. Can story points be fixed, or better yet can story points still be useful? We will also have a visit from Jeremy Berriault with a discussion from the QA Corner.

Software Process and Measurement Cast
SPaMCAST 516 - Agile Testing and More, An Interview With Nishi Grover Garg

Software Process and Measurement Cast

Play Episode Listen Later Oct 14, 2018 29:21


SPaMCAST 516 features our interview with Nishi Grover Garg.  Nishi and I started by discussing the major differences in agile and non-agile testing and ended with a discussion of agile pods. This is a wonderfully idea-rich interview. Note:  I am recording part of this episode remotely from a hotel in Brazil! Nishi’s Bio: Nishi is a Consulting Agile and Software testing trainer. With a decade of experience working in an Agile environment in different product-based companies, she has had a chance to work in all stages of software testing life cycle from a White box, Black box to Automation testing and Usability testing. Having now extended it to her full-time job, Nishi is a coach, trainer, and mentor in areas of Agile and software testing, specializing in conducting QA Induction boot-camps, ISTQB workshops, DevOps Foundation and Selenium Automation courses. She is certified by Agile Testing Alliance (ATA) as a CP-DOF, CP-SAT, CP-AAT, CP-MAT and by ISTQB as a Foundation and Advanced Test Analyst and likes to keep updating her skills periodically. She is also a passionate freelance writer and contributes to many online forums about new topics of interest in the industry like Techwell community’s AgileConnection.com, Stickyminds.com and many more. Check out her blog at www.testwithnishi.com to find her articles and catch up on her latest professional activities! Contact information: Blog: www.testwithnishi.com Email: grover.nishi@gmail.com Re-Read Saturday News This week we begin our read of Bad Blood (buy your copy today https://amzn.to/2zTEgPq  and support the blog and the author).  Bad Blood is a new book for me, therefore a “read” rather than a re-read. We begin with the introductory material and a proposed plan for the read. Week 1 – Approach and Introduction – https://bit.ly/2J1pY2t  Conferences and More! ISMA 16 Sao Paulo, Brazil October 16 Register Now; https://bit.ly/2PXH8A5 Presentation: Product Owners In Agile – The Really Hard Role! ITMPI Webinar Virtual October 31 Register Now: https://bit.ly/2zo8MAV Webinar:  Agile, Where Agile Fears to Tread Next SPaMCASTSPaMCAST 517 will feature our essay on a code of ethics for agile coaches.  I am also considering a call to action to begin banding together to support a code of ethics.   We will also have a discussion with Jon M. Quigley.

TestCast Brasil
TestCast 05 - Agile Testing (Testes Ágeis)

TestCast Brasil

Play Episode Listen Later Sep 1, 2017 56:14


Saudações BugHunters! Nesse episódio falamos um pouco sobre um assunto que tem ficado cada vez mais em alta no mercado de Teste e Qualidade de Software no Brasil e no mundo: Agile Testing (ou Testes Ágeis)! Chamamos convidados ilustríssimos e que trabalham no dia a dia com Agile Testing para dar os seus pontos de vista, esclarecer conceitos, informações, e muito mais sobre o tema! Venha já conferir! Links relacionados: Blog da Concrete Solutions: https://www.concrete.com.br/blog Blog Youse Labs: https://labs.youse.com.br/ CodeAcademy: https://www.codecademy.com/ Radar Tecnológico da ThoughtWorks: https://www.thoughtworks.com/pt/radar Livro Agile Testing - Lisa Crispin: https://goo.gl/675voo Livro More Agile Testing - Lisa Crispin: https://goo.gl/Ju5CWS Fórum Agile Testers: http://agiletesters.com.br/ Report sobre tendências do mercado Ágil: https://goo.gl/3CLjjz Página do TestCast Brasil no Facebook: www.facebook.com/testcastbrasil Linkedin: Samanta: https://www.linkedin.com/in/samantacici/ Hatada: https://www.linkedin.com/in/fernandohatada/ João: www.linkedin.com/in/joaolfc/ Lucas: https://www.linkedin.com/in/lucas-santos-ctfl-ctfl-at-cbts-ctal-ta-cfpp-5630578a/

AB Testing
AB Testing – Episode 64

AB Testing

Play Episode Listen Later Aug 8, 2017 50:29


We talk about Agile Testing (both the concept and the book), and how it relates to the “Modern Testing” we’ve talked about before. Also – there’s a mailbag question on Multi Armed Bandit Testing. This episode brought to you by the folks at Techwell. Attend Agile Dev East, the premier agile event covering the latest […] --- Support this podcast: https://anchor.fm/abtesting/support

AB Testing
AB Testing – Episode 64

AB Testing

Play Episode Listen Later Aug 8, 2017 50:29


We talk about Agile Testing (both the concept and the book), and how it relates to the “Modern Testing” we’ve talked about before. Also – there’s a mailbag question on Multi Armed Bandit Testing. This episode brought to you by the folks at Techwell. Attend Agile Dev East, the premier agile event covering the latest […]

Agile for Humans with Ryan Ripley
62: Agile Testing with Lisa Crispin

Agile for Humans with Ryan Ripley

Play Episode Listen Later Apr 17, 2017 44:18


Lisa Crispin (@lisacrispin) and Amitai Schleier (@schmonz) joined me (@RyanRipley) to discuss co-presenting at conferences, co-writing books, and agile testing. [featured-image single_newwindow=”false”]Lisa Crispin and Janet Gregory Co-Presenting a Conference Talk[/featured-image] Lisa is a tester who enjoys sharing her experiences and learning from others. She is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley, 2014) and Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009). Lisa is a tester on a fabulous agile team. She specializes in showing testers and agile teams how testers can add value and how to guide development with business-facing tests. Amitai is a software development coach, speaker, legacy code wrestler, non-award-winning musician, award winning bad poet, and the creator of the Agile in 3 Minutes podcast. He blogs at schmonz.com and is a frequent guest on Agile for Humans. Amitai has published many of his agile observations and musings in his new book – Agile in 3 Minutes on Lean Pub. In this episode you'll discover: How to get started in conference speaking with co-presenting The joys and techniques of writing a book with a partner What is being observed in the agile testing world today Links from the show: More Agile Testing: Learning Journeys for the Whole Team Agile Testing: A Practical Guide for Testers and Agile Teams Lisa’s website: lisacrispin.com Self.Conference – May 19th and 20th [callout]Janet Gregory and Lisa Crispin pioneered the agile testing discipline with their previous work, Agile Testing. Now, in More Agile Testing, they reflect on all they've learned since. They address crucial emerging issues, share evolved agile practices, and cover key issues agile testers have asked to learn more about. Packed with new examples from real teams, this insightful guide offers detailed information about adapting agile testing for your environment; learning from experience and continually improving your test processes; scaling agile testing across teams; and overcoming the pitfalls of automated testing. You'll find brand-new coverage of agile testing for the enterprise, distributed teams, mobile/embedded systems, regulated environments, data warehouse/BI systems, and DevOps practices. Click here to purchase on Amazon.[/callout] [reminder]What are your thoughts about this episode? Please leave them in the comments section below.[/reminder] Want to hear another podcast about the getting started with speaking at technical conferences? — Listen to my conversation with Don Gray, Tim Ottinger, Amitai Schleier, and Jason Tice on episode 32. We discuss how to write a compelling abstract, what track reviewers are looking for in a submission, and how to give yourself the best change of getting selected. One tiny favor.  — Please take 30 seconds now and leave a review on iTunes. This helps others learn about the show and grows our audience. It will help the show tremendously, including my ability to bring on more great guests for all of us to learn from. Thanks! This podcast is brought to you by Audible. I have used Audible for years, and I love audio books. I have three to recommend: Agile and Lean Program Management by Johanna Rothman Scrum: The Art of Doing Twice the Work in Half the Time by Jeff Sutherland The Lean Startup by Eric Ries All you need to do to get your free 30-day Audible trial is go to Audibletrial.com/agile. Choose one of the above books, or choose between more than 180,000 audio programs. It's that easy. Go to Audibletrial.com/agile and get started today. Enjoy! The post AFH 062: Agile Testing with Lisa Crispin [PODCAST] appeared first on Ryan Ripley.See omnystudio.com/listener for privacy information.

Software Engineering Radio - The Podcast for Professional Software Developers
SE-Radio Episode 283: Alexander Tarlinder on Developer Testing

Software Engineering Radio - The Podcast for Professional Software Developers

Play Episode Listen Later Feb 28, 2017 69:41


Felienne talks with Alexander Tarlinder on Developer Testing. Topics include Developer Testing, Agile Testing, Programming by Contract, Specification Based Testing, Venue: KTH, Stockholm Related Links Alexander on Twitter https://twitter.com/alexander_tar Agile Testing: A Practical Guide for Testers and Agile Teams by Lisa Crispin and Janet Gregory https://www.amazon.com/Agile-Testing-Practical-Guide-Testers/dp/0321534468 Clean Code https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship-ebook/dp/B001GSTOAM Alexander’s book review site http://www.techbookreader.com/ Developer […]

Agile Amped Podcast - Inspiring Conversations
Bob Galen, Agile Testing Maturity and the 3 Pillars of Quality at Agile2016

Agile Amped Podcast - Inspiring Conversations

Play Episode Listen Later Aug 19, 2016 10:10


Bob's session at Agile2016 is called "Agile Testing Maturity - What Does 'Good' Look Like?" In his session Bob reminds us that you can't add quality on at the end; you have to build it in. "A lot of folks focus on automation, TDD, BDD, and ATDD, and they forget things like good effective story writing or using the 3 Amigos" as well as regression testing or system testing. Bob champions balance across the three pillars of Agile quality: technology, testing tactics, and soft/collaborative skills. When it comes to quality, he is also a strong proponent for an "everyone owns it" mentality. SolutionsIQ's Alan Dayley hosts at Agile2016 in Atlanta, GA. About Agile Amped The Agile Amped podcast series connects the community through compelling stories, passionate people, shared knowledge, and innovative ideas. Fueled by inspiring conversations with industry thoughtleaders, Agile Amped offers valuable content – anytime, anywhere. To receive real-time updates, subscribe at YouTube, iTunes or SolutionsIQ.com. Subscribe: http://bit.ly/SIQYouTube, http://bit.ly/SIQiTunes, http://www.solutionsiq.com/agile-amped/ Follow: http://bit.ly/SIQTwitter  Like: http://bit.ly/SIQFacebook

The InfoQ Podcast
Lisa Crispin and Justin Searls on Testing and Innovation in Front End Technology

The InfoQ Podcast

Play Episode Listen Later May 27, 2016 29:02


In this week's podcast Richard Seroter talks to Lisa Crispin who works on the tracker team at Pivotal Labs, and is an organiser of the Agile Alliance Technical Conference. Lisa is the co-author of several books on Agile Testing, and is also the 2012 recipient of the Agile Testing Days award for Most Influential Agile Testing Professional Person. Richard also talks to Justin Searls, software craftsman, presenter of "How to Stop Hating Your Tests" and co-founder of Test Double, a company whose goal is to "improve how the world writes software." Why listen to this podcast: - Agile is mainstream, and being adopted by big enterprises, but there's a place to help small companies and startups. - Cloud Foundry pair testers to write production code with the programmers. - Developers have to be focused on right now, testers have freedom to look at more of the big picture - People know testing is good and there a lot of tools for it, but some tools are ill-conceived. - We need a better language for talking about good QA and full stack testing. Notes and links can be found on InfoQ: http://bit.ly/1U0ip8Q 2m:00s - The first XP universe conferences were mainly about XP practices, values and principles, and were attended by developers 2m:17s - Over time, topics moved towards processes and frameworks, and the number of developers who attend Agile conferences has gone down dramatically. 3m:51s - Now Agile is mainstream, it's being adopted by big enterprises, but there's a place to help small companies and startups. That's usually where the innovation comes from, and the Agile Alliance wants to encourage innovation. Quick scan our curated show notes on InfoQ. http://bit.ly/1U0ip8Q You can also subscribe to the InfoQ newsletter to receive weekly updates on the hottest topics from professional software development. http://bit.ly/24x3IVq

Agile Amped Podcast - Inspiring Conversations
JoEllen Carter Talks Agile Testing and Story Mapping with Agile Amped

Agile Amped Podcast - Inspiring Conversations

Play Episode Listen Later May 5, 2016 9:04


Agile tester JoEllen Carter sits down with Agile Amped at Mile High Agile 2016 to chat about using "Testing to Build the Right Thing", the topic and title of the hands-on session she presented with Lisa Crispin. After enjoying their experience diving into story mapping, the duo decided to share it with a wider audience. Though testers aren't always invited to story mapping sessions traditionally, JoEllen points out that testers can help determine where weaknesses in a story are before it gets build into the product. JoEllen Carter has more than ten years of experience defining the role of tester on agile teams. Her experience in software development and testing began in the highly regulated and QA-intensive nuclear power industry, and now includes direct marketing donor management software, staffing software, e-commerce systems, and project management software. SolutionsIQ's Howard Sublett hosts. About Agile Amped The Agile Amped podcast series engages with industry thought leaders at Agile events across the country to bring valuable content to subscribers anytime, anywhere. To receive real-time updates, subscribe at YouTube, iTunes or SolutionsIQ.com. Subscribe: http://bit.ly/SIQYouTube, http://bit.ly/SIQiTunes, http://www.solutionsiq.com/agile-amped/ Follow: http://bit.ly/SIQTwitter Like: http://bit.ly/SIQFacebook

Meta-Cast, an agile podcast
Episode 93 - Revisiting Agile Testing

Meta-Cast, an agile podcast

Play Episode Listen Later Apr 19, 2016 50:09


We originally discussed agile testing over 5 years (and 80 episodes) ago.  Today we revisit the topic to learn how our views of testing in an agile world have evolved.  We cover topics ranging from QA/Dev ratios to automation.  You'll find that we've learned a lot over the year and this episode reflects that. How has your view of agile testing evolved over time?  Leave a comment below or mention us on twitter using #AgileTesting to start the conversation. Here are links to our original discussions from 2010 and 2011. Episode 12 - Agile Testing, Part 1 (http://www.meta-cast.com/2010/11/episode-12-agile-testing-part-1.html) Episode 13 - Agile Testing, Part 2  (http://www.meta-cast.com/2011/01/episode-13-agile-testing-part-2.html) Support this podcast

Software Process and Measurement Cast
SPaMCAST 390 – Vinay Patankar, Agile Value and Lean Start-ups

Software Process and Measurement Cast

Play Episode Listen Later Apr 17, 2016 23:38


The Software Process and Measurement Cast 390 features our interview with Vinay Patankar. We discussed his start up Process Street and the path Vinay and his partner took in order to embrace agile because it delivered value, not just because it was cool. We also discussed how Agile fits or helps in a lean start-up and the lessons Vinay wants to pass on to others. Vinay’s Bio: Vinay Patankar is the co-founder and CEO of Process Street, the simplest way to manage your teams recurring processes and workflows. Easily set up new clients, onboard employees and manage content publishing with Process Street. Process Street is a venture-backed SaaS company and AngelPad alum with numerous fortune 500 clients. When not running Process Street, Vinay loves to travel and spent 4 years as a digital nomad roaming the globe running different internet businesses. He enjoys food, fitness and talking shop. Twitter: @vinayp10 Re-Read Saturday News We continue the read Commitment – Novel About Managing Project Risk by Maassen, Matts, and Geary. Buy your copy today and read along (use the link to support the podcast). This week we tackle Chapters Three which explores visualization, knowledge options and focusing on outcomes. Visit the Software Process and Measurement Blog to catch up on past installments of Re-Read Saturday. Upcoming Events I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd. I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th. Register now! I will be speaking at the CMMI Institute’s Capability Counts 2016 Conference in Annapolis, Maryland, May 10th and 11th. Register Now! Next SPaMCAST The next three weeks will feature mix tapes with the “if you could fix two things” questions from the top downloads of 2007/08, 2009 and 2010. I will be doing a bit of vacationing and all the while researching, writing content and editing new interviews for the sprint to episode 400 and beyond.   Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 389 – AUAT, Soft Skills, OODA vs PDCA

Software Process and Measurement Cast

Play Episode Listen Later Apr 10, 2016 34:08


The Software Process and Measurement Cast 389 essay on different layers and anti-patterns of Agile Acceptance Testing. Many practitioners see Agile acceptance testing as focused solely on validating the business facing functionality. This is a misunderstanding; acceptance testing is more varied. We also have a column from Kim Pries, the Software Sensei.  Kim discusses the significance of soft skills. Kim starts his essay with the statement, “The terms we use to talk about soft skills may reek of subjective hand-waving, but they can often be critical to a career.” Gene Hughson anchors the cast with a discussion from his blog Form Follows Function, titled OODA vs PDCA – What’s the Difference? Gene concludes that OODA loops help address the fact that “We can’t operate with a “one and done” philosophy” when it comes to software architecture. We are also changing and curtailing some of the comments at the end of the cast based on feedback from listeners. We will begin spreading out some of the segments such as future events over the month so that if you binge listen, the last few minutes won’t be as boring and boring. Re-Read Saturday News This week we begin the read Commitment – Novel About Managing Project Risk by Maassen, Matts, and Geary.  Buy your copy today and read along (use the link to support the podcast). This week we tackle Chapters One and Two which set the context for the novel and introduces the concept of real options.   Upcoming Events I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now! I will be speaking at the CMMI Institute’s Capability Counts 2016 Conference in Annapolis, Maryland May 10th and 11th. Register Now! Next SPaMCAST The next Software Process and Measurement Cast features our interview with Vinay Patankar.  We discussed his start up, Process Street, and the path Vinay and his partner took in order to embrace agile because it delivered value, not just because it was cool.  We also discussed how Agile fits or helps in a lean start-up and the lessons Vinay wants to pass on to others.   Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 388 – Dr Mark Bojeun, PMO As A Strategic Tool

Software Process and Measurement Cast

Play Episode Listen Later Apr 3, 2016 29:23


The Software Process and Measurement Cast 388 features our interview with Dr. Mark Bojeun. Dr. Bojeun returns to the podcast to discuss how a PMO can be a strategic tool for an organization. If a PMO is merely a control point or an administrative function, their value and longevity are at risk. Mark suggests that there is a better way. Mark last visited the Software Process and Measurement Cast on SPaMCAST 280.  We discussed his book, Program Management Leadership: Creating Successful Team Dynamics (Kindle version). Mark’s BioDr. Bojeun has more than 20 years of experience in providing strategic management and leadership through portfolio, project and program management. His experience includes developing and managing multi-million dollar portfolios, programs and projects, facilitating the achievement of strategic objectives, and creating best practice processes for program and project management efforts. Dr. Bojeun has designed and implemented multiple Enterprise Program Management Offices (EPMOs) for domestic and multinational firms and has extensive experience in organizational change management through transformational leadership, strategic support and staff empowerment to management professionals in the development and implementation of organizational vision, mission, objectives, and goals. Dr. Bojeun holds a Program Management Professional (PgMP), Project Management Professional (PMP) and Risk Management (PMI-RMP) certification from the Project Management Institute (PMI), is a Microsoft Certified Solution Developer (MCSD), and has a Bachelor’s degree in Business Administration, an MBA from George Mason University and a PhD in Organizational Leadership. Dr. Bojeun’s new book, Program Management Leadership: Creating Successful Team Dynamics as part of CRC Publishing’s Best Practices and Advances in Program Management Series addresses the need for effective leadership styles in managing programs and projects achieving high performing teams that consistently exceed expectations. Over the last ten years, Dr. Bojeun has provided commercial training courses in all aspects of Program and Project management and has been an Adjunct Professor for a number of universities. Dr. Bojeun is currently an Adjunct Professor at Strayer University where he actively teaches business, logistics and project management courses for both undergraduate as well as graduate students. In addition, he provides motivational presentations to leaders throughout the world. Contact Mark on LinkedIn Re-Read Saturday NewsThis week we have a few final notes on our re-read of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog.  In this week installment we summarize our major take away and identify what we can do to improve based on our new knowledge. We will read Commitment – Novel About Managing Project Risk by Olav Maassen and Chris Matts for our next Re-Read beginning next week. Buy your copy today and start reading (use the link to support the podcast). In the meantime, vote in our poll for the next, next book. As in past polls please vote twice or suggest a write-in candidate in the comments. We will run the poll for one more week. Upcoming EventsI will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd. I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th. Register now! I will be speaking at the CMMI Institute’s Capability Counts 2016 Conference in Annapolis, Maryland May 10th and 11th. Register Now! Next SPaMCAST The next Software Process and Measurement Cast will feature our essay on different layers and anti-patterns of Agile Acceptance Testing . Many practitioners see Agile acceptance testing as focused solely on the business facing functionality. This is a misunderstanding; acceptance testing is more varied. We will also have columns from Kim Pries and Gene Hughson! Shameless Ad for my book!Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book book here. Available in E English and Chinese.

Software Process and Measurement Cast
SPaMCAST 387 –Storytelling As A Tool, Critical Roles, QA Career Path

Software Process and Measurement Cast

Play Episode Listen Later Mar 27, 2016 50:01


The Software Process and Measurement Cast 387 includes three features.  The first is our essay on storytelling.  Storytelling is a tool that is useful in many scenarios, for presentations, to help people frame their thoughts and for gathering information. A story provides both a deeper and more nuanced connection with information than most lists of PowerPoint bullets or even structured requirements documents. The essay provides an excellent supplement to our interview with Jason Little (which you can listen to here). The second feature this week is Steve Tendon discussing Chapter 9 of Tame The Flow: Hyper-Productive Knowledge-Work Performance, The TameFlow Approach and Its Application to Scrum and Kanban published J Ross. Chapter 9 is titled “Critical Roles, Leadership and More”.  We discuss why leadership roles are important to achieve hyper-productive performance. Sometimes in Agile and other approaches, it is easy to overlook the role of leaders outside of the team. Remember Steve has a great offer for SPaMCAST listeners. Check `out  https://tameflow.com/spamcast for a way to get Tame The Flow: Hyper-Productive Knowledge-Work Performance, The TameFlow Approach, and Its Application to Scrum and Kanban at 40% off the list price. Anchoring the cast this week is a visit to the QA Corner.  Jeremy Berriault discusses whether a career and the path your career might take in testing is an individual or a team sport.  Jeremy dispenses useful advice even if you are not involved in testing. Re-Read Saturday News This week we are back with Chapter 14 of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog.  Chapter 14 is titled A Universal Measurement Method.  In this chapter, Hubbard provides the readers with a process for applying Applied Information Economics. We will read Commitment – Novel About Managing Project Risk by Olav Maassen and Chris Matts for our next Re-Read.  Buy your copy today and start reading (use the link to support the podcast). In the meantime, vote in our poll for the next book.  As in past polls please vote twice or suggest a write-in candidate in the comments.  We will run the poll for two more weeks. Upcoming Events I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now! I will be speaking at the CMMI Institute’s Capability Counts 2016 Conference in Annapolis, Maryland May 10th and 11th. Register Now! Next SPaMCAST The next Software Process and Measurement Cast will feature our interview with Dr. Mark Bojeun.  Dr. Bojeun returns to the podcast to discuss how a PMO can be a strategic tool for an organization.  If a PMO is merely a control point or an administrative function, their value and longevity are at risk.  Mark suggests that there is a better way. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 386 – Jason Little, Storytelling in Change Management

Software Process and Measurement Cast

Play Episode Listen Later Mar 20, 2016 35:30


The Software Process and Measurement Cast 386 features our interview with Jason Little. Jason and I discussed his exploration of storytelling in change management.  Stories are a powerful tool to develop and hone a big picture view of organizational change. Jason began his career as a web developer when Cold Fusion roamed the earth. Over the following years, he moved into management, Agile Coaching and consulting. The bumps and bruises collected along the way brought him to the realization that helping organizations adopt Agile practices is less about the practices, and all about change. In 2008, he attended an experiential learning conference about how people experience change, and since then he’s been writing and speaking all over the world about helping organizations discover more effective practices for managing organizational change. He is the author of Lean Change Management and an international speaker who has spoken all over the world from Canada, the US, Finland, Germany, Australia, Belgium and more. Contact Data:http://www.agilecoach.ca/about/http://ca.linkedin.com/in/jasonlittle/http://www.twitter.com/jasonlittle Re-Read Saturday News This week we are back with Chapter 13 of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog. In Chapter 13 we discuss New Measurement Instruments for Management.  Hubbard shifts gears in this chapter to focus the reader on the new tools that our dynamic, electronically-tethered environment has created.  Here is a summary of the chapter in a few bullet points:   Everyone creates data that is trackable and measurable. The internet is a measurement instrument. Prediction markets are a way to synthesize a wide variety of opinions.   It is time to begin the selection process for the next’ish book for the Re-Read Saturday.  We will read Commitment – Novel About Managing Project Risk by Olav Maassen and Chris Matts based on the recommendation of Steven Adams first then move to the next book.  As in past polls please vote twice or suggest a write-in candidate in the comments.  We will run the poll for three weeks. Upcoming Events I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now! I will be speaking at the CMMI Institute’s Capability Counts 2016 Conference in Annapolis, Maryland May 10th and 11th. Register Now! Next SPaMCAST The next Software Process and Measurement Cast will feature our essay on storytelling. In the Harvard Business Review article, The Irresistible Power of Storytelling as a Strategic Business Tool by Harrison Monarth (March 11, 2014), Keith Quesenberry, a researcher from Johns Hopkins, notes “People are attracted to stories because we’re social creatures and we relate to other people.” The power of storytelling is that it helps us understand each other and develop empathy. Storytelling is a tool that is useful in many scenarios; for presentations, but also to help people frame their thoughts and for gathering information. A story provides both a deeper and more nuanced connection with information than most lists of PowerPoint bullets or even structured requirements documents. The essay provides an excellent supplement to our interview with Jason Little. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 385 - Agile Portfolio Metrics, Why Diversity, Fast Is Not Enough

Software Process and Measurement Cast

Play Episode Listen Later Mar 13, 2016 36:55


The Software Process and Measurement Cast 385 features our essay on Agile portfolio metrics. Agile portfolio metrics are integral to prioritization and validating the flow of work. But, Agile portfolio metrics are only useful if they provide value. Metrics and measures add value if they reduce uncertainty so that we can make better decisions.  In the second segment, Kim Pries, the Software Sensei asks the question, “Why should we care about diversity?” No spoilers here, but the answer might have something to do with value! Anchoring the cast, Gene Hughson discusses Architecture and OODA Loops: Fast Is Not Enough from his blog Form Follows Function! For those of you that don’t remember, OODA stands for observe, orient, decide, and act. Re-Read Saturday NewsThis week we are back with Chapter 12 of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog. In Chapter 12 we discussed The Ultimate Measurement Instrument: Human Judges. Humans can be a valuable measurement tool; however, that value requires using techniques to correct for the certain errors that are common in unaided human judgment. Upcoming EventsI am facilitating the CMMI Capability Challenge. This new competition showcases thought leaders who are building organizational capability and improving performance. Listeners will be asked to vote on the winning idea which will be presented at the CMMI Institute’s Capability Counts 2016 conference. The next CMMI Capability Challenge session will be held on March 15th at 1 PM EST. http://cmmiinstitute.com/conferences#thecapabilitychallenge I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd. I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now! Next SPaMCAST The next Software Process and Measurement Cast will feature our interview with Jason Little. Jason and I discussed his exploration of the use of storytelling in change management. Stories are a powerful tool to develop and hone a big picture view of organizational change. Shameless Ad for my book!Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English Chinese.

Software Process and Measurement Cast
SPaMCAST 384 - Gwen Walsh, Leadership and End Annual Reviews

Software Process and Measurement Cast

Play Episode Listen Later Mar 6, 2016 40:17


The Software Process and Measurement Cast 384 features our interview with Gwen Walsh. Gwen is the President of TechEdge LLC. We discuss leadership and why leadership is important. We also discuss the topic of performance appraisals and how classic methods can hurt your organization. Gwen’s advice both redefines industry standards and provides you with an idea of what is truly possible. Gwen Walsh has built a career creating and implementing business and technology solutions that redefine the industry standards for both Fortune 100 corporations and entrepreneurial organizations. With over 25 years of experience in leadership development and organizational transformation, Ms. Walsh, founder of TechEdge LLC, helps her clients stay ahead of their competition, stay in touch with their customers and stay in high demand. Ms. Walsh's client portfolio includes Kaiser Permanente, Hospital Corporation of America, Hewlett-Packard, KeyBank, Medical Mutual of Ohio, General Motors, Omaha Public Power District and Anheuser-Busch.  Contact information gwalsh@techedgellc.comhttp://techedgellc.com/https://www.linkedin.com/company/techedge-llchttps://www.linkedin.com/in/gwenwalsh Re-Read Saturday NewsThis week we are back with Chapter 11 of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog. Chapter 11 begins section four of the book and is titled Preferences and Attitudes: The Softer Side of Measurement. The softer side is a euphemism for attitudes and opinions.  In this chapter, we visit how to: · Measure opinions and feelings.· Design out bias in surveys and questions.· Observe opinions and feelings through trade-offs.· Use trade-offs to describe risk tolerance. Anyone living in the United States knows that every election year there are a plethora opinion polls.  One of my favorite blogs is Nate Silver’s FiveThrityEight, which shows a wealth of statistical information about sports, economics, culture, and politics (a form of sport).  Much of the data presented is a reflection of opinions and attitudes. Often they are real predictors of behavior and product success.  Upcoming EventsI am facilitating the CMMI Capability Challenge. This new competition showcases thought leaders who are building organizational capability and improving performance. Listeners will be asked to vote on the winning idea which will be presented at the CMMI Institute’s Capability Counts 2016 conference. The next CMMI Capability Challenge session will be held on March 15th at 1 PM EST. http://cmmiinstitute.com/conferences#thecapabilitychallenge I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now! Next SPaMCAST The next Software Process and Measurement Cast features our essay on portfolio metrics. Agile portfolio metrics are integral to prioritization and validating the flow of work. But, Agile portfolio metrics are only useful if they provide value. Metrics and measures add value if they reduce uncertainty so that we can make better decisions.  We will also have a new installment from the Software Sensei. Kim asks the question, “Why should we care about diversity?” Gene Hughson will anchor cast with another entry from his wonderful blog Form Follows Function! Shameless Ad for my book!Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English h and Chinese.

Software Process and Measurement Cast
SPaMCAST 383 – Peer Reviews, Responsibility without Authority, Shared Visions

Software Process and Measurement Cast

Play Episode Listen Later Feb 28, 2016 47:04


Software Process and Measurement Cast 383 features our essay on peer reviews.  Peer reviews are a tool to remove defects before we need to either test them out or ask our customers to find them for us. While the data about the benefits of peer reviews is UNAMBIGUOUS, they are rarely practiced well and often turn into a blame apportionment tool.  The essay discusses how to do peer reviews, whether you are using Agile or not so that you get the benefits you expect! Our second segment is a visit to the QA Corner.  Jeremy Berriault discusses a piece of advice he got from a mentor that continues to pay dividends.  This installment of the QA Corner discusses how a QA leader can generate and leverage responsibility without formal authority.            Steve Tendon anchors this week’s SPaMCAST discussing Chapter 8 of Tame The Flow: Hyper-Productive Knowledge-Work Performance, The TameFlow Approach and Its Application to Scrum and Kanban published J Ross. Chapter 8 is titled “Creating A Shared Vision At The Team Level”.  We discuss why it is important for the team to have a shared vision, the downside of not having a shared vision and most importantly, how to get a share vision.  Remember Steve has a great offer for SPaMCAST listeners. Check out https://tameflow.com/spamcast for a way to get Tame The Flow: Hyper-Productive Knowledge-Work Performance, The TameFlow Approach, and Its Application to Scrum and Kanban at 40% off the list price. Re-Read Saturday News This week we are back with Chapter 10 of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog. In Chapter 10 we visited how to use Bayesian Statistics to account for having prior knowledge before we begin measuring.  Most common statistics assume that we don’t have prior knowledge of the potential range of what we are measuring or the shape of the distribution.  This is often a gross simplification with ramifications!   Upcoming Events I am facilitating the CMMI Capability Challenge. This new competition showcases thought leaders who are building organizational capability and improving performance. Listeners will be asked to vote on the winning idea which will be presented at the CMMI Institute’s Capability Counts 2016 conference.  The next CMMI Capability Challenge session will be held on March 15th at 1 PM EST.  http://cmmiinstitute.com/conferences#thecapabilitychallenge   I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now!  Upcoming Webinars Budgeting, Estimation, Planning, #NoEstimates and the Agile Planning Onion March 1, 2016, 11 AM EST There are many levels of estimation, including budgeting, high-level estimation and task planning (detailed estimation). This webinar challenges the listener to consider estimation as a form of planning. Register Here Next SPaMCAST The next Software Process and Measurement Cast features our interview with Gwen Walsh.  Gwen is the President of TechEdge LLC. We discussed leadership and why leadership is important.  We also discussed the topic of performance appraisals and how classic methods can hurt your organization. Gwen’s advice both redefines industry standards and provides you with an idea of what is truly possible. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 382 – Ben Linders, Continuous Process Improvement

Software Process and Measurement Cast

Play Episode Listen Later Feb 21, 2016 34:06


Software Process and Measurement Cast 382 features our interview with Ben Linders. Ben revisits the Software Process and Measurement Cast to discuss his recent series of articles on targeting, finding, and eradicating impediments. Ben discussed a platform for continuous process improvement that delivers continuously increasing value! Ben’s Bio: Ben Linders is an Independent Consultant in Agile, Lean, Quality and Continuous Improvement, based in The Netherlands. He is the author of Getting Value out of Agile Retrospectives, Waardevolle Agile Retrospectives, What Drives Quality and Continuous Improvement. As an adviser, coach and trainer he helps organizations by ddeploying effective software development and management practices. He focuses on continuous improvement, collaboration and communication, and professional development, to deliver business value to customers. Ben is an active member of networks on Agile, Lean and Quality, and a frequent speaker and writer. He shares his experience in a bilingual blog (Dutch and English), as an editor for Agile at InfoQ and as an expert on TechTarget. Follow him on twitter: @BenLinders.  Ben's impedements articles http://www.benlinders.com/2015/handling-impediments-why-it-matters/   Ben's new book on continuous improvement is available on leanpub: https://leanpub.com/continuousimprovement   Re-Read Saturday News We take a break for Podcamp Toronto and to begin the process of picking the next book. What are your suggestions? In the meantime catch up on the re-read of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog.   Upcoming Events I am facilitating the CMMI Capability Challenge. This new competition showcases thought leaders who are building organizational capability and improving performance. Listeners will be asked to vote on the winning idea which will be presented at the CMMI Institute’s Capability Counts 2016 conference. The next CMMI Capability Challenge session will be held on March 15th at 1PM EST. http://cmmiinstitute.com/conferences#thecapabilitychallenge   I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd. I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now! Upcoming Webinars IIST Webinar: Scaling Agile Testing Using the TMMi Date: February 26, 2016Time: 11:00am ESTPresenter: Tom Cagley, VP of Consulting, TMMi Accredited Assessor Agile methods, principles and techniques are core to how many IT organizations develop and maintain software. However, even though techniques like Test-Driven Development and Scrum are widely practiced, one common complaint is that it is difficult to scale these practices. The webinar will outline the TMMi and provide a process for using environmental, technical and project context to effectively integrate testing into an Agile development environment, measuring the effectiveness of the process. Budgeting, Estimation, Planning, #NoEstimates and the Agile Planning OnionMarch 1, 2016, 11 AM ESTThere are many levels of estimation, including budgeting, high-level estimation and task planning (detailed estimation). This webinar challenges the listener to consider estimation as a form of planning. Register Here Next SPaMCAST The next Software Process and Measurement Cast features our essay on focus. The essay is a reaction to an earlier discussion of hyper-connectivity and the techniques to combat the downside of hyper-connectivity, which has convinced me that we are dancing around the bigger workplace issue of how can you stay focused on delivering real business value in an environment that seems to be designed to promote making incremental progress on lots of projects, rather than getting any one of them done. We will also have new entries from the Jeremy Berriault’s QA Corner and a discussion with Steve Tendon on Chapter 8 of Tame The Flow: Hyper-Productive Knowledge-Work Performance, The TameFlow Approach and Its Application to Scrum and Kanban. Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Next Gen Now with Rudina Seseri
Cloud Technologies and the Growing Cloud Business

Next Gen Now with Rudina Seseri

Play Episode Listen Later Feb 15, 2016 34:08


As CTO for Cloud at Pivotal, Colin Humphreys is responsible for the company’s big picture strategy and roadmap for our cloud platform offerings. Colin joins Pivotal from its acquisition of CloudCredo, where Colin was co-founder and CEO. Colin led the installation of the first SLA-driven production Cloud Foundry deployment, delivers tooling to the Cloud Foundry community, and is a regular conference speaker on PaaS-related topics. He also organizes the London PaaS User Group, and is passionate about “Infrastructure as Code”, “Continuous Delivery”, “Devops”, and “Agile Testing”. Colin has spent the last 15 years sitting on the fence between development and operations, delivering solutions for eBay, Volkswagen, Paypal, Cineworld, and others.

Software Process and Measurement Cast
SPaMCAST 381 – Agile Adoption vs Transformation, Myths of Greenfield Development, Gender Gap in Computer Science

Software Process and Measurement Cast

Play Episode Listen Later Feb 14, 2016 28:25


Software Process and Measurement Cast 381 features our essay on Agile adoption.  Words are important. They can rally people to your banner or create barriers. Every word communicates information and intent. There has been a significant amount of energy spent discussing whether the phrase ‘Agile transformation’ delivers the right message. There is a suggestion that ‘adoption’ is a better term. We shall see! We will also have an entry from Gene Hughson’s Form Follows Function Blog. Gene will discuss his blog entry, Seductive Myths of Greenfield Development. Gene wrote “How often do we, or those around us, long for a chance to do things “from scratch”. The idea being, without the constraints of “legacy” code, we could do things “right”. While it’s a nice idea, it has no basis in reality.” The discussion built from there! And a visit from the Software Sensei, Kim Pries!  In the essay, Kim ruminates on the gender gap in computer science education leading to a gender gap in the industry. Re-Read Saturday News We continue the re-read of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog. In Chapter Nine, we tackle sampling.   Upcoming Events I am facilitating the CMMI Capability Challenge. This new competition showcases thought leaders who are building organizational capability and improving performance. Listeners will be asked to vote on the winning idea which will be presented at the CMMI Institute’s Capability Counts 2016 conference.  The next CMMI Capability Challenge session will be held on February 17 at 11 AM EST. http://cmmiinstitute.com/conferences#thecapabilitychallenge   I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now!   Next SPaMCAST The next Software Process and Measurement Cast features our interview with Ben Linders.  Ben revisits the Software Process and Measurement Cast to discuss his recent series of articles on targeting, finding, and eradicating impediments.  Ben lays out a process that generates a platform for continuous process improvement that delivers continuously increasing value!   Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
Spamcast 380 - Kim Robertson, The Big Picture of Configuration Management

Software Process and Measurement Cast

Play Episode Listen Later Feb 7, 2016 44:49


Software Process and Measurement Cast 380 features our interview with Kim Robertson. Kim and I talked about big picture configuration management.  Without good configuration managements work, products, and programs often go wildly astray. Kim describes the a process that is as old a dirt . . . but WORKS and delivers value. We also discussed the book Kim co-authored with Jon M Quigley (Jon was interviewed in SPaMCAST 346) Configuration Management: Theory, Practice, and Application. Kims Bio Kim Robertson is a NDIA Certified Configuration Management (CM) practitioner, consultant, and trainer with over 30 years of experience in contracts, subcontracts, finance, systems engineering and configuration management. He has an advanced degree in organizational management with a government contracts specialty and is the co-author of Configuration Management: Theory Practice and Application. He can be reached at Kim.Robertson@ValueTransform.com If you are interested in the seed questions used to frame our interview please visit the SPaMCAST Facebook page. Re-Read Saturday News We continue the re-read of How to Measure Anything, Finding the Value of “Intangibles in Business” Third Edition by Douglas W. Hubbard on the Software Process and Measurement Blog. In Chapter Eight, we begin the transition from what to measure to how to measure. Upcoming Events I am facilitating the CMMI Capability Challenge. This new competition showcases thought leaders who are building organizational capability and improving performance. Listeners will be asked to vote on the winning idea which will be presented at the CMMI Institute’s Capability Counts 2016 conference.  The next CMMI Capability Challenge session will be held on February 17 at 11 AM EST. http://cmmiinstitute.com/conferences#thecapabilitychallenge I will be at the QAI Quest 2016 in Chicago beginning April 18th through April 22nd.  I will be teaching a full day class on Agile Estimation on April 18 and presenting Budgeting, Estimating, Planning and #NoEstimates: They ALL Make Sense for Agile Testing! on Wednesday, April 20th.  Register now!  Next SPaMCAST The next Software Process and Measurement Cast features our essay on Agile adoption.  Words are important. They can rally people to your banner or create barriers. Every word communicates information and intent. There has been a significant amount of energy spent discussing whether the phrase ‘Agile transformation’ delivers the right message. There is a suggestion that ‘adoption’ is a better term. We shall see! We will also have an entry from Gene Hughson’s Form Follows Function Blog. Gene will discuss his blog entry, Seductive Myths of Greenfield Development. And a visit from the Software Sensei, Kim Pries!  Kim’s essay is on women in the tech field. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process,  for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Agile Amped Podcast - Inspiring Conversations
Lisa Crispin talks about Agile Testing and Donkeys At Agile 2015

Agile Amped Podcast - Inspiring Conversations

Play Episode Listen Later Aug 11, 2015 11:53


Brains, Beauty, Agile Testing, and Donkeys with Lisa Crispin at Agile 2015- what more could you possibly need?

Ryn The Guardian Melberg
Agile Testing Methodologies

Ryn The Guardian Melberg

Play Episode Listen Later Aug 10, 2015 43:15


This week on The Guardian Podcast with Ryn Melberg, tests and testing for users of the Agile project management methodology are described and discussed. How testing gets done with Agile is where the differences are.

Software Process and Measurement Cast
SPaMCAST 349 - Agile Testing, QA Corner - Test Cases, TameFlow Column

Software Process and Measurement Cast

Play Episode Listen Later Jul 5, 2015 59:47


To paraphrase Ed Sullivan, “We have a big, big show this week,” so we will keep the up front chit chat to a minimum.  First up is our essay on Agile Testing. Even if you are not a tester, understanding how testing flows in Agile projects is important to maximize value. Second, we have a new installment from Jeremy Berriault’s QA Corner.  In this installment Jeremy talks about test cases.  More is not always the right answer. Anchoring the Cast is Steve Tendon’s column discussing the TameFlow methodology and his great new book, Hyper-Productive Knowledge Work Performance. Call to Action! I have a challenge for the Software Process and Measurement Cast listeners for the next few weeks. I would like you to find one person that you think would like the podcast and introduce them to the cast. This might mean sending them the URL or teaching them how to download podcasts. If you like the podcast and think it is valuable they will be thankful to you for introducing them to the Software Process and Measurement Cast. Thank you in advance! Re-Read Saturday News We have just begun the Re-Read Saturday of The Mythical Man-Month. We are off to rousing start beginning with the Tar Pit. Get a copy now and start reading! The Re-Read Saturday and other great articles can be found on the Software Process and Measurement Blog. Remember: We just completed the Re-Read Saturday of Eliyahu M. Goldratt and Jeff Cox’s The Goal: A Process of Ongoing Improvement which began on February 21nd. What did you think?  Did the re-read cause you to read The Goal for a refresher? Visit the Software Process and Measurement Blog and review the whole re-read. Note: If you don’t have a copy of the book, buy one. If you use the link below it will support the Software Process and Measurement blog and podcast. Dead Tree Version or Kindle Version  Upcoming Events Sftware Quality and Test Management  September 13 – 18, 2015San Diego, Californiahttp://qualitymanagementconference.com/ I will be speaking on the impact of cognitive biases on teams!  Let me know if you are attending! More on other great conferences soon! Next SPaMCast The next Software Process and Measurement Cast will feature our interview with Arlene Minkiewicz. Arlene and I talked technical debt. Not sure what technical debt is?  Well to some people it is a metaphor for cut corners and to others is a measure of work that will need to be done later.  In either case, a little goes a long way! Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 348 - Woody Zuill, #NoEstimates

Software Process and Measurement Cast

Play Episode Listen Later Jun 28, 2015 34:20


The Software Process and Measurement Cast features our interview with Woody Zuill.  We talked about the concept and controversy swirling around #NoEstimates. Even if the concept is a bridge too far for you, the conversation is important because we talked about why thinking and questioning is a critical survival technique. As Woody points out, it is important to peer past the “thou musts” to gain greater understanding of what you should be doing! Woody Zuill has been programming computers for 30+ years. Over the last 15+ years he has worked as an Agile Coach, Trainer, and Extreme Programmer and now works with Industrial Logic as a Trainer/Coach/Consultant for Agile and Lean software development. He believes code must be simple, clean, and maintainable to realize the Agile promise of Responding to Change.  Contact InformationMob Programming: http://mobprogramming.org/Blog: http://zuill.us/WoodyZuill/Twitter: https://twitter.com/woodyzuill Call to action! I have a challenge for the Software Process and Measurement Cast listeners for the next few weeks.  I would like you find one person that you think would like the podcast and introduce them to the cast.  This might mean sending them the URL or teaching how to download podcasts.  If you like the podcast and think it is valuable they will be thankful to you for introducing them to the Software Process and Measurement Cast! Thank you in advance! Re-Read Saturday News We have just begun the Re-Read Saturday of The Mythical Man-Month. We are off to a rousing start beginning with the Tar Pit.   Get a copy now and start reading! The Re-Read Saturday and other great articles can be found on the Software Process and Measurement Blog. Remember: We just completed the Re-Read Saturday of Eliyahu M. Goldratt and Jeff Cox’s The Goal: A Process of Ongoing Improvement, which began on February 21nd. What did you think?  Did the re-read cause you to read The Goal back up for a refresher? Visit the Software Process and Measurement Blog and review the whole re-read. Note: If you don’t have a copy of the book, buy one.  If you use the link below it will support the Software Process and Measurement blog and podcast. Dead Tree Version or Kindle Version  Upcoming Events Software Quality and Test Management  September 13 – 18, 2015 San Diego, California http://qualitymanagementconference.com/ I will be speaking on the impact of cognitive biases on teams!  Let me know if you are attending! More on other great conferences soon! Next SPaMCast The next Software Process and Measurement Cast is a magazine installment.  We will feature our essay on Agile Testing. The flow of testing is different in an Agile project.  In many cases, organizations have either not recognized the change in flow, or have created Agile/waterfall hybrids with test groups holding onto waterfall patterns.  While some of the hybrids are driven by mandated contractual relationships, the majority are driven by lack of understanding or fear of how testing should flow in Agile projects. We will also have new installments from Jeremy Berriault’s QA Corner.  Jeremy, is a leader in the world of quality assurance and testing and was originally interviewed on the Software Process and Measurement Cast 274. The third column features Steve Tendon discussing more of his great new book, Hyper-Productive Knowledge Work Performance.  Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 332 - Shirly Ronen-Harel, The Coaching Booster

Software Process and Measurement Cast

Play Episode Listen Later Mar 8, 2015 45:29


This week’s Software Process and Measurement Cast features our interview with Shirly Ronen-Harel. We began by talking about the book she is co-authoring, The Coaching Booster, which is 80% complete on LeanPub. We branched out into other topics including coaching, lean, Agile and using lean and Agile in startups. This was an incredibly content-rich podcast.  Have your notepad ready when you listen because Shirly provides ideas and advice that can change how you work! Shirly provides coaching and consulting on Agile/lean methods.  She provides Agile solutions using methods like Scrum, Kanban, Agile Testing, Agile product development, DevOps , Agile project management and more. Shirly is experienced with Agile assimilation with large companies, as well as small companies and startups. She holds a BSW degree from the University of Tel-Aviv (1995) and also has experience with family and individual therapy at crisis stage. Shirly is also the author of an Agile parenting book and is currently writing a new book about personal Agile coaching. Twitter : @shirlyronenrl Linkedin : http://il.linkedin.com/pub/shirly-ronen-harel/0/653/249 Blog : http://agilopedia.blogspot.co.il/ http://agileandfamily.blogspot.co.il/ Contest The contest is over for the copy of Anthony Mersino’s new book Agile Project Management, and the winner is Paul Laberge! We will have another contest in a few weeks so keep listening. Call to action! Can you tell a friend about the podcast? If your friends don’t know how to subscribe or listen to a podcast, show them how you listen to the Software Process and Measurement Cast and subscribe them!  Remember to send us the name of you person you subscribed (and a picture) and I will give both you and the horde you have converted to listeners a call out on the show. Re-Read Saturday News The Re-Read Saturday focus on Eliyahu M. Goldratt and Jeff Cox’s The Goal: A Process of Ongoing Improvement began on February 21nd. The Goal has been hugely influential because it introduced the Theory of Constraints, which is central to lean thinking. The book is written as a business novel. Visit the Software Process and Measurement Blog and catch up on the re-read. Note: If you don’t have a copy of the book, buy one.  If you use the link below it will support the Software Process and Measurement blog and podcast. Dead Tree Version or Kindle Version  Upcoming Events CMMI Institute Conference EMEA 2015March 26 -27 London, UKI will be presenting “Agile Risk Management.”http://cmmi.unicom.co.uk/ International Conference on Software Quality and Test ManagementWashington D.C. May 31 - June 5, 2015Wednesday June 3, 2015http://qualitymanagementconference.com/I will be presenting a new and improved version of “The Impact of Cognitive Biases on Test and Project Teams.” Next SPaMCast In the next Software Process and Measurement we will feature our essay on what is Agile. Agile is more than just behaviors!  Agile is values and principles and . . . I would be willing to fight over that definition! We will also have new entries from Kim Pries and Jo Ann Sweeny! Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 322 – Clareice and Clyneice Chaney, Contracting, Acquisition and Agile Testing

Software Process and Measurement Cast

Play Episode Listen Later Dec 28, 2014 42:47


SPaMCAST 322 features our interview with Clareice and Clyneice Chaney. Clareice and Clyneice provide insights and practical advice into how Agile and contracting work together.  The focus of the interview is on contracting and acquisition of Agile testing, however the concepts we discussed can be applied to contracting for any type of service using Agile techniques. Clyneice Chaney brings over 30 years of testing, quality assurance, and process improvement experience. Clyneice holds certifications from the American Society for Quality as a Certified Quality Manager/Organizational Excellence and Project Management Institute's Professional Project Manager. She has participated as an examiner for Baldrige state quality awards for Georgia and Virginia. She is currently an instructor for an International Testing Certification organization and has presented technical papers at the Software Engineering Institute: SEPG Conference, American Society for Quality: Quality Manager's conference, Quality Assurance Institute International Testing Conference, International Conference on Software Process Improvement and Software Test and Performance Testing Conferences. Clareice Chaney has over 30 years’ experience in Commercial and Government Contracting with an emphasis in contracting within the information technology arena.  She holds a PMP certification with the Project Management Institute and is a certified Professional Contracts Manager (CPCM) through the National Contract Management Association (NCMA). She has presented at the National Contract Management Association World Congress and provided recent collaborations on agile testing and contracting at the Quality Assurance Institute International Conferences. Call to action! We are in the middle of a re-read of John Kotter’s classic Leading Change on the Software Process and Measurement Blog.  Are you participating in the re-read? Please feel free to jump in and add your thoughts and comments! After we finish the current re-read will need to decide which book will be next.  We are building a list of the books that have had the most influence on readers of the blog and listeners to the podcast.  Can you answer the question? What are the two books that have most influenced you career (business, technical or philosophical)?  Send the titles to spamcastinfo@gmail.com. First, we will compile a list and publish it on the blog.  Second, we will use the list to drive future  “Re-read” Saturdays. Re-read Saturday is an exciting new feature that began on the Software Process and Measurement blog on November 8th.  Feel free to choose you platform; send an email, leave a message on the blog, Facebook or just tweet the list (use hashtag #SPaMCAST)! Next The next Software Process and Measurement Cast will feature our essay on the Attributes Leading to Faiure with Agile. Agile projects don’t work when there isn’t open and honest communication within a team. Problems also can occur when all team members are not involved, or if the organization has not bought into the principles of Agile. Knowing what can go wrong with Agile implementations and projects is a step to making sure they do not happen! We will also have the next Form Follows Function column from Gene Hughson and Explaining Change with Jo Ann Sweeney. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Software Process and Measurement Cast
SPaMCAST 314 - Crispin, Gregory, More Agile Testing

Software Process and Measurement Cast

Play Episode Listen Later Nov 2, 2014 40:45


SPaMCAST 314 features our interview with Janet Gregory and Lisa Crispin.  We discussed their new book More Agile Testing. Testing is core to success in all forms of development.  Agile development and testing are no different. More Agile Testing builds on Gregory and Crispin’s first collaborative effort, the extremely successful Agile Testing to ensure everyone that uses an Agile frameworks delivers the most value possible. The Bios! Janet Gregory is an agile testing coach and process consultant with DragonFire Inc. Janet is the is the co-author with Lisa Crispin of Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009), and More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley 2014). She is also a contributor to 97 Things Every Programmer Should Know. Janet specializes in showing Agile teams how testers can add value in areas beyond critiquing the product; for example, guiding development with business-facing tests. Janet works with teams to transition to Agile development, and teaches Agile testing courses and tutorials worldwide. She contributes articles to publications such as Better Software, Software Test & Performance Magazine and Agile Journal, and enjoys sharing her experiences at conferences and user group meetings around the world. For more about Janet’s work and her blog, visit www.janetgregory.ca. You can also follow her on twitter @janetgregoryca. Lisa Crispin is the co-author, with Janet Gregory, of More Agile Testing: Learning Journeys for the Whole Team (Addison-Wesley 2014), Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009), co-author with Tip House of Extreme Testing (Addison-Wesley, 2002), and a contributor to Experiences of Test Automation by Dorothy Graham and Mark Fewster (Addison-Wesley, 2011) and Beautiful Testing (O’Reilly, 2009). Lisa was honored by her peers by being voted the Most Influential Agile Testing Professional Person at Agile Testing Days 2012. Lisa enjoys working as a tester with an awesome Agile team. She shares her experiences via writing, presenting, teaching and participating in agile testing communities around the world. For more about Lisa’s work, visit www.lisacrispin.com, and follow @lisacrispin on Twitter. Call to action! What are the two books that have most influenced you career (business, technical or philosophical)?  Send the titles to spamcastinfo@gmail.com.  What will we do with this list?  We have two ideas.  First, we will compile a list and publish it on the blog.  Second, we will use the list to drive “Re-read” Saturday. Re-read Saturday is an exciting new feature we will begin on the the Software Process and Measurement blog on November 8th with a re-read of Leading Change. So feel free to choose you platform and send an email, leave a message on the blog, Facebook or just tweet the list (use hashtag #SPaMCAST)! Next SPaMCAST 315 features our essay on Scrum Masters.  Scrum Masters are the voice of the process at the team level.  Scrum Masters are a critical member of every Agile team. The team’s need for a Scrum Master is not transitory because they evolve together as a team. Upcoming Events DCG Webinars: How to Split User StoriesDate: November 20th, 2014Time: 12:30pm ESTRegister Now Agile Risk Management - It Is Still ImportantDate: December 18th, 2014Time: 11:30am ESTRegister Now The Software Process and Measurement Cast has a sponsor. As many you know I do at least one webinar for the IT Metrics and Productivity Institute (ITMPI) every year. The ITMPI provides a great service to the IT profession. ITMPI’s mission is to pull together the expertise and educational efforts of the world’s leading IT thought leaders and to create a single online destination where IT practitioners and executives can meet all of their educational and professional development needs. The ITMPI offers a premium membership that gives members unlimited free access to 400 PDU accredited webinar recordings, and waives the PDU processing fees on all live and recorded webinars. The Software Process and Measurement Cast some support if you sign up here. All the revenue our sponsorship generates goes for bandwidth, hosting and new cool equipment to create more and better content for you. Support the SPaMCAST and learn from the ITMPI. Shameless Ad for my book! Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

STP Radio
STPCon Fall 2014 Speaker Teaser ep.2

STP Radio

Play Episode Listen Later Oct 9, 2014 38:07


Listen in as JeanAnn Harrison conducts even more interviews with STPCon Fall 2014 speakers in episode 2 of our STPCon Speaker Teaser. In this episode JeanAnn first speaks with Howard Chorney - Enterprise Test Architect at SOASTA. JeanAnn and Howard discuss his session and share some thoughts on STPCon. JeanAnn also chats with Bob Galen, Agile Consultant, Coach & Trainer at RGCG, LLC about Transitioning from Traditional to Agile Testing. They also touch on his second extended session with Shaun Bradshaw Yin and Yang: Metrics within Agile and Traditional Lifecycles. Tune in to this episode and get ready for STPCon coming up in Denver this November. Learn more at www.STPCon.com

STP Radio
STPCon Fall 2014 Speaker Teaser ep.2

STP Radio

Play Episode Listen Later Oct 9, 2014 38:07


Listen in as JeanAnn Harrison conducts even more interviews with STPCon Fall 2014 speakers in episode 2 of our STPCon Speaker Teaser. In this episode JeanAnn first speaks with Howard Chorney - Enterprise Test Architect at SOASTA. JeanAnn and Howard discuss his session and share some thoughts on STPCon. JeanAnn also chats with Bob Galen, Agile Consultant, Coach & Trainer at RGCG, LLC about Transitioning from Traditional to Agile Testing. They also touch on his second extended session with Shaun Bradshaw Yin and Yang: Metrics within Agile and Traditional Lifecycles. Tune in to this episode and get ready for STPCon coming up in Denver this November. Learn more at www.STPCon.com

Software Process and Measurement Cast
SPaMCAST 259 - Agile Testing, Pries, Rubrics

Software Process and Measurement Cast

Play Episode Listen Later Oct 13, 2013 43:53


Welcome to the Software Process and Measurement Cast 259 The Software Process and Measurement Cast 259 features an essay title Agile Testing. Testing is an important step in the delivery of any piece of software. It is the processes required to find and remove defects from functionality before that functionally is delivered into production. For those of you that have not written code or been an integral part of a software project, a project of any size rarely jumps from that idea directly into executable code without a few hiccups (call ‘em whatever you’d like… explicative deleted, defects or problems). There are two basic ways to find these gremlins – testing (including reviews) before implementation or letting your customers find them after implementation. The SPaMCAST 259 also includes Kim Pries's column.  Kim discusses rubrics and why they are important to software development. And just in case you are confused . . . we are not talking about the "cube."Measurement Cast has a sponsor . . . As many you know I do at least one webinar for the IT Metrics and Productivtity Intstiute (ITMPI) every year. The ITMPI provides a great service to the IT profession. ITMPI's mission is to pull together the expertise and educational efforts of the world's leading IT thought leaders and to create a single online destination where IT practitioners and executives can meet all of their educational and professional development needs. THe ITMPI offers a premium membership that gives members unlimited free access to 400 PDU accredited webinar recordings, and waives the PDU processing fees on all live and recorded webinars.  The Software Process and Measurement Cast recieves a fee if you sign up using the URL in the show notes.   http://mbsy.co/fGdw  All revenue our sponsors goes for bandwidth, hosting and new cool equipment to create more and better content for you!  Support the SPaMCAST and learn from the ITMPI! THe Software Process and Measurement Cast is a proud member of the Tech Podcast Network.  If it is tech it is on the Tech Podcast Network.  Check out the Software Process and Measurement and other great podcasts on the TPN!   TPN:  www.techpodcast.com Do you have a Facebook account?  If you do please visit and like the Software Process and Measurement Cast page on Facebook.  http://ow.ly/mWAgU  The Daily Process Thoughts is my project designed to deliver a quick daily idea, thought or simple smile to help you become a better change agent. Each day you will get piece of thought provoking text and a picture or hand drawn chart to illustrate the idea being presented. The goal is to deliver every day; rain or shine, in sickness or in health or for better or worse! Check it out at www.tcagley.wordpress.com.  Shameless Ad for my book!  Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: "This book will prove that software projects should not be a tedious process, neither for you or your team." NOW AVAILABLE IN CHINESE!  Have you bought your copy? It is conference season! Agile Phily - AgileTour 2013http://www.ifpug.org/?page_id=980Time: October 7, 2013 from 12:30pm to 4:30pmLocation: EbayEnterprise (previously known as GSI Commerce) in King of PrussiaStreet: First AvenueCity/Town: King of Prussiahttp://www.agilephilly.com/events/agiletour-2013 AgileDC - Agile, it's not just for big complex projects anymore.Date: October 8, 2013 http://agiledc.org/ Testrek 2013October 28-30, 2013 to the Eaton Chelsea Downtown Toronto!http://www.qaitestrek.org/2013Toronto/ Agile Practical Techniques WorkshopMonday, October 28, 2013: 8:30 AM - 4:30 PMFormat: Full-day Class Agile Practical Techniques Workshop helps developers, testers, business analysts, scrum masters and project managers to develop an understanding of Agile development techniques focusing on concepts such as test driven development that integrate testing into the Agile process. The workshop combines concepts from Agile (e.g. Scrum, xP and Test Driven Development) and Learning Organizations, providing participants with the tools to both participate on Agile projects and to develop and deploy related processes. Lean Software Development WorkshopTuesday, October 29, 2013: 8:30 AM - 12:00 PMFormat: Half-day TutorialLean Software Development Workshop (e.g. Kanban, Flow and Kaizen) uses a lean-agile focus to help everyone involved in developing, enhancing and maintaining software employ the Principles of Lean to enhance the delivery of value-added work. This workshop is hands-on and “things” will be thrown! Presentation: Agile Underperforming? Keys to Improving DeliveryWednesday, October 30, 2013: 9:45 AM - 10:45 AMJust because you have implemented Agile techniques does not mean you are performing to the level which your organization is capable. Is your implementation of Agile underperforming? Agile has been promoted as delivering higher customer satisfaction, better quality, faster time to market, increased productivity and — in some cases — to deliver world peace. The question is do you know and if you think you know, is your knowledge more than anecdotal? The only way to know truly is to measure. Measurement is only the first step in finding issues and taking action. Measurement provides focus. Being aware of problems and not spending the time and effort to study performance is a waste. W. Edwards Deming admonished us to have “constancy of purpose.” I would use the term “attention-span” in an attempt to make the same argument. Once we understand we have a problem, our next step is to take action and to perhaps make a difference in the value we deliver. Is your Agile underperforming? It doesn’t matter if you’re not going to fix it. Contact information for the Software Process and Measurement Cast Email:  spamcastinfo@gmail.comVoicemail:  +1-206-888-6111Website: www.spamcast.netTwitter: www.twitter.com/tcagleyFacebook:  http://bit.ly/16fBWVContact information for the Software Process and Measurement Cast One more thing!  Help support the SPaMCAST by reviewing and rating the Software Process and Measurement Cast on ITunes! It helps people find the cast.  Next: The Software Process and Measurement Cast 260 will feature my interview with Dr Richard Sykes.  Dr Sykes is the chair of the Board of Directors of the TMMi Foundation. The TMMi is the Testing Maturity Model Integration. When it comes to testing the TMMI is more than just a model!

Software Process and Measurement Cast
SPaMCAST 254 - Matt Heusser, Agile Testing, Test Professionalism

Software Process and Measurement Cast

Play Episode Listen Later Sep 8, 2013 38:16


Welcome to the Software Process and Measurement Cast 254 The Software Process and Measurement Cast 254 features my interview with Matt Heusser.  We discussed agile and testing.  It was a great interview full of thought provoking discussion and controversial ideas.   Matthew Heusser is the managing consultant at Excelon Development, where he focuses on software project delivery and risk management. A board member for the Association for Software Testing, lead editor of "How to Reduce the Cost of Software Testing" (Taylor & Francis 2011), Matt recently served as co-chair of the test and quality track of the Agile Conference. You can learn more about Matt at www.xndev.com or follow him on twitter @mheusser. The Software Process and Measurement Cast has a sponsor . . . As many you know I do at least one webinar for the IT Metrics and Productivtity Intstiute (ITMPI) every year. The ITMPI provides a great service to the IT profession. ITMPI's mission is to pull together the expertise and educational efforts of the world's leading IT thought leaders and to create a single online destination where IT practitioners and executives can meet all of their educational and professional development needs. THe ITMPI offers a premium membership that gives members unlimited free access to 400 PDU accredited webinar recordings, and waives the PDU processing fees on all live and recorded webinars.  The Software Process and Measurement Cast recieves a fee if you sign up using the URL in the show notes.   http://mbsy.co/fGdw  All revenue our sponsors goes for bandwidth, hosting and new cool equipment to create more and better content for you!  Support the SPaMCAST and learn from the ITMPI! THe Software Process and Measurement Cast is a proud member of the Tech Podcast Network.  If it is tech it is on the Tech Podcast Network.  Check out the Software Process and Measurement and other great podcasts on the TPN!   TPN:  www.techpodcast.com Do you have a Facebook account?  If you do please visit and like the Software Process and Measurement Cast page on Facebook.  http://ow.ly/mWAgU  The Daily Process Thoughts is my project designed to deliver a quick daily idea, thought or simple smile to help you become a better change agent. Each day you will get piece of thought provoking text and a picture or hand drawn chart to illustrate the idea being presented. The goal is to deliver every day; rain or shine, in sickness or in health or for better or worse! Check it out at www.tcagley.wordpress.com.  Shameless Ad for my book!  Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: "This book will prove that software projects should not be a tedious process, neither for you or your team." NOW AVAILABLE IN CHINESE!  Have you bought your copy? It is conference season! ISMA8 ISMA Rio will be an opportunity for all members of the Software Measurement Community to meet in beautiful Rio de Janeiro, Brazil, in the week of September 30th, 2013. The eighth edition of the IFPUG International Software Measurement & Analysis Conference will be co-located with the 1st edition of the ISBSG IT Confidence Conference. That will bring together many international consultants, practitioners, and researchers from the Software Measurement arena. You can’t afford to miss it! The New York Times has nominated Rio the #1 tourist destination in the world. See you in Rio! http://www.ifpug.org/?page_id=980 Agile Phily - AgileTour 2013http://www.ifpug.org/?page_id=980Time: October 7, 2013 from 12:30pm to 4:30pmLocation: EbayEnterprise (previously known as GSI Commerce) in King of PrussiaStreet: First AvenueCity/Town: King of Prussiahttp://www.agilephilly.com/events/agiletour-2013 AgileDC - Agile, it's not just for big complex projects anymore.Date: October 8, 2013 http://agiledc.org/ Testrek 2013October 28-30, 2013 to the Eaton Chelsea Downtown Toronto!http://www.qaitestrek.org/2013Toronto/ Agile Practical Techniques WorkshopMonday, October 28, 2013: 8:30 AM - 4:30 PMFormat: Full-day Class Agile Practical Techniques Workshop helps developers, testers, business analysts, scrum masters and project managers to develop an understanding of Agile development techniques focusing on concepts such as test driven development that integrate testing into the Agile process. The workshop combines concepts from Agile (e.g. Scrum, xP and Test Driven Development) and Learning Organizations, providing participants with the tools to both participate on Agile projects and to develop and deploy related processes. Lean Software Development WorkshopTuesday, October 29, 2013: 8:30 AM - 12:00 PMFormat: Half-day TutorialLean Software Development Workshop (e.g. Kanban, Flow and Kaizen) uses a lean-agile focus to help everyone involved in developing, enhancing and maintaining software employ the Principles of Lean to enhance the delivery of value-added work. This workshop is hands-on and “things” will be thrown! Presentation: Agile Underperforming? Keys to Improving DeliveryWednesday, October 30, 2013: 9:45 AM - 10:45 AMJust because you have implemented Agile techniques does not mean you are performing to the level which your organization is capable. Is your implementation of Agile underperforming? Agile has been promoted as delivering higher customer satisfaction, better quality, faster time to market, increased productivity and — in some cases — to deliver world peace. The question is do you know and if you think you know, is your knowledge more than anecdotal? The only way to know truly is to measure. Measurement is only the first step in finding issues and taking action. Measurement provides focus. Being aware of problems and not spending the time and effort to study performance is a waste. W. Edwards Deming admonished us to have “constancy of purpose.” I would use the term “attention-span” in an attempt to make the same argument. Once we understand we have a problem, our next step is to take action and to perhaps make a difference in the value we deliver. Is your Agile underperforming? It doesn’t matter if you’re not going to fix it. Contact information for the Software Process and Measurement Cast Email:  spamcastinfo@gmail.comVoicemail:  +1-206-888-6111Website: www.spamcast.netTwitter: www.twitter.com/tcagleyFacebook:  http://bit.ly/16fBWVContact information for the Software Process and Measurement Cast One more thing!  Help support the SPaMCAST by reviewing and rating the Software Process and Measurement Cast on ITunes! It helps people find the cast.  Next: The Software Process and Measurement Cast 255 features my essay on self-management. The title of the essay is "Project Management Is Dead, Long Live Project Management." Self-management and agile go together like chocolate and peanut butter but . . . the concept is not well understood and rarely implemented well.

Teahour
#7 - 你应该知道的 Agile 和 Scrum

Teahour

Play Episode Listen Later Mar 15, 2013 87:45


本期由 Dingding Ye 主持,参与嘉宾有 徐毅 和 Terry Tai。徐毅老师目前是 Nokia 公司的敏捷和精益教练,专长于大型组织(>500人)的敏捷迁徙转变,著有译本 团队之美,管理 3.0 和 Scrum 要素,目前正在写作一本敏捷测试相关的书 大测大悟。本期徐毅老师到 Teahour,跟我们一起分享了他在敏捷实践,Scrum实践以及敏捷测试上的一些心得和经验。 Weibo Twitter LinkedIn Fengche.co Scrum 要素 敏捷宣言 敏捷宣言中文版 The Home of Scrum XP IT 新产品开发游戏 PDSA: Plan, Do, Studay, Act Scrum Alliance Scrum 指南 Kanban Agile Testing Day Practical Agile Testing with Janet Gregory Agile Testing Practices of an Agile Developer Explore It! Special Guest: 徐毅.

Meta-Cast, an agile podcast
Episode 13 - Agile Testing, Part 2

Meta-Cast, an agile podcast

Play Episode Listen Later Jan 2, 2011 57:47


Bob and Josh wrap up their two-part discussion of testing in an agile team. This second session covers various agile testing subjects, from continuous integration to automated test development. Support this podcast

Meta-Cast, an agile podcast
Episode 12 - Agile Testing, Part 1

Meta-Cast, an agile podcast

Play Episode Listen Later Nov 8, 2010 50:55


Bob and Josh start their two-part discussion of testing in an agile team. This first session covers numerous agile testing subjects, from the role of testers in an agile team to test driven development. Support this podcast

Software Engineering Radio - The Podcast for Professional Software Developers

This episode covers the topic of agile testing. Michael interviews Lisa Crispin as an practionier and book author on agile testing. We cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration.

Software Engineering Radio - The Podcast for Professional Software Developers

This episode covers the topic of agile testing. Michael interviews Lisa Crispin as an practionier and book author on agile testing. We cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration.

Software Engineering Radio - The Podcast for Professional Software Developers

This episode covers the topic of agile testing. Michael interviews Lisa Crispin as an practionier and book author on agile testing. We cover several topics ranging from the role of the tester in agile teams, over test automation strategy and regression testing, to continuous integration.

Software Process and Measurement Cast
SPaMCAST 52 - Lisa Crispin, Agile Testing, Change Checklist Part Two

Software Process and Measurement Cast

Play Episode Listen Later Feb 8, 2009 35:50


Lisa Crispin is an agile testing coach and practitioner. She is the co-author, with Janet Gregory, of Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009). She specializes in showing testers and agile teams how testers can add value and how to guide development with business-facing tests. Her mission is to bring agile joy to the software testing world and testing joy to the agile development world. Lisa joined her first agile team in 2000, having enjoyed many years working as a programmer, analyst, tester, and QA director. Since 2003, she's been a tester on a Scrum/XP team at ePlan Services, Inc. She frequently leads tutorials and workshops on agile testing at conferences in North America and Europe. Lisa regularly contributes articles about agile testing to publications such as Better Software magazine, IEEE Software, and Methods and Tools. Lisa also co-authored Testing Extreme Programming (Boston: Addison-Wesley, 2002) with Tip House. For more about Lisa"s work, visit www.lisacrispin.com.Join the SPaMCAST’s community by joining the SPaMCAST Facebook page and get involved!!!!  http://tinyurl.com/62z5elThe essay is titled "A Really Simple Checklist for Change Readiness Assessment” Part 2.  Planning for change is no very different from planning a vacation.  The Checklist will remind you of the big things to remember that sometimes get forgotten when dealing with the details of making change happen.  Remember that part one was originally uploaded in SPaMCAST 51.  The text of the whole essay can be found at www.tcagley.wordpress.com.There are a number of ways to share your thoughts with SPaMCAST: •    Email SPaMCAST at spamcastinfo@gmail.com•    Voice messages can be left at 1-206-888-6111•    Twitter - www.twitter.com/tcagley•    BLOG – www.tcagley.wordpress.com•    FACEBOOK!!!! Software Process and Measurement      http://tinyurl.com/62z5elNext Software Process and Measurement Cast: The next Software Process and Measurement Cast will feature an interview with Capers Jones discussing a wide range of software measurement topics.  The interview with Capers was an exciting and I think you will find the interview an exciting listen.  Listen with a friend!One more item . . .my father has begun to podcast his fiction at www.talesbytom.com.  Yours truly is doing the production.  Feel free to check it out and give him feedback.