Common Dysfunctions of “Scrum” Teams — Part 2

Common Dysfunctions of “Scrum” Teams — Part 2

In the first part of this series, I talked about how many teams who try to transform into “Agile” teams fail because they don’t actually understand what being “agile” is all about, or because they try to cut corners by not fully embracing (at the outset, at least) the fundamental requirements of the methodology that they have selected.  Today, I’m going to focus on three additional complications that teams often run into when they run headlong down the path of “Agile” without actually embracing the precepts of “agility” that provide the highest likelihood of success.

The first is the importance of continuous improvement practices — how ignoring the process review ceremonies hurts everyone involved.  The second is how a lack of clear goals and expectations will lead to inevitable disappointment and disillusionment — both from the team as well as stakeholders.  And the final point is how even though there are specific practices and ceremonies that are recommended by Scrum, the entire point of being “agile” (not “Agile”) is to assess what you’re doing, how you’re doing it, and what you’re achieving, and make changes to get better — even if that means doing things that “aren’t” Scrum.

Failing to Implement Continuous Improvement

A lot of people seem to forget that Agile practices and methodologies have their origins in Lean manufacturing processes, those decades-old efforts to improve, streamline, and constantly improve line manufacturing.  Concepts like the Toyota Production System (which I discuss here in an post on “waste”), Just-In-Time inventory processes, the Lean Management Model (and all it’s associated acronyms – TPM, TSM, TQM, etc.) are all spiritual precursors to the Agile Manifesto and its subsequent development methodologies.

And the single most important thing about all of these processes and philosophies and concepts throughout their history is reflecting on what you’ve done.  It’s not enough to just do the work and pat yourself on the back at the end of the sprint for a job well done.  Lean principles and Agile methodologies almost all have built into them some form of retrospective during which the team involved in the process meets, discusses what’s been working and what hasn’t, and makes proposals for change that can be tested out in future work efforts.

There are few things that I believe as absolutes, but this is one of them — if you’re not doing retrospectives, you’re not being “agile”.  Looking back over the last sprint, iteration, project, or other set piece of work and critically examining how you executed it is essential to increasing not only the effectiveness of the team, the efficiency of the team, the quality of the results, but (and perhaps most importantly) the buy-in and commitment of the team.

If you’re not doing some kind of retrospective, you’re not really being agile.

Remember, Scrum is a team-oriented model — the team makes commitments, the team works together to meet them, the team meets daily, the team demos their software to the stakeholders, and the team needs to be empowered to improve themselves.  There’s a little truth to the classic joke about how many psychologists it takes to screw in a lightbulb — for a team to really change and improve, they have to want to change.  And there’s literally nothing that motivates a team to change more than having open and honest discussions amongst themselves within a functioning retrospective.

Some people look at the retrospective as “fluff” — and think that if the team’s meeting their commitments there’s no need to improve.  That’s just bullshit.  It’s “slap yourself on the back for a good job” thinking that doesn’t embrace the origins behind Agile practices.  Every team can do better.  Every team messes something up while working.  Every team really, truly has to be open and honest about what’s not working and try new things.  Otherwise, you’ll just get the same results every single time — with the same frustrations, the same issues, and the same outcome.  Agile wants us to be better with every single iteration, or with every project, or with every user story.  And we simply can’t do that unless we’re looking back with a critical eye and trying new things.

And let me assure you — if you’re not constantly improving your practices, your processes, and your tools…your competition is.

Lack of Clear Goals and Expectations

Another common problem that I see with teams who have started to embrace Agile principles and the Scrum methodology in particular is a failure to really understand goals are set under the practices, and how to ensure that the outcome from an iteration matches the expectations of the stakeholders to whom you’ll be showing your work at the end of the sprint.  There are two primary components that teams who cut corners on Scrum often overlook — one that focuses on the Development side of the equation, and one that focuses on the Stakeholders.

On the Development side of the equation, there is a concept in Scrum known as the “Definition of Done,” which is a set of criteria that the team agrees is necessary for any individual developer to call a story “done”.  Generally speaking, this includes things like unit testing, code review, deployment without a build breaking, and other things like that.  But it doesn’t have to be limited to that — for example, you could make a UX review part of the definition, or you could require some user acceptance testing.  The point is, every Scrum team needs to have a clear checklist of things that must be completed before the story is considered “done”.  These objective measures are in place to ensure that there is a consistent level of quality both in the outcome and in the implementation, and help to counteract some of the issues that Scrum teams run into related to code quality, lack of automated testing, or other things that should “just happen” but often don’t.  The team’s “definition of done” is the Bible by which they execute their stories — if you cut corners and don’t check one of those boxes…well, you’re not “done” with the story.

The point of having both “Definition of Done” and clear “Acceptance Criteria” is so that everyone knows exactly what’s being committed to.  Without them, requirements are fluid and unspoken, and will be missed.

The Stakeholder side of the equation should be covered by the “Acceptance Criteria” associated with every single story that the team takes on.  These criteria should be clear, concise, and defined before the team makes its commitment.  That last part is particularly important — Acceptance Criteria allow the Product Owner (and by proxy, the Stakeholders) to specify what they consider to be an acceptable solution to the problem posed by the User Story that the team is taking on.  These are all of the important things to the Stakeholders that aren’t captured in the traditional “User X wants to Y so that they can Z” format that User Stories follow.  If there are particular implementation notes that are important to the Stakeholders, they should be noted as Acceptance Criteria.  If there’s a particular UX design paradigm that needs to be used, it should be noted as Acceptance Criteria.  If there’s anything not captured by the User Story itself, upon which the Product Owner’s review of the completed work will hinge, it needs to be captured as an Acceptance Criteria.

The point of having both a clear and agreed-upon “Definition of Done” and clear and objective “Acceptance Criteria” is so that everyone knows what’s being committed to.  The Developers know both what they’re going to have to do from their side and what the Stakeholders expect…and the Stakeholders know what they’re going to get at the end of the Sprint.  If it’s not in the Definition of Done or in the Acceptance criteria, then you can’t expect it to be done.  The team commits to what the team commits to — no more and no less.  But, it has to be clear before they can make that commitment.  Without a clear Definition of Done and without clearly articulated Acceptance Criteria, the teams will fail more often than they won’t — nobody can meet unspoken expectations every single time.

Holding the Process More Sacred Than Results

The last issue that teams have when they don’t fully embrace “agility” is likely to seem entirely contrary to almost everything I’ve said thus far.  But it’s not…let me tell you why.

Every team needs a place to start.  The specific “requirements” of Scrum provide a team with an excellent foundation upon which they can start to build their work.  Skipping out on the foundation makes the structure as a whole untenable, though — there’s a cost to every ceremony you don’t do, every principle you think is “fluff”, and every single corner that you cut.  If you think of a team like a building, you need a solid, clean, and “perfect” foundation in order to ensure that the building doesn’t topple over during a big storm.  That’s why teams that ignore the principles of agility, or teams that start out with “Scrumbut,” or teams that do everything but the retrospective wind up with problems and complaints about how “this doesn’t work.”  Of course it doesn’t work — you didn’t really try it out.

If you go shopping for clothes, grab a bunch of stuff off the rack, but only try on the jeans…well, who’s to blame when the rest of the outfit clashes?  Not the clothes.  And the same goes for teams that start out cutting corners and skipping out on well-established components of the process.

Agility requires that we assess what’s working and what’s not — but to know whether it’s working or not, we need to try it.

But that doesn’t mean that you have to strictly follow every single principle or component of a given methodology.  Adherence to some belief that every single thing required by any given methodology must be followed to the letter of the “law” is ridiculous.  The concept that there’s some “Holy Scrum” that only heretics breach and are punished for in roiling fires and Stakeholder meetings for eternity is, simply, bullshit.

Constant improvement requires that we assess what’s working and what’s not — and this isn’t relegated solely to development practices or artifacts or tools.  It’s just as important to review your implementation of the methodology every two weeks as it is to review your work.  If there’s something that’s not working, being “agile” requires that we change it.  Maybe retrospectives after every Sprint aren’t helpful because you’re not actually finding a lot to talk about — okay, make it every two Sprints.  Maybe daily in-person standups are difficult because you have a globally distributed team it’s not really reasonable to have somebody in the UK call in at the end of their day — fine, make your standups daily email updates.

The point is, though, that you have to try the basics before you’ll know what to tweak and what to change.  You have to start somewhere, and if you start out by cutting corners…well, that’s going to be the driving principle of your processes from that point forward.  And nobody wants that.

Scrum isn’t perfect.  Scrum isn’t for everyone.  But it’s a well-known, tested methodology that gives a team that truly embraces “agility” as a part of their core being somewhere to start.  It’s a foundation upon which you can build the home that you always wanted — whether that’s a lavish Art Deco palace or a single-story Arts and Crafts home.  But if you start out by cutting things before you try them, don’t be surprised when all you wind up with is a shack made of old hunks of wood that nobody else wanted.

Back To Top