Skip to main content

Behavior Driven Development and the Agile Analyst

I have been following up on my recent discovery of "Feature Injection" by trying to track down some more information about Behavior Driven Development, or "BDD," which is the software development technique into which feature injection fits.  Handily, someone really good at BDD has put what seems to be a lot of really high quality information into the wikipedia entry, so I urge you to go check that out immediately, before someone less-good changes it!

But in the mean time, I've been talking with colleagues about the practicalities of BDD, and I'm just enchanted by the helpful structure BDD provides for describing what's different about agile business analysis from its handy bĂȘte noire and sparring partner, "Big Upfront Design."

So, compare and contrast a "requirement" written in detail well before software development begins (the "BUFD" requirement) to the one done "just in time," expressed for perpetuity in business-readable language as a set of acceptance criteria examples linked to an underlying software implementation (the "BDD" requirement).

BUFD requirement (also known as a "system requirement," a "system specification," a "user requirement," or a "non-functional requirement."):
  • is written in great detail long before a programmer has even heard of it based on discussions between analysts and "business people."
  • is approved in writing by a slew of people, along with a vast number of other requirements in a multi-page "systems specification," for Sarbanes-Oxley compliance (SOX).
  • is used by testers as the basis for the "test plan," they put together for testing the software as it is written by development.
  • is accompanied by a "traceability matrix" which links the requirement to the associated written "design" and "test plan" documents, on a point by point basis.
  • must be kept up to date once the programmers get going (along with the design, the test plan, and the traceability matrix)
  • is used as a point of reference if scope must be cut due to lack of time--the requirement is still in the document, but the behavior isn't in the program.  But at least there's a notation in the traceability matrix for reference, once an argument needs to happen.
  • is put into storage of one kind or another once the software goes live (the resemblance of the "archive" above to a waste basket is purely coincidental).
BDD requirement (also known as "a set of acceptance criteria" or "our hero"):
  • is modeled at a high level with very little verbiage during a brief project "discovery" or "inception" phase at the beginning of a project, and recorded in writing on a 3x5 card after a discussion which includes business people, programmers and testers.
  • is explored in detail just before programmers begin work on it, by putting together written business-language scenarios that can be used to test whether the software is doing what the business would expect.  These are called "acceptance criteria," and they are written directly into an automated functional testing tool, where they can be referenced by the people who actually write the corresponding automated functional tests.
  • can be signed off at this time, for SOX compliance.
  • serve as the built-in documentation to the automated tests for as long as the software is operational.  although the software itself may change, and those changes may require corresponding modifications to the actual automated tests, the business language description of what those tests do are not likely to change.
Vast quantities of excess paper are avoided here:  we don't fully document any requirement unless we are actually going to develop it (avoiding waste associated with "comprehensive" up-front documentation for a system which will never be built to this scope).  We avoid writing the test plan and the traceability matrix by recording requirement details directly into the automated tool in the form of acceptance criteria.  We avoid rewriting requirements by not getting the details until just before coding.

But meanwhile, we actually improve business people's ability to judge whether the software meets their needs or not, and keep a record of the software's intent forever.  The analysis activity becomes lean, efficient, and immortal!

As an analyst, I like that.

Comments

  1. I am also a big fan of BDD. While reengineering a big legacy system, I always detected huge misconceptions about what the business thought the app is doing and what actually happened in the code. I brought in BDD as a measure to avoid this misconception in future but interest in a still mostly waterfall like environment was not very big! Unfortunately. But we will not give up!

    ReplyDelete
  2. Hey persillie! Would it still be feasible to have your BAs work closely with your QAs to ensure it's clear what the test cases in the test plan are doing, and why? Just doing that BA/QA sanity check can help a lot--build in some stuff that will help in UAT even while you system test! If you keep your well-commented system test and UAT plans around post-go-live, you can still get your measure of eternity! :-) Good luck with it!

    ReplyDelete

Post a Comment

Popular posts from this blog

How Do You Vote Someone Off of Your Agile Team?

One of the conundrums of agile conversion is that although you are ordered by management to "self-organize," you don't get to pick your own team.  You may not have pictured it this way, but your agile team members are going to be the same people you worked with before, when you were all doing waterfall!   I know I wasn't picturing it that way for my first agile team, so I thought I should warn you.  (I thought I was going to get between six and eight original Agile Manifesto signers.  That didn't happen.). Why "warn" you (as opposed to "reassure" you, say)?  Because the agile process is going to reveal every wart, mole, quirk, goiter, and flatulence issue on the team within a few hours.  In the old days, you could all be eccentric or even unpleasant in your own cube, communicating only by document, wiki, email, and, in extreme situations, by phone.  Now you are suddenly forced to interact in real time, perhaps in person, with written messag

A Corporate Agile 10-point Checklist

I'm pretty sure my few remaining friends in the "small, collocated team agile" community are going to desert me after this, but I actually have a checklist of 10 things to think about if you're a product owner at a big company thinking of trying out some agile today.  Some of these might even apply to you if you're in a smaller place.  So at the risk of inciting an anti-checklist riot (I'm sorry, Pez!), I am putting this out there in case it is helpful to someone else. From http://www.yogawithjohn.com/tag/yoga-class/ Here's what you should think about: 1.        Your staffing pattern.  A full agile project requires that you have the full team engaged for the whole duration of the project at the right ratios.  So as you provision the project, check to see whether you can arrange this staffing pattern.  If not, you will encounter risks because of missing people.  Concretely it means that: a.        You need your user experience people (if a

Your Agile Transformation Needs to Start with a Quiet Phase

From a really great blog post on agile adoption:  http://smoovejazz.wordpress.com/2011/02/16/an-agile-approach-for-adopting-agile-practices/ I've observed some different agile transformation patterns, and maybe you have too: Just Do Standups   (Shoot, then Aim):   some people feel that since you're "agile," you should just start doing stuff, like daily standups, and then build more of the the plan as you go.  Find a team and start doing some agile with them!  Start a revolution one practice at a time, one team at a time. Pros:   you're very busy from the start. Cons:   what exactly are you doing and why? KPI-Driven Change (Aim, then Shoot) : some people who have worked in large corporations for a while will tell you that to get the respect of the people, you need to start with a plan, support the plan with awesome printed and online collateral.  Then you "kick off," tell teams what to do, and measure them using "Key Productivity Indica