Done-ness, Denial, and Golf

>> 31 August 2008

Totally Test-driven Scrum Delivers Better Objectivity In Measuring Real Progress - Agile Software Process Improvement
Progress is measured entirely in terms of story points collected for tests passed. There is no "A for Effort" in my way of doing things. It's either testably delivered or it's not.

The analogy I use - and, yes, it's a golfing analogy - is to differentiate by measuring the completeness of a hole by whether or not the players took the shots they planned to vs. whether or not the ball actually finished up in the hole.
Jason Gorman puts his finger squarely on one of the keys to software development success. Performing tasks only matters if they result in complete and correct work product.

Jason's golf metaphor conveniently illustrates a couple of related antipatterns, "close enough" and "pick it up after 10 strokes."

In "close enough," someone makes a decision that whatever is currently implemented seems pretty done, so the developer should move on and try to get something else implemented. This is like finding yourself on the green putting for double bogey, and deciding to stop and move on to the next tee. You cover all 18 holes in 4 hours, but you can't post a score because you haven't really played the round.

In "pick it up after 10" the team concedes its inability to bring a feature to testable completion. Just like a course policy to keep people moving around the course, the software team decides that they have spent all the time they can on a feature and must move on. Like the 10-stroke golf hole, they've used more time than they expected and didn't achieve the required result, but they figure they've done all the can in the time allowed. If the customer is lucky, the 10th stroke will be on the green close to the hole. If not, the 10th stroke will fail to clear the edge of the bunker and trickle back into the sand.

I am still learning about Agile methods, but I have to believe the Agile team would have a couple of responses. "Close enough" is essentially a quality issue; the team thinks it's implemented something acceptable even if not what was asked for or agreed to. If the product owner accepts the close enough solution, just like in the cartoons the hole can slide over under the ball so it can drop in. Voila! No longer close enough, the feature is now done.

"Pick it up after 10" seems like a reasonable strategy on the surface. It's the golf equivalent of timeboxing. Teams get into trouble only if they mark the feature done and check it off the list. Seems to me the Agile team might talk it over with the customer and decide they're only go to play the front nine today. Playing the hole to completion will mean they can't finish the back nine, and the customer just can't accept the ball-in-the-bunker solution. Teams create problems for themselves when they pick it up and move on to the next hole, card a 10, and don't tell anyone about the feature's unfinished business.

Closely related to "pick up after 10" is the mulligan. In some respects mulligans are necessary, even a mark of good practice, in software development. You learn from the failed attempt, and try again. Just as in golf, though, mulligans don't show up on the score card. The proud player strides off the 18th green tallying up his 90-something score, but everyone in the group knows Mr. Mulligan couldn't have broken 100. On software projects there's a lot more at stake than who buys the first round. Unreported mulligans eat up time, and worse, keep the team from assessing its own likely performance on future efforts. Mulligans often lead inevitably to "close enough" and "10-stroke limit" situations.

Ah, golf. You can fool yourself if you like, but the rattle of the ball in the cup is what the game is all about.

Read more...

Betting on Scrum

>> 23 August 2008

It's been a year since my ScrumMaster training - time to renew my Scrum Alliance membership. Time to consider the value of membership. Time to consider my place in the Agile community. Time to decide if, as an experience designer, I can find a place in that community!

First, what did I gain for my original investment of $1300 out of pocket and a couple of vacation days?

  • I discovered Jeff Patton's excellent work. He offers one of the best collections of experience design resources I've found at his site, AgileProductDesign.com. The discussion list he moderates on agile usability attracts leaders in both fields.
  • I'm happy to work in a framework that values empathy and story in addition to analysis and research.
  • I like working with and learning from engineers. I'm glad I get to talk with them more.
  • I'm passionate about enabling real people to do real things, and with tools that come easily to hand and leave them happy. Seems like lots of Agile engineers are, too. I like that.
I've found some things less congenial.
  • I'm tired of "how many agilists can dance on the head of a pin" discussions. Who's really doing Agile? What methods are really Agile? Can you be an Agile practitioner if you don't produce code? Which Agile guru speaks with the most authentic voice? Can you trace your Agile sensei's pedigree through an unbroken chain of master and student to the founders? Are you still Agile if you learned from someone else? What does the Manifesto mean and whose interpretation is Agile?
  • The Agile world does not seem to be a comfortable place for user experience practitioners, still less for business analysts. I have been stunned by the number and intensity of insulting, patronizing, and downright hostile opinions I have seen expressed. Manifesto values like respect have at times been glaringly absent.
  • Scrum is a hard practice to evangelize. Managing a product backlog or scrum backlog seems to require discipline at least as great, if not greater, as does "traditional" project planning. It's more structure than people who don't like project management overhead want, and people who have to pass regulatory audits are afraid it's not enough.
The software world has never been without its religious wars, and you can't please all the people all the time. Experience designers, product people, and engineers eventually will find their places under the Agile umbrella.

And somewhere there's a dogma-free Web app team full of people passionate about enabling people to express their personal and professional power through software - and room for one more like them.

At least, I just bet $50 on the possibility!

Read more...

Designing for the Edges

"Explore your edge cases for the sake of innovation."
Nick Finck is just one of the prominent designers who see the value in edge cases. On the other hand, the casually dismissive "Oh, it's just an edge case!" is all too commonly heard. Before tossing out your edge cases altogether, ask these questions:
  • Is the case a user goal believed to be shared by only a few users?
  • Is the case created when technical limits prevent the application from fulfilling a user goal?
  • Is the case created when users interact with the application in unexpected ways?

If the answer is "Yes," the edge case indicates the opportunity to introduce delighters/exciters, usability improvements, technical innovation, or trend-anticipating product innovation. Capture this information somewhere for follow up if you must drop the case from your current design effort.

I saw these factors in action some time ago when I was designing a new, much-requested, feature. One customer had what some believed to be a novel requirement. The requirement seemed to make business sense. The solution would extend an existing feature and require the addition of a minor option to the feature I was designing. Even so, my business stakeholders asserted this customer's request was wholly unique. They were sure no other customer would use the new option because, the stakeholders believed, none of them used our application the way this customer did.

I decided to inquire further. After all, these same people had asked us to invest a significant part of the project budget to design a different capability targeted at the very use pattern they now were sure was used by only one customer. It didn't add up.

Sure enough, there were customers who had asked for something similar. Engineering had not anticipated their requirement in their original design. Later, engineering removed a configuration option from a related feature. Without the configuration option, the first feature was unusable.

Someone had figured out a hack that mimicked the desired behavior, a hack that was subsequently employed for other customers. The hack was "good enough" - until a customer needed both the original feature and the flawed related feature. The workaround obscured an underlying limitation in our application that could have been easily remedied at any time.

This "edge case" qualified on all three criteria: people believed the request was idiosyncratic, a technical limitation prevented solving a customer's business problem, and the need to solve the business problem was unexpected. By repairing the flawed feature we could serve the insistent customer to their satisfaction, and deeper-than-face-value analysis revealed prospective customers would likely appreciate the feature, too.

Read the 4-part series Designing for the Edges at Functioning Form, where Luke Wroblewski collected a number of perspectives on the value of edge cases.

Read more...

About

Faith Peterson is a versatile business analyst and user experience designer practicing in and around Chicago, IL. She works on Web-enabled business applications for content management, digital asset management, and business process management.

  © Blogger template Digi-digi by Ourblogtemplates.com 2008

Back to TOP