Showing posts with label agile. Show all posts
Showing posts with label agile. Show all posts

IIBA tip explains «extends» use cases

>> 12 March 2009

The latest IIBA® Tips & Techniques Bulletin has what seems to be a pretty useful explanation of the «extends» use case relationship. Should help BAs who don't have a programming background. If you've been struggling with what «extends» is for, see if this helps.

Tip #2: «Extend»’ing vs. «Include»’ing a Use Case

When does it make sense to split up a Use Case, and what options do I have?

«extend» and «include» are useful ways to reduce duplication and help sequence your work.

Develop an Include use case if its functionality will be required by multiple use cases. While most Include use cases tend to be fairly simple, formally capturing the details in a single Include use case allows you to avoid repeating requirements.

Examples of Include use cases: logging into a system or automating a credit check.

Extend use cases formalize an alternate flow (from one use case) into a separate use case and apply only to that other use case. A common reason to create an Extend use case is to accommodate release planning. By moving an alternate flow off into a distinct use case, it can be prioritized and tracked separately from the parent use case while still being able to show how it fits into that bigger picture.

Those looking for a good basic description of a variety of modeling techniques, including use cases, might like Scott Ambler's Agile Modeling site. There are a lot of sources for nuts-and-bolts info about use cases, but Scott also discusses how to use good judgment when deciding what models to create.

More agile BA articles: Google

Reblog this post [with Zemanta]

Read more...

In search of great business analysis: imagination, analytical power, critical thinking skill, creativity, wisdom

>> 09 March 2009

Recently I've been asked how to identify and develop business analyst talent. I've found it to be an interesting question and would like to offer some observations.

The software development field has been grappling for nigh on to 50 years with problems related to requirements. For a generation the business analyst was seen as the key to solving those problems. The Agile generation saw traditional requirements practice itself as the problem, with some members advocating the radical disintermediation of software development. The business analyst community has recently begun to address the challenge by introducing greater professionalization to its practice as well as adapting its methods to the Agile world, even as an almost nihilistically pragmatic "post-Agile" movement emerges (apologies to Jason Gorman). This history begs the question of whether we, like Diogenes, ultimately will be disappointed in our quest for a solution to problems with requirements and, ultimately, successful project outcomes.

None of these approaches tries to identify what distinguishes great business analysts from the competent or the ineffective. Are BAs made or born? Can business analysts be trained, coached, or mentored to greatness? Will certification result in better requirements practice? Can using subject matter experts as BAs improve the chances for project success?

A case can be made that a handful of intangible qualities characterize truly great business analysts, whether they work under that title or some other (such as programmer/analyst, SME, product owner, user experience designer). These qualities can be refined or developed if present in raw or latent form. I am skeptical, though, that anyone can create them through training or experience. These qualities are:

The information space has to be imagined in order to explore it (you have to know what question to ask before you can ask it), and re-imagined as new information is discovered. The information, once gathered, has to be interpreted, restructured, and communicated so that team members can together create and craft a solution. All along the way information and ideas must be evaluated and good judgment must be exercised. Even Agile methods that use creating the solution as the exploration process, omitting the formal BA role, still often find that success relies, as does that of any approach, on the team's ability to manifest these qualities.

Frameworks and conventions serve, if these qualities are present, to reduce risk, bring stability to the effort, and help organize and communicate it. Without these intangibles frameworks and templates can actually increase risk because they create the illusion that the requirements effort was done – the “form over substance” problem.

I'd love to hear your comments as I “think out loud” in subsequent posts about whether or how

Reblog this post [with Zemanta]

Read more...

Chicago Agile Project Management meetup

>> 29 November 2008

If you are an Agile practitioner you might be interested in the Agile Project Management meetup. The meetup is new with 40+ members and meets monthly.

Dec 2nd's topic is user story life-cycle patterns from backlog to production, a topic of interest to business stakeholders and business analysts in addition to project managers and engineers.

Get information at Chicago Agile Project Management at Meetup.

Read more...

The road is paved with good intentions

>> 01 September 2008

There's an interesting discussion going on at Chad Meyers' blog about quality in both design and execution. Chad argues there is such a thing as Good Design, that knowledgeable observers can reliably discern its presence, and that "but it works!" is not a defense or justification for bad design choices.

The bad design characteristics Chad spotlights prevent teams from responding effectively to demands on the application. With each release the team is less and less able to respond until they are locked in to long projects that deliver little incremental value. How do teams start down the path that leads there?

Chad's post hinted at an important consideration that doesn't seem to get enough attention. Too often people make fast delivery of finished features the top priority, then define everything else as speeding up or slowing down delivery. The problem is that driving to "finished features" (where finished = fully realized) prevents the team from engaging in "fast fail, fast feedback" cycles. If there is also a stated desire to work in an "agile" fashion the team might try to marry the "finished feature" driver with a short-timebox approach with disastrous results. On a large project few fully-realized features can be started and delivered in a single 2-4 week period.

The Agile team will look for a way to decompose the desired end feature into meaningful chunks and distribute the work across the team such that each 2-4 week period delivers a coherent, complete piece of functionality. The wish-we-were-agile team calls on the team to keep the feature "simple," not to "gold-plate" it. Once headed down this path all kinds of things become gold-plating - error handling, usability, even the goal of working software can be set aside under the banner of producing "good enough" software quickly.

Of course there is nothing wrong with "simple," "good enough," or "rapid delivery." Problems arise when the team identifies "simple" feature realization and "simple" process with undesigned, untested and incomplete. The team might find itself choosing to hack up a feature, not because it's a justifiable approach given the business needs but because their inability to decompose the feature and plan across iterations has created an artificial pressure to cross the feature off the project plan. That team can report that features are "done" but pays the price later when the customer doesn't accept the feature or the market demands more.

Read more...

Done-ness, Denial, and Golf

>> 31 August 2008

Totally Test-driven Scrum Delivers Better Objectivity In Measuring Real Progress - Agile Software Process Improvement
Progress is measured entirely in terms of story points collected for tests passed. There is no "A for Effort" in my way of doing things. It's either testably delivered or it's not.

The analogy I use - and, yes, it's a golfing analogy - is to differentiate by measuring the completeness of a hole by whether or not the players took the shots they planned to vs. whether or not the ball actually finished up in the hole.
Jason Gorman puts his finger squarely on one of the keys to software development success. Performing tasks only matters if they result in complete and correct work product.

Jason's golf metaphor conveniently illustrates a couple of related antipatterns, "close enough" and "pick it up after 10 strokes."

In "close enough," someone makes a decision that whatever is currently implemented seems pretty done, so the developer should move on and try to get something else implemented. This is like finding yourself on the green putting for double bogey, and deciding to stop and move on to the next tee. You cover all 18 holes in 4 hours, but you can't post a score because you haven't really played the round.

In "pick it up after 10" the team concedes its inability to bring a feature to testable completion. Just like a course policy to keep people moving around the course, the software team decides that they have spent all the time they can on a feature and must move on. Like the 10-stroke golf hole, they've used more time than they expected and didn't achieve the required result, but they figure they've done all the can in the time allowed. If the customer is lucky, the 10th stroke will be on the green close to the hole. If not, the 10th stroke will fail to clear the edge of the bunker and trickle back into the sand.

I am still learning about Agile methods, but I have to believe the Agile team would have a couple of responses. "Close enough" is essentially a quality issue; the team thinks it's implemented something acceptable even if not what was asked for or agreed to. If the product owner accepts the close enough solution, just like in the cartoons the hole can slide over under the ball so it can drop in. Voila! No longer close enough, the feature is now done.

"Pick it up after 10" seems like a reasonable strategy on the surface. It's the golf equivalent of timeboxing. Teams get into trouble only if they mark the feature done and check it off the list. Seems to me the Agile team might talk it over with the customer and decide they're only go to play the front nine today. Playing the hole to completion will mean they can't finish the back nine, and the customer just can't accept the ball-in-the-bunker solution. Teams create problems for themselves when they pick it up and move on to the next hole, card a 10, and don't tell anyone about the feature's unfinished business.

Closely related to "pick up after 10" is the mulligan. In some respects mulligans are necessary, even a mark of good practice, in software development. You learn from the failed attempt, and try again. Just as in golf, though, mulligans don't show up on the score card. The proud player strides off the 18th green tallying up his 90-something score, but everyone in the group knows Mr. Mulligan couldn't have broken 100. On software projects there's a lot more at stake than who buys the first round. Unreported mulligans eat up time, and worse, keep the team from assessing its own likely performance on future efforts. Mulligans often lead inevitably to "close enough" and "10-stroke limit" situations.

Ah, golf. You can fool yourself if you like, but the rattle of the ball in the cup is what the game is all about.

Read more...

Betting on Scrum

>> 23 August 2008

It's been a year since my ScrumMaster training - time to renew my Scrum Alliance membership. Time to consider the value of membership. Time to consider my place in the Agile community. Time to decide if, as an experience designer, I can find a place in that community!

First, what did I gain for my original investment of $1300 out of pocket and a couple of vacation days?

  • I discovered Jeff Patton's excellent work. He offers one of the best collections of experience design resources I've found at his site, AgileProductDesign.com. The discussion list he moderates on agile usability attracts leaders in both fields.
  • I'm happy to work in a framework that values empathy and story in addition to analysis and research.
  • I like working with and learning from engineers. I'm glad I get to talk with them more.
  • I'm passionate about enabling real people to do real things, and with tools that come easily to hand and leave them happy. Seems like lots of Agile engineers are, too. I like that.
I've found some things less congenial.
  • I'm tired of "how many agilists can dance on the head of a pin" discussions. Who's really doing Agile? What methods are really Agile? Can you be an Agile practitioner if you don't produce code? Which Agile guru speaks with the most authentic voice? Can you trace your Agile sensei's pedigree through an unbroken chain of master and student to the founders? Are you still Agile if you learned from someone else? What does the Manifesto mean and whose interpretation is Agile?
  • The Agile world does not seem to be a comfortable place for user experience practitioners, still less for business analysts. I have been stunned by the number and intensity of insulting, patronizing, and downright hostile opinions I have seen expressed. Manifesto values like respect have at times been glaringly absent.
  • Scrum is a hard practice to evangelize. Managing a product backlog or scrum backlog seems to require discipline at least as great, if not greater, as does "traditional" project planning. It's more structure than people who don't like project management overhead want, and people who have to pass regulatory audits are afraid it's not enough.
The software world has never been without its religious wars, and you can't please all the people all the time. Experience designers, product people, and engineers eventually will find their places under the Agile umbrella.

And somewhere there's a dogma-free Web app team full of people passionate about enabling people to express their personal and professional power through software - and room for one more like them.

At least, I just bet $50 on the possibility!

Read more...

A recent trainee's comments on the Scott Ambler vs. CSM debate

>> 04 July 2007

Last week I completed the Scrum Alliance's Certified ScrumMaster training. Coincidentally, Scott Ambler's sidebar "Bringing Ethics to Scrum Certification" had appeared just a couple of weeks earlier (Coming Soon: Agile Certification, 8 Jun 2007). I had opportunity to read Ambler's article between my training days 1 and 2. I find myself thinking that if the 2-day course were called "Scrum Alliance-Certified Training in How to Be a ScrumMaster" the ethics issue - to whatever extent there is one - would shrink dramatically.

First, let me say I agree with Ambler that any professional certification worthy of the name should have some real teeth, with real professional development and demonstrated experience requirements as well as an exam. If a certifying body endorses an individual as competent, that body should have reason to do so. And - to whatever extent it might happen - it's just not OK for trainees or vendors to present attendance at a class as in any way equivalent to the achievement of a professional designation like the PMP.

That said, I found the training useful and I would take it again. I didn't expect to receive a professional designation. I expected only that someone who knows what he is talking about would transmit accurate and useful Scrum knowledge. I also did not expect to become a master of Scrum through taking the 2-day course. Rather, I expected to have the understanding necessary to perform the duties of the ScrumMaster role on a real-world project. At the start of our training that's what our trainer said we could expect, and I believe it's what he delivered. I list the Certified ScrumMaster training in my resume and online profiles along with other professional development and training experiences, where I hope readers will understand it as evidence both of my commitment to ongoing professional development and to Agile - a commitment I've backed with $1200 and 16 hours of vacation time.

Read more...

About

Faith Peterson is a versatile business analyst and user experience designer practicing in and around Chicago, IL. She works on Web-enabled business applications for content management, digital asset management, and business process management.

  © Blogger template Digi-digi by Ourblogtemplates.com 2008

Back to TOP