Process for a change request which invalidates an existing user story and acceptance tests

Jun 11, 2013 at 4:10 PM
Edited Jun 11, 2013 at 4:11 PM
First of all thanks for a comprehensive document which answers a lot of questions I've had on how best to use TFS with SharePoint.

However there seems to me to be a glaring omission in your proposed change control process. (apologies if I've missed something)

You talk about additional requirements, extensions to existing requirements (which are essentially new requirements) and changes to a requirement not yet implemented which is fine but what about changes to a requirement already implemented.

For example

During a Beta test phase of project a product is verified by a potential customer. They report back on the system and submit a change request. When reviewed it is found that this change request invalidates one or more existing stories that have been implemented ( i.e. they are completely wrong or not even required in the system any more) to the point that the acceptance test for those stories are either all wrong or are mostly all wrong. What should you do?

Do you :
a)reopen the old stories that are mostly wrong change the details of the stories and test cases for these requirements add to the backlog and re-estimate. Then close any stories no longer required with reason rejected and all associated test cases.

or
b) create new stories with associated test cases and then change the state of all the effected Completed/closed stories to be something like rejected.

c) something else.

This situation does occur quite regularly so I see no reason your guidance should ignore it.

Also what should happen to the release burn down to show the effects of these changes?

I'd appreciate any guidance you can give in this situation
Developer
Jun 14, 2013 at 4:20 AM
Hi andyLight1.
There are many scenarios that you bring up.
The omission you mention is fodder for debate over whether you are asking for a requirements change or a product change. It may sound like I'm splitting hairs, but there is a legacy to every application that requires the team to keep it maintained.
I'm of the opinion that stories or requirements that have been implemented and accepted are no longer requirements. They need to be scrubbed, harvested, and stored as the legacy documentation and reusable assets for future development. That way, new development team members have something to review as they try to learn the application and become productive members of the team. This applies in a post-release sense, not Beta-test.
For post-release, changes that are approved for development that invalidate existing functions or features require that the legacy documentation be changed in addition to the tests and code.

Beta Test
Because you suggest that acceptance of features goes through a Beta-testing phase, I can only assume that "true" agile methods are not being used. That, in and of itself, is not to be taken as either a good or bad thing; just a comment. Agile teams deliver to their customers at the end of every sprint, which is considered in those purest communities as releasable software. The closer you get to true agility, the better your team is at defining what "Done" means; you finish a story, you write tests for that story, you build it, your user accepts it... Done. If a change is requested, the change is listed as a series of new stories. You don't go back and change a story that has already been accepted and closed. You DO, however, need to make sure that the product documentation reflects the changes being requested and that all tests related to the stories that have been closed are either updated or removed (if obsolete).
Personally, I would not change the state of any accepted stories to rejected. I would change the state of the tests to "incomplete" or "not tested" or whatever workflow state you are using. If, during testing, you realize that the tests are not accurate or are obsolete, you change or remove those tests in lieu of new tests.
If you DO have Beta-test cycles, however, then changes to the stories that have been implemented should be documented as defects, because the stories have not been fully accepted yet, and linked to the stories that are in error. In that case, you still need to make sure that the documentation and tests reflect the changes required to fix the defect and the stories that are affected should either be re-opened or rejected in lieu of new stories.

In summary, I'm a firm believer that requirements are not reusable. Once implemented, they are disposable. That doesn't mean that all of the tests and documentation are disposable. Those assets need to be changed to reflect the change being made and, as such, during development, tasks should be written to maintain the affected documentation.

Release Burndown: Because you are beta-testing, your release burndown should not get credit for delivering stories that are found to be in error as you state. Your task or work remaining certainly should reflect that you thought you were done, but when you add new tasks to fix the defects, your task or work remaining should increase to reflect the new work remaining.

I realize there hasn't been an update to the guidance for some time, but this posting will be a reminder to address change management in future releases.
I hope that helps.
Jun 18, 2013 at 12:18 PM
Thanks for your comprehensive reply.

I'd agree with many points you have made.

In our case we are developing a product that goes out to many customers not just one. We are developing using Scrum.

We have customers working with us (a select few) that are involved in reviewing software (review meetings) and answers questions in an ad-hoc fashion as we develop it and a Domain expert who is the nominated PO. However even in this situation, there may be issues that don't get found until out gets it out in the wild. So a Beta Test phase is still required especially when delivering to a customer not involved with the project. Is this not being truly agile or ScumBut maybe or maybe it's just pragmatic. We want to use Agile for the key benefits to control changes as they are found whilst in development, to get each feature delivered to an acceptable quality level ( definition of Done) and to get it out to review regularly with real customers in a deployable state, but we need to control the final acceptance process in more detail when deployed to a customer site.

As to updating the documentation. This is fine to a point. Part of the reason why Agile methodologies where developed was to overcome issues with too much documentation and to get people to actively communicate and collaborate. I know it doesn't mean no documentation, but rather just enough. I'm more of the opinion that the Tests truly describe the functionality of system especially if they are automated and therefore by their nature cannot be out of date. Documents will always become outdated to an extent no mater how hard you try to keep them up to date.

How would you see the process changing if the issues were raised whilst the application is still in development i.e. a late sprint in the release.? I think most of what you say still stands, but what about the release burn-down in this case?

Usually if the change or new requirement only affects one or two existing done stories/requirements it's not too much of an issue. It's when it requires a fundamental change where 10 or more stories are affected that things get complex. Especially if your UAT test are manual. Finding affected tests when changes occur and maintaining these tests i.e. making sure they are up-to-date in general is a big job.
Developer
Jun 18, 2013 at 2:00 PM
Hi thank you for your support of the rangers.
In light of what you describe, your situation is really no different from what I see every day. It comes down to documentation and traceability.
Documentation:
Sure, agile professes documenting only what needs to be documented. In your case, you owe it to your customers to document all of the stories and maintain that documentation. Your customers are not going to learn how to use your system by running your tests; neither are your new employees. You don't need to polish the documentation such that all the spelling and grammar errors are fixed, but you need to describe how the system works.
At least 2 types of documentation are valuable for your team.
1 - functional documentation - this is a description of usage of your system from the user's point of view.
2 - application design - this is a description of all the moving parts underneath the presentation tier. Some call it an architecture spec, some call it a design spec... call it what you want, but it represents a reference of how the system was built.

Tests as the documentation is almost the same as saying that your code is "self-documenting".

Traceability:
In order to know what tests to run or change, you need to have them linked to the requirement for which they were written. When that requirement changes, all of its tests are linked to it and must be changed as well. If the requirement becomes obsolete, more than likely, so do its tests. It's that simple.

There is no magic to it. Making sure that everything is up-to-date IS a big job and must be done.
Jun 18, 2013 at 2:15 PM
When I said documentation I meant old style specification and design documents and not customer documents such as end user guide, configuration and training material which obviously do have to be kept up to date and are part of the final deliverable.

If UAT tests are phrased in Domain language why can't they be seen as the living spec?

It is a big job and I'm surprised that the isn't given more emphasis in the Scrum guides etc.

"Grooming the Done list/UAT tests" as well as Grooming the backlog. Too much emphasis is made of getting the next thing done rather than ensuring/validating "the whole" in my opinion.