Problem Statement
Skimping out proper testing is easy; testing is too often treated with a subjective approach versus a systematic approach. It is too easy for a developer to say, “oh yeah, I tested that.”
Many times, work on proper testing is put aside for a production break-fix or an environment enhancement. The value of proper testing plans isn’t easily perceived because a framework to think about the testing value proposition doesn’t properly exist.
Estimating Risk
Developers have one of two options when it comes to testing:
1. Develop and test the use cases that come to mind as they are testing
2. Develop and test following a written testing plan
Hypothetically, if a developer chooses to test without a written plan, you can estimate the code is subject to a 10% chance of failure or disruption within the first three months.
On the other hand, if a developer is following a written testing plan, that 10% risk number can be knocked down. For example, if there are 10 written test cases, the risk goes from 10% to 7%. But if there are 50 written tests the risk goes from 10% to 2%.
Of course, the 10% number is not an exact measurement, but a ballpark number to be used for estimating only. By starting with a risk number it’s easier to start thinking properly about the value of formal testing on IDM projects.
How to Reduce Risk
Each formal test case executed reduces the risk score by a fraction of a percent. For a test case to reduce the risk, it needs to be formally defined, reviewed, and tested.
Steps for Testing
1. Document the test case
2. Review the test case with stakeholders
3. Run the test case as defined
4. Document testing results with evidence (such as log statements)
5. Have a peer review the test
6. Demonstrate the test in front of stakeholders
Example One: Case Study
One example of excellent testing execution comes from a GCA healthcare client that has 100k+ identities. Since GCA started working with this client more than a decade ago, a systematic process has been meticulously followed for onboarding new functionality into the environment. This project has been so successful, the vendor has recognized it in the top 5 largest implementations of MicroFocus Identity Manager in the world. The key reason for the project’s success is its commitment to following the aforementioned testing procedure.
We started by following the normal process:
- The assigned GCA engineer documented 20 use cases
- The cases were reviewed by GCA’s team lead
- Based on team lead feedback, five more use cases were added to the document
- GCA held an hour-long meeting with the client to talk through each case
- From feedback, we refined and made minor modifications resulting in additional details added to a handful of use cases
- The client formally signed off on the use cases
- We tested each use case in front of the client
All in all, GCA’s testing process includes three tests of each use case by at least three people, plus documented evidence of successful tests. Everything checked out, it was deployed, and rolled out to the entire 100k+ user community. If you want to roll out successful IDM initiatives, this is the way to do it.
Example Two: How to Calculate IAM Risk
Let’s take a look at an example of reducing that 10% risk score with a test case. Imagine you are developing a connector that will create a network account for a new user in Active Directory once they’ve been hired in the HR system. Here is an example use case:
Test Case 1: If a network name is already taken, a unique digit will be added to the new hires network name.
Expected Output: If the users JSmith and JSmith1 exist, when John Smith is added, their network name will be JSmith2.
With this use case documented and formally tested with the six steps listed in example one, this test case will reduce the three-month 10% error risk by a small fraction. Then based on general guidelines below if 25-30 additional use cases are documented, this should put the 3 month issue risk of this new functionality to less than 1%.
If you adhere to the formal testing strategy and follow the testing number guidelines below your IDM project will succeed. Your project could go from a 50% success rate to 95% just by changing how you think and prioritize and think about testing.
Baseline Testing Recommendations:
New Read-Only App – 10 to 15 test cases
New App with Provisioning – 25 to 50 test cases
Basic Workflow – 10 to 20 test cases
Advanced Workflow – 25 to 50 test cases
Assumed Risk Calculator:
Risk = Chances of an error occurring in the first 3 months
Rec# = Recommended number of use cases based on type of functionality
X = How many test cases you need to formally test
Risk = 10% – 9% ( * x) / Rec#
Note: After you go beyond the Rec# of test cases the ROI per test case drops and the above formula doesn’t necessarily hold true.
Conclusion
GCA can help you reduce the risk of your IDM initiatives. We have worked on identity management projects with both Microfocus, NetIQ and SailPoint’s suite of products for organizations in healthcare, energy, finance, and technology.
We target a less than 1% three-month risk score by conducting a number of test cases based on the project. If your team needs help testing or you’re worried about your risk levels, don’t hesitate to reach out by contacting us today.