Comprehensive software reviews to make better IT decisions
SDLC Metrics: An Approach for Ranking and Selecting Metrics
Not all metrics are good and using poor metrics can have a serious negative impact on your organization. Maximize the value of your software development lifecycle (SDLC) metrics by using this thoughtful and judicious approach to ranking and selecting them.
In a Metric Fixated world, it is easy to fall prey to the siren-song of “what get measured gets done,” but the truth is that blind faith in metrics is a recipe for disaster. So how does IT leadership select and manage their SDLC metrics in an effective way to maximize the value while minimizing the risks? Follow these simple steps:
- UNDERSTAND the risks of metrics
- KNOW the rules of good metrics practices
- RANK AND SELECT the right metrics to meet your prioritized goals
This is the last in a series of three tech briefs on the topic of SDLC metrics. In the first brief (UNDERSTAND), we presented the dark side of metrics, and warned of the serious dangers associated with poor selection and management of metrics. In the second (KNOW), we presented a set of simple-to-follow rules to avoid common pitfalls associated with bad metrics. In this tech brief (RANK AND SELECT), we present a straightforward approach to helping your organization evaluate and select the most effective SDLC metrics for your needs.
The secret to effective selection of SDLC metrics is to choose the fewest number of metrics possible while simultaneously selecting those that maximize value and minimize the serious risks associated with bad metrics and never using them for reward or punishment, but rather to help your development team(s) solve problems and achieve goals.
To do this, start by identifying your development team’s prioritized business-aligned goals (i.e. what goals does the team need to achieve to support the business’ goals, ranked in order of importance). Do this by working collaboratively with the business to understand their needs and translate them into a prioritized list of development team goals (keep this list short (2-4 as a guide) because your team can only effectively focus on a few goals at once). As an example, below is a hypothetical set of prioritized team goals that have been aligned to the business:
With your prioritized goals in hand, it’s time to identify potential metrics for use by your team. Begin by compiling a list of potential metrics to be used (ignore any metrics that are not relevant to achieving your prioritized business-aligned goals). For each potential metric, capture a name and brief description, then identify how the metric would be gathered and reported (think carefully about how to gather each metric in a way that reduces the risk of things like gaming and unintended consequences). As an example, the table below captures five potential SDLC metrics (in practice, your team should identify 10-15 potential metrics to be evaluated):
Now it’s time to evaluate and rank your list of potential SDLC metrics so that you can then short list the best metrics to meet your team’s needs. To do this, your team will score each metric against a set of defined “pros” and “cons.” We recommend you use the following pro/con criteria (but feel free to adjust these to the individual needs of your organization):
Now, using a five-point scale from low to high (low, med-low, medium, med-high, and high) have your development team score each of the potential metrics against the pro/con criteria. In the example below, we have scored each of the five metrics relative to one another (we used our prioritized business-aligned goals from above to determine the Value to Team scores). Note that we have used traffic light color coding (i.e. green is “good,” red is “bad”) to help visually compare scores:
Once scoring is complete, have your development team roughly sort the list of potential metrics from best to worst (this sorting has already been carried out in the table above). You can “eyeball it” based on colors, or come up with some simple formula like a weighted average score of pros/cons for each metric (you don’t need to be precise here as long as they are roughly ordered from “best” to “worst”).
Now, working from top to bottom, have your team openly discuss each metric, compare it to the others, and decide on which to include in your team’s new set of SDLC metrics. You won’t necessarily take the very top scoring metrics on the list because some of them could overlap in some way. We suggest reducing your list down to the 3-5 best SDLC metrics for the team (feel free to adjust this range for your organization’s specific needs, but remember that each new metric you add to the list will diminish the value of every other metric, because your team can only focus on a few things at a time).
At this point, your team will have collaboratively settled on the smallest set of SDLC metrics they believe will help them the most in achieving their prioritized business-aligned goals. This buy-in from the team is critical to effective adoption and use of SDLC metrics. Remember never to use these metrics for punishment or reward (which will only undermine their effectiveness). Your team can now put these metrics into place.
As one last step, schedule a follow-up SDLC metrics discussion with the team (three to six months later is suggested, but do what works best for your circumstances) to review the metrics for effectiveness and decide whether any changes are needed (whether it is changes to which metrics are gathered or how they are gathered). Remember that the team’s needs and goals can change over time, and so your SDLC metrics should be reviewed and updated regularly to maintain their effectiveness.
If you would like to know more about this approach to metrics, speak to your Info-Tech Engagement Representative about the soon-to-be-released SDLC Metrics blueprint.
Want to Know More?
Thor, the Norse God of Thunder, tells Jane Foster, the woman he’s trying to impress, that on his home world of Asgard, the realm eternal, science and magic are two sides of the same coin. Had Jane been a part of the operations teams at Google (or other mature online service providers), she would have immediately realized we have a similar technology right here on good old Earth. We call the science site reliability engineering (SRE), and service level objectives (SLO) is the magic behind it. SRE is a powerful concept for organizations that are serious about keeping their customers happy. It is therefore important for them to develop well-thought-out SLOs and make certain that management is intellectually equipped to derive valuable business perspectives from them.
Hell hath no fury like a customer not being able to access an online service when they want to. They expect the online services to always be on, always be accessible, and always treat them like there’s no one else in the world who matters more. Thank heavens then for giving these online services the ability to use site reliability engineering (SRE) to keep their customers happy, engaged, and most importantly, feeling valued.
Info-Tech members moving to Agile are frequently unsure of the role of PMs and the PMO in an Agile environment. Any organization used to traditional (Waterfall) project management will need to make adjustments in support of Agile or risk losing the benefits.
GitHub has announced that, effective April 14, 2020, all of its core features will be free for everyone. This will include private development within organizations that have previously paid for some subscription plans.
Many Info-Tech members are wrestling with how to best manage their software development productivity while working from home, especially for teams using a Waterfall approach. Sprinkling some Agile practices into their normal routine could improve transparency and show continuous value delivery.
When deciding on how to license your products or components, you don’t start with debating open vs. closed source code. It starts with asking simple questions around your overall goals.
RPA projects fail more often than one would expect. The ease with which RPA tools allow workflows to get designed and implemented makes it easy to avoid building a strong foundation for the work being done.
We live in a metrics-fixated world where having more metrics is always thought to be better than having less, and Software Development Life Cycle (SDLC) metrics are no exception. But the truth is that any badly chosen or managed metric will do more harm than good to your organization. To avoid these pitfalls, take ownership for SDLC metrics away from managers and put it into the hands of those who can best manage it: your development teams.
Robotic process automation (RPA) success is dependent on the right business processes to automate. Blueprint helps identify the right places to apply RPA with its Enterprise Automation Suite.