I’ve coached folks in making a several thousand strategy, engineering or life decisions over the past 20 years. In most cases, we have used a simple weighted-scoring method to evaluate the relative “goodness” of alternatives against a set of criteria. Criteria are given weights that express their relative importance; a 10-point weighting scale is typically used. The estimated performance of alternatives is then scored against each factor using another 10-point scale. 10 is assigned to the “best-fit” or best-performing alternative and 0 is assigned to one that just meets a Threshold or Must limit (firm requirement or stakeholder walkaway point) and offers no additional “margin”.
A weighted score (sum of weight x score for each factor) is then computed for each alternative; this is normalized so that the overall best alternative scores 1.0. Such results are typically presented in the form of an evaluation matrix using Excel or more advanced decision analysis tools. An alternative with a normalized weighted score of 0.85 is deemed to be 15% less effective than the 1.0 alternative, all things considered.
When training these folks, I’ve always emphasized the appropriate role that numbers play in this type of analysis. They simply express a human being’s judgment concerning the relative importance of criteria or relative effectiveness of alternatives. However, in the presence of a spreadsheet it seems that numbers take on a life of their own. This can lead to a false sense of precision in a decision. Debates break out over 1 point in a weight or score. Weighted scores are computed to 3+ decimal points (0.886 vs. 0.894) to break a tie between alternatives.
Weighting and scoring schemes seldom merit this level of trust. Weights and scores are usually only good to 1 significant figure; their product has no more precision than either of its inputs. So 0.0886 and 0.894 are just 0.9 for all practical purposes (as is 0.852).
In the Decision Driven® Solutions Framework (DDSF) the numbers are hidden to keep the focus on judgment and communication among stakeholders. Although the tool can support a 10-point weighting and 10-point scoring scale, a simpler scheme of 5 weighting bands and 5 scoring bands is implemented. The total weighted score or normalized weighted score is not displayed. The relative advantages and disadvantages are displayed graphically in the Evaluate Alternatives and Compare Alternatives canvases. The numbers are still there, but de-emphasized so users aren’t tempted by spreadsheetitis.
I think this level of precision is more than adequate for 99.982% (oops, I got carried away) of the decisions that you will face in business, engineering, or life. It frees you to think carefully about what you value and which alternatives may best deliver this value, but not to think more highly of your analysis than is justified.
I’ll write more on “keep-it-simple” weighting and scoring concepts in coming posts. In the meantime you can use the scoring features of the Decision Driven® Solutions Framework (DDSF) to evaluate DDSF capabilities against other decision analysis tools that you currently use. Please contact the Decision Driven® Solutions team at firstname.lastname@example.org or email@example.com to start your free trial of DDSF.