The Communications and Information Directorate had a strong data collection program, but the analysis and use of this information was limited. Although the intent of these metrics was to measure an important or problem area, the number of metrics continued to grow, while the analysis was almost nonexistent.
A plan was created to validate the purpose of each metric. Rather than modify existing metrics, the metrics program needed an overhaul. Many of the cost, schedule, and performance metrics were relevant because they directly measured the mission. However, the metrics process to collect and analyze this information required updating. We defined an overall metrics philosophy as an adjunct to the strategic plan and noted that each new metric had to have basic information associated with it, making it useful to the right people at the right time. Figure 2 is a form we used to collect this information in a single, neat package so everyone from collectors to decision makers could understand their purpose in collecting, reporting, and making decisions based on this metric. Although simple, this broad overview causes people in the organization to think before creating or approving a metric. It also marks the conditions under which the metric will remain useful. This makes the process easier for semiannual review of the metrics, because the criteria are spelled out and metrics that have outlived their usefulness are
Metric Title
Brief Description
Link to Goals/Objectives
Decision(s) based on analysis
Who makes decision(s)
Who collects data
How is data collected
How often is data collected
Who reports data
How and to whom is data reported
How often is data reported
Who analyzes data
How is data to be analyzed
(formulas and factors)
Lowest acceptable values
Highest acceptable numerical
values
Expected values
At what point will you stop
collecting this metric