An Audit Report on Fiscal Year 2001 Performance Measures at 14 Entities
November 2002
Report Number 03-008
Overall Conclusion
Serious deficiencies exist in the methods 12 of the 14 audited entities use to collect, calculate, and report key performance measures. The deficiencies are a combination of inadequate controls over the performance measurement process, entities' not following the definitions of the measures, and entities' not being able to support their results. (See Figure 1 on next page.) These deficiencies compromise the usefulness of the State's performance measure system as a decision-making tool. Decision makers cannot rely on reported results for 53 percent of the audited measures, all of which were key measures, that the 14 entities reported for fiscal year 2001. This is the highest rate of unreliability since the State Auditor's Office began certifying performance measures in 1994. (A measure is reliable if it is certified or certified with qualification; it is unreliable if it is inaccurate or if factors prevent certification.)
Key Points
- Gaps in Control Processes Continued to Contribute to Inaccurate Results
Inadequacies in entities' performance measures control processes continue to be one of the main causes of inaccurate results. None of the entities had documented processes for all of their audited measures to ensure that employees collected and reported the data consistently and accurately. Many entities do not formally review the calculations to ensure that they are accurate before submitting them into the Automated Budget and Evaluation System of Texas (ABEST). For example, one measure was inaccurate because the entity did not use the correct year's data in its calculation. If this entity's performance measures procedures had required a formal review, it is likely that the error would have been corrected before the results were entered into ABEST.
- Ten of the Entities Did Not Follow the Measure Definition for One or More
of the Audited Measures
Ten of the 14 the entities did not calculate one or more of their audited measures according to the definition approved by the Legislative Budget Board. Of the measures we audited at these entities, 65 percent were unreliable for this reason. The most common ways entities deviated from the definition were to exclude required data from or to include extra data in their calculations. Some entities chose not to follow the definition because they believed the measure would be more meaningful if calculated differently. In such cases, the entities should work with the Legislative Budget Board to consider changing the measure definition.
- Several Entities Did Not Have Adequate Support for Their Results
For 13 percent of the audited measures, we were unable to verify the results the entities reported because the entities' supporting documentation was unavailable. In some cases, the entities did not retain supporting documentation. Other entities' supporting documentation was not available because of conversions to new automated systems. In addition, some entities relied on source documents from third parties, such as contractors, without verifying the data. When an entity relies on data from a third party for a measure, it is not enough for the entity to process the data; the entity must ensure that the source documentation is accurate. Of the 14 audited measures dependent on third party documentation, the results for 10 of the measures were unreliable.
Download the PDF version of this report. (03-008.pdf)
HTML Equivalent (utilizing Adobe's PDF Conversion by Simple Form).