How can federal agencies objectively rate their vendors?

Let the numbers tell the story

By Diane Weeks, Information Assurance Consultant and Project Manager, TalaTek

The above chart shows calculations for the Acceptable Quality Level (AQL) for CPARS based on possible and actual weighted rankings.

The above chart shows calculations for the Acceptable Quality Level (AQL) for CPARS based on possible and actual weighted rankings.

It sounds easy at first glance. How hard can rating a vendor be? Well, it definitely involves more than meets the eye, especially when you have a vendor that is excelling at their work. Contracting Officers and their representatives (CORs) need to apply fair measurement to all contractors and are under increasing pressure to justify exceptional ratings for their contractors’ annual Contractor Performance Assessment Reporting System (CPARS).

CPARS is used to measure vendors’ performance and includes the contractor’s record of conforming to requirements and to standards of good workmanship; forecasting and controlling costs; adherence to schedules, including the administrative aspects of performance; reasonable and cooperative behavior and commitment to customer satisfaction; reporting into databases; integrity and business ethics; and business-like concern for the interest of the customer. 

While CPARS provides the framework, TalaTek provides additional metrics to assist in creating a fair and accurate measurement of its performance. (Hint TalaTek’s goal is to be evaluated at an exceptional level and they wanted to provide the metrics to justify this). TalaTek  created measurement standards and metrics for every aspect of its work to include quality and timeliness. They also created the “Deliverable Tracker” for federal clients where agencies can rate their vendors through predefined metrics for performance based on quality and timeliness. The Tracker lists all client deliverables ranging from its annual management reports to weekly meeting minutes. 

Each item is given a weight of high, moderate, or low based on their importance and complexity regarding the program. For TalaTek, an annual management report would have a high weight, whereas weekly meeting minutes would have a low weight. This weight attaches different values to the final calculation of the quality and timeliness of the deliverable.

To gauge timeliness, we created fields for due date, actual date and government delay. For example, if a government delay has caused a missed deadline, it doesn’t factor into the timeliness calculation for that deliverable. And, of course, the high, moderate, or low importance of the deliverable also figures into that calculation.

To assess quality, TalaTek uses a rating system of 1 to 5 based on the Quality Assurance Surveillance Plan (QASP) associated with its contract. In this case, deliverable quality ranges from 5, where there were “minor issues and highly effective resolution” to 1, where there were “substantial issues and inadequate resolution.”

For an ongoing summary of timeliness or quality status, the tracker calculates the total number of points possible for all deliverables. The difference between that target number and the actual number is interpreted as a percentage. This information is presented in a summary table.

For both timeliness and quality, its QASP indicates anything above 95% is considered exceptional, 90% is very good and down to 70% where performance is considered unsatisfactory.

The tracker allows the COR  to monitor the contractor’s performance on a regular basis and provide monthly, quarterly, annual or ad hoc statistics on timeliness and quality to management. This way, performance reviews are based on objective data that can be easily shared with other groups within an agency and the contracting vendor can rest easy knowing they are being evaluated fairly and impartially.

TalaTek uses this to help  its clients assess their success, so can you.