Archive

SERVING THE AMERICAN PUBLIC:
BEST PRACTICES IN
Performance Measurement

Benchmarking
Study Report

June 1997


Contents

Executive Summary
Introduction
Summary of Best Practices in Performance Measurement
Section 1: Establishing and Updating Performance Measures
Section 2: Establishing Accountability for Performance
Section 3: Gathering and Analyzing Performance Data
Section 4: Reporting and Using Performance Information
Strategies for Successful Performance Measurement
Appendices
Acknowledgments
Study Participants
Site Visit Survey Responses
Glossary
Relevant Government Publications
Agency Contacts and Other Sources

Executive Summary

. . . chart a course for every endeavor that we take the people's money for, see how well we are progressing, tell the public how we are doing, stop the things that don't work, and never stop improving the things that we think are worth investing in.

--President William J. Clinton, on signing the Government Performance and Results Act of 1993

All high-performance organizations whether public or private are, and must be, interested in developing and deploying effective performance measurement and performance management systems, since it is only through such systems that they can remain high-performance organizations. When President Clinton signed the Government Performance and Results Act of 1993 (GPRA) into law, this commitment to quality was institutionalized. Federal agencies were required to develop strategic plans for how they would deliver high-quality products and services to the American people. Under GPRA, strategic plans are the starting point for each federal agency to (1) establish top-level agency goals and objectives, as well as annual program goals; (2) define how it intends to achieve those goals; and (3) demonstrate how it will measure agency and program performance in achieving those goals.

It was also in 1993 that President Clinton and Vice President Gore initiated the National Performance Review (NPR) to reinvent government. One of NPR's reinvention initiatives has been to foster collaborative, systematic benchmarking of best-in-class organizations, both public and private, to identify best practices in a wide range of subjects vital to the success of federal agencies in providing high-quality products and services to our principal customer the American people.

In February 1997, NPR published the Benchmarking Study Report Best Practices in Customer-Driven Strategic Planning, which documents and details the in-depth processes and approaches of those best-in-class organizations that excel at incorporating their customers' needs and expectations into their strategic planning processes. This study provided public and private leaders and managers with world-class practices and formulas for success in developing and deploying agency strategic plans and goals. To complement this strategic planning study, NPR commissioned the first-ever intergovernmental benchmarking consortium involving not only U.S. federal agencies, but also local governments and the government of Canada in a collaborative study of performance measurement.

This report documents the Performance Measurement Study Team's findings, and will be a useful tool for public and private leaders and managers in identifying and applying best-in-class performance measurement and performance management practices. When used in conjunction with the Customer-Driven Strategic Planning Study, federal agencies will have a framework for success in meeting the Administration's expectations for not only "doing the right things" but for "doing them right," as well.

Study Findings

Leadership is critical in designing and deploying effective performance measurement and management systems. Clear, consistent, and visible involvement by senior executives and managers is a necessary part of successful performance measurement and management systems. Senior leadership should be actively involved in both the creation and implementation of its organization's systems. In several public and private organizations studied, the chief executive officer not only personally articulated the mission, vision, and goals to various levels within the organization, but was also involved in the dissemination of both performance expectations and results throughout the organization.

A conceptual framework is needed for the performance measurement and management system. Every organization needs a clear and cohesive performance measurement framework that is understood by all levels of the organization and that supports objectives and the collection of results. Some of the benchmarking partners used a balanced set of measures methodology to organize measures and align them with their overall organizational goals and objectives. The majority had a uniform and well-understood structure setting forth how the process worked and a clear calendar of events for what was expected from each organizational level and when.

Effective internal and external communications are the keys to successful performance measurement. Effective communication with employees, process owners, customers, and stakeholders is vital to the successful development and deployment of performance measurement and management systems. It is the customers and stakeholders of an organization, whether public or private, who will ultimately judge how well it has achieved its goals and objectives. And it is those within the organization entrusted with and expected to achieve performance goals and targets who must clearly understand how success is defined and what their role is in achieving that success. Both organization outsiders and insiders need to be part of the development and deployment of performance measurement systems.

Accountability for results must be clearly assigned and well-understood. High-performance organizations clearly identify what it takes to determine success and make sure that all managers and employees understand what they were responsible for in achieving organizational goals. Accountability is typically a key success factor, but one with multiple dimensions and multiple applications.

Performance measurement systems must provide intelligence for decisionmakers, not just compile data. Performance measures should be limited to those that relate to strategic organizational goals and objectives, and that provide timely, relevant, and concise information for use by decisionmakers at all levels to assess progress toward achieving predetermined goals. These measures should produce information on the efficiency with which resources are transformed into goods and services, on how well results compare to a program's intended purpose, and on the effectiveness of organizational activities and operations in terms of their specific contributions to program objectives. Many of our partners cautioned against repeating their initial mistake: collecting data simply because the data were available to be collected, or because having large amounts of data "looked good." Instead, organizations should choose performance measures that can help describe organizational performance, direction, and accomplishments; and then aggressively use these to improve products and services for customers and stakeholders.

Compensation, rewards, and recognition should be linked to performance measurements. Most partners link performance evaluations and rewards to specific measures of success; they tie financial and nonfinancial incentives directly to performance. Such a linkage sends a clear and unambiguous message to the organization as to what's important.

Performance measurement systems should be positive, not punitive. The most successful performance measurement systems are not "gotcha" systems, but learning systems that help the organization identify what works and what does not so as to continue with and improve on what is working and repair or replace what is not working. Performance measurement is a tool that lets the organization track progress and direction toward strategic goals and objectives.

Results and progress toward program commitments should be openly shared with employees, customers, and stakeholders. While sensitive competitive financial and market share information generally must be protected, performance measurement system information should be openly and widely shared with an organization's employees, customers, stakeholders, vendors, and suppliers. Many of our partners maintained information on their performance objectives and specific progress toward these objectives on their organizations' Internet and intranet sites for real-time access by various levels of management, teams, and sometimes individuals. Most used periodic reports, newsletters, electronic broadcasts, or other visual media to set forth their objectives and accomplishments.

Not an End, But a Beginning . . .

This report is not the end of our performance measurement benchmarking study, but rather creates a platform for a wide range of beginnings. The approaches identified in this report will be given life by being shared, debated, and implemented in the context of organizational realities. Then, where appropriate, they must be used and improved upon. One of the consistent themes from our benchmarking partners, both public and private, was that effective performance measurement systems take time: time to design, time to implement, time to perfect. Performance measurement must be approached as an iterative process in which continuous improvement is a critical and constant objective.

Another hallmark of the successful organizations we studied was the use of benchmarking to establish performance targets as part of a continuous improvement process. Our partners first detailed their own processes through such practices as process mapping; they then compared these with those organizations, both public and private, considered to be the best. Through this organizational self-analysis and comparison against the best, our benchmarking partners have learned what needs to be changed as well as the processes, methodologies, approaches, and practices that can help them continuously improve. We urge leaders throughout the federal community to establish their own planning and measurement networks and working groups to share their best practices and process improvements with each other.

Introduction

Leading-edge organizations, whether public or private, use performance measurement to gain insight into, and make judgments about, the effectiveness and efficiency of their programs, processes, and people. These best-in-class organizations decide on what indicators they will use to measure their progress in meeting strategic goals and objectives, gather and analyze performance data, and then use these data to drive improvements in their organization and successfully translate strategy into action.

For decades, the federal government as a whole has demonstrated a keen interest in performance measurement. Specifically, it has considered ways of measuring government performance and using these results in the budget process. Thus, the Hoover Commission of 1949 proposed performance budgeting, President Johnson implemented a program planning budgeting system, and the Carter Administration advocated a zero-based budgeting system. All of these efforts looked to better define government program objectives and to link program accomplishments to the means of achieving them.

Today, several pieces of landmark legislation including the Chief Financial Officers Act of 1990, the Government Performance and Results Act of 1993, the Government Management Reform Act of 1994, and the Information Technology Management Reform Act of 1996 require that federal agencies:

To help agencies respond to this new challenge, Vice President Gore's National Performance Review (NPR) has assembled a group of process experts to identify how some of the best organizations, public and private, are implementing results-oriented performance measurement and performance management. In this first-ever intergovernmental benchmarking study, we have tried to identify the processes, skills, technologies, and best practices that can be used by government to link strategic planning with performance planning and measurement by:

Study Design

Context. NPR sponsors and organizes benchmarking studies aimed at making government work better and cost less. This effort is championed by the President's Management Council, which is made up of the Deputy Secretaries and their equivalents in the major federal agencies.

The present performance measurement benchmarking study builds on and extends the findings contained in the February 1997 NPR report Serving the American Public: Best Practices in Customer-Driven Strategic Planning. Further, we found that the best performance measurement and management systems and practices work within a context of strategic planning that takes it cue from customer needs and customer service.

Participants.The Performance Measurement Study Team was comprised of representatives from 14 U.S. federal agencies, six Canadian government agencies, the United Kingdom, and two local governments in the United States (see Appendix B for a list of team members). The high level of Canadian participation on the team reflects that country's recent commitment to performance management. In 1995, Canada's Expenditure Management System adopted a strategic, multiyear perspective in planning and results reporting. Accordingly, Canadian departments like their U.S. counterparts must report on their performance to ensure the effective use of appropriated resources.

The intergovernmental benchmarking team worked with 32 study partners drawn from more than 100 organizations considered best-in-class in the area of performance measurement (study partners are listed in Appendix A). These best-in-class organizations:

Terminology. An important early task for the team was to define our terms. Not surprisingly, given the relatively recent and widespread usage of performance measurement practices, we found a broad range of definitions for key terms. For our purposes in this study, we defined our basic terminology as follows:

Other terms related to performance measurement and management used in this report are defined in the glossary (appendix D).

Performance measurement process model. Another early task for the team was to try to build a model of the performance measurement process used in the federal context. To this end, we analyzed performance measurement and management as practiced by the various public agencies represented among our members. Developing this model gave us a good understanding of the steps, phases, and considerations involved in performance measurement, and an appreciation for the broad variety of ways in which it is approached. Our model is presented on the following page (not available); it is descriptive, rather than prescriptive, illustrating the basic stages and flow of the process. The model provided us with a useful frame of reference as we began our study of performance measurement in best-in-class organizations.

Survey of best practices. The study team next designed a structured site visit instrument to survey best practices in performance measurement and management among the partners. A matrix of partner responses appears as appendix C; highlights of our findings are presented in the following four sections.

Report Overview

The next four sections describe how high-performing organizations develop, communicate, and constantly improve their performance measurement and management systems. They highlight a broad array of successful processes, approaches, tools, and practices used in:

Finally, we present strategies in performance measurement and management specifically aimed at the public sector. This information is drawn from our research and from the survey of our partners.

Summary of Best Practices in Performance Measurement

Implementing and maintaining a performance measurement system represents a major commitment on the part of an organization. Following are some of the basics of philosophy and methodology that facilitate the performance measurement development process. With these in place, an organization can generally establish a successful performance measurement and management system.

Executive involvement. In most of our partner organizations, the performance measurement initiative was originally introduced, and continually championed and promoted, by the top executives. In many of the organizations we studied, leadership commitment to the development and use of performance measures was a critical element in the success of the performance measurement systems.

Sense of urgency. The impetus to move or move more aggressively to a new or enhanced performance measurement and performance management system is generally the result of a cataclysmic event most frequently, a circumstance threatening the organization's marketplace survival. One of several scenarios may precede initiating a performance measurement system within an organization: (1) a newfound leadership commitment to performance measurement; (2) the desire of a high-performance organization to keep its competitive edge; (3) the need to link organizational strategy and objectives with actions; or (4) the resultant outcome of current quality programs.

Alignment with strategic direction. Performance measurement systems succeed when the organization's strategic and business performance measures are related to that is, are in alignment with overall organizational goals. Top leaders convey the organization's vision, mission, and strategic direction to employees and external customers clearly, concisely, and repeatedly. Moreover, organizational objectives are shared with employees in several different formats, both visual and verbal. For example, one partner published and distributed a booklet to show each employee what matters at the corporate level, what affects the division level, and how everything aligns within the corporation. This information sets the stage for the development of useful performance measures, since the more clearly goals are communicated, the easier it is for employees to see and decide on what needs to be accomplished.

The most common thread among the organizations benchmarked was the linkage/alignment between their corporate strategy and their performance measurement system. One participant noted that this linkage allowed his company to operate with very optimistic "stretch" performance goals and measures. We also found that partners with a vibrant linkage between corporate goals and performance plans were easily able to align the contributions of customers, external partners, stakeholders, and in one case even volunteers.

Conceptual framework. An organization's performance measurement system should be integral to its overall management process and directly support the achievement of the organization's fundamental goals. In fact, in some cases, the performance measurement system is its management process. Examples of a conceptual framework for organizing measurement systems include the use of:

Communication. Communication is crucial for establishing and maintaining a performance measurement system. It should be multidirectional, running top-down, bottom-up, and horizontally within and across the organization. Our partners communicate internally by way of interactive, group-oriented mechanisms (town hall meetings, business update meetings, and focus groups); various forms of print media (newsletters, reports, and publications); advanced computer technology (e-mail, video conferencing, and on-line Internet/intranet systems); and other highly visible means, such as the routine placement of progress charts in appropriate work areas. For example, one of our partners holds a breakfast every two weeks with 40 different employees to review where the organization is going and how it is doing.

Employee. Employee involvement is one of the best ways to create a positive culture that thrives on performance measurement. When employees have input into all phases of creating a performance measurement system, buy-in is established as part of the process. As with other concepts described here, the level and timing of employee involvement is individually tailored by the partners depending on their size and structure.

In sum, to undertake performance measurement successfully, an organization must:


Table of Contents

Section 1:
Establishing and Updating Performance Measures

"Each organization must create and communicate performance measures that reflect its unique strategy."
--Dr. Robert S. Kaplan, Harvard Business School

World-class organizations use performance measurement systems to determine whether they are fulfilling their vision and meeting their customer-focused strategic goals.Their performance measures must therefore meet the following criteria:

Ensure a narrow, strategic focus. The measures and goals an organization sets should be narrowly focused to a critical few. It is neither possible nor desirable to measure everything. In addition, mature performance measurement systems are linked to strategic and operational planning.

World-class organizations know where they're headed through effective customer-driven strategic planning. They know where they are by measuring performance against corporate goals and objectives. The organizational strategy, correctly developed and modeled by senior management, provides a framework within which business units, teams, and individuals can implement a performance measurement system.

Our study partners concentrate their measurement efforts on items that can be traced through business unit performance plans to the entity's strategic vision. If a measure and its corresponding data requirements cannot be linked back to strategic planning, they are immediately considered for deemphasis or elimination. This frees organizations from "rescue initiatives" in areas that produce little value and equally importantly avoids data overload.

Measure the right thing. Before deciding on specific measures, an organization should identify and thoroughly understand the processes to be measured. Then, each key process should be mapped taken apart and analyzed to ensure (1) a thorough, rather than assumed, understanding of the process; and (2) that a measure central to the success of the process is chosen. In some cases, targets, minimums, or maximums are defined for each measure.

Be a means, not an end. In a best-in-class organization, employees and managers understand and work toward the desired outcomes that are at the core of their organization's vision. They focus on achieving organizational goals, by using performance measures to gauge goal achievement, but do not focus on the measures per se. Performance measurement is thus seen as a means, not an end. Several study participants reminded us to "focus on the goal, measure the end results, and don't focus on the measurement."

What to Measure?

Regardless of size, sector, or specialization, organizations tend to be interested in the same general aspects of performance: Attention to, and establishment of, measurements in these areas is thus a significant part of a successful performance measurement system.

In the private sector, the principal measure of successful performance is profit. Public agencies, on the other hand, have no such universal and widely accepted performance measure of success. For public sector organizations, performance must be judged against the goals of their programs and whether the desired results and outcomes have been achieved. Success is often viewed from the distinct perspectives of various stakeholders, such as legislatures, regulators, other governmental bodies, vendors and suppliers, customers, and the general public. Therefore, it is extremely important that the measures of performance used by a public organization be created with as much input and consultation from these constituencies as is feasible, so as to reach as much consensus as is possible regarding what is expected of the organization.

While a publicly owned corporation may ultimately be held accountable by its stockholders, and a public entity by the taxpayers, most of the best-in-class organizations place customer satisfaction above all else. "Customers are a source of goals," noted one partner representative, and many others weigh customer concerns heavily when developing strategic goals.

Often, measures are tied to corporate strategy; this often requires negotiation between a team of senior executives and business unit managers. A small manufacturing partner describes its process as "not really top-down or bottom-up," because it involves reconciling strategic requirements with the reality of its factory capabilities. More than one participant noted that their organization's measures cascade both up and down.

Determining a Baseline and Goals

Once an organization has decided on its performance measures, the next step in the process is to determine a baseline for each of the measures selected. Once data are collected for the first time on a particular measurement, the organization then has baseline data.

Determining appropriate goals for each measure after these baseline data are collected can be accomplished in several ways. Most partners use various statistical analysis techniques as well as benchmarking to set goals for future performance.

A common practice is to set goals that will force the organization to "stretch" to exceed its past performance. By benchmarking measures, an organization can validate the fact that the goals are still attainable. For example, a goal of 100 percent customer satisfaction may be an admirable goal for any organization. However, if industry standards have been at 80 percent, a goal of 100 percent may not be realistically attainable. Setting a 100 percent goal anyway can easily demotivate employees by giving them an essentially impossible target. In this regard, one partner representative noted that setting a quality standard with zero tolerance for human error undermines morale and makes goals appear unattainable. Organizations should instead set goals that excite an employee's interest and elicit commitment.

To this end, it is important to provide information on performance goals and results to employees. Many partners provide information on key goals and measures to all employees through their intranets, newsletters, and bulletin board displays. This increases employee understanding of the organization's mission and goals and unifies the workforce behind them. It also helps emphasize a team philosophy rather than foster individual competition.

Reviewing Measures

An important aspect of performance measurement is its iterative quality. Organizations should continually assess whether their current measures are sufficient or excessive, are proving to be useful in managing the business, and are driving the organization to the right result. This review lets the organization make sure that it is maintaining the right measures. When measures become obsolete, they should be discarded, and possibly replaced with something else. One partner representative noted that measures should be dropped if they were no longer needed or if no change occurred in a measure after much attention. Many partners found that they began with too many measures and needed to reduce the array of measures tracked at each organizational level.

Performance analysis also lets organizations change the priority of specific measures over time. Some performance goals, for instance, are intended to influence behavior and should be deemphasized once target performance is achieved. Some other goals may change due to the nature of the business, market conditions, or regulatory requirements. One partner regularly develops employee change teams to look at the measures and determine whether they might need adjusting.

Refining and changing measures is healthy and necessary, but our partners cautioned that frequent changes will cause confusion and may affect accountability.

Continuous and regular review of measures as they relate to the corresponding goals and the organization's strategic plan are key to success in performance measurement. It not only helps in deciding the right things to measure, but provides needed information to assess progress toward reaching goals of all levels within the organization. Performance measurement has no purpose if data are not used to improve organizational performance.


Table of Contents

Section 2:
Establishing Accountability for Performance

"What gets measured, gets done."
--A Study Participant
Establishing viable performance measures is critical for organizations; making those measures work is even more important. Once the performance measurement system is created, then, the next step is to implement it within the organization. One partner representative provided the following insight: "the key issue with performance measurement is deployment success is 20 percent approach, 80 percent deployment."

And successful deployment appears to be strongly related to developing a successful system of accountability: that is, of making managers and employees alike "buy in" to performance measurement by assuming responsibility for some part of the process. Among our study partners, we found the following general areas of responsibility/ accountability:

Following are successful strategies used by our partners for establishing employee and management accountability for the success of the organization's performance measurement system.

Empowerment

Employees are most likely to meet or exceed performance goals when they are empowered with the authority to make decisions and solve problems related to the results for which they are accountable. In many ways, accountability is analogous to a contract between manager and employee, with the manager providing a supportive environment and the employee providing results.

The performance goals of an organization represent a shared responsibility among all its employees each of whom has a stake in the organization's success. A critical challenge for private and public organizations alike is ensuring that this shared responsibility does not become an unfulfilled responsibility. Accountability helps organizations meet this challenge.

According to one participant, "the system is a closed loop . . . responsibility is attached to authority resulting in accountability." Another partner representative commented that "you can only hold employees accountable if they have control." A third participant believed that measures over which organizations have no control external measures should also be included.

Underlying employee empowerment is management's view of its employees as an asset rather than a resource. One partner representative stressed the use of the term "asset" because it implies that employees are to be valued and cared for, while a "resource" is something that is used up and replaced. In many leading organizations, the process of performance measurement has led to a better understanding of how individual employees or teams of employees contribute to the performance goals of an organization. The contributions of individuals and teams are a starting point for enumerating the results for which they are accountable.

Owner Identification

Most managers from best-in-class organizations hold an appropriate individual accountable for each performance measure. Most organizations therefore identify a measurement owner. This is an assigned individual who is accountable and responsible for a particular measure.

One study partner formally documents who is responsible for each performance target within a business unit. A single matrix identifies the business unit's goals and measures, the accountable individuals, and those individuals and organizations that have a collateral responsibility for meeting the performance target.

Another partner uses a matrix to identify and document roles that must be played to achieve organizational performance targets. This matrix allows the business unit to emphasize business goals rather than internal process outputs.

Rewards and Incentives

More than half of our benchmarking partners link pay or rewards to their performance measurement systems. In other cases, managers ensure that performance goals are met by rating individual contributions to performance goals in individual appraisals. One partner links corporate values with performance measures for determining management pay.

An example of how this linkage of performance measurement and employee incentives works follows. One of our partners feels so strongly about a training measurement that it mandates a specified number of hours for training annually for every employee as part of its incentive program. And if only 1 of its 9,000 employees does not make this training minimum, the amount of the incentive is reduced accordingly for all employees. This demonstrates the organization's commitment to training, stresses the importance of each individual to the team, and creates a commitment to training at all levels of the organization to achieve a goal.

Incentives don't always have to have financial attributes. For example:

Other rewards for exceptional performance include acknowledgment in newsletters and other publications as well as annual awards.

In addition to rewarding achievement, many organizations also recognize a pattern of chronic substandard performance by linking job performance to pay. One participant noted that "adverse actions tend to be rare because nonperformers are weeded out during the probationary period . . . a supportive culture of excellence tends to result in excellence."

Culture and Communication

Within most of our partner organizations, failure to meet performance goals results in a comprehensive review of problems and solutions. As one partner explained, "the culture is based on understanding the reality of human error and striving to improve . . . employees do not fear admitting mistakes." Periodic meetings allow staff to review progress and strategize about solving problems. Several participants emphasized that the focus is on corrective action, not blame. A number of our partners have established policies that institutionalize problem-solving approaches for failures and substandard performance.

Generally, organizations have a formal written plan describing how performance measures will be implemented. In many cases, the plan details the measurements, goals, objectives, and the common alignment to the organizational strategy. In addition, it is a common practice to identify one individual who will be responsible and accountable as a respective measurement owner.


Table of Contents

Section 3:
Gathering and Analyzing Performance Data

Without a yardstick, there is no measurement; without measurement, there is no control.
Anonymous
Data are collected and then analyzed for each performance measure to determine if and how well goals are being met. It is very easy for the data collection and analysis phase of performance measurement to get out of hand. Advanced technology facilitates this tendency: It is tempting to take advantage of the myriad data resources available via Internet and intranet. Best-in-class organizations remember that data collection and analysis are not a research activity conducted for its own sake. Rather, data are collected and analyzed to get answers.

Our partners collect data at all levels of their organizations through any number of mechanisms, at both regular intervals and on an ongoing basis. Through it all, they remain focused on the questions they are trying to answer. This focus on strategic alignment makes data collection a dynamic and vital, rather than tedious and never-ending, exercise.

Gathering the Data Principles

Keep it focused. One participant commented that "our company was data rich and insight poor." Keeping data gathering focused is very much a senior leadership responsibility. This focus ensures that the right data and only the right data are collected, that repetitious or tangential compilations are avoided, and that the questions originally posed by the performance measures are being answered.

Keep it flexible. In best-in-class organizations, data are collected from a variety of sources and through a variety of media. Any one system isn't necessarily right or wrong. Although using automation is preferable, world-class organizations also use manual systems when needed and cost efficient.

Keep it meaningful. Useful and relevant data can be gathered if the correct measures were set up in the first place. One partner representative observed that a few basic, well-aligned measures taken seriously are better than a number of complex measures. That's because with simple measures, it's clear what data need to be collected; with well-aligned measures, it's easy to see the data's relevance. On the other hand, it's possible to carry simplicity too far. A recurring challenge to effective performance measurement is to overcome, in the words of one participant, a "long-lived work culture of transactional auditing which causes a focus on checklist-type, as opposed to results-oriented, trending." In other words, data collection must be tailored and thoughtful, not derived from a "one-size-fits-all" master checklist.

Keep it consistent. Data collection should be based on a set of agreed-upon definitions. These definitions need to be universally understood by employees, managers, partners, suppliers, and even customers. Data collected within a common framework of understanding can be easily compared and analyzed, allowing subsequent evaluations to be "apples to apples."

Gathering the Data Responsibilities

Each business unit and hierarchical level of an organization will have different needs for the data gathered. These differences should be reflected in the collection process.

Line supervisors and employees. The data focus for line supervisors and employees relates to daily operations and customer service as these are aligned with the organization's vision and strategic planning. Thus, line supervisors and employees collect operational performance data. These data are often best gathered as part of the employees' interface with the customer. One of our partners uses advanced technology that automatically records every time a work product changes status. The updated status information is accessible to anyone in the company and to company business partners and customers. The capture of the performance data is seamless in the process of the business transaction.

Business unit managers. Business unit managers need data that can be used to measure customer satisfaction, dissatisfaction, or indifference. These data are usually collected via customer surveys administered by a third party or in-house office.

Another kind of data in which the business unit manager is interested involves program costs. These data come from the organization's accounting and cost accounting systems which record expenses and revenues. Armed with these data, a manager can not only react to, but can also institute proactive measures to reduce unnecessary costs.

Best-in-class business units also measure the health of their organizations. They survey employee morale and where appropriate employee safety. They look for skill deficiencies and try to be a continuous learning organization.

Executive management. Senior managers need to determine whether their organizations are meeting or exceeding the expectations defined in their customer-focused strategic plans. Generally, they target a vital few measures as critical to their responsibilities. Rather than immersing themselves in day-to-day details, executives look for trends.

Transforming Data Into Information

Data analysis in performance measurement is the process of converting raw data into performance information and knowledge. The data that have been collected are processed and synthesized so that organizations can make informed assumptions and generalizations about what happened. They can then compare the actuality to what they had expected to happen, decide why there might be a variance, and determine what corrective action might be required. This last set of activities is the subject of the next section reporting and using performance information. As in data collection, the organization must keep this next step perpetually in mind. Though interesting, analysis is not undertaken for its own sake. Following are some principles of data analysis drawn from best-in-class organizations.

Everyone needs information. But not everyone knows what to do with raw data. So, frequently in world-class organizations, in-house quality staff or outside contractors analyze the data used to measure performance. Some organizations provide data directly to managers, or to the relevant business unit, for analysis. At one world-class organization, data analysis takes the form of "cross talks" between organizational units jointly reviewing performance results.

To ensure that everyone can use and understand data and its analysis, some organizations train their workforces in rudimentary analytical methods. Such an evaluative culture is promoted by engaged executive leadership, and often nurtured by a cadre of analysts helping business units understand and interpret their data.

One partner established skilled measurement coordinators within each operating area of its organization, trained them in measurement and analysis techniques, and charged them with the responsibility for educating team leaders and employees.

In some organizations, central analytical staffs collect and analyze data related to corporate strategic goals and prepare concise reports for executives or agenda material for high-level business review meetings. At this senior level, it is essential to provide information rather than merely show data. Graphics and other visuals are used to focus senior management on points needing immediate attention.

User information needs differ. Different levels of an organization will use different pieces of the analyzed data. The critical users of performance information are decisionmakers, both on the front lines and in the executive suite. One benchmarking partner representative stated that the goals of its analysts are not always synchronized with the goals of the decisionmakers. As a rule, decisionmakers need information that is timely, relevant, and concise; analysts tend to value products that are thorough, objective, and professionally acceptable. The successful resolution of this internal tension is a sign of a world-class performance information system.

Good analytic tools are available. Many tools for effective performance analysis are readily available in today's marketplace. Off-the-shelf software packages can perform straightforward aggregation/disaggregation, statistical analysis, linear programming, trend analysis, charting, quality control, operations research, process cost analysis, and forecasting. More sophisticated packages can also perform a wide range of quality control functions and econometric modeling.

Over time, analysis can become more sophisticated. Analytic approaches and dependence on performance information become stronger as performance measurement processes mature. The business units of one partner organization that used advanced statistical techniques to analyze data tended to do better than those units that didn't.

A similar maturation occurs over time regarding the complexity of the measures used. For example, several partners began with straightforward workload/output measures. As they needed to conduct more sophisticated analysis, however, they moved to complex results-oriented measures of effectiveness and efficiency.

Several partners have developed mainframe applications that upload data from operating systems, process them overnight, and prepare performance activity reports tailored to different organizational units. Properly maintaining the mainframe or host computer environment re-enforces data consistency and definitions.

The partners tend to apply a reasonableness test of the cost and complexity of their performance data analysis systems. One partner representative observed that one of the organization's primary tools is common sense.

A picture is worth a thousand words. Performance data can be displayed in a wide variety of ways, including graphic presentations such as histograms, bar charts, pie charts, and scatter diagrams. Most organizations use some form of spreadsheets and databases to organize and categorize their performance data. Information technology advances particularly in electronic communications will provide still more options for data display and dissemination.


Table of Contents

Section 4:
Reporting and Using Performance
Information

You have to be a learning organization learn from your failures, so you don't repeat them; learn from your successes, so you can replicate them.
--A Study Participant

High-performing organizations do not measure things just for the sake of measurement. Rather, they report, evaluate, and use performance information as integral parts of their performance measurement systems to:

These same high-performing organizations see performance data as empirical information about the operation of their organizations and their customer or stakeholder requirements and preferences. Whether applied over the longer term or for short-term corrective actions, performance information is reported, evaluated, and used as an underpinning for the continuous improvement of overall management and strategic planning processes.

Report Information

Performance information should be disseminated quickly. Putting useful information into the hands of an organization's decisionmakers promptly and efficiently is critical.

Many communication devices can be used to meet this objective, including meetings, reports and newsletters, charts placed in work areas, e-mail, publications, and videoconferencing. Intranets are also being used to give entire organizations access to performance data summaries; this gives them the opportunity to be proactive about issues or adverse trends. Another performance reporting objective is to keep employees at all levels "in the loop," interested, and motivated. To this end, many partners use sophisticated communication systems so that all staff receive performance measurement status repeatedly in many forms.

In several cases, scorecards are posted in work areas throughout the organization, enabling everyone to know how they personally contributed to corporate performance. Employee newsletters, "Employee Recognition Day," and regular daily feedback are other useful communication techniques.

Some partners use a weekly newsletter that contains updated information about the different branches, new employees, operating results, business economy, and company's training schedule. Once each quarter, a more elaborate newsletter containing more detailed articles is sent to each employee's home.

At one partner organization, employees and executive staff share information with one another through a unique "recognition days" program. Once each year, executive staff members, together with workers from various company sites, visit each branch of the company to find out how things have been going.

Another partner uses a system of icons representing each of the six key performance measures used within the business unit. These icons are posted widely throughout the plant to focus employees' attention on the measures. This clever and effective deployment strategy serves to educate employees about the measures themselves as well as the status of their performance.

Evaluate Performance

Organizational performance evaluations are conducted periodically to best meet an organization's individual management information needs; they are typically scheduled on a monthly or quarterly basis. Depending on the types of activities and the organization, the frequency of evaluation could range from daily or weekly to semiannually. In many cases, organizations use a combination of reviews at various intervals. For example, one partner uses a combination of a monthly office review, a six-month review, and an annual review. Others rely on quarterly senior management reviews.

In one organization, reviews are done monthly to assess budget results and key project milestones, quarterly for customer satisfaction results, and annually for individual performance. In several instances, organizations undergo specific, externally mandated, six-month evaluations as part of their participation in "ISO 9000," an international standard setting and certification process.

In addition, unscheduled events such as customer feedback; industry mergers; or changes in contracts, technology, or the market can all trigger a performance evaluation.

While evaluation is done at various levels of the organization, evaluation results usually flow up to a senior-level person, chief executive, or some type of senior executive committee for review. Based on the evaluation, senior management determines whether corrective actions or changes are necessary in the performance measurement system, the measures themselves, or the organization's goals.

There are many management tools and techniques available for conducting this type of top-level review and evaluation of performance information; one useful approach is known as "story boarding." This approach is based on a managing for results and management by fact "story board." As depicted in the exhibit on the following page, the story board compares annual objectives and plan targets with year-to-date performance and identifies any gaps. Staff members generally those involved in either planning or quality who report directly to senior management conduct either a "gap analysis" or "root cause analysis." They develop recommendations to senior management as countermeasures or solutions. They also make recommendations as to whom should be accountable for, the current status of, and the milestones related to, the countermeasures.

One partner has developed a new aspect to the story board process a countermeasure outlook. This includes an assessment of whether the countermeasure was capable of closing the identified gap, whether the proper resources had been allocated, and a prediction for performance improvement.

Use Performance Information

Feed it into resource allocation decisions. There are important linkages among resource allocation, strategic planning, and performance measurement. A high-performing organization's strategic planning process is directly related to and may drive the process for allocating its resources to carry out goals and objectives. An organization's strategic plan is also directly related to what it is that the organization decides to measure in terms of performance and outcomes. However, the relationship between performance measurement and resource allocation is less clear.

Many of our study partners factor performance information into resource allocation decisions involving personnel or budget. Generally, they do not rely solely on such information. Resource allocation decisions are likely to be based on tactical and/or strategic considerations related to new initiatives, specific markets, technologies, or other factors.

Use it in employee/management evaluations. Most high-performing organizations have developed some means of linking accountability with incentive compensation or wage increases based on performance. Several partners hold managers accountable, factoring performance measurement results into their bonus plans. Most best-in-class organizations link performance measures in some way to pay.

People are also empowered and rewarded for making process changes based on performance results. One company provides people with incentives for achieving performance results based on doing things a certain way. Quality success stories are shared two or three times a year. The chair and senior officers review individual and team applications for significant improvement above and beyond the call of normal duty. A percentage of the savings is then shared with award recipients.

Other organizations use a multisource feedback appraisal process for managers that provides for evaluation by superiors, their employees, and their peers. In the case of one organization, this approach is used to assess "organizational vision, team participation, integrity and dignity, job knowledge and skills, and continuous improvement." Other organizations combine a similar feedback appraisal process with an approach that evaluates not only performance, but also criteria (or values) addressing individual behavior. The values one company uses to conduct its reviews include "respect for each other, integrity, trust, credibility, and continuous improvement and personal renewal." These multisource feedback reviews are often administered by an outside, third-party organization.

Most of our benchmarking partners have a recognition or rewards system linked to their performance measures. These organizations provide financial and nonfinancial incentives for successful performance (see section 2).

Use it to determine gaps between goals and reality. Performance results can be used, as discussed above, to determine gaps between specific strategic objectives and/or annual goals and actual achievement. The root causes of these gaps are analyzed, and countermeasures developed and implemented. Whenever there is a gap between current results and an organization's objectives, it is an opportunity for process improvement.

Use it to drive reengineering. Several of our benchmarking partners use reengineering in response to the identification of gaps between objectives and achievement. Some of the processes reengineered by our partners included cycle time, organizational structure, outsourcing, information technology, programs, and benefits.

A good example of how performance measurement drives reengineering is the case of one partner which recently focused on addressing customer complaints. This partner achieved significant improvements over a 12- to 15-month period by focusing on measuring complaints addressed in the same day when received by 3 p.m. This focus drove efforts to improve the process and to add technicians and resources.

One partner did not want to add staff simply to meet a high volume of calls received during a 30-minute period every day. Instead, it reengineered the process completely so as to reduce the number of "abandoned" calls.

Use it in benchmarking. Our study revealed that most of our partners use benchmarking as a methodology for organizational improvement; developing their performance measurement systems; and validating their operational position; and to maintain world-class performance. Our partners primarily use external benchmarking and competitive benchmarking, where they compare their operations with organizations outside of their companies.

A few of our partners used internal benchmarking, where an internal business unit would compare itself with similar business units within the same organization. One partner uses the same performance measures across business units to facilitate internal benchmarking.

At least two of our partners regularly participate in benchmarking consortia where participants from various industries meet to benchmark processes. These benchmarking consortia regularly use performance measures to discover best practices.

Use it to improve organizational processes. Our study showed that managers are most often the ones empowered to make process changes. One organization used a multivariable testing technique to discover how process improvements can be made. Management set up trial and control processes in such a way that employees could try various process improvements in a controlled manner and selectively identify changes that would improve process performance.

One partner created a roll-out group to get out of the "stove pipes" of individual teams and departments. This group "a place to try new processes and to address process issues" meets every two months and is empowered to decide on how situations are to be handled. As a result of its efforts, cycle time for a particular product has been reduced from 52 to 29 days. An employee survey has been administered, identifying the need for a better training program. Also, lack of sales growth has resulted in a major reorganization, including the development and implementation of a team structure.

Use it to adjust goals. In most cases, if performance goals are not met, corrective action is implemented. Conversely, if goals are exceeded, the "bar is reset to establish stretch goals."

One way of adjusting goals and the approach to their achievement is to form partnerships with other entities. Through these partnerships, organizations can combine resources and adjust their part of the overall goal accordingly. An example of this type of partnership was demonstrated when one organization teamed with another local organization to negotiate with potential new industries wishing to establish facilities in the overall area. Through this partnership, the organization was able to achieve desired community growth scenarios, as well as more aggressive revenue goals.

Use it to improve measures. One organization displayed performance measurements on bar charts and used raw data in its first year of implementation. In the next, it validated and normalized the data. In the following year, its bar chart included the normalized data with a trend line, a simple five-year moving average. In the next year, it used a logarithmic trend line to obtain a better fit.

Another organization originally used percentage data to measure performance. However, as the volume grew, it became apparent that actual data should be used instead. For example, a 99.5 percent successful on-time delivery rate meant 1.5 million failures per year.

One process owner claimed his process was as high as it could be: 99 percent on time. He defined on time as when the order came in. This meant that items sent back were counted as on time. The term was redefined.

One partner provided an example from its community service work. It measured pollution on beaches by counting the number of times rather than the length of time a beach was closed because of pollution. For increased precision, it now counts the number of "beach blocks," measured in terms of lifeguard towers closed and the number of days closed. Based on customer reports of closure tolerance, zero beach closures was set as the goal.

Several participants stressed the need to analyze significant movements before acting. Organizations must recognize and understand that variation occurs in many selected measures, and that there are both normal and special causes for such variations. Two partners develop upper and lower statistical limits around a performance target. If actual performance falls within the limits, the partners normally take no action.

The bar keeps getting raised. The question is how fast are you getting better?

A Study Partner

Strategies for Successful Performance Measurement

The public sector is under intense pressure to improve its operations and deliver its products and services more efficiently and at the least cost to the taxpayer. Performance measurement is a useful tool in this regard, since it formalizes the process of tracking progress toward established goals and provides objective justifications for organizational and management decisions. Thus, performance measurement can help improve the quality and cost of government activities.

The 1993 passage of Public Law 103-62, the Government Performance and Results Act (GPRA), represents a federal commitment to performance measurement. GPRA aims to improve the management of federal programs through the use of strategic planning and performance measurement systems. Under GPRA, all federal agencies will, beginning with the fiscal year 1999 budget cycle, be required to draft and submit five-year strategic plans with clearly stated strategic goals, annual performance plans describing how they will carry out these strategic plans and meet their goals, and annual reports on their progress.

Meeting GPRA's requirements constitutes a dramatic shift in focus and approach for many agencies. For example, GPRA requires the development of performance data that are aligned with agencies' strategic goals. It emphasizes outcomes rather than inputs and outputs. It looks to identify goals at given resource levels and to hold managers more accountable for program results. Any one of these precepts might represent a paradigm shift for a given agency. The following paragraphs therefore describe key strategies drawn from our research for successfully implementing a performance measurement system that meets GPRA objectives.

Why Measure?

Performance measurement yields many benefits for an organization. One benefit is that it provides a structured approach for focusing on a program's strategic plan, goals, and performance. Another benefit is that measurement provides a mechanism for reporting on program performance to upper management. Our partners use measurement information to:

Customer-Driven Strategic Planning

Although agencies rely on Congress and other stakeholders to clarify their mission and agree on their goals, they also, like private sector organizations, must address customer needs. Many tools are available to help agencies gauge these needs and obtain stakeholder input for strategic planning. These tools include the following:

This customer-driven strategic planning process should result in "stretching" strategic goals and focused objectives. Agency managers should then identify owners for each goal and objective, and develop strategies and allocate the necessary resources for performance.

Getting Started

Three elements are useful in developing and implementing a performance measurement system:

Change management is ultimately the responsibility of senior leadership. Management implements a plan by using techniques to align the organization's people and culture with changes in business strategies, organizational structures, and systems. For example, one of our partners initiated a change process to focus a greater amount of attention on performance measurement. Concurrently, however, the organization experienced numerous changes in senior leadership positions within a relatively short time frame. Because the organization was relying on a process, rather than people alone, to institutionalize performance measurement, the new approach gradually evolved and eventually became ingrained in the corporate culture. There were learning pains, but people became more used to it every year.

Employees should be involved in performance measurement as members of the organization's "team" although the specific degree of involvement will vary by organization. At one end of the spectrum, for example, some organizations use a cross-functional or matrix team of employees and sometimes of stakeholders, too representing all areas involved in or responsible for the performance measurement system (e.g., program areas, planning, and budget). These teams use various techniques to brainstorm, discuss, clarify, and prioritize ideas related to the development of performance measures.

If individuals have not worked on teams before, they should receive training so they can learn to function in and as a team. Similarly, if managers have never used performance measurement information before, they should receive training on how to understand and use such information. One partner provides a "how to" book for managers on writing specific objectives so they can more effectively communicate their organization's destination. The key here is for the organization to identify gaps in knowledge and experience at whatever level and provide targeted, just-in-time training to address these.

Establishing and Updating Performance Measures and Goals

For each goal and objective, performance measures, baselines, and performance targets need to be established both organizationwide and for each contributing program/process. Thus, managers can work with multidisciplinary teams, focus groups, and/or stakeholders to develop measures from goals and objectives. Next, they should establish baseline data to help them understand their current status. They can use benchmarking, competitive comparisons, gap analysis, and past experience to establish these targets.

A conceptual framework can help in deciding what to measure. For example, measuring organizational performance can be linked to the strategic planning process. Or you can use a balanced set of measures that ensures that senior leaders can get a quick, comprehensive assessment of the organization in a single report. A family of measures can be used to align measurements across levels of the organization.

Regardless of which framework is used to design and implement a system for measuring organizational performance, several criteria need to be addressed in creating good measures. In general, a good measure:

Above all, however, a good measure drives appropriate action. Several characteristics are associated with the implementation of a successful performance measurement system. Such a system:

By creating an operational definition for each measure, you can ensure that these measures are understood by everyone in the organization. A typical definition includes (1) a specific goal or objective; (2) data requirements, such as the population the metric will include, the frequency of measurement, and the data source; (3) the calculation methodology, including required equations and precise definition of key terms; (4) reports in which the data will appear and the graphic presentation that will eventually be used to display the data; and (5) any other relevant rationale for the measure.

Some of our partners explored an inventory of common measures to cut across all business units. For example, one partner developed an economic value-added index to measure its financial performance; a customer value-added index to measure the satisfaction of its customers relative to that of its competitors; and a people value-added index to measure employee perceptions of senior leadership, overall job satisfaction, and diversity practices.

Once measures are in place at the highest level of the organization, they should be cascaded to lower levels. One organization we studied demonstrated a seven-level cascade moving from the vice president to the hourly worker with an associated measurement priority at each level. Another organization suggested using policy deployment to plan and execute organizationwide and customer-focused performance improvement.

We found that successful entities maintain a strong focus on customer and market expectations, as well as on profit and production. This focus is reflected in the measures selected. For example, one partner operates with fewer than 20 percent of its performance measures related to financial, bottom-line categories. Another partner ranks employee safety and customer satisfaction above corporate profits in its measurement hierarchy.

Establishing Accountability for Performance

An organization needs to establish who is responsible for performance measurements. Someone must be responsible for getting the information needed and for reporting it in a timely manner. Others need to be responsible for the actual outcomes on the measurements. Some partners have team-level measurement experts who are responsible for helping team members understand the significance of the performance data collected and who guide the team in using data at weekly goal meetings. Another partner has people responsible for training employees on what the data mean and how to interpret the data.

Both organizational and individual responsibilities need to be identified for the performance measures. For example, one partner uses a "role/responsibility" matrix to formalize the process and identify ownership of each measure. This matrix lists measures along the vertical axis; the horizontal axis includes the present goal, the future goal, each business process, and the associated process owner. Status is indicated within the matrix using the letters "R," responsible; "A," accountable; "C," consult; and "I," inform.

Through a variety of techniques, the goal owner establishes goal targets. For example, at one organization, the target does not become official at the corporate level until it is agreed to through a negotiation process between the office of the chairman and the goal owner. This ensures a high degree of integrity in the process and the people involved.

A powerful message is sent to employees and to the organization as a whole by formally linking executive compensation to organizational performance, as well as by judging individual performance by the achievement of strategic objectives. At several organizations, managers could not only earn a certain percentage of salary in annual bonuses, but could actually lose access to bonuses by not meeting goals. The issue of control does not affect this concept of ownership. For example, even if a senior official could not make a 25 percent increase in a volume goal because the market went down and interest rates fell, the goal did not change. Instead, because the goal owner is seen as responsible for making the best of a bad situation, that manager would forfeit a certain amount of money tied to the achievement of that goal.

One partner representative stated that, "an effective performance measurement system is a servant of the business, not its master." The primary purpose of measuring performance is to develop, deliver, and improve on world-class products and services not to audit or find fault. For government offices and business units, the essential responsibility is to provide services to citizens, not monitor the behavior of employees.

Data Collection and Reporting

Performance measures must be timely, easy to implement, and clearly defined. Speed is essential in both data collection and distribution. Our study partners try to collect data as work is done rather than through separate collection and maintenance tasks. Performance measurements tend to be simple. According to one partner, simple and clear nomenclature should be used, measures should be user-friendly, and the data collection effort should not be overly structured. A standard data definition helps business units throughout an organization use and understand measures uniformly.

A clear data collection plan helps streamline the data collection process:

  1. Identify how much data need to be collected, the population from which the data will come, and the length of time over which to collect the data.
  2. Identify the charts and graphs to be used, the charting frequency, the type of comparison to be made, and the calculation methodology.
  3. Identify the characteristics of the data to be collected attribute data are things that can be counted; variable data are things that can be measured.
  4. If the performance measure is new, try to identify existing data sources or create new sources. All data sources need to be credible and cost effective.

Resources must be allocated for the data collection effort. This entails identifying who is responsible for collecting and reporting the data. Management must also identify who will ensure that the data collected are reliable, timely, and accurate, and that the process is confidential and access is rapid. In addition, management must describe how the data will be collected, including data entry, tabulation, and summarization methods. Managers should also identify any associated costs of collecting data.

Generally, our partners use information systems to support data collection and reporting. They use both automated and manual requests for periodic updates. Organizations should try to automate when possible to reduce the burden of data collection on the workforce. They should also centralize their databases, create an on-line data entry system, make sure it is flexible enough to respond to improvements/changes, and make it user-friendly. One partner uses automation in order to be proactive. This allows it to identify and fix failures well before customers indicate that something is wrong.

Analyzing and Reviewing Performance Data

Various processes can be used to analyze and validate performance data including operations research, statistical analysis, quality control, and process cost analysis, among other techniques. A representative of one study partner said that the organization's business units that use advanced statistical techniques to analyze data tend to do better than those that don't. This partner often applies its expertise in advanced technical methods to improve performance at a lower level of the organization so as to effect results at a higher level. Another partner representative said the analyst needs to be able to explain to senior leadership how the measure was obtained and what the measures mean. Management then reviews results versus expectations and makes mid-course corrections: Data are truly integrated.

A very useful method for measuring performance is statistical process control (SPC), a scientific method of analyzing data and using the analysis to solve practical problems. The most common statistical tool in SPC is the control chart, which is used to detect differences in variation among numerical results obtained. Numbers always contain variation. There is variation in the way the numbers are generated, collected, and analyzed, as well as variation in the measurement process itself. The control chart filters out routine variation so exceptional values can be revealed. SPC helped one study partner, which began using the process about 12 years ago, determine which problems required follow-up actions. Feedback was provided to activity/process owners for continuous improvement.

Graphic presentation of key information is a critical element of the analysis and review process. Many partners use run charts to identify meaningful trends. They analyze indicator data at least annually against intended targets and look for trends. Based on these trends, they take appropriate action as required. In an effort to relate employee progress to its strategic plan without sharing inside information, one partner visually indicateds targets without using actual data. Icons, such as an ice cream cone or a pot of gold, are used on charts to identify who is above, below, and on target. Other study partners color code their charts e.g., green for good, red for bad, and yellow for caution. The Pareto chart is another useful tool to show the relative importance of different categories.

Evaluating and Utilizing Performance Information

Performance information must be formally reviewed and acted upon to improve or simplify processes. Most of our partners incorporate a review of performance measurements into the strategic planning process in order to provide management feedback for adjusting future performance plans and resources and for confirming or modifying performance plans or targets. They use performance information to perform benchmarking and comparative analysis with best-in-class organizations or to identify opportunities for reengineering and resource allocation.

Most of our partners base rewards and recognition on results. One partner has several incentive pay programs that focus the hourly workers on accomplishing what is best for their customers and for the company.

Process owners use performance information for continuous improvement. A popular continuous improvement model used by our partners was the Shewart cycle: plan, do, study, act. The "plan" is what is expected to happen for any selected action. The "do" is the execution of what was planned; often, this is in the form of a pilot test. The "study" compares the results of what actually happened to the expected results. The "act" acts on the results. If the predictions hold true, then execution can be standardized. The process for taking action varies, but generally includes the following steps: understand the results, establish and clarify priorities, generate recommendations, develop action plans, implement action plans, and monitor progress.

Under GPRA and, indeed, as a principle of good management agencies must continually revise and improve their program and activity measures. The June 2, 1997, Federal Times cites several examples of agencies improving and refining performance measures:

Reporting on Performance to Customers and Stakeholders

Data should be reported and performance explained internally, and performance information should be consolidated across the organization. One partner introduced a new internal document on key performance indicators that uses color-coded graphs to show progress or trouble. The CEO of another study partner uses an internal television system to communicate information including information on performance measurement to all organizational locations worldwide on a quarterly basis. A third partner has over 5,000 data sites on its intranet, each of which is kept apprised of performance measurement data and findings.

Information should not only be shared internally, but also externally with customers and stakeholders through annual reports. One partner formed formal partnership agreements with industry associations in the United States so as to share performance information with customers and stakeholders. The aim is to employ quality management principles in a joint effort to enhance operations and offer innovative, nonregulatory approaches to problem solving.

Repeating the Cycle

Sharing performance information with customers and stakeholders facilitates the receipt of pertinent input from them for the planning process. Congress and management use this information to set priorities and make decisions. Further, the input influences the customer-driven strategic planning process, the multiyear goal setting and resource planning process, the annual performance planning process, and ultimately resource allocation. Customer/stakeholder feedback also influences the updating of performance measures and goals and the establishment of new ones. Thus informed, updated, and revised, the performance measurement process begins anew.


Table of Contents

Appendices

Appendix A:
Acknowledgments

The Performance Measurement Study Team thanks the corporate and government partners that willingly shared their experiences and best practices with us. Special thanks are also due Harry Hatry, The Urban Institute; Robert Kaplan, Harvard University; Steve Klink, Federal Quality Consulting Group; Melissie Rumizen, National Security Agency; Joyce Thompson, Zenger Miller; the United Kingdom U.S. Embassy; and Nita Congress, editor, for her special touch.

STUDY PARTNERS

Lockheed Martin Tactical Aircraft Systems
Prince William County, VA
AT&T Telecommunications
AECL (Atomic Energy of Canada, LTD)
Pratt & Whitney
Canadian Intellectual Property Office
Trade Marks Office, Industry Canada
Halliburton Company
Xerox
Chevron
Custom Research, Inc.
Florida Power & Light
Multnomah County, Oregon
Department of Energy/University of California
Commonwealth of Virginia
DuPont
City of Scottsdale, Arizona
Granite Rock
City of Sunnyvale, California
Honeywell Air Transport
ADAC
Fannie Mae
St. Lawrence Seaway Authority, Canada
Department of Veterans Affairs
Saturn
British Telecommunications
Eastman Kodak
U.S. Coast Guard
Wainright Industries
Federal Express Corporation
Her Majesty's Land Registry
BellSouth Telecommunications
City of Coral Springs, Florida


Table of Contents

Appendix B:
Study Team

Study Sponsor
National Performance Review
Study Organizers
Wilett Bunton, NPR Benchmarking Team Leader
Lori Byrd, NPR Benchmarking Team
Linda Nivens, NPR Benchmarking Team
Study Team Leaders
James J. Cavanagh, Department of Energy
Lisa J. Roth, Patent and Trademark Office
Adel Shalaby, Treasury Board Secretariat, Canada
Gary A. Steinberg, National Aeronautics and Space Administration
Study Team Members
Linda Allen-Benton, National Science Foundation
Brian Andrew, Canadian Intellectual Property Office Industry Canada
Linda Bailey, Department of Health and Human Services
Barbara Bova, Canadian Intellectual Property Office, Industry Canada
Wilett Bunton, National Performance Review
Lori Byrd, Department of Transportation
Jerry Chatham, Department of Veterans Affairs
Danna Chung, Department of Health and Human Services
Michel C“t‚, Canadian Heritage
Donna V. Davis, General Services Administration
Stephen J. Dienstfrey, Department of Veterans Affairs
Thomas Garin (Major), U.S. Air Force
Hap Hadd, Department of Health and Human Services
Gia Harrigan, Department of the Navy
Hamid Jorjani, Agriculture and Agri-Food Canada
Larry Juul, Department of Defense
Suneel Kapur , Department of Energy
Margo Kiely, Fairfax County, VA
Steve Lambing, National Aeronautics and Space Administration
Romeo Lavarias, Department of Housing and Urban Development
Steve LeNard, Department of Health and Human Services
Daryl Lucas, Department of Education
Curt Marshall, National Oceanic and Atmospheric Administration
John Scott McAllister, Department of Veterans Affairs
Jennifer McKay, Canadian Intellectual Property Office, Industry Canada
Dana Mellerio, National Aeronautics and Space Administration
Kathleen Monahan, Department of Housing and Urban Development
Carmen Nadeau, St. Lawrence Seaway Authority, Canada
Linda Nivens, Department of Labor
Jane Osborne, Department of Health and Human Services
Chris Peterson, Department of Housing and Urban Development
Peter Poulos, Department of Defense
Valerie Richardson, Patent and Trademark Office
Armeta Ross, Government of the District of Columbia
Michael Turner, Federal Aviation Administration
Mike Whitfield, AECL (Atomic Energy of Canada Ltd)


Table of Contents

Appendix C:
Site Visit Survey Responses
in WP

Not Available
Table of Contents

Appendix D:
Glossary

A word is not a crystal, transparent and unchanged; it is the skin of a living thought, and may vary greatly in color and content according to the circumstances and time in which it is used.
--Justice Oliver Wendell Holmes

The following terms have been defined as listed below for the purposes of this study. Advance planning: That part of the planning process in which organizational leaders together with the strategic planning staff define the planning process; establish membership, roles, and responsibilities for the process; clarify expectations for process outputs and outcomes; and provide the necessary resources to ensure its success.

Balanced scorecard: A management instrument that translates an organization's mission and strategy into a comprehensive set of performance measures to provide a framework for strategic measures and management. The scorecard measures organizational performance across several perspectives: financial, customers, internal business processes, and learning and growth.

Baseline data: Initial collection of data to establish a basis for comparison.

Benchmark: A standard or point of reference used in measuring and/or judging quality or value.

Benchmarking: The process of continuously comparing and measuring an organization against business leaders anywhere in the world to gain information that will help the organization take action to improve its performance.

Core process: The fundamental activities, or group of activities, so critical to an organization's success that failure to perform them in an exemplary manner will result in deterioration of the organization's mission. Customer: The person or group that establishes the requirement of a process and receives or uses the outputs of that process, or the person or entity directly served by the organization.

Environment: Circumstances and conditions that interact with and affect an organization. These can include economic, political, cultural, and physical conditions inside or outside of the organization.

External customer: An individual or group outside the boundaries of the producing organization that receives or uses the output of the process.

Government Performance and Results Act (Public Law 103-62): A law that creates a long-term goal-setting process to improve federal program effectiveness and public accountability by promoting a new focus on results, service quality, and customer satisfaction.

Internal customer: An individual or group inside the boundaries of the producing organization that receives or uses the output from a previous stage or process to contribute to production of the final product or service.

Key performance indicator: Measurable factor of extreme importance to the organization in achieving its strategic goals, objectives, vision, and values that, if not implemented properly, would likely result in a significant decrease in customer satisfaction, employee morale, and effective financial management.

Measure: One of several measurable values that contribute to the understanding and quantification of a key performance indicator.

Metrics: The elements of a measurement system consisting of key performance indicators, measures, and measurement methodologies.

Mission: An enduring statement of purpose; the organization's reason for existence. The mission describes what the organization does, who it does it for, and how it does it.

Outcome measure: An assessment of the results of a program activity as compared to its intended purpose.

Output measure: Tabulation, calculation, or recording of activity or effort.

Performance goal: A target level of an activity expressed as a tangible measurable objective, against which actual achievement can be compared.

Performance management: The use of performance measurement information to help set agreed-upon performance goals, allocate and prioritize resources, inform managers to either confirm or change current policy or program directions to meet those goals, and report on the success in meeting those goals.

Performance measure: A quantitative or qualitative characterization of performance.

Performance measurement: A process of assessing progress toward achieving predetermined goals, including information on the efficiency with which resources are transformed into goods and services (outputs), the quality of those outputs (how well they are delivered to clients and the extent to which clients are satisfied) and outcomes (the results of a program activity compared to its intended purpose), and the effectiveness of government operations in terms of their specific contributions to program objectives.

Stakeholder: Any person, group, or organization that can place a claim on, or influence, the organization's resources or outputs; is affected by those outputs; or has an interest in or expectation of the organization. Strategic direction: The organization's goals, objectives, and strategies by which it plans to achieve its vision, mission, and values.

Strategic goal: A long-range change target that guides an organization's efforts in moving toward a desired future state. Strategic objective: A broad time-phased measurable accomplishment required to realize the successful completion of a strategic goal.

Strategic planning: A continuous and systematic process whereby guiding members of an organization make decisions about its future, develop the necessary procedures and operations to achieve that future, and determine how success is to be measured.

Vision: An idealized view of a desirable and potentially achievable future state where or what an organization would like to be in the future.


Table of Contents

Appendix E:
Relevant Government Publications

Several useful publications are listed here for your reference in initiating or improving your organization's performance measurement and performance management processes.

Agencies' Strategic Plans Under GPRA: Key Questions to Facilitate
Congressional Review. GAO/GGD-10.1.16. General Accounting Office, version 1, May 1997.
Best Practices: The IRS Research Project on Integrating Strategic Planning, Budgeting, Investment and Review. Office of Economic Analysis, Internal Revenue Service, May 1996.
Criteria for Developing Performance Measurement Systems in the Public Sector. Department of the Treasury, September 1994.
Executive Guide: Effectively Implementing the Government Performance and Results Act. GAO/GGD-96-118. General Accounting Office, June 1996.
Focusing on Results: A Guide to Performance Measurement. Industry Canada, March 1995.
The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven. GAO/GGD-97-109. General Accounting Office, June 1997.
Guidelines for Performance Measurement. DOE/G/120.1-5. Department of Energy, June 1996.
Guidelines for Strategic Planning. DOE/PO-0041. Department of Energy, January 1996.
A Handbook for Strategic Planning. Publication No. 94-02. Total Quality Leadership Office, Department of the Navy.
Implementation of the Government Performance and Results Act. Chief Financial Officers Council, GPRA Implementation Committee, May 1995.
In Their Own Words. Executive Summary of Strategic Management Interview Data. Total Quality Leadership Office, Department of the Navy.
Integrating Performance Measurement Into the Budget Process. Chief Financial Officers Council, GPRA Implementation Committee Subcommittee Project, January 1997.
Managing for Results: Analytic Challenges in Measuring Performance. GAO/HEHS/GGD-97-138. General Accounting Office, May 1997.
NASA Strategic Management Handbook. National Aeronautics and Space Administration, October 1996.
The National Highway and Traffic Safety Administration Case Study: Strategic Planning and Performance Measurement. National Highway and Traffic Safety Administration, August 1996.
OMB Circular A-11, Part 2: Preparation and Submission of Annual Performance Plans, Section 220. Office of Management and Budget, revised May 1997.
Performance Budgeting: Past Initiatives Offer Insights for GPRA Implementation. GAO/AIMD-97-46. General Accounting Office, March 1997.
Performance Measurement Guide. Department of the Treasury, Financial Management Service, November 1993.
Performance-Based Management: Eight Steps to Develop and Use Information Technology Performance Measures Effectively. General Services Administration, Office of Governmentwide Policy, December 1996.
Program Performance Measures: Federal Agency Collection and Use of Performance Data. GAO/GGD-92-65. General Accounting Office, May 1992.
Reaching Public Goals: Managing Government for Results - a Resource Guide. National Performance Review, October 1996.
Serving the American Public: Best Practices in Customer-Driven Strategic Planning. Benchmarking Study Report. National Performance Review, February 1997.
Strategic Information Management (SIM) Self-Assessment Toolkit. General Accounting Office, Accounting and Information Management Division, Exposure Draft, Version 1.0, October 1994.
Strategic Management for Senior Leaders: A Handbook for Implementation. Publication No. 96-03. Total Quality Leadership Office, Department of the Navy.
Strategic Planning and Strategic Management Within NASA: A Case StudyNational Aeronautics and Space Administration, June 1996.
Strategic Planning: Charting a Course for the Future. Video. Document No. T012-00- 0000150. National Aeronautics and Space Administration, October 1996.
Strategic Planning: Selecting the Leadership Team. Total Quality Leadership Office, Department of the Navy. May 1992.
Toward Useful Performance Measurement: Lessons Learned From Initial Pilot Performance Plans. National Academy of Public Administration, November 1994.


Table of Contents

Appendix F:
Agency Contacts and Other Sources

Many individuals and organizations can provide valuable assistance as you and your organization move forward in improving your performance measurement process. Listed below are phone numbers and e-mail addresses for the organizations represented in this study and some other helpful points of contact.
U.S. Air Force
SAF/ST
Thomas Garin
Phone: 703-808-3837
e-mail: garint@sgate.com

Canadian Intellectual Property Office, Trade
Marks Office, Industry Canada

Mike Whitfield
Phone: 819-997-2469
e-mail: whitfieldm@aecl.ca

Department of Defense
Phone: 703-614-9163
Department of Education
Daryl Lucas
Phone: 202-401-8547
e-mail: daryl_luca@ed.gov

Department of Energy
Office of Policy and International Affairs
Suneel Kapur
Phone: 202-586-0110
e-mail: suneel.kapur@hq.doe.gov

Fairfax County, VA
Margo Kiely
Phone: 703-324-7533
e-mail: mkielye@ffxvm1.co.fairfax.us

Federal Aviation Administration
Michael Turner
Phone: 405-954-0425
e-mail: mturn1@ibm.net

General Services Administration
Carole A. Hutchinson
Director of Performance Management
carole.hutchinson@gsa.gov
(202) 501-0325

Office of Governmentwide Policy
Donna V. Davis
Phone: 202-401-8134
e-mail: donna.davis@gsa.gov

Government of the District of Columbia
Phone: 202-727-6554

Department of Health and Human Services
Phone: 202-205-5766

Department of Housing and Urban Development
Phone: 202-401-8800, ext. 2706

Department of Labor
Linda Nivens
Phone: 202-219-7357
e-mail: linda.nivens@dol.gov

National Aeronautics and Space Administration
Gary Steinberg
Office of Policy and Plans
Phone: 202-358-4552
e-mail: gary.steinberg@hq.nasa.gov

National Oceanic and Atmospheric
Administration

Office of Policy and Strategic Planning
Curt Marshall
Phone: 202-482-5181
e-mail: curt.l.marshall@noaa.gov

National Performance Review
Wilett Bunton
Phone: 202-632-0150/0367
e-mail: wilett.bunton@npr.gsa.gov

National Science Foundation
Phone: 703-306-1104

Department of the Navy
Naval Undersea Warfare Center (Newport)
Gia Harrigan
Phone: 401-841-6167
e-mail: harrigan@c80.npt.nuwc.navy.mil

Patent and Trademark Office
Office of the Comptroller
Valerie Richardson
Phone: 703-305-8161
e-mail: vrichard@uspto.gov

Department of Transportation
Lori Byrd
Phone: 202-366-6068
e-mail: lori.byrd@ost.dot.gov

Treasury Board Secretariat, Canada
Adel Shalaby
Phone: 613-957-2493
e-mail: shalaby.adel@tbs-sct.gc.ca
Internet: www.tbs-sct.gc.ca

Department of Veterans Affairs
Phone: 202-273-5084
Other Sources

National Performance Review
Managing for Results Web Page
Internet: www.npr.gov

Office of Management and Budget
Phone: 202-395-4840/5670
Internet: www.whitehouse.gov/ WH/EOP/OMB/html/ombhome.html

Inter-Agency Benchmarking and Best Practices Council
Internet: www.va.gov/fedsbest/ index.htm

Navigation Bar For NPR site Back To The NPR Main Page Search the NPR Site NPR Initiatives Links to Other Reinvention Web Sites Reinvention Tools Frequently Asked Questions NPR Speeches NPR News Releases