Presenting a "balanced scorecard" of management information | Behavioral Healthcare Executive Skip to content Skip to navigation

Presenting a "balanced scorecard" of management information

August 20, 2012
by Dennis Grantham, Editor-In-Chief
| Reprints

Hughes Johnson, Director of Performance Improvement at Youth Villages, sees the development of quality measurement and performance management systems as a process of “building the airplane while you’re flying it.” While this sounds like a risky proposition, his 20 years of experience at this major Tennessee-based provider, which offers an array of services in eleven states, assures him that there’s no other way.

“In order to build a system of measurement, you have to choose a place to start and then start with what you have, however imperfect,” says Johnson.

“At Youth Villages, we break our performance measures into three areas:  Those that define our clinical programs, those that provide key operational measures, and those that track the success of children and families served by Youth Villages.”

Of course, the magic of this approach, as Johnson will explain in one session of September’s upcoming Behavioral Healthcare Leadership Summit, is in “breaking down these three components of measurement into presentable pieces, into bits of information that give a definition to each of those parts in a way that will support their implementation.” Johnson’s presentation therefore dives into three levels of measuring and managing performance.

The first essential level of measurement involves clinical programs. “It is essential to define the key elements of a clinical program or treatment model so that you can measure model fidelity or drift over time.” In his Summit presentation, Johnson will highlight and share a detailed example:  Youth Villages’ clinical adherence model for residential treatment.  

“With this and any model, it is important to define not only the clinical interventions involved, but the specific patterns required to execute them.” On the clinical level, Youth Villages defines and trains these models to its clinical staff, who then keep clinical level measures of performance, quality, and treatment fidelity. Every six months or so, key clinical indicators are measured for each program and its ongoing adherence to program models as its performance is reviewed by senior management.

The next level of measurement involves operational information from diverse areas including operations, financial, human resources, and customer service.  These, along with a third set of measures—key organizational metrics—are presented to Youth Villages’ managers and executives using a “balanced scorecard” format. This format is designed to do two things: first, to present meaningful, actionable information to various levels of management that is suited to their responsibilities and needs and, second, to provide “directional” information that offers insight into the overall emphasis, direction, and success of the greater organization.  

 “Balancing” the presentation of management information in the scorecard format helps managers to keep things in perspective, to avoid getting “tunnel vision” about a narrow set of personal, group, or departmental objectives.  The “scorecard” used by Youth Villages categorizes management information into five areas:  clinical, financial, operational, human resources, and customer service.

Capturing the precise mix of information to be included on a particular scorecard is challenging, but then again, Johnson says it’s important to begin somewhere and perfect the process as you go along. “Obviously, a CEO is likely to be looking at the organization—and at those five categories of metrics—from a different perspective than an operational, clinical, or HR manager,” says Johnson. But together, he says, the organization knows that “if these key organizational and operational metrics are working well—we should be able to manage good outcomes for kids.”

While Johnson says that Youth Villages’ quality and performance measurement programs have benefitted from the organization’s deployment of an electronic health records system—as well as other organizational computing technology—these aren’t essential. “It is possible to implement tools like these without technology in a way that works. Our effort, in fact, started with an Excel spreadsheet.”