Performance Evaluation: Two Tools

PEER staff
6/30/2017

What am I going to talk about?

Performance-based budgeting is universally acclaimed – but it requires specific tools to work!

I'm here to talk about two of those tools:

  • The 7 Elements of Quality Program Design
  • the Measuring Mississippi Data Analysis Tool

The 7 Elements

The first, and much better-developed, tool is conceptual: The 7 Elements of Quality Program Design.

You may be familiar with it; it's already gotten some national attention.

The 7 Elements

There's a detailed set of questions that go with each element, but the essentials are easy:

  • Premise of new activity
  • Needs assessment
  • Description of new activity
  • Research and evidence filter
  • Implementation plan
  • Fidelity plan
  • Measurement and evaluation

The 7 Elements

Program premise

  • What is this program trying to achieve?
  • In what way are its efforts novel?
  • Is it part of agency mission and state goals?

Needs assessment

  • What is the extent of the problem, and to what extent does the program address it?
  • This should be rigorously quantified and sourced. Show your work!

The 7 Elements

Program description

  • What will you actually do in the program?
  • How much will that cost, over short and long terms?
  • What are the expected benefits of the program?
  • Again, these should be quantified and sourced.

The 7 Elements

Research and evidence filter

  • Why should we believe the program will work?
  • This is where the math and science come in.
    • “Little Timmy” stories: Not good enough!
    • It's often a stumbling block.
  • MISS. CODE ANN. §27-103-159 specifies a definition of 'evidence' and other terms.
    • This is useful to get everybody on the same page!

The 7 Elements

Implementation plan

  • What resources and activities will the program require to get off the ground?

Fidelity plan

  • What's your plan to ensure fidelity to the program design?
    • This element relates closely to the research base!

Measurement and evaluation

  • What is your empirical criterion of success for this program?
    • There's a tradeoff here with the research element!
    • As always, show your work.

The 7 Elements: Takeaways

Takeaway one: Everybody's children are above average.

  • That is, responses to the seven elements will naturally (and generally blamelessly) emphasize a program's strong points and glide over the weak.
  • But responses can't just be treated as a hoop to jump through; they must be evaluated by staffers competent in the math and research!
  • Without such independent evaluation, the point is lost.

The 7 Elements: Takeaways

Takeaway two: These may seem complicated and unusual; they're actually rigorous and intuitive.

  • The seven elements, at base, are just asking about things we already want to know:
    • Does this program work?
    • How much does it cost?
    • Are its effects a priority for us, compared to financially equivalent options?
    • et cetera!

More on takeaway two

ANSWERING these questions can sometimes get technical, but ASKING them is just common sense! We already want to know this stuff.

More on takeaway two

And the math and science involved aren't optional; they're necessary for answering some basic questions we have about programs.

  • Basic numeric comparisons without rigorous empirical and mathematical context are useless!

More on takeaway two. With pictures!

Imagine two programs doing the same thing.

  • One of them achieves 40 units of effect. Boo.
  • The other achieves 80 units of effect. Yay!

Here's what we probably imagine...

plot of chunk unnamed-chunk-1

But here's what could be happening!

plot of chunk unnamed-chunk-2

Takeaway three:

The 7 Elements are primarily intended for use with intervention programs – and they're primarily for looking forward!

Intervention programs defined by opposition:

  • Programs we need for their own sake
  • Programs that do something for us

The distinction can get fuzzy, but not insolubly so.

So what if we need to evaluate something else?

Well, it's a pretty basic rule:

To know how you're doing, you need to know what you've done!

Which means that any (non-piecemeal) solution to the problem of backward-looking performance based budgeting starts with a program inventory.

  • The inventory needs to be highly specific – to the level at which dollars turn into discrete results by virtue of discrete activities.

Program inventory

MS is in the process of a four-agency inventory pilot:

  • Corrections
  • Education
  • Health
  • Transportation

Program inventory

The project seems simple, but is surprisingly complex!

It's also necessary for several reasons:

  • Accountability for money spent
  • Accountability for performance
  • Transparency for all stakeholders

Program inventory

It's not enough to have an inventory; we also have to make that inventory available!

But here we face a problem: Balancing the needs of public users and power users.

Program inventory

A word of caution

There's much left to be done on this project!

  • Improve the current product
  • Expand the inventory
  • Change the culture
    • At the agency level
    • At the state level

A gesture of optimism

But a project like this, carried out well, enables whole new kinds of performance evaluation.

For instance: The aforementioned backwards-looking evaluation.

  • If performance numbers aren't good for an evidence-based program, investigate fidelity!
  • Over time, establish meaningful baselines and find connections.

What else could we do?

With enough data, we've even got the key to evaluating the non-intervention programs I mentioned earlier!

  • There are two basic relationships we can expect between spending on non-intervention programs and performance.
  • Each relationship indicates an optimization strategy!

They will look something like this...

plot of chunk unnamed-chunk-3

Or like this!

plot of chunk unnamed-chunk-4

That's all, folks!

Questions?