Three Tips for Spending COVID-19 Funds in Evidence-Based Ways

Frederick M. Hess
3 min readJul 22, 2021

As schools race to spend the billions of dollars in COVID-19 aid that Washington has provided, I’m troubled by the number of emails I receive, from district staff or board members, that say things like, “Please don’t tell anybody I shared this, but you won’t believe how we’re spending X million dollars . . .” I think part of the problem is that, even where school and system leaders are intent on making good, “evidence-based” decisions, there’s a lot of uncertainty about just what that means in practice.

After all, districts know they’re supposed to make a particular show right now out of spending money in accord with the “evidence.” The American Recovery Plan stipulates that 20 percent of all American Recovery Plan dollars going to local districts should be used to measure the impact of lost instructional time or pay for “evidence-based interventions” that will help compensate for it. Meanwhile, any number of vendors, advocates, and consultants are busy telling school and system leaders that they’ve got just the (evidence-based) thing to help. It can make it tough to sort the wheat from the chaff.

As leaders try to seek out evidence-based practices that will actually be useful, here are a few suggestions drawn from a piece I recently did for Educational Leadership that they should keep in mind.

First off, the evidence in question is frequently less definitive than we might imagine. Even medical researchers, with their deep pockets and fancy lab equipment, change their minds with distressing regularity on things like the risks of cholesterol, the virtues of flossing, or the effects of alcohol. The fact that a study concludes something doesn’t mean that the conclusion is necessarily “true.” This is why research shouldn’t be followed slavishly.

Consider: In 2015, an attempt to replicate 97 studies with statistically significant results found that more than one-third couldn’t be duplicated. This means that, when someone took the original data and reran the study, the results disappeared. And a confidential survey a few years back found that more than90 percentof psychology researchers confessed to at least one behavior that might compromise their research, such as stopping data collection early because they liked the results or failing to disclose all of a study’s conditions. The bottom line: There are lots of potential pitfalls when we just “follow the research.” All “evidence” isn’t reliable, and determining what to trust requires judgment, acumen, and patience.

Second, the research that gets pursued is often dictated less by educators than by policymakers and funders, who can provide resources, platforms, and professional opportunities that practitioners just can’t match. Researchers are perfectly willing to tap into big state data sets that allow them to tackle broad policy questions — like whether pre-K or charter schooling boosts student achievement — but which don’t say much about how a given district or school should spend its funds.

Third, even when you’ve found solid, relevant research, it’s not always evident what it means to “follow” it. After all, learning that research says something “works” and actually making it work successfully oneself can be two completely different things. In health care, for instance, if a vaccine “worked” when patients got a 200-milligram dosage exactly 24 days apart, that’s how doctors will administer it. If those instructions aren’t followed, the treatment won’t work the way it’s supposed to.

In schooling, implementation is rarely that disciplined — for reasons ranging from contracts to coaching to culture. Yet, it’s vital to understand precisely how to implement a practice in order for it to work and then to do it. When research finds significant effects from a particular practice, coaches or vendors need to be crystal clear on what that means and what it entails — and school and district leaders need to be willing to execute accordingly. Remember: When “evidence-based” practice winds up only loosely modeled on the research, it’s no longer evidence-based.

Given all that, school and system leaders need to ask hard questions about what purportedly “works.” Even when analyses are actually relevant to a district’s needs, they are often imprecise about what programs require, how they’re structured, or how staff are trained. If COVID funds targeted to adopt or expand evidence-based practices are going to deliver, it’s crucial to be clear on what evidence is actually helpful, what it actually says, and how to use it well.

This post originally appeared on Rick Hess Straight Up.

--

--

Frederick M. Hess

Direct Ed Policy Studies at AEI. Teach a bit at Rice, UPenn, Harvard. Author of books like Cage-Busting Leadership and Spinning Wheels. Pen Ed Week's RHSU blog.