Tuesday, September 22, 2009

Research as Innovation

Many of us heard Jim Shelton, the ED Assistant Deputy Secretary for Innovation and Improvement, speak to the education publishing industry last week about the $650 million fund now called “Investing in Innovation” (i3). Through i3, Shelton wants to fund the scaling up of innovations having some evidence that they’re worth investing in. These i3 grants could be as large as $50 million.

With that amount at stake, it makes sense for government funders to look for some track record of scientifically documented success. The frequent references in ED documents to processes of “continuous improvement” as part of innovations suggest that proposers would do well to supplement the limited evidence for their innovation by showing how scientific evidence can be generated as an ongoing part of a funded project, that is, how in-course corrections and improvements can be made to the innovation as it is being put into place in a school system.

In his speech to the education industry, Shelton complained about the low quality of the evidence currently being put forward. Although some publishers have taken the initiative and done serious tests of their products, there has never been a strong push for them to produce evidence of effectiveness.

School systems usually haven’t demanded such evidence, partly because there are often more salient decision criteria and partly because little qualified evidence exists, even for programs that are effective. Moreover, district decision makers may find studies of a product conducted in schools that are different from their schools to have marginal relevance, regardless of how “rigorously” the studies were conducted.

The ED appears to recognize that it will be counter-productive for grant programs such as i3 to depend entirely on the pre-existing scientific evidence. An alternative research model based on continuous improvement may help states and districts to succeed with their i3 proposals—and with their projects, once funded.

Now that improved state and district data systems are increasing the ability of school systems to quickly reference several years of data on students and teachers, i3 can start looking at how rigorous research is built into the innovations they fund—not just the one-time evaluation typically built into federal grant proposals.

This kind of research for continuous improvement is an innovation in itself—an innovation that may start with the “data-driven decision making” mode in which data are explored to identify an area of weakness or a worrisome trend. But the real innovation in research will consist of states and districts building their own capacity to evaluate whether the intervention they decided to implement actually strengthened the area of weakness or arrested the worrisome trend they identified and chose to address. Perhaps it did so for some schools but not others, or maybe it caught on with some teachers but not with all. The ability of educators to look at this progress in relation to the initial goals completes the cycle of continuous improvement and sets the stage for refocusing, tweaking, or fully redesigning the intervention under study.

We predict that i3 reviewers, rather than depending solely on strong existing evidence, will look for proposals that also include a plan for continuous improvement that can be part of how the innovation assures its success. In this model, research need not be limited to the activity of an “external evaluator” that absorbs 10% of the grant. Instead, routine use of research processes can be an innovation that builds the internal capacity of states and districts for continuous improvement.
-DN

Friday, September 11, 2009

Easton Sets a New Agenda for IES

John Easton, now officially confirmed as the director of the Institute of Education Sciences, gave a brief talk July 24th to explain his agenda to the directors and staff of the Regional Education Labs. This is a particularly pivotal time, not only because the Obama administration is setting an aggressive direction for changes in the K-12 schools, but also because the Easton is starting his six-year term just as IES is preparing the RFP for the re-competition for the 10 RELs. (The budget for the RELs accounts for about 11% of the approximately $600 million IES budget.)

Easton made five points.

First, he is not retreating from the methodological rigor, which was the hallmark of his predecessor, Russ Whitehurst. This simply means that IES will not be funding poorly designed research that does not have the proper controls to support conclusions the researcher wants to assert. Randomized control is still the strongest design for effectiveness studies, although weaker designs are recognized as having value.

Second, there has to be more emphasis on relevance and usability for practitioners. IES can’t ignore how decisions are made and what kind of evidence can usefully inform them. He sees this as requiring a new focus on school systems as learning organizations. This becomes a topic for research and development.

Third, although randomized experiments will still be conducted, there needs to be a stronger tie to what is then done with the findings. In a research and development process, rigorous evaluation should be built in from the start and should relate more specifically to the needs of the practitioners who are part of the R&D process. In this sense, the R&D process should be linked more directly to the needs of the practitioners.

Fourth, IES will move away from the top-down dissemination model in which researchers seem to complete a study and then throw the findings over the wall to practitioners. Instead, researchers should engage practitioners in the use of evidence, understanding that the value of research findings comes in its application, not simply in being released or published. IES will take on the role of facilitating the use of evidence.

Fifth, IES will take on a stronger role in building capacity to conduct research at the local level and within state education agencies. There’s a huge opportunity presented by the investment (also through IES) in state longitudinal data systems. The combination of state systems and the local district systems makes gathering the data to answer policy questions and questions about program effectiveness much easier. The education agencies, however, often need help in framing their questions, applying an appropriate design, and deploying the necessary and appropriate statistics to turn the data into evidence.

These five points form a coherent picture of a research agency that will work more closely through all phases of the research process with practitioners, who will be engaged in developing the findings and putting them into practice. This suggests new roles for the Regional Education Labs in helping their constituencies to answer questions pertinent to their local needs, in engaging them more deeply in using the evidence found, and in building their local capacity to answer their own questions. The quality of work will be maintained, and the usability and local relevance will be greatly increased.
— DN