Showing posts with label John Easton. Show all posts
Showing posts with label John Easton. Show all posts

Friday, September 11, 2009

Easton Sets a New Agenda for IES

John Easton, now officially confirmed as the director of the Institute of Education Sciences, gave a brief talk July 24th to explain his agenda to the directors and staff of the Regional Education Labs. This is a particularly pivotal time, not only because the Obama administration is setting an aggressive direction for changes in the K-12 schools, but also because the Easton is starting his six-year term just as IES is preparing the RFP for the re-competition for the 10 RELs. (The budget for the RELs accounts for about 11% of the approximately $600 million IES budget.)

Easton made five points.

First, he is not retreating from the methodological rigor, which was the hallmark of his predecessor, Russ Whitehurst. This simply means that IES will not be funding poorly designed research that does not have the proper controls to support conclusions the researcher wants to assert. Randomized control is still the strongest design for effectiveness studies, although weaker designs are recognized as having value.

Second, there has to be more emphasis on relevance and usability for practitioners. IES can’t ignore how decisions are made and what kind of evidence can usefully inform them. He sees this as requiring a new focus on school systems as learning organizations. This becomes a topic for research and development.

Third, although randomized experiments will still be conducted, there needs to be a stronger tie to what is then done with the findings. In a research and development process, rigorous evaluation should be built in from the start and should relate more specifically to the needs of the practitioners who are part of the R&D process. In this sense, the R&D process should be linked more directly to the needs of the practitioners.

Fourth, IES will move away from the top-down dissemination model in which researchers seem to complete a study and then throw the findings over the wall to practitioners. Instead, researchers should engage practitioners in the use of evidence, understanding that the value of research findings comes in its application, not simply in being released or published. IES will take on the role of facilitating the use of evidence.

Fifth, IES will take on a stronger role in building capacity to conduct research at the local level and within state education agencies. There’s a huge opportunity presented by the investment (also through IES) in state longitudinal data systems. The combination of state systems and the local district systems makes gathering the data to answer policy questions and questions about program effectiveness much easier. The education agencies, however, often need help in framing their questions, applying an appropriate design, and deploying the necessary and appropriate statistics to turn the data into evidence.

These five points form a coherent picture of a research agency that will work more closely through all phases of the research process with practitioners, who will be engaged in developing the findings and putting them into practice. This suggests new roles for the Regional Education Labs in helping their constituencies to answer questions pertinent to their local needs, in engaging them more deeply in using the evidence found, and in building their local capacity to answer their own questions. The quality of work will be maintained, and the usability and local relevance will be greatly increased.
— DN

Monday, April 6, 2009

The Advantages of Research on Local Problems

The nomination of John Q. Easton as the new head of IES highlights a debate that has been going on for quite a long time. As Donald Campbell noted in the early 70s in his classic paper “The Experimenting Society” (updated in 1988), “The U.S. Congress is apt to mandate an immediate, nationwide evaluation of a new program to be done by a single evaluator, once and for all, subsequent implementations to go without evaluation.” In contrast, he describes a “contagious cross-validation model for local programs” and recommends a much more distributed approach that would “support adoptions that included locally designed cross-validating evaluations, including funds for appropriate comparison groups not receiving the treatment.” Using such a model, he predicts that “After five years we might have 100 locally interpretable experiments.” (p.303) The work of the Consortium on Chicago School Research, which Easton has led, has a local focus on Chicago schools consistent with the idea that experiments should be locally interpretable. Elsewhere, we have argued that local experiments can also be vastly less expensive; thus having 100 of them is quite feasible. These experiments also can be completed in a more timely manner—it need not take five years to accumulate a wealth of evidence. We welcome a change in orientation at IES from organizing single large national experiments to the more useful, efficient, and practical model of supporting many local rigorous experiments. –DN

Campbell, D. T. (1988). The Experimenting Society. In E. S. Overman (Ed.), Methodology and epistemology for social science: Selected Papers. (pp. 303). Chicago: University of Chicago Press.