Showing posts with label FERPA. Show all posts
Showing posts with label FERPA. Show all posts

Tuesday, April 19, 2011

A Conversation About Building State and Local Research Capacity

John Q Easton, director of the Institute of Education Sciences (IES), came to New Orleans recently to participate in the annual meeting of the American Educational Research Association. At one of his stops, he was the featured speaker at a meeting of the Directors of Research and Evaluation (DRE), an organization composed of school district research directors. (DRE is affiliated with AERA and was recently incorporated as a 501(c)(3)). John started his remarks by pointing out that for much of his career he was a school district research director and felt great affinity to the group. He introduced the directions that IES was taking, especially how it was approaching working with school systems. He spent most of the hour fielding questions and engaging in discussion with the participants. Several interesting points came out of the conversation about roles for the researchers who work for education agencies.

“...in parallel to building a research culture in districts, it will be necessary to build a practitioner culture among researchers.”

Historically, most IES research grant programs have been aimed at university or other academic researchers. It is noteworthy that even in a program for “Evaluation of State and Local Education Programs and Policies,” grants have been awarded only to universities and large research firms. There is no expectation that researchers working for the state or local agency would be involved in the research beyond the implementation of the program. The RFP for the next generation of Regional Education Labs (REL) contracts may help to change that. The new RFP expects the RELs to work closely with education agencies to define their research questions and to assist alliances of state and local agencies in developing their own research capacity.

Members of the audience noted that, as district directors of research, they often spend more time reviewing research proposals from students and professors at local colleges who want to conduct research in their schools, rather than actually answering questions initiated by the district. Funded researchers treat the districts as the “human subjects,” paying incentives to participants and sometimes paying for data services. But the districts seldom participate in defining the research topic, conducting the studies, or benefiting directly from the reported findings. The new mission of the RELs to build local capacity will be a major shift.

Some in the audience pointed out reasons to be skeptical that this REL agenda would be possible. How can we build capacity if research and evaluation departments across the country are being cut? In fact, very little is known about the number of state or district practitioners whose capacity for research and evaluation could be built by applying the REL resources. (Perhaps, a good first research task for the RELs would be to conduct a national survey to measure the existing capacity.)

John made a good point in reply: IES and the RELs have to work with the district leadership—not just the R&E departments—to make this work. The leadership has to have a more analytic view. They need to see the value of having an R&E department that goes beyond test administration, and is able to obtain evidence to support local decisions. By cultivating a research culture in the district, evaluation could be routinely built in to new program implementations from the beginning. The value of the research would be demonstrated in the improvements resulting from informed decisions. Without a district leadership team that values research to find out what works for the district, internal R&E departments will not be seen as an important capacity.

Some in the audience pointed out that in parallel to building a research culture in districts, it will be necessary to build a practitioner culture among researchers. It would be straightforward for IES to require that research grantees and contractors engage the district R&E staff in the actual work, not just review the research plan and sign the FERPA agreement. Practitioners ultimately hold the expertise in how the programs and research can be implemented successfully in the district, thus improving the overall quality and relevance of the research.
—DN

Friday, March 14, 2008

Making Way for Innovation: An Open Email to Two Congressional Staffers Working on NCLB

Roberto and Brad, it was a pleasure hearing your commentary at the February 20 Policy Forum “Using Evidence for a Change” and having a chance to meet you afterward. Roberto, we promised you a note summarizing the views expressed by several on the panel and raised in the question period.

We can contrast two views of research evident at the policy forum:

The first view holds that, because research is so expensive and difficult, only the federal government can afford it and only highly qualified professional researchers can be entrusted with it. The goal of such research activities is to obtain highly precise and generalizable evidence. In this view, practitioners (at the state, district, or school level) are put in the role of consumers of the evidence.

The second view holds that research should be made a routine activity within any school district contemplating a significant investment in an instructional or professional development program. Since all the necessary data are readily at hand (and without FERPA restrictions), it is straightforward for district personnel to conduct their own simple comparison group study. The result would be reasonably accurate local information on the program‘s impact in the setting. In this view, practitioners are producers of the evidence.

The approach suggested by the second view is far more cost effective than the first, as well as more timely. It is also driven directly by the immediate needs of districts. While each individual study would pertain only to a local implementation, in combination, hundreds of such studies can be collected and published by organizations like the What Works Clearinghouse or by consortia of states or districts. Turning practitioners into producers of evidence also removes the brakes on innovation identified in the policy forum. With practitioners as evidence producers, schools can adopt “unproven” programs as long as they do so as a pilot that can be evaluated for its impact on student achievement.

A few tweaks to NCLB will be necessary to turn practitioners into producers of evidence:

1. Currently NCLB implicitly takes the “practitioners as consumers of evidence” view in requiring that the scientifically based research be conducted prior to a district‘s acquisition of a program. We have already published a blog entry analyzing the changes to the SBR language in the Miller-McKeon and Lugar-Bingaman proposals and how minor modifications could remove the implicit “consumers” view. These are tweaks such as, for example, changing a phrase that calls for:
“including integrating reliable teaching methods based on scientifically valid research”
to a call for
“including integrating reliable teaching methods based on, or evaluated by, scientifically valid research.”

2. Make clear that a portion of the program funds are to be used in piloting new programs so they can be evaluated for their impact on student achievement. Consider a provision similar to the “priority” idea that Nina Rees persuaded ED to use in awarding its competitive programs.

3. Build in a waiver provision such as that proposed by the Education Sciences Board that would remove some of the risk to a failing district in piloting a new promising program. This “pilot program waiver” should cover consequences of failure for the participating schools for the period of the pilot. The waiver should also remove requirements that NCLB program funds be used only for the lowest scoring students, since this would preclude having the control group needed for a rigorous study.

The view of “practitioners as consumers of evidence” is widely unpopular. It is viewed by decision-makers as inviting the inappropriate construction of an approved list, as was revealed in the Reading First program. It is seen as restricting local innovation by requiring compliance with the proclamations of federal agencies. In the end, science is reduced to a check box on the district requisition form. If education is to become an evidence-based practice, we have to start with the practitioners. —DN