Feuer, M.J., Towne, L., & Shavelson, R. J. (2002) Scientific Culture and Educational Research. Educational Researcher 31(4) 4-14.
U.S. Department of Education (2003). Identifying and implementing educational practices supported by rigorous evidence: a user friendly guide. Available at http://ies.ed.gov/ncee/pubs/evidence_based /evidence_based.asp.
Reiser, B. J. (2013). What professional development strategies are needed for successful implementation of the next generation science standards? Paper prepared for K12 center at ETS invitational symposium on science assessment. Washington, DC. http://www.k12center.org/rsc/pdf/reiser.pdf.
Clearly Feuer, Towne, & Shavelson (2002) were at odds with the policy emphasis captured in the “user-friendly guide” by the Department of Education in (2003), though they were clearly open to increasing use of randomized, controlled trials: “Although we strongly oppose blunt federal mandates that reduce scientific inquiry to one method applied inappropriately to every type of research question, we also believe that the field should use this tool in studies in education more often than is current practice…. We have also unapologetically supported scientific educational research without retreating from the view that the ecology of educational research is as complex as the field it studies and that education scholarship therefore must embody more than scientific studies.” While they leave the field open for many different communities of inquiry, the DOE report narrows the focus onto just one. This narrowing of the range of inquiry, in my view, is short-sighted and extremely limiting in three ways.
First, as we learned in Organizing Schools for Improvement, change takes time. It often takes five years for a new program or community to be built and show results. There can be an implementation dip, where the disruption of change actually makes things worse initially. As we learned at Waukesha STEM this week, the first six months of their new idea of “connect time” was true chaos with teachers ready to get rid of it immediately. Now it is one of the pillars of the way they have changed to student-centered learning. Second, the narrowing of a focus to one kind of method as suggested in the DOE report means that there are fewer questions that can be asked. For example, there is no ethical way to use randomized, controlled trials to understand the experience of homeless students in schools. As Feuer, Towne, & Shavelson state, “The question drives the methods, not the other way around. The overzealous adherence to the use of any given research design flies in the face of this fundamental principle.” Finally, it is increasingly clear that a diversity of ideas drives innovations and solutions, and “the presence of numerous disciplinary perspectives (e.g., anthropology, psychology, sociology, economics, neuroscience) focusing on different parts of the system means that there are many legitimate research frameworks, methods (Howe & Eisenhart, 1990), and norms of inquiry.” (Feuer, Towne, Shavelson, 2002) We need multiple Discourses (Gee, 1990) in educational research.
The Department of Education report is meant to address the gap between research and practitioners. Feuer, Towne, and Shavelson quote the National Research Council that said, “Educators have never asked much of educational research and development, and that’s exactly what we gave them.” What I found compelling about Reiser’s (2013) paper on professional development for the Next Generation Science Standards was that it seamlessly wove theory and practice, describing the cultural shift to one line messages, giving examples of the way practice is now, and describing what it should be. For example, Reiser writes, about the “shift from learning about… to figuring out,” and “Inquiry is not a separate activity—all science learning should involve engaging in practices to build and use knowledge.” Further, when Reiser outlines the key principles for professional development, lists a series of recommendations, and includes practical examples, like the suggestion, “One fruitful way to engage teachers with records of practice is for teachers to analyze video cases of teaching interactions.” In the frame of distributed leadership, changing systems of practice happens through changing the routines, and this paper clearly brings research to bear on precisely what is being done in the classroom.
(Somewhat more philosophically, it is ironic that just as the Next Generation Science Standards are shifting towards an approach of describing phenomena first and then trying to explain it, while Department of Education clings to the old scientific model of inquiry that dictates rigid positivist methods.)
What are the implications for school leaders? I see the appeal of a one-size-fits-all, tried-and-true, what works solution, but I think most educators know that nothing with kids (or teachers, for that matter) works that way. Yet when faced with a field of educational research that seems to have a lot of internal conflict about what is considered “rigorous” research, what do you do first, on Monday, when the kids show up? I think this is why the ideas of design and professional community are appealing as a way of improving educational systems. Design, to me, is not about realizing one fixed answer, but rather is constant process of listening and testing, embedded in local context rather than seeking to minimize it. Similarly, focusing on professional community builds the capacity of people and context, rather than seeking to minimize them. Just as inquiry is not a separate activity when learning science or for educational researchers, it is not a separate activity for leaders, either.