A week before this semester began, I was asked to teach ELPA 875, Theory and Practice in Educational Planning. This post is a summary and brief reflection of this experience and how I might improve the course in the future. The image above was a focal diagram that we returned to throughout the semester. The PDSA image comes from
Broadly, this is a class about planning for and effecting change in an organization. Through the lens of trying to impact change, we considered
Scale – district, building, classroom, learner. For example, we explored how defining problems at a large scale, such as the achievement gap, can make them feel unsurmountable and consequently disconnected from daily work. We worked to define problems in a way that was connected to our daily work and aligned with organizational goals.
Design – we talked about a design strategy of starting with small, iterative testing rather than large-scale changes all at once; seeing the system that produces undesired results; and considered the importance of including different perspectives – not just to be “midwest nice” – but because no one person can see all the pieces of a system,
Change as relational – bringing people together to solve common problems can be the work that builds a positive culture where people trust and support each other.
Next up on the read it and return it list is Change Leader: Learning to Do What Matters Most, by Michael Fullan (2011). This book was cited in something I read and I happened to be wandering the education stacks (yes, some people still do this) and I picked it up. I was curious about the first chapter: Practice Drives Theory: Doing is the crucible of change. Definitely in my court.
“All the best concepts to be deeply experientially grounded.” (p. xi) This book comes after Fullan worked on whole-system reform, engaging with practitioners and policymakers to change large, complex education systems. “The most effective leaders use practice as their fertile learning ground. They never go from theory to practice or research evidence to application. They do it the other way around: they try to figure out what’s working, what could be working better, and then look into how research and theory might help.” (p.xii)
“Doing is the crucible of change” (p.3)
“Effective change leaders … walk into the future through examining their own and others’ best practices, looking for insights they had hitherto not noticed” (p.11)
adaptive challenge (require new discoveries and behavioral change) vs. technical problems (we know the answer, solution just needs to be applied) (p.17-18)
“balance between capacity building and accountability interventions” (p.19)
Halverson, E., & Sheridan, K. (2014). Arts Education and the Learning Sciences. Chapter 31 in Learning Sciences. (p.626-646).
Halverson, E., Lowenhaupt, R., & Kalaitzidis, T. (under review). Towards a Theory of Distributed Instruction in Creative Arts Education. Journal of Technology and Teacher Education.
Arts educators and researchers seem to spend a lot of time justifying themselves and their work, trying to demystify what it is and its value. Halverson and Sheridan (2014) note that the “inability to objectively assess arts production is what has destined the arts to remain peripheral in schools” (p.638). Many teachers and administrators are unlikely to have experienced a strong arts program in their own education nor do they have training in this area. How many art teachers go on to become principals? Even those who believe in it may not know how to go about implementation. Personally, I know that I never identified as someone who “got” art class: I could never discern the rules of the game. For this reason, what I appreciated most about Halverson and Sheridan’s (2014) chapter regarding arts education and the learning sciences was that it made each component clear and understandable. I think there is still a leap to how instruction would be designed and assessed, but that is where Halverson, Lowenhaupt, and Kalaitzidis (under review) pick up.
The idea of distributed instruction definitely resonates with my experiences. As a science teacher, I mentored all my students through the science research process every year. I would act as both instructional designer, setting up deadlines and templates, and content mentor, answering questions, delivering mini-lectures, or recommending further resources on everything from wind turbine shape to bacteria incubation to oscillating chemical reactions. I felt like my varied science background was a resource, and I loved getting to learn with the students about all these different areas. The process was exhilarating and exhausting. Once I became technology coordinator, one of my favorite things to do was go into the science classes and serve only as mentor, engaging with students about their projects without worrying about how they were meeting requirements. I see a lot of potential for the idea of distributed instructional design, particularly in the personalized learning model as as way to understand what happens in practice and what that practice reveals about the designer’s conceptual model of teaching and learning.
Finally, I was thinking back to our early discussion about Discourses (Gee, 2001) with its relationship to identity, and thinking about conversations with leaders of schools that are adopting a personalizing learning model. Like the kids in art class who “get it”, it seems like some teachers seem to just “get it”: they co-teach and flex as needed in order to orchestrate student-centered inquiry all without formal training as to how to do this. These skills are increasingly seen as valuable and scarce, so if we want to shift both teachers and students into this way of thinking about learning, we need a way forward, a way that arts based education already knows. In particular, arts education addresses identity and culture, which is crucial through the lens of Discourses. Furthermore, Gee (2001) writes, “one crucial question we can always ask about identities of any type is this: What institution or institutions, or which group or groups of people, work to construct and sustain a given Discourse?” (p.111) We have different “institutions” within our buildings fighting to construct and sustain Discourses, with literacy and STEM currently in charge and arts at the periphery. I see the articulation of arts based education and distributed instruction as leading the way for how to prepare teachers needed for these alternative, in-school environments, rather than perpetuating the myth of the teacher or learners that just “get it.”
Feuer, M.J., Towne, L., & Shavelson, R. J. (2002) Scientific Culture and Educational Research. Educational Researcher 31(4) 4-14.
U.S. Department of Education (2003). Identifying and implementing educational practices supported by rigorous evidence: a user friendly guide. Available at http://ies.ed.gov/ncee/pubs/evidence_based /evidence_based.asp.
Reiser, B. J. (2013). What professional development strategies are needed for successful implementation of the next generation science standards? Paper prepared for K12 center at ETS invitational symposium on science assessment. Washington, DC. http://www.k12center.org/rsc/pdf/reiser.pdf.
Clearly Feuer, Towne, & Shavelson (2002) were at odds with the policy emphasis captured in the “user-friendly guide” by the Department of Education in (2003), though they were clearly open to increasing use of randomized, controlled trials: “Although we strongly oppose blunt federal mandates that reduce scientific inquiry to one method applied inappropriately to every type of research question, we also believe that the field should use this tool in studies in education more often than is current practice…. We have also unapologetically supported scientific educational research without retreating from the view that the ecology of educational research is as complex as the field it studies and that education scholarship therefore must embody more than scientific studies.” While they leave the field open for many different communities of inquiry, the DOE report narrows the focus onto just one. This narrowing of the range of inquiry, in my view, is short-sighted and extremely limiting in three ways.
First, as we learned in Organizing Schools for Improvement, change takes time. It often takes five years for a new program or community to be built and show results. There can be an implementation dip, where the disruption of change actually makes things worse initially. As we learned at Waukesha STEM this week, the first six months of their new idea of “connect time” was true chaos with teachers ready to get rid of it immediately. Now it is one of the pillars of the way they have changed to student-centered learning. Second, the narrowing of a focus to one kind of method as suggested in the DOE report means that there are fewer questions that can be asked. For example, there is no ethical way to use randomized, controlled trials to understand the experience of homeless students in schools. As Feuer, Towne, & Shavelson state, “The question drives the methods, not the other way around. The overzealous adherence to the use of any given research design flies in the face of this fundamental principle.” Finally, it is increasingly clear that a diversity of ideas drives innovations and solutions, and “the presence of numerous disciplinary perspectives (e.g., anthropology, psychology, sociology, economics, neuroscience) focusing on different parts of the system means that there are many legitimate research frameworks, methods (Howe & Eisenhart, 1990), and norms of inquiry.” (Feuer, Towne, Shavelson, 2002) We need multiple Discourses (Gee, 1990) in educational research.
The Department of Education report is meant to address the gap between research and practitioners. Feuer, Towne, and Shavelson quote the National Research Council that said, “Educators have never asked much of educational research and development, and that’s exactly what we gave them.” What I found compelling about Reiser’s (2013) paper on professional development for the Next Generation Science Standards was that it seamlessly wove theory and practice, describing the cultural shift to one line messages, giving examples of the way practice is now, and describing what it should be. For example, Reiser writes, about the “shift from learning about… to figuring out,” and “Inquiry is not a separate activity—all science learning should involve engaging in practices to build and use knowledge.” Further, when Reiser outlines the key principles for professional development, lists a series of recommendations, and includes practical examples, like the suggestion, “One fruitful way to engage teachers with records of practice is for teachers to analyze video cases of teaching interactions.” In the frame of distributed leadership, changing systems of practice happens through changing the routines, and this paper clearly brings research to bear on precisely what is being done in the classroom.
(Somewhat more philosophically, it is ironic that just as the Next Generation Science Standards are shifting towards an approach of describing phenomena first and then trying to explain it, while Department of Education clings to the old scientific model of inquiry that dictates rigid positivist methods.)
What are the implications for school leaders? I see the appeal of a one-size-fits-all, tried-and-true, what works solution, but I think most educators know that nothing with kids (or teachers, for that matter) works that way. Yet when faced with a field of educational research that seems to have a lot of internal conflict about what is considered “rigorous” research, what do you do first, on Monday, when the kids show up? I think this is why the ideas of design and professional community are appealing as a way of improving educational systems. Design, to me, is not about realizing one fixed answer, but rather is constant process of listening and testing, embedded in local context rather than seeking to minimize it. Similarly, focusing on professional community builds the capacity of people and context, rather than seeking to minimize them. Just as inquiry is not a separate activity when learning science or for educational researchers, it is not a separate activity for leaders, either.
How do we use information? This is a really broad question and might not seem on topic for this week, but I’ll get there. First, I returned to Chris Thorn’s “Knowledge Management for Educational Information Systems: What is the state of the field?” (2001) that we read last semester. He defines knowledge and it’s relationship to information and data. Data is facts, information is facts + context, knowledge is the facts + context + experience, judgement, intuition, values. (These are actually definitions from Epson, 1999, that Thorn cites.) There is thus a progression from data to knowledge of as facts are brought into a Discourse. Two different Discourses might take the same data and come out with different knowledge. Thinking about it in this way led me to think about what we have discussed the last two weeks about how administrators have the power to bring a policy into their Discourse (if they have established one, of course).
But returning to the question of how we use information, what I find so exciting about Authentic Intellectual Work (AIW) is that it is an information gathering tool that seeks to measure what I would call the “good stuff” of teaching and learning: the conversations, the higher-order thinking, the student interest, social support. What’s more, the implementation framework actually establishes a Discourse around the use of the information, changing the way educators interact and centering the conversation around the empirically gathered information – not about thoughts, intentions, feelings, etc. Teachers are coached on how to see and understand the information that is already in their classrooms.
In a different turn on how we use information, Organizing Schools for Improvement uses data to show relationships in a way that I had never seen before. It was the first time I had seen a quantitative analysis of systems that even attempted to show synergistic effects, such as Figure 4.11 (p. 114), showing that schools strong on two supports did substantially better than those strong in just one or the other support. While I have struggled with accepting the use of math and reading scores as measures of “achievement,” I think the way it was used here has merit. Since the schools deemed “improving” were the ones in the top quartile, it does seem that this would represent genuine learning. It seems it would be hard to exclusively teach to the test and get into the highest quartile.
Halverson, R. (201?). Systems of Practice: How Leaders Use Artifacts to Create Professional Community in Schools.
Spillane, J., Halverson, R., Diamond, J. (2004). Towards a theory of leadership practice: a distributed perspective. Journal of Curriculum Studies. 36 (1): 3-34.
I’ve read so much in the past four months (more than I have in the past 10 years) that sometimes I lose track of what I wanted to come to graduate school to study in the first place. The readings this week took me right back to questions I wrote in my personal statement: “How can a system-wide professional growth model be designed that inspires a professional culture? Is cultivating passionate and engaged teachers enough to shift an institution? What other structures or leadership opportunities do teachers need to feel connected?”
The distributed perspective for leadership presented by Spillane et al. proposes that leadership is never limited to one person alone. It is distributed across actors, tools, and context. The relevant level of analysis for the practice of leadership is the tasks that leaders do. Changing the practice of leadership thus begins with changing the tasks and routinizing these new tasks, eventually rooting the changes into the norms or culture of the organization. What resonates most with me is the fact that distributed leadership is both a way to see leadership and a way to change it. Continue reading “Reaction: Distributed Leadership & Systems of Practice”→
Halverson, R.; Grigg, J.; Prichett, R.; Thomas, C. (2007). The New Instructional Leadership: Creating Data-Driven Instructional Systems in School. Journal of School Leadership. 17: 159-193.
Thorn, C.A. (2001, November 19). Knowledge Management for Educational Information Systems: What Is the State of the Field?. Education Policy Analysis Archives. 9(47). Retrieved September 5, 2007 from http://epaa.asu.edu/epaa/v9n47/.
Unit of analysis. That seemed to be the thing that kept popping out at me this week. This was stated directly by Thorn (2001) that if the student is the unit of interest, then the data gathered should be attributes about the student. Student Information Systems, however, tend to be designed to produce reports for district-level analysis, not for the classroom. Halverson et al. (2007) found a mismatch or inoperability of data in the district’s high-tech data storage as opposed to the local collection and storage of low-tech data. The logical goal of the proposed data-driven instructional system is thus to link the results of summative data with formative information systems that teachers can use to improve instruction. The goal of practical measurement, as proposed by Yeager et al. (2013) is that “educators need data closely linked to specific work processes and change ideas being introduced in a particular context” (p.12).
In the past, I have associated data-driven decision making as context-blind work whose sole purpose was to improve standardized test scores. The readings this week as well as the networked improvement communities (Bryk et al., 2010) from a few weeks ago has given me a different perspective on what it means to use data to inform instruction, design, and communities. Continue reading “Reaction 11: Data-Driven Instructional Systems”→