An Evidence-Driven Approach to Fostering Innovation In Classrooms and Communities
Teach For All’s mission is to develop collective leadership to improve education and expand opportunity for all children, so they can shape a better future for themselves and the world around them. This raises a lot of challenging questions. How would our education systems be designed differently if the purpose of education was fulfilling students’ potential as leaders of a better future? What does it take to support students in developing their agency, well-being, awareness, connectedness and mastery? How can we develop teachers with the motivations, mindsets, and skills to foster these outcomes?
This year, with support from the Jacobs Foundation, Teach For All’s global organization is beginning a journey to answer these important questions by empowering school practitioners to experiment and learn as they try to make a difference in classrooms and communities around the world. Our Research and Evaluation Team is piloting lean, agile, and adaptive research approaches to enable Teach For All’s global network to generate the evidence and feedback that supports the innovation we need.
We are beginning by infusing this evidence-driven approach in Teach For All’s teacher development work through a partnership with our colleagues in the Global Learning Lab, who are currently developing a new Teaching as Collective Leadership framework, which will offer actionable and locally customizable guidance for teachers, teacher coaches, and program designers who are oriented towards this different view of the purpose of education. Several weeks ago, the Global Learning Lab shared the latest draft of this framework with practitioners from across our network, who will begin to adapt and engage with it in new ways—providing a unique opportunity to generate feedback and evidence about the framework itself, how to best implement it with teachers and coaches, and how shifts in teacher development might lead to changes in teacher mindsets and classroom practices.
In designing our evidence generation and learning strategy, we’re guided by a number of ideas that we believe are relevant not only to our team’s research on projects such as Teaching as Collective Leadership, but to innovation processes more broadly:
- Recognize and embrace uncertainty: Innovation inherently involves replacing uncertainty with new knowledge. During the early phases of developing something new—for example, a new program, process, or framework—uncertainty tends to abound. Embracing, naming, and prioritizing where you are uncertain can be the key to designing a learning strategy and figuring out what kinds of feedback and evidence generation are likely to be the most valuable to guide decision-making and pivots in your strategy. This can be a tough mindset to embrace, since monitoring and evaluation was for such a long time primarily about accountability to funders, not about enhancing the learning of the stakeholders leading the process.
- Ensure you have all the necessary components to get to impact: Distinct areas of uncertainty within new initiatives should be considered separately: the theory of change (for example, when your understanding of the problem, context, or potential solutions is incomplete); the design of the program and how best to fit it to the the context; and/or the implementation or delivery of the program. Another way to think about it is: sound theory of change + strong program design + strong delivery → impact. You may have more uncertainty about one of these aspects than another, depending on existing evidence or experience.
- Think about learning as an iterative and adaptive process where your research methods become more rigorous as you learn: When uncertainty is high, you are as likely to find things that don’t work well as you are to find things that don’t work as originally planned or designed. Quick testing, learning from experience, and iterating purely by eliminating obviously flawed assumptions can be extremely beneficial in early stages of an innovation process. As you get more certain about your theory of change, the design of your idea, and how to best deliver it, you can layer in more types of feedback. More rigorous testing (for example, using rapid or nimble RCTs) can be helpful to test critical assumptions or design features of your initiative. Even where an experimental design is not possible, you can still often combine some quantitative and qualitative feedback usefully.
We’re currently putting these principles to work. We started with focusing on uncertainty in the design of Teaching as Collective Leadership and its resources. Last November, we organized beta-testing—popularized in the social sector through the book The Lean Startup—of key resources of the Teaching as Collective Leadership framework to get concrete user feedback from teachers and teacher coaches on their experiences using new reflective tools for new and novice teachers. And, with that feedback in hand, we’ve now embarked upon a new study that brings in more structured feedback. We’ve recruited 50 teachers and 25 coaches from 10 of our network partners to try out the framework in their classrooms. The study will investigate changes in teacher mindsets as well as observe the classroom practices of these teachers to understand how they support student leadership development.
Depending on what we learn from this global study, we hope to launch additional experiments with network partners as a next phase of this work to better understand how other changes to teacher development, coaching, and support enable student leadership and learning.
It's early days on all of this and we look forward to sharing more as we go! We hope that in doing so, we'll achieve two goals. As well as generating actionable insights about how to develop teachers and students as leaders of a better future, we also hope to learn about how to better empower practitioners in the Teach For All network with programmatic improvement research.