It was so encouraging to read Dr Chris Brown’s evaluation of the impact of our commitment to Evidence Informed Practice:
Evaluating the Oaks Fed model for research engaged self-improvement
Dr. Chris Brown
The Oaks CE learning federation represents a family of three small Church Infant Schools based Hampshire who all work closely together under the leadership of the federation Headteacher and Governing Body. One of the federation’s improvement plan objectives is for it to become an evidence-informed federation where the schools collaborate to rigorously evaluate the quality of the education they offer, understand what they need to do to improve, to take appropriate evidence-informed action and evaluate the impact of their actions, enabling them to achieve together. To meet this objective, the executive headteacher of the federation devised a model of professional learning where (as of 2016) four of the statutory staff professional development (inset) days allocated to schools in England were dedicated solely to evidence-informed professional development. Using a cycle of enquiry approach, the aim of the model is to enable teachers to engage collaboratively with research, to identify new practices, to trial these practices, to measure their impact and then roll out the most successful within and across schools in the federation. This report provides an evaluation of the approach in relation to three key areas: 1) did the model help teachers become engaged with research; 2) did the model help teachers use the research to develop approaches to feedback with clear pathways to impact; and 3) have these new approaches had impact?
The report is based on interviews with 15 teachers and school leaders, and pre and post intervention surveys relating to teachers’ use of research. External observation is provided by OFSTED, England’s accountability body since an OFSTED inspector also visited one of the three schools involved towards the end of the project.
Did the activities undertaken help participants engage with the research and relate it to their context/setting and area of practice that requires improving?
Respondents suggested that the activities helped them engage effectively with the research in the following ways: 1) by providing access to research where previously this had been difficult: “[previously] that’s the bit that I’ve found hardest with the inquiry, is accessing that kind of material… knowing more where to go and accessing [research]. So having access to that and time to read through things was really helpful” (respondent #3); 2) this first quote also highlights the value placed on having time to engage with research. Other similar comments about the model providing the time needed to do research included: “having those inset days made all the difference this year. You know, when we were trying to fit it in, sometimes it didn’t happen, and we’d grab half an hour and it didn’t have the momentum it had this year” (respondent #3) (respondents #5, #8, #9, #10, #13 and #14 also made similar points); 3) The approach to research engagement was seen to have two key components: participants enjoyed the collaborative discursive nature of the activities: “I’m not one to sit and read through reams of research, but actually when we did the, everyone read a little bit and then fed back and discussed it. I found that a much easier, way to engage with the research … to go through and talk about, or to analyse together.” (respondent #2); “the communication and working as part of a team is important, if you can sit down with [research] and unpick [its meaning] together. I think that’s better than trying to work in isolation (respondent #7) (similar points also made by respondents #10, #11, #12, #13 and #14). Furthermore the structured and facilitated approach to research engagement meant that participants felt they were able to engage more meaningfully with the literature (respondents #2, #5, #9, #13 and #14); 4) respondents also appreciated that they were being encouraged to experiment and take risks: “I think for me, it was the knowledge that it was okay to get it wrong. That didn’t matter, because it’s not necessarily finding the answer” (respondent #6). Likewise respondent #9 noted of the federation leader that: “she is always reassuring us that ‘if you trialled it and it didn’t work, that’s fine’”.
Effective engagement with research also requires that teachers can understand strengths and limitations of different research methods, can contextualise research findings (i.e. see how research findings can be applied to one’s own setting and practice) and can engage in learning conversations using research as part of collaborative approach to designing new teaching strategies. These three requirements are reflected in survey questions 1, 2 and 3 in table 1 below. In all three areas it can be seen that over the course of the project respondents typically believed that they had improved their knowledge and skills in each of these areas, with average scores moving from below the mid-point score of 3 (‘average’) at the start of the project to closer to 4 (‘above average’) by its end.
Table 1: Pre and post survey questions and responses.
|Question*||Pre response (average)||Post response (average)||Difference (average)|
|1) Knowledge of research methods||2.8||3.6||0.9|
|2) Relating academic research findings to your practice||2.8||3.8||1|
|3) Confidence around having conversations about academic research||2.9||3.8||0.9|
|4) Confidence around interpreting academic research findings||2.6||3.7||1.1|
|5) Using academic research to inform the design of teaching and learning strategies||2.5||3.5||1|
*Respondents were asked to rate their knowledge and skills against a five point scale, with 5 equaling ‘high’, 3 equaling ‘average’ and 1 equaling ‘low/none’.
Correspondingly it was felt that across federation level teachers were becoming research informed as a result of the approach: “there is [now] evidence-informed professional conversation all the time. People have been far better about the idea of providing evidence for what they’re saying” (respondent #1); “[we’re] actually beginning to embed the fact that everything we do, should actually be shrouded in research… and that’s what we‘ve got to continue doing (respondent #8). Furthermore a school inspection undertaken by OFSTED towards the end of June 2017 provides an external assessment, suggesting teachers are now using research evidence to improve specific aspects of teaching and learning. In particular the report notes that: “leaders have embedded a research-based culture where strategies to improve teaching are investigated and evaluated in terms of outcomes for pupils. As a result, the whole school community is deeply dedicated to continuous improvement and sharing expertise to raise standards further”. This report thus lending further weight to the notion that the approach and activities used have been successful in helping teachers engage in research evidence and collaboratively develop research-informed teaching practices to tackle areas requiring improvement.
Did the activities help participants develop interventions with clear pathways to impact?
From analyzing the interview data it could be seen that all respondents could espouse a theory of action (ToAs) (pathways to impact) for their developed intervention. In other words respondents were able state what their intervention was, the logic underpinning its design, how it was intended that the intervention be realised and the changes it was intended should result. An example of one such ToA is set out in table 2. As can be seen in the table respondent #4 sets out in detail how they were able to deconstruct the nature of their intervention and its intended and actual changes in knowledge and practice as well as evidence the impact on students that resulted. Respondent #3 suggested that the ToA approach had made her realise the importance of being systematic and rigorous in how interventions are developed as well as how baselines are established and how impact is assessed. Furthermore that the ToA approach meant that if interventions were not delivering the desired impact that tweaking and refinement could be undertaken by reexamining the logic of the approach and whether its constituent parts were being implemented or supported effectively. This was also reflected by respondent #5 who noted the ToA approach meant that they were able to systematically explore “what is the problem? What am I doing about it? What’s changed?” In addition it was also recognized that the ToA approach could be used generally to explore and tackle issues of practice: “if you’ve got your theory of action, I find that you can then drop in a variety of questions, can’t you? And, it’s a similar process. I mean, once you’ve got the process of the research and that systematic approach and looking at it, then I feel that you can drop any question in [and explore how to address it” (respondent #12). Alternatively that the ToA approach can help refine or fix interventions that appear to be unsuccessful: “it also helps you address “Well, actually, it didn’t work, so where do I go now?” Or, to somebody else, they come back and say, “Well, it did work for me, but it didn’t work for B.” “It did work for you, why? Why? Was it your approach? Was it the cohort?” So, then it opens up another question on where you’re looking at” (respondent #12).
Interview data also suggests that the ToAs developed by respondents were fully grounded in the research they engaged with in workshop 1. In particular, three respondents could specifically identify the research underpinning their intervention: for example see table 2 for respondent #4’s responses. Others could not recall the name of the research(er) but could describe what the research was about and its implications for practice. Furthermore, survey data too suggests that participants felt, by the end of the project, they had developed the skills to interpret and then apply academic research to the design of new teaching and learning strategies. Survey questions 4 and 5 in table 1, for instance, indicate that over the course of the project respondents typically believed that they had substantively more confidence than before in interpreting research findings. They also reported a higher ability to employ research effectively when developing new pedagogies. These responses reinforcing the suggestion that the theories of action developed for interventions had a basis in the research concerned.
Table 2: One example of one respondent’s theory of action
|Problem or driver for intervention||As a school we have been tasked with supporting more children to exceed expectations in writing. For our early years children we felt that this wasn’t going to be reached through more hand writing practice or more time sat at tables. Our previous observations and experience led us to believe that something else must happen before children would exceed in their writing.|
|The intervention||We had noticed over several years that many children were fearful of failure, getting things wrong or not being able to achieve something and that this was inhibiting them in taking risks in their learning. They would keep doing what they could easily do rather than taking a risk with something new or tricky that might possibly go wrong. We felt that this may well be what was preventing our children from exceeding. Our intervention was informed by Carol Dweck and her work around growth mindsets. From this work we hypothesized that if we were able to change children’s feelings and attitudes towards failure/struggle and getting things wrong, then they would be more likely to take risks in their learning.|
|Activities and interactions||We have introduced the idea of being a ‘Brave Learner’. This has not just been applied to writing and maths but to all aspects of learning and being. We have created two brave learner characters and have identified the characteristics of being a brave learner. Children have been awarded a certificate when they have been a Brave Learner and their picture is added to our Brave Learner display board in school.|
|Learning||The teachers involved better understand the need to show to children that getting it ‘wrong’ is part of the learning process and only by having another go, changing strategies or practicing will get better: failure and getting things wrong are part of the learning process. They now also have an understanding of the need to give children a language to articulate their feelings while learning.|
|Changes in behaviour||When a child has been awarded a certificate, we now talk about how the child felt about the struggle they have had to be a Brave Learner. We now praise their effort and resilience and their endurance, not whether they were successful in their quest.|
|Difference||Over the last six months we have seen a huge change in the attitudes of our children. They talk about being a Brave Learner and when we, the adults, talk about needing to be a Brave Learner they know what they have to do. They also talk about how they and others have been or need to be Brave Learners. We feel our Brave Learner programme has impacted positively on all children’s attainment in writing especially for those for which writing has been a struggle. The children have begun to understand that struggle is part of learning, not an indication they will never get there.|
Did participants develop interventions that made a difference to teaching and learning?
In all cases, respondents could easily attribute changes from their interventions to learning, behaviours and outcomes for children. An exemplar response in its entirety is set out in table 3. For other respondents I have sought to provide example vignettes that capture changes in practice and children’s outcomes in order to provide an illustration of what had been achieved as respondents journeyed along their ToAs. For example, respondent #2’s research question was “if they’re better risk takers, and they’re more willing to try things, are their reading levels coming up?” Respondent #2’s approach was to create “a small focus group [and worked with the group using] books and empathy of characters [to help them understand that] you can’t learn without being uncomfortable, and all those sorts of things. So, break down the barriers, and make them risk takers, and that linked with the empathy, because we’re all in the pit at different times. Bar one, the whole focus group did get to [working above expectations], so, it seemed to have been successful…but I’ve been doing it with all of them. I think it’s been, outside of that group, it’s been effective, as well”.
Table 3: One example of one respondent’s impact statement
|Impact domain||Impact text and data (respondent #11)|
|Learning||The aim was to improve teachers’ understanding of the effective characteristics of learning, and whether this approach impacts on writing outcomes for summer born children.
Specific learning included: ‘the approach has changed our perspective on the importance of some core skills [and has led to an] improved understanding of why certain provision is important to specific groups and individuals. From our staff questionnaire, it is clear that teachers and teaching assistants all have a greater knowledge of the learning characteristics’.
|Changes in behaviour||Changes in teacher practice noted by respondent #11 included:
– ‘changes to teachers’ planning activity – using characteristics of effective learning to move away from curriculum specific foci’;
– that ‘learning values are now driving teaching practice [rather than end of year goals]’;
– that teachers were ‘more actively looking for effective learning behaviours and planning activities to develop these behaviours’; and that
– across the school there was a more general focus on ‘getting children to use the language of learning, so reflecting on their own learning’
It was also noted that, depending on the cohort/class, ‘we have had to change the focus from role play writing opportunities to individual interests… we have also had to do much more fine/gross motor work’. In other words teachers were also taking a differentiated, learning centred, approach employing their understanding of the effective characteristics of learning
|Difference||Leuven scale data shows greater engagement in learning by children, interview data with children suggests greater confidence and understanding. Parent questionnaires indicate that parents can see the differences in children’s writing. For example, one parent noted that: ‘the forming of Jill’s letters and her interest in writing have both improved significantly’.
Furthermore the school’s writing data for 2015 highlighted that only 60% summer born children met their year 1 Early Learning Goals for writing. This compares to 83% of Autumn born children. Respondent #11 argued that the changes in practice noted earlier worked extremely well; ultimately leading to a rise the number of children meeting their writing Early Learning Goals in 2016 to 86% and in 2017 to 82%. In other words sustained improvements of over 20% per year.
Respondent #5 noted that with their project: “there were six boys who I was trying to get to age-related expectations for writing and at the beginning of the year they were predicted that they might not make it. Out of that four have made it, two haven’t, so I guess the data is saying that it’s more successful than not [in fact the data provided showed that the four pupils in question had exceeded expectations]. The Talk for Writing [an approach developed by Pie Corbett’s which research says is successful] works in particular for stamina of writing. When [the pupils] came in September, their stamina and confidence to write at length was zero. The Talk for Writing just gives them the toolkit to do that. They can regurgitate, shall we say, the story and it helps them think about actually the mechanics of the writing rather than, ‘I have to think what to write and then how to write it.’ It’s that stepping stone and it’s been a good scaffold for them. It has helped them grow in confidence and ability.
Respondents #6 and #8 were working collaboratively on a feedback project. Here it was noted that “using the Leuven capture sheet, it was clear that our focus children were slow to settle to a given task. Having checklist prompt cards and strategy cards [derived from research by Gibbs and Simpson, 2011] have certainly made things quicker and the children and all now engaged positively with their writing. The quality of writing has improved and outcomes in reading and writing [according to the end of year learning goals] are now significantly above average” (respondent #8). Furthermore data provided by these two respondents’ shows that the gap between highest and lowest achieving pupils in terms of meeting or exceeding age related expectations has closed during of the course of the project from 10% to 6%.
Finally, respondent #12’s project was to explore children’s understanding of mastery with the aim of helping them exceed age related expectations in writing and maths. It drew on research by Yarker (2016) and Schumaker and Carraccio. Two focus groups of children were selected and learning conversations were held about the notions of mastery. Subsequently a language of learning was introduced across year 1 to help children see mistakes as part of the learning process rather that a set back and that these mistakes could help children master their learning. Modeling of mastery language and skills was undertaken by the teachers and teaching assistants. End of year data shows that the number of children in Year 1 meeting their age related expectations this year has risen in writing from 76% to 83% and in maths from 83% to 92%. In conclusion then it seems clear that that Oaks Federation’s new approach to professional development has enabled teachers to successfully engage with research evidence on effective pedagogic practices in such a way that both teachers and children are benefitting as a result.