Posted in Uncategorized

The five steps to EEF Toolkit success

The Education Endowment Foundation (EEF) Toolkit is an amazing resource for schools and academies. Freely-available, regularly updated and designed for teachers and school leaders, it provides the latest summaries of research evidence regarding the way that factors such as feedback, meta-cognition, setting and school uniform, impact on learning.

But how do you use this breadth of knowledge simply and effectively? Here, we show the five key steps, seen in this diagram and then detailed below.

EEF Toolkit diagram
Image via evidence.based education

The five steps to success

Step 1: Decide what you want to achieve by identifying school priorities using internal data and professional judgement.

The conversations that occur at this step need to be informed by reliable and valid internal data. Reliable data, for instance, from assessments, are consistent over time (you’d get a broadly similar result from a reliable assessment if a pupil took it both in the morning and the afternoon on the same day, for example). Valid data can be drawn from assessments which measure the specific area of interest and are fit for the purposes you intend, for instance, mental arithmetic ability, as opposed to something unintended (easy to do when you are constructing a test).

When it comes to professional judgement, beware the many forms of bias that are at play: confirmation bias (where we interpret things to fit with our previous experience, or the experience of the rest of the group) and recency bias (where we believe that things that have happened in the recent past will continue into the future), are two key forms of bias to keep in mind.

Step 2: Identify possible solutions using evidence summarised in the EEF Toolkit.

The Toolkit is a great resource, but why?

Firstly, the research used to create the summaries is of the highest quality available; this means that they are often derived from meta-analyses (studies which collate and report on the outcomes of multiple research investigations) which have been robustly designed.

Secondly, because of the independent researchers at Durham University working behind the scenes of the Toolkit, the findings are reliable. It is strongly advised to remember though, that all of the findings, such as “+8 months’ progress” are averages, and an ‘average’ can include a very broad range of data points.

The vital point to consider is not to expect that by simply ‘giving feedback’, for instance, every pupil in your school will make eight months more progress than the average. To get an idea about how we go about helping schools ‘get under the bonnet’ of the Toolkit, read the ‘Checklist Edufesto’ post on the evidencebased.education blog, which discusses the merits of a research-based feedback checklist.

Step 3: Give the idea the best chance of success by applying the elements of effective implementation.

So, you’ve identified a problem, and you’ve found a sensible, evidence-based approach that you think will provide the solution. What next? Well, implementation is next.

  • What training will teachers need to implement the new approach?
  • What other resources will be needed?
  • How much is it going to cost?
  • What disruption to other ongoing work will be caused?
  • How will you make sure that what you plan to implement is actually implemented?

This last point is crucial. All too often, the execution of a plan can take a different approach to the one that was intended, so take this into account and put in place a few checks to make sure you stay true to your plans.

Step 4: Did it work? Evaluate the impact of your decisions and identify potential improvements for the future.

Evaluation is the important area that often gets forgotten. Like losing weight, eating healthier food and doing more exercise, evaluation is hard to get right and requires disciplined planning and execution. But, if you’re prepared to work hard and stay on track, the EEF’s DIY Evaluation Guide is there to help.

When we wrote it, I was keen for it to be rigorous but accessible and its use in schools suggests that we succeeded. So download the Guide and make sure that you plan your evaluation at the start of your project, not as a bolt-on at the end.

Step 5: Secure and spread change by sharing the findings to inform the work of the school.

Finally, this step is just as crucial as the previous stages in the process. You began with a priority (maybe to increase reading ability as measured by a standardised assessment), then you identified a plausible solution and put it into practice, ensuring that you stayed faithful to your plans and avoided the temptation to expand the project beyond its initial parameters.

You evaluated your findings using the DIY Evaluation Guide and you now have important information, such as the impact that you had on those reading scores. Maybe it had a positive impact, maybe it didn’t. No matter what the results, share them. Publish them to your colleagues and discuss what you’ve found. Only by taking this step will you be able to make real use of the five-step process. Without sharing what you’ve found, what’s the point?

In our experience working with schools, a major criterion for the success of this five-step process is having support at SLT level for disciplined inquiry. Where we’ve seen it work well, a headteacher has supported another senior leader to take responsibility for running the process, and has remained open to the possibilities, asking their colleague to implement the five steps. More often than not in education, we don’t know the impact of the interventions we use in schools, so finding out by following this process enables senior leaders to make important decisions informed by both professional judgement and robust, contextualised evidence.

Brought to you via 

evidencebased.education logo

evidencebased.education is an organisation which provides training and support to schools, LAs and other educational bodies to help them use proven research evidence and evaluation techniques. They are a leading authority in training teachers and school leaders to interpret data from Durham University’s Centre for Evaluation and Monitoring (CEM), and have developed innovative, sustainable, and affordable methods of online and in-person training on the use of CEM assessment data.

evidencebased.education have also developed a training programme in line with all the best available evidence on CPD in schools, which takes teachers and senior leaders through the research use journey, step-by-step.

For further information, head to our experts page.

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s