“Data is the new oil” was first quoted in 2006. It highlighted how valuable data is and how it needs to be refined to get the power out it. However, when you are thinking about data in a school this can feel risky.
When it comes to young people’s data, we really need to think of it as an extension of them. In the same way we must protect and nurture the young people in our care, the same must be done with their data.
Getting it right for every “data” child
By changing how we think of data we can make sure it is handled safely. If you were to imagine the data is your school was another child, how would you handle it?
Using the SHANARRI factors would be a good starting point,
who has access to the data?
what’s the quality of the data?
does your data help you achieve the goals in your school?
does the data in your school develop to match your needs?
is your data refreshed regularly enough?
is the data respected and correctly used in your school?
do you use your data responsibly?
do you include your data in your school’s decision making?
Data can feel like an abstract set of numbers on a screen, but especially when it comes to young people’s data it needs to be cared for and kept safe.
East Neuk Analytics is delighted to announce that we’ve been named a Finalist in the British Data Awards 2021 for ‘EdTech of the Year’.
The British Data Awards is an annual campaign that aims to uncover data success stories. Organisations taking part this year include FTSE 100 giants, tech unicorns, public sector bodies, newly launched not-for-profit organisations and everything in between.
The British Data Awards was launched to help discover and celebrate the most passionate data-led organisations, no matter their size. Some organisations named as Finalists have the potential to change the world, while others are having a much more local, but nonetheless important impact. And with some 149 entries received, competition to be named a 2021 Finalist proved to be particularly tough.
Jason Johnson, Co-Founder of Predatech and British Data Awards judge said: “We set-off on a quest to find our Finalists back in February, and it’s been quite a journey! Judging the entries has been a challenge due to the number of high quality entries, but it’s also been a great privilege. It’s reminded us of the sheer talent and ingenuity that makes the UK such a global powerhouse for all things data. All our Finalists should be very proud.”
Emma Nylk, Founder of East Neuk Analytics said: “I am delighted to be selected as a finalist of the ‘EdTech of the Year’ at the British Data Awards. Schools have been under huge pressure over the last year and I am proud that I have been able to support them during this time.”
A total of 16 categories are available, including sector specific awards such as ‘EdTech of the Year’ and ‘AI Company of the Year.’ ‘Data for Good Initiative of the Year’ proved to be the most popular category overall, closely followed by ‘Start-Up of the Year’ helping to reinforce the UK’s well-earned reputation as a vibrant hub for new businesses. While every region of the UK is represented among our Finalists, organisations based in London, Yorkshire and the Humber, and Scotland account for some 63% of all Finalists.
The 2021 judges include: Neil Carden, COO of Forth Point, Jason Johnson, Co-Founder of Predatech, Mahana Mansfield, Data Science Director at Deliveroo, Tej Parikh, Chief Economist at the Institute of Directors, Harriet Rees, Head of Data Science at Starling Bank and Dr Jo Watts, CEO & Founder of Effini.
The winners of the British Data Awards 2021 will be announced on Tuesday 4th May.
As more children and young people return to face-to-face learning, there are many discussions going on related to ‘catching up’. How we measure the progress schools make in this area can be challenging. I hope these 3 tips will support your thinking around this.
What are you comparing against?
With any data analysis, every number needs a comparison to allow you decide on the action required from it. If you had one child or young person in your class who had been receiving remote teaching for many months, you would be able to compare their progress against the rest of the class, but what do you do when every child has been remotely taught? Are the baselines or targets you used before the pandemic still fit for purpose?
What gets measured get managed
Being able to tell whether a child or young person has the expected numeracy or literacy skills for their age is relatively simple, but what about the areas that are not being measured? We all know of children that have shown huge levels of resilience through the last year but there will be some who have and continue to struggle. Measuring the ‘health and wellbeing’ of children, young people (and the staff within schools) is difficult, but any area that is not measured risks not being managed.
Data is only one element
Data can be hugely powerful and helpful, but it is only one element. Data analysis becomes compelling when it is combined with the story of the data. If a group of children or young people need to ‘catch up’, understanding why this is the case and what actions are going to be taken to solve the issue is how you turn your ‘insight into action’.
I hope you have found this useful. If you would like to read more on data analysis within schools, please subscribe to my blog or sign up to my monthly e-newsletter.
About the author
Emma Nylk founded East Neuk Analytics in 2017 to support teachers to their understanding of the data related to their children in their schools. With over 10 years’ experience as an analyst, she brings her expertise from retail and financial businesses to help schools. She has been published in TES Scotland and member of the Data Advisory Group of Young Scot.
The impact of Covid-19 on the health and wellbeing of children can be felt by everyone involved in education. Being able to quantify this impact is not an easy task.
Schools are full of data; pupil attendance, SIMD, free school lunches, standardised tests, parental feedback, teacher assessments…. the list goes on. All this data can be very powerful, but only when teachers have time to analyse and then, most importantly, act on it.
With teachers under huge pressures, any time spent gathering data must be justified against the end goal. This comes down to what action will the school need to take if the results of the data are better/worse than expected. If the data will not result in any change in action, is the data needed?
Clearly defining the question you are trying to answer before you start collating data will save you so much time and effort in the long run. Are you trying to validate teacher assessments? Or making sure parents are feeling connected to the school? Or trying to understand if children are feeling safe at school? Then, if you do not get the answer you expect (in a positive or negative way), do you have enough data to take action from the results?
I refer to this way of thinking about data analysis as “so what….?”. If a number has moved (or not) asking ‘so what?’ can focus the discussions. It allows you to dig into the root causes of data movements.
‘So what..?’ is one of my 5 principles of data analysis best practice. The others are,
Rubbish in = Rubbish out
What gets measures get managed
The trend is your friend
Insight to action
If you would like to read more about these other principles, I have discussed them in my previous blog posts.
When time is so precious, making sure the data that guides us all is of the highest quality is worth the investment.
Many school improvement plans include measurements based on surveys of pupils, staff or parents, but do you need a response from every person to be confident in your results. Would the views of just 50% of your parents be a suitable sample?
Well, as with a lot of statistics it depends on what action you want to take on the back of the results. In general, the bigger the stating group (e.g. number of pupils), the smaller percentage of responses you will need to your surveys.
The other major factor is how accurate you need the results to be. For example, if you ask parents how happy they are with the communication they receive from the school – would you take different action if 90% said they were happy vs. 91.27%?
With surveys that don’t have 100% response rate, you need to understand how the response rate will impact on how confident you can be in the result.
How confident can you be in the result?
Low response rate
Cheap and quick
Gives an approx. answer
High response rate
Reasonably robust figure
More time involved
100% response (Census)
Every view is collected and can be analysed
Can be time consuming
As with all analysis, when handling survey data I refer to my principles of data best practice.
What is your margin of error?
Unless you have 100% response rate, the results of your survey will have an “error margin”. Being aware of the range of the answers is very important when you are comparing surveys over time.
The graph below shows the % of people saying they were happy (in the blue dotted line) and the results of a survey of the same group of people (but with less than 100% response rate) in red.
The margin of error (pale red ribbon) contains the actual score and surveyed score, even though they appear to be trending in different directions throughout the months. This is important because if you were taking action based on the trend of the red line it might not be have result you expect.
For your survey score to a smaller margin of error (and give you a sampled result closer to the true blue line) you could increase the % of people responding.
Remembering the data analysis best practice principles (so what…? and insight to action) if you were hoping that more than 90% of people are happy and over the months of this measurement only between 45% to 60% of people were happy, the absolute number doesn’t mean much – some action is needed.
How many responses do you need?
Imagine you wanted to know the percentage of pupils walking to school (in this case the answer is 50%). The graph below shows the response rate of surveys you need based on,
asking every pupil (census)
a survey where you are confident in the result within +/- 3 percentage points (e.g. 47%-53%)
a survey where you are confident in the result within +/- 5 percentage points (e.g. 45%-55%)
As you can see, the larger of the size of the population (e.g. number of pupils or staff) the lower the response rate needed. Also, as the precision of the result goes down the % of responses also goes down.
When setting up your survey and then analysing your results, making sure you are clear on how accurate you need the results to be can save you a lot of time and effort.
Now for the Maths
To calculate the number of responses you need or how accurate your score is from your survey you need to use the following formula,
Can I help?
I have a document that calculates the number of replies you need to your survey to be confident in the scores and a document that will provide you with the margin of error on a survey you have completed on my TES shop.
If you would like any support in setting up and/or analysing your schools surveys please get in touch.
It looks like a version of home learning is going to be around for a little while longer and I know lots of schools are surveying pupils, parents and teachers to understand how they are finding things.
A measure you might not have thought about is asking them how much “effort” is involved in home learning.
“Customer effort” has been around over 80 years but became popular about a decade ago within businesses after an article was published in the Harvard business review in 2010,
“A significant advantage of the CES [Customer Effort Score] approach is the ability to produce actionable data that can be used to help design customer experience”
Henley Business School – “Customer Effort – Help or Hype?” April 2013
Below is an example of a question that you could use,
As with any measure there are pros and cons but it is a measure I would consider adding into any pupil/parent or staff survey. Adding it in with a space that allows people filling it to give examples of what is causing them issues will allow you to focus your time on making life a bit easier.
I have previously managed customer surveys and if you would like any more information on survey best practice, please get in touch.
Staying at home is feeling like the “new normal”. However I know my daughter, like lots of children is missing her friends and teachers.
We’ve been able to interact regularly with her school, but I am aware that not every child (due to lots of reasons) is so lucky. When all children are able to return to school, will these differing levels of engagement increase the attainment gap.
Markers in the tracker A number of schools I work with use an Attainment Tracker to follow how their children are achieving from nursery to S3.
Within the document they can compare the attainment of children by different measures such as risk matrix, SIMD.
However, there could be benefit in flagging how much engagement there was during the lockdown. The options could be as simple as,
By combining this information with your existing data it could highlight any new groups of children that would benefit from additional support. Especially for children transitioning from P7 to S1.