Powered by Teacher Tapp Logo

Reliability

How we make our data representative

Any one can ask a survey of teachers on social media or by asking teachers on their mailing list to fill in questionnaire. So why bother with Teacher Tapp?

Because our data is representative, which means it reflects the whole teaching population. Your data is more likely to be accurate and be believed by the press and policymakers.

How do we do this? In England we are lucky that the School Workforce Census tells us what the population of teachers looks like (albeit just in state-funded schools).

In the full analysis we conduct on the results, we create a set of post-stratification ‘weights’ that allow us to count the responses of some teachers more than others. If we have too many of a certain type of teacher, then we can rebalance the results to account for this. We can only adjust for teacher characteristics where we know their proportions in the teacher population and re-weighting cannot be carried out on too many factors at once. At the moment we re-weight by:

  • Gender
  • Age category
  • Senior leadership status
  • Government Official Region
  • Phase of schooling
  • Private versus state-funded

To give an example of how weights work, in a recent analysis we gave female primary classroom teachers in their 30s a weighting of 2.4x the value of a typical respondent as we had too few of them in the original sample. At the other end of the scale, male secondary senior leaders in their 30s were given a weighting of 0.2x the typical respondent (we have a lot of these!)

Every teacher helps make sharper the picture we paint of education, but reweighting helps us make sure the colour of the picture is correct!

If you would like more information please download our re-weighting report here.

“Due to the pandemic, we had a niche project for which we needed research that captured data in between annual poll results. We were able to obtain these results easily and our expectations were wonderfully met. Very pleased with both the service and results and will continue to enjoy working with them.”

Ian Macrory, Branch Head, Office for National Statistics
View Our Services

FAQs

Topics
  • Education Intelligence
  • Security
  • Privacy
  • Other

Education Intelligence

    • How many teachers are in your panel?
    • Over a week we have around 10,000 teachers answering on Teacher Tapp. In England, the number of teachers we have with all the information to allow us to count them in a final set of results differs each week but is usually between 5,500 and 7,000 teachers. The sample is very diverse, with teachers across approximately 4,000 schools – giving a great insight into school-level policies (e.g. technology, behaviour systems, reading schemes, etc).
    • How much does it cost?
    • Our pricing varies. At the most expensive end, a last-minute one-off survey question will cost £1,300* but if you’re able to give us a bit of notice and want several questions then the costs drop to around £800* per question. (*All prices exclude VAT).
    • Do you offer discounts?
    • In order to help as many people access education research as possible we keep our prices reasonable by offering early bird discounts, loyalty bonuses, package offers and occassional discounts via our newsletter (so make sure to sign up)! We always look in our extensive back catalogue to see if we already hold the data and can offer it at a much-reduced price. Also, because we only ask 3 questions each day, we try to reduce down surveys to the most needed essentials – it saves you money, and means our panel get more interesting questions. Win-win!

Security

    • How long does the process take?
    • Teacher Tapp asks questions every day of the year (including Christmas day!) Our turnaround is usually very quick. Responses come in for 24 hours and then results are turned around within 48 working hours. The whole thing therefore takes 3 days.
    • How are the findings presented?
    • Your customised report will be specially created with graphs pulling out the most interesting findings, plus a series of PDF and excel charts which show the results for each category. We are very protective of our users’ personal data, so we don’t share individual results. But we do provide lots of cross-tabs, so you can see how the results of each question break down by gender, school type, phase, subject, etc.
    • How many questions can I ask the teachers and school staff?
    • As many as you’d like! But there are some conditions as to how you do this. On any given day we only ask 3 questions to our whole panel. If you would like to ask more than 3 questions we have two options: split them across several days, or ask them as an optional question set. Although the latter sample size is smaller you do get the added benefit of the results being hidden from the panel too!

Privacy

    • Why use Education Intelligence?
    • We are a reputable, friendly company with unparalleled speed at getting a representative sample of teachers. We are experts in what we do: so our questions make sense to teachers and don’t need to be asked multiple times to get to the heart of the matter. We try to save you money where we have data already in our back catalogue. And, crucially, Teacher Tapp is there to help teachers get better at their jobs – so we aren’t an inconvenience to teachers. The aim is to help teachers feel that they are part of a research community, as opposed to just being experimented on. By commissiong Teacher Tapp you are supporting that quest!
    • How many clients have you worked with?
    • We’ve worked with over 100 education and research organisations, including The Sutton Trust, Office for National Statistics, Teach First, the Anna Freud Mental Health Centre, Microsoft, MyTutor and The Gatsby Foundation. If you would like to see any of testimonials, please let us know – we love sharing them!
    • Do you tell panellists who commissioned the research?
    • Not unless you do. If we said in advance who commissioned the topic of a question it could introduce bias to the results. Secondly, and more importantly, we are fussy about the questions that come onto the app and we only ask questions that are fair and are genuinely seeking to find an answer (not a pre-defined answer), and that meet our editorial beliefs around questions. We do not engage in push polling. For that reason we consider all questions to be ‘Teacher Tapp’ ones when they are asked. Should you wish to make results public you are allowed to do so. Should you publish any results we reserve the right to counter-publish the survey as provided to you, so the public are assured you have used the data faithfully. In doing so, the trust in results is preserved for us both!
    • Do you ensure every one who signs up is a teacher?
    • No. Not every teacher is employed at a school – some work as supply teachers and so are employed by agencies. And not every teacher has qualified teacher status – some of our teachers are unqualified trainees on school direct, others are unqualified and not on teacher training routes – making verification through that route tricky too. Instead, on sign-up, we ask teachers to select their school or to pick ‘not a teacher’ and the latter group are removed from results. If you’re being nosey, it’s much easier to click the ‘not a teacher’ box! We also periodically ask questions to see if answers are clashing with answers and therefore signifying an unlikely teacher. We also don’t count users in our analysis for a short while, until we are more assured of their responses, and we have ways of checking if new signs up are acting in unusual ways to avoid ‘flood polling’ where lots of people sign up to sway the response of a particular question. We are always open to improving this system. If you have any ideas for how to do so without making the sign-up onerous or giving away masses of personal data, do let us know via hello@teachertapp.co.uk

Other

    • Can we target research or adverts at specific groups of teachers?
    • Yes! We will advise in advance how big the group is in the sample, and make sure it is suitably large. We’ve targeted questions specifically at maths, science, primary, female – all kinds of teachers! (Maternity leave questions are a prime example of where targetting is needed).
    • Do you have teachers in Scotland, Wales and Northern Ireland in the panel?
    • Yes, but not enough to get reliable results for them. By design, our questions mainly cover primary and secondary schools in England. Because of this, teachers in other parts of the UK end up having to skip a lot of questions and so end up dropping out of the panel. We are keen for them to stay, but they rarely do. We therefore don’t offer results for any other part of the UK, though some clients are happy to receive the data that we do have available (even with the caveats on reliability).
    • How come your panel retention rate is so high?
    • We are really proud that over 2/3rds of people who sign up to Teacher Tapp stay with us for at least 3 months. The more questions a teacher answers, the more accurate our picture of education becomes, so it makes a real difference. The key ingredients are that we make it easy and quick to answer on phones, and that teachers get to see the results the next day – meaning they learn something new every day. We’re also lucky that teachers want to be involved in research and enjoy being part of our community.