How to write surveys for social care research: part 1

Eleni Romanou advises social care researchers on how to use cognitive testing to help devise effective questionnaires

Boy writing in a bookResearchers or evaluators working in social care often use surveys to collect information and opinions from service users, families or social care professionals. We might include standardised questions or psychometric measures but there are times when we need new questions which have to be designed from scratch. How can we make sure people interpret these questions the way we intended and give responses which are thoughtful and accurate?

I’m going to explain how a cognitive testing approach can help and over the course of 2 blogs I’ll share ideas on how to craft questionnaires which are effective, engaging and tailored to research participants.


Challenges associated with conducting surveys in a social care setting

Qualitative approaches such as in-depth interviews are often well suited to research with young or vulnerable people or about sensitive topics. But if we want to find out how widespread a particular experience or view is, only a survey will give us the reach we need. As researchers we need to think hard when it comes to asking questions of potentially vulnerable people. How can we go about crafting questionnaires which will get us the data we need whilst still protecting our participants?

I have often grappled with this challenge in my experience as a social researcher. Applying the principles of good questionnaire design has helped me avoid common pitfalls, such as vagueness or giving multiple choice options which overlap. This can leave respondents feeling confused and unable to select an answer. But I still make a common mistake: I assume that, as long as questions are clearly worded for the context, age group or profession my survey is targeting then the respondents will be receptive to my questions and at the ready with accurate information and candid opinions.

The reality is not so simple. If respondents find questions too conceptually or emotionally challenging they may become distressed, misunderstand the question and give incongruous answers, or disengage from the process. Examples of this include asking respondents to estimate how frequently they do something – depending on what that ‘something’ is, this can be a very difficult thing to do – or asking questions which bring up traumatic memories.

Common pitfalls

There are times when the risk is glaringly obvious. Some time ago I helped design a questionnaire for the NSPCC exploring the prevalence of child maltreatment in the UK (Radford et al, 2011). Because of the nature of the survey we needed to ask children, young people and parents some questions which could be seen as intrusive and use explicit wording. It was clear from the outset that we would need to explore whether people would be prepared to answer questions about their personal experiences openly.

But there’s always some risk that people might misinterpret or struggle to make sense of questions. A striking instance of this involved a survey where children were asked if they agreed with the typical opinions of their peers. The interviewer read statements beginning with “People my age often think…” aloud and we expected the children to say whether they agreed or disagreed. But in reality several children thought they had to try and guess the age of the interviewer! Elsewhere, the word “attitude” has been misunderstood to mean “giving attitude” rather than “opinion” and terms such as “your local area” can be interpreted very differently by different audiences.

How cognitive testing can help

Cognitive testing can minimise the risks above. I frequently use it before launching a survey to find out how my intended audience understands my questions and to identify anything which might be too emotionally or conceptually challenging.

To test a survey I select a small number of respondents and ask them to paraphrase the questions in their own words. I explore any hesitations or confusion I observe, checking how easy they found it to recall relevant information or situations which help them give a particular opinion or answer. I might encourage respondents to talk me through their thinking as they read the question, pointing out any words or concepts which they find ambiguous and explaining how they reached their answer.

The cognitive testing process is invaluable for identifying problems in questionnaires. It always gives me ideas for how to improve and customise questions so that they are more engaging and more likely to generate the data I need for my research. 

Here are my top tips for anyone considering using cognitive testing on a questionnaire:

  • Does the survey target a vulnerable or minority population?
  • Is the content very sensitive or intrusive?
  • Are you asking respondents to make difficult mental calculations (e.g. estimating time or resources spent on a programme) or nuanced judgements (e.g. about the benefits they gained from a service)?

If the answer to any of these questions is yes, cognitive testing could steer you away from inadvertently distressing your respondents or collecting inaccurate data.

As with all qualitative research you need enough participants to map the range of variation in your target population. But I would advise starting off with relatively small numbers of participants. If interesting issues come up which need further investigation you can recruit more.

Cognitive testing should draw out the types of problems associated with a questionnaire. Focus your efforts on detecting the full range of problems, not counting how many times these problems crop up.

Carry out face-to-face testing so you can observe respondents’ non-verbal cues and probe any difficulties you spot.

Conduct cognitive testing together with a colleague and collate your observations. Having more than 1 viewpoint can help you differentiate between minor and major issues and assess how these are likely to affect the quality of data you’re trying to collect. It also helps you think about your respondents’ experience of the survey more objectively. 

Review and test

There is much to gain from reviewing and testing new survey questions. This both helps ensure you collect accurate data and gives participants the best possible experience of taking part in research. I urge anyone designing a questionnaire to consider using cognitive testing to make your questions effective, tailored and engaging. 

Like this blog?

Let us know which blog you've read, what you think, share information you have on the topic or seek advice. 

Get in touch

More from impact and evidence

Recruiting service users for evaluation interviews

Tips for setting up interviews with service users as part of a research study.
Find out more

Why I’d choose face-to-face over telephone interviews

NSPCC evaluation officer, Shirley Magilton argues that face-to-face interviews encourage people to talk more openly.
Read more

How to work with reluctant interviewees

Emma Belton discusses interviewing families for evaluations in difficult contexts.
Read more

Support for professionals

Sign up to CASPAR

Subscribe to CASPAR, our current awareness service for child protection practice, policy and research.
Sign up to CASPAR

Information Service

Our Information Service provides quick and easy access to the latest child protection research, policy and practice. 
Find out more about our Information Service

Follow @NSPCCpro

Follow us on Twitter and keep up-to-date with all the latest news in child protection.

Follow @NSPCCpro on Twitter

Library catalogue

We hold the UK's largest collection of child protection resources and the only UK database specialising in published material on child protection, child abuse and child neglect.

Search the library

New in the Library

A free weekly email listing all of the new child protection publications added to our library collection.

Sign up to New in the Library

Helping you keep children safe

Read our guide for professionals on what we do and the ways we can work with you to protect children and prevent abuse and neglect.

Read our guide (PDF)

Impact and evidence

Find out how we evaluate and research the impact we’re making in protecting children, get tips and tools for researchers and access resources.
Impact and evidence

Impact and evidence insights

Each week we’ll be posting insights from professionals about evaluation methods, issues and experiences in child abuse services and prevention. 
Read our blogs

Sharing knowledge to keep children safe

Read our guide to the NSPCC Knowledge and Information Service to find out how we can help you with child protection queries, support your research, and help you learn and develop.

Read our guide (PDF)

Get expert training and consultancy

Grow your child protection knowledge and skills with CPD certified courses delivered by our experts nationwide and online.
Get expert training

References

  1. Radford, L. et al. (2011) Child abuse and neglect in the UK today. London: NSPCC