Recently so many people have been asking me to review their questionnaires and surveys that I thought I’d update a document I first created several years ago which sets out some essential best practices for creating good questionnaires. While written for training evaluation, the guide is applicable to any surveys.
1. Ask: “Why are we doing this?”
- What do we need to know?
- Why do we need to know it?
- What do we hope to do when we find out?
- What are the objectives of the survey?
2. Ask: “What are we measuring?”
In training evaluation, what you measure can be influenced by the learning objectives of the course or curriculum you are measuring:
- Knowledge
- Skills
- Attitudes
- Intentions
- Behaviours
- Performance
- Perceptions of any of the above
Your questions, and possibly your survey methods, will differ accordingly.
3. Be aware of respondent limitations.
- Where possible, pilot your questionnaire with a sub-group of your target audience.
- The complexity of your questionnaire and its language should take into account the age, education, competence, culture, and language abilities of respondents.
4. Guarantee anonymity or confidentiality.
- Confidentiality lets you follow up with non-responders, and match pre- and post studies.
- Confidentiality must be guaranteed within a stated policy.
- Anonymity prevents you from doing follow-ups or pre-post studies.
5. Select a data collection method that is appropriate.
Consider the speed and timing of your study, the complexity and nature of what you are measuring, and the willingness of respondents to make time for you. Options:
- E-mail – fast, inexpensive, not anonymous, requires all respondents have e-mail.
- Telephone – time consuming, not anonymous, may require skill, has to be short.
- Face-to-face interview – slow, expensive, requires skill, best for small samples, qualitative studies.
- Web-based – fast, inexpensive (if you use services like Zoomerang), can be anonymous, best for large surveys.
6. Write a compelling cover note.
Where appropriate introduce your questionnaire with a brief but compelling cover note that clarifies:
- The purpose of study and why it is worth giving time to.
- The sponsor or authority behind it.
- Why you value the respondent’s input.
- The confidentiality or anonymity of the study.
- The deadline for completion.
- How to get clarification if necessary.
- A personal “thank you” for participating.
- The signature or e-mail signature of the survey manager (or, ideally, of the sponsor).
- If sending an e-mail, have it come from someone in authority who will be recognised, use a strong subject line that cannot easily be ignored, and time it to arrive early in the week.
7. Explain how to return responses.
If not obvious, make it clear how and by when responses must be returned.
8. Put a heading on the questionnaire.
State simply what the purpose is, what the study is about, and who is running it.
9. Keep it short.
- State how long completion should take and make sure that it does.
- Make questionnaires as brief as possible within the time and attention constraints of your respondents (personal interviews can go longer than self-completion studies).
- Avoid asking questions that deviate from your survey purpose.
- Avoid nice-to-know questions that will not lead to actionable data.
10. Use logical structure.
- Group questions by topic.
- Grouping questions by type can get boring and cause respondents to skim through.
- Number every question.
- Where possible, in web-based surveys put all questions on one screen, or allow respondents to skip ahead and back track.
11. Start with engaging questions.
Many questionnaires are abandoned after the respondent answers the first few questions.
- Try to make the first questions non-intimidating, easy, and engaging, to pull the respondent into the body of the piece.
- Try to start with an open question that calls for a very short answer, and ties in to the purpose of the questionnaire.
12. Explain what to do.
Provide simple instructions, if not obvious, on how to complete a section or how to answer questions (circle the number, put a check mark in the box, click the button etc.)
13. Use simple language.
- Avoid buzz words and acronyms.
- Use simple sentences to avoid ambiguity or confusion.
- If necessary, provide definitions and context for a question.
14. Place important questions at the beginning.
- If a question requires thought or should not be hurried, put it at the beginning. Respondents often rush through later questions.
- Leave non-critical or off-topic questions, such as demographics, to the end.
15. Select scales for responses.
- Keep response options simple.
- Use scales that provide useable granularity.
- Make response options meaningful to respondents.
- Make it obvious if open-ended responses should be brief or substantial by using an appropriate answer-box size.
16. Fine-tune questions and answer options.
- Keep response options consistent where possible - don’t use a 5-point scale in one question and a 7-point in the next unless absolutely necessary; don’t put negative options on the left in one question and on the right in another.
- Be precise and specific – avoid words that have fuzzy meanings (“rarely” or “often” or “recently”).
- Do not overlap response options (use 11-20 and 21-30, not 10-20 and 20-30).
- If you use a continuum scale with numbers for answer options, use a clear concept at the top and bottom of the scale (instead of “on a scale of 1 to 5, how good is it? : 1-2-3-4-5, use 1=very bad -2-3-4-5=very good).
- Use scales that are centred– don’t have one “bad” answer option and four shades of “good”.
- Don’t force respondents into either/or answers if a neutral position is possible
- Allow for “not applicable” or “don’t know” responses.
- Edit and proofread to make sure that answer choices flow naturally from the question.
17. Avoid leading or ambiguous questions.
- Don’t sequence your questions to lead respondents to answer in a certain way.
- Avoid questions that contain too much detail or may force respondents to answer “yes” to one part while wanting to answer “no” to another (e.g. “How confident do you feel singing and dancing?”).
- Minimise bias by piloting your questionnaire before it goes live.
18. Use open-ended questions with care.
- Open responses are difficult to consolidate, so use them sparingly.
- They often provide really useful data, so don’t avoid them completely.
- Doing a pilot or running a focus group before rolling out a survey can provide useful insight for creating more structured closed questions.
- Provide at least one open question so respondents can express what is important to them.
19. Thank the respondent.
- Thank the respondent once again. Reiterate why you value the input.
- If you intend to feed back results, emphasize when and how they can expect to get them.
- If you have offered an incentive, specify what the respondent has to do to claim or be eligible for it.
1 comment:
As someone who has worked with survey design for over 25 years, this post is well done and very helpful. Crafting well designed questions is both science and art, and keeping bias neutral is critical if you're going to learn anything from the data you collect.
Good job!
Post a Comment