This is one of a series of posts about a real-life attitudinal audience segmentation project. See other posts in this series.
Once the research plan detailing who you’re going to survey and what you need to ask them has been finalized, the next step is development of the survey itself.
Before I go further, let me stress how critical it is that you are careful in the design of your survey. If you don’t know how to craft a survey, find someone who does. Bad survey design doesn’t only irritate your respondents and fail to get you the information you need, but it can make you think you have good information when you don’t. There are a lot of ways to introduce bias into your survey — even unintentionally — in what you ask, how you ask it, when you ask it and what options you give respondents for answering the questions you pose.
Your survey needs to be as short as possible to get what you want, as easy as possible for your respondents to understand and to fill out, and as well-organized as possible to make it easy for you to pull your data together afterwards. And foremost of all, it needs to try to get the truth from your respondents, and not color their input with your own bias and expectations.
OK, enough of my “good survey/bad survey” soapbox. Back to our project.
For the SCU initiative, the group had already agreed to include all relevant groups in one survey so that we could segment the entire data set together. While this definitely has the benefit of allowing insight across all groups about relevant messaging and key motivations, it also means that they survey will be more complex than if a more homogeneous population is being questioned.
In this situation, we really had a matrix of groups based on three key considerations:
- Student Type: Traditional undergraduate, adult taking undergrad coursework, or adult taking graduate courses
- Status: Current student, prospective student who has applied, or simply “non-student”
- Location: On-campus student, online learner or student at one of the school’s offsite locations
Each of these characteristics brings a different set of available course options at the university, different degree opportunities, different housing options, and — most importantly — EXCLUDES OTHERS. In order to avoid dragging respondents through a long list of questions or choices that don’t apply to them (which would frustrate people and drive down response rates), we had to map out a survey that would allow us to ask relevant questions based on some early response choices, and also to capture some data that would apply to everyone.
So how to lay out something so complex and ensure that you’re (a) minimizing the amount of time it will take a respondent to go through the survey, (b) keeping questions relevant and (c) not repeating yourself or creating multiple data fields with the same information from different respondents, that you’ll have to consolidate later?
For me, the answer is — MIND MAPPING SOFTWARE.
That’s right — those solutions designed to help you organize your thoughts can be incredibly valuable when trying to work through a complex survey flow, and have a number of other benefits that a simple document or online survey preview simply can’t provide.
Take a look at the complete map of the SCU survey: SCU Survey Mind Map
Benefit 1: Flexibility
My favorite thing about this approach is that mind maps are eminently flexible. If you need to add a node, just do it — you don’t have to reformat the whole thing. If you need to move something or connect it to another path, it’s a matter of a few clicks and drags.
Benefit 2: More Than One Path
As you can see from the map, the SCU survey required a number of branches based on responses from each survey taker — and then required that those branches converge for other questions. Mind-mapping software can do what an outline or online survey printout can’t — help you work through what branches need to diverge, and when, and where they need to go after that.
Benefit 3: Visual Flow
I’m a visual thinker. So a mind map allows me to follow the flow of a respondent’s flow through the survey in a visual way — and if you’ve ever tried to follow skip logic in a word-processing outline, or figure it out from a printout of your survey in online services, you’ll know just what an amazing and rare gift this is for the survey designer.
Benefit 4: Clarity For The Client
Most of my clients don’t design surveys for a living, and a complex document full of skip logic instructions can be so daunting that it completely destroys any chance they have of ensuring that the survey makes sense and covers what they want to cover. Often they know more about their audiences than I do, and having them weigh in on language, specific items for specific groups and other elements is critical. This format also illuminates things we’ve missed and can add it. Working out the kinks at this stage is incredibly helpful, because it takes far less time than working them out once a survey has been programmed online. Especially when there’s a complex dance of respondent types, special questions, and different areas of inquiry this opportunity is a huge time-saver.
Benefit 5: Clarity For Me
When it comes time to actually program the survey itself, I now have an approved flow to follow. And ensuring that skip logic is set up properly is also easy, because it’s shown right on the map. After the survey is programmed I go back to the map and add the question numbers to each node on the map so I can easily find the item on the survey if I need to tweak something.
Next: Data, data, data! How we look at what we’ve got.