This post continues my review of the online course, Using Data in the Health and Social Care Environment. The first part of Section 3 is about surveys.
As before, “Task 1” is just an introduction. It’s followed by some information about collecting data and types of survey questions.
Collecting your own data
Task 2 in this section starts with some of the same peculiarities noted previously. Data like official statistics is called “secondary data” even though it’s the first data anyone researching anything is likely to look at. Data that you collect yourself is called “primary data” even though when you use it to create a report and someone reads the report, they didn’t collect the data themselves and so from their point of view it’s secondary.
There are a couple of references to one of the other courses, Facilitation and Engagement in the Health and Social Care Environment, implying that it’s best to complete that course before the present one, but without really explaining why.
A section on designing a survey uses the coloured boxes techniques again to make the limited amount of text harder to read.
The first box (blue, for no apparent reason) seems to define the term “format” in a strange way. Telephone and face-to-face surveys are said to have the same format, but paper and online surveys are different formats. I still don’t know what, if anything, the author of the course means by “format”. SurveyMonkey and Google Forms are mentioned.
The last box (again blue, for no apparent reason) tells you that it’s important to pilot your survey, without describing what piloting really is. Weirdly, it ends by suggesting that it’s a good idea to provide incentives, without mentioning that doing this can cast doubt on your results.
A couple of paragraphs describe why baseline data is useful to make before and after comparisons. This is a bit strange in a page about surveys, because of course baseline data doesn’t just apply to surveys. There’s no mention at all of the related concept of benchmark data.
Finally, a few paragraphs describe some fairly random ethical and consent issues. Things like data protection and capacity to give consent just aren’t mentioned.
Types of survey questions
Some types of survey questions are listed, again in coloured boxes. The terminology used is odd and not very helpful. Some examples are given, but not consistently, and some of the examples that are given are unhelpful.
For example, the first box describes an “open question”, which Google Forms calls either text or paragraph text. In reality you can ask a closed question and provide a text box for the answer. The term “open question” usually means something different.
The next box describes a “list”, which Google Forms calls either checkboxes or “choose from a list”. The example shows a weird combination of a bulleted list with checkboxes too. Worse, using the term “list” in this way confuses a list of checkboxes with a listbox, which is a different thing that isn’t mentioned.
The next one, “category”, is what Google Forms calls multiple choice. The idea is that you can only choose one of the options. The example, however, is another bulleted list with checkboxes. You can choose as many of them as you want. The normal way to do this is to use radio buttons, but they aren’t mentioned.
After a few more coloured boxes with similar problems, a grid is mentioned even though it’s not really a type of question in its own right, just a compact way of presenting multiple questions.
There are a few “dont’s” with examples, but not all of them are entirely satisfactory.
For example, the first is:
“[DON’T] Ask leading questions, i.e. one [sic] that presumes the answer”
But the example doesn’t presume the answer:
“Why do you think services are so poor?”
The presumption is that services are poor, but there’s no presumption as to why they are. There’s nothing really wrong with this question, if it appears in a context where it has already been established that the person taking the survey thinks services are poor.
There’s only one “do”, and that’s to help people navigate the survey. The example is, “If NO go to question (10).” Nothing wrong with that, except that it’s weirdly described as a “filter”.
Collating and organising the data
The text ends with some mostly unhelpful remarks about collating and organising data, for example:
“Once you get your survey results back, you will need to gather (collate) all your data together, in order to organise, sort and analyse it.”
That’s no doubt true in a meaningless kind of way.
SurveyMonkey is mentioned again, but Google Forms is left out this time.
Spreadsheets are mentioned, without any explanation of what they are. An example claims to show a spreadsheet that could be used to analyse and then present some data, but there’s no example of the analysis or the presentation.
This part of the course ends with an assignment, which is to create a survey. There’s no guidance on how to go about it.
I’ve made online forms before, so I started with that method. I’ve seen a lot of SurveyMonkey forms, so I ignored that method, but I tried out Google Forms and a PDF form for the first time.
One advantage of an online form is that you, the form designer, have complete control over how the form looks and behaves. It’s not too hard to make a form that works nicely on desktop computers, tablets and smartphones.
Also, any user who doesn’t like those options can print the form and post it. Visually impaired users can change the contrast and the font size to suit, or use a screen reader to read it out. Users who don’t know much English can get a free online translation.
A disadvantage is that it’s difficult to be certain that your form works consistently on all web browsers. For example, I discovered when doing this assignment that Safari on iOS currently has a bug that makes radio buttons hard to use.
Google Forms provides all the above advantages for users, but the design of the form is limited. For example, in my survey when you tick a certain checkbox in question 3 you should skip straight to question 8. That was easy in my own online form, and also in my PDF, but I didn’t find an obvious way to do it in Google Forms. The script editor might provide a way, but I didn’t investigate it.
Overall, I was quite impressed by Google Forms and I might use it again for a simple form where the data is not sensitive, so that people won’t mind sending their information to Google Inc.
PDF forms are very flaky, and I wouldn’t use them again. The software I used to create the form (Scribus) didn’t support radio buttons, but I found a way to implement fake radio buttons.
Different PDF viewers handle forms in different ways. My normal viewer, PDF X-Change, handled the survey well but displayed an error message after submitting the data to my web server. On iOS, Safari’s PDF viewer doesn’t seem to handle forms at all, and Adobe Reader made a horrible mess of the survey form. Adobe Reader was OK on my old Windows desktop, but it crashed after submitting the data. Adobe doesn’t make a version of Reader at all for my new laptop.
Maybe if I kept trying I’d have found a PDF viewer that works properly, but that’s not the point. If lots of people download a survey, I have no control over what PDF viewers they’re all going to use. PDF forms are unworkable.
The next, and last, part of this depressing course will be about presenting data and writing a report.