Blue Sky Blog

Explore our insights about lecture-based learning and discover why our innovative virtual events services and content management solutions are powerful assets for any organization.

Setting yourself up for Success with eLearning Feedback Surveys

Setting yourself up for Success with eLearning Feedback Surveys

I recently attended a webinar on mentoring and induction programs for new PK-12 teachers. A survey popped up at the end and among the questions was this feedback form favorite  – “What did you learn?” I answered honestly, “I learned strategies for mentoring and new teacher induction.”

Feedback Survey Fails

Most adult learning sessions, whether conducted in-person or online, include participant feedback surveys. All too often though, learning designers create these as an afterthought, long after the content has been created. Part of the problem lies in the fact that feedback surveys feel very easy to compose. After all, they’re just a quick set of questions we pose to participants about how they liked the session, what they learned and how they plan to use the material, right?

Feedback surveys can be much more than that, but the reality is that questions that seem straightforward like the one above can pose problems for respondents – those participants who complete the survey. And when that happens, it can render results virtually useless – poor quality data that lacks meaning and doesn’t inspire action.

By attending to the WHY, WHAT, and HOW of survey design you can set yourself up to collect rich, useful data that will effectively inform your efforts to continuously improve eLearning.

Know WHY you are creating a feedback survey

  • Identify a purpose for the survey. What are you hoping to learn from participants?
  • Develop evaluation questions. These are like research questions – broad questions that inform the design of the survey. Examples are questions such as “How do participants experience the learning?” and “What aspects of the course appear to be most/least effective for participants’ learning?”
  • Know your audience for the results. Is the survey designed just for the course instructor, or will results be shared at different levels of the organization? Consider everyone’s information needs when designing the questions.
  • Know what decisions rest on the results. What actions might be taken as a result of what is measured? Will the instructor make improvements in the course? Will additional courses be added? Is there a need for a more advanced or more basic level course?

Know WHAT a survey can measure

  • Know what surveys can and cannot measure.
    Surveys can capture the following types of information:
    • Attributes (e.g., demographic characteristics such as age, ethnicity, or gender)
    • Behaviors (e.g., what people do, such as shop, exercise, engage in hobbies)
    • Abilities (e.g., knowledge or skills)
    • Thoughts (e.g., attitudes, beliefs, feelings, awareness, opinions, or preferences) (Robinson & Leonard, 2018, p. 39).

It’s important to recognize that surveys cannot truly measure behavior. Rather, they measure people’s perceptions of behavior, or behavior they choose to report. If I tell you how often I exercise each week, that may or may not agree with data you might collect if you spend a week observing me!

Know HOW to design quality questions that will yield meaningful data

  • Compose questions that are comprehensible. If respondents don’t understand what a question is asking, they won’t be able to offer good quality data.
    • Use plain language.
    • Be aware of respondents’ literacy levels and language.
  • Know that seemingly straightforward questions can hide a multitude of interpretations. For example: “How many times in the last week did you read a magazine?” Consider which of these behaviors might count or not count as “read a magazine”:
    • Glanced through most of the pages
    • Read one article
    • Read part of one article and glanced at a few other pages
    • Flipped through each page cover to cover and read most of one article
  • Ensure question stems and responses are aligned. If the question asks “how useful” an aspect of the course is, the response options (answer choices) should feature degrees of “usefulness.”
    • Example:
      • WRONG: How useful were the handout materials?
        • Excellent
        • Good
        • Fair
        • Poor
      • RIGHT: How useful were the handout materials?
        • Not at all useful
        • Somewhat useful
        • Very useful
  • Ask questions respondents can answer. “What did you learn?” sounds like a perfectly good question, but scores of respondent answers over the years have indicated that people have difficulty with this question. They may have trouble articulating what they learned, or they may be rushed and not feel like answering an open-ended question in detail, or they may not know exactly what the question is asking. Does the instructor want to know only what new information a participant learned, or what the person remembers about what was covered in the course, regardless of whether it was new information?

Know HOW to engage respondents to get the best response rate

Of course, completing a feedback form may be compulsory for participants to receive credit or compensation, but it’s not the best way to get high-quality data.

  • Include a brief invitation or introduction that tells respondents why you need the data, and why it’s so important to get their feedback.
  • Respect respondents’ time by keeping the survey as brief as possible.
    • Don’t ask what you don’t need to know. If you’re not going to analyze how people of different racial/ethnic subgroups experienced the learning, don’t ask their race/ethnicity. The same goes for other demographics (e.g., age, gender, income).

Bonus tip: Consider follow-up surveys to measure what you can’t measure at the close of the course – e.g., if/how participants are using what they learned, in what ways, and with what degrees of success.

 

About the Author:

Sheila RobinsonSheila B. Robinson, Ed.D of Custom Professional Learning, LLC, is an educator, consultant, and program evaluator. She facilitates workshops on program evaluation, survey design, data visualization, and presentation design, and works with clients to design surveys, presentations, and professional development courses. She is the author of Designing Quality Survey Questions (SAGE Publications, 2018).

Leave a Reply