Updating Evaluations Based on “Performance-Focused Smile Sheets”

In the spirit of sharing my work, I wanted to share how we updated our smile sheets. We knew our evaluation forms needed some help. They were long, tedious, and didn’t really give us the information we needed. So, we looked to Will Thalheimer’s book, Performance-Focused Smile Sheets. I set to work creating new evaluation questions for our courses.

Note: These questions were for immediate evaluations only, not delayed ones.

How the New Evaluation Questions Were Written

Rewriting our smile sheet smile sheets started with taking a look at our old questions and dissecting them. One of them was:

The course was given the correct amount of time.

  • Strongly Agree
  • Agree
  • Neutral
  • Disagree
  • Strongly Disagree

If the average answer was “Disagree” – what does that mean? Was the course too long or too short? Why? Obviously, as observers we could take a good guess. And maybe a couple of people would have written in a comment. But the point still stands – why not make the answers actually give us real data? Adding more data to your smile sheets is much of what Will suggests in his book.

We changed this question to:

Which of the following is true about the timing of the course?

  • The training felt dragged out and could be shorter.
  • The training felt dragged out yet not enough time was given to certain topics.
  • The training felt rushed and more time is needed.
  • The training was generally given the right amount of time.
  • The training was generally given the right amount of time, but certain topics were given significantly too much or too little time.

You will also notice that there are four answers that are not “good” answers. There’s no range in feelings among them – it’s really four “unacceptable” answers and only one “acceptable” answer.

In addition to improving the quality of the questions and answers themselves, Will also has a list of evaluation goals – not unlike learning objectives. These goals include “Learner Engagement” and “Effectiveness of Cognitive Frameworks.” I selected seven of these goals are required within each evaluation, with seven optional goals that we include only if needed – such as a pilot training where we may want more information. Each goal has one to two associated questions to see if we met that goal. The example question above fell into the “Course Well-Organized” goal and is an optional addition to the smile sheets.

The grades for each response are Unlikely Result, Superior Result, Acceptable, and Unacceptable. There was at least one Acceptable response for each question, and often many Unacceptable responses. Sometimes there were no Unlikely Result or Superior Result responses. The goal is to make the responses unbiased, so not to lean in favor of ourselves, but also attempt to account for all potential responses. Therefore, there were often many more Unacceptable responses because there were several reasons why something could be unacceptable, as shown in the example above.

Unlikely Result responses include those such as “I feel like an expert after taking this course” or something along those lines. In some scenarios, such as learning a simple systems process, that could be the case. However, if it is something softer than that – why was the person even in the class? Do they feel like they no longer need practice? Though Unlikely Result responses sound good, they open up another world of questions.

Acceptable Responses are just that – Acceptable Responses. We often struggle with this, wanting everything more than acceptable. If something is lower than four stars on Amazon, we may consider it unacceptable. However, four stars actually does mean “I like the product.” Isn’t that a good thing? Just like we want to hear “exceeding expectations” more than “meeting expectations” – though there is absolutely nothing wrong with “meeting expectations.” If you follow this model for creating your own evaluation questions, keep that in mind throughout.

How The New Questions Were Received by Stakeholders

I presented the new smile sheets in a document with this explanation at the beginning:


Goals of Evaluation Questions
The questions drafted in this document are intended to be sent to instructor-led or e-learning participants immediately upon completing a course.

Goals of immediate evaluations
The following should be known or supported by the answers to the drafted evaluation questions:

  1. Primary Goal: Training Effectiveness
    1. Secondary Goal – Understanding
      1. Tertiary Goal – Learner Engagement
        1. (Optional) Quaternary Goal – Learners Motivated to Learn
        2. (Optional) Quaternary Goal – Instructors Credible and Engaging
        3. (Optional) Quaternary Goal – Environment Conducive to Learning
      2. Tertiary Goal – Effectiveness of Cognitive Frameworks
        1. (Optional) Quaternary Goal – Course Well-Organized
        2. (Optional) Quaternary Goal – Materials Signal Attention Hot Spots
        3. (Optional) Quaternary Goal – New Content Aligned to Prior Knowledge
    2. Secondary Goal – Motivation to Apply
      1. Tertiary Goal – Belief in Value of Concepts
      2. (Optional) Tertiary Goal – Resilience

Goals of delayed evaluations
The following are goals that would not be met by immediate course evaluations, but support the primary goal of training effectiveness and should be considered in delayed evaluations:

  1. Secondary Goal – Remembering
    1. Tertiary Goal – Realistic Retrieval Practice
    2. Tertiary Goal – Spaced Repetitions
    3. Tertiary Goal – Situation-Action Triggering
  2. Secondary Goal – After-Training Follow-Through
    1. Tertiary Goal – Reminding Mechanisms
    2. Tertiary Goal – Job Aids
    3. Tertiary Goal – Supervisor Follow-Up

These goals will not be covered in this document.


In the email, I also explained that it was based on the book and research. I have already developed a reputation for being research focused, and so was fortunate that my director didn’t even read over it – “I’ll trust the experts.”

So, I have absolutely no tips on how to sell this to your management. If you do – please comment below!

Example Questions for Performance-Focused Smile Sheets

Some additional sample questions, in addition to the ones already available in the book:


Goal: Environment Conducive to Learning

Generally, which of the following were true about the learning environment? (Select all that apply)

  • I generally felt that my opinions, questions, and responses were respected.
  • I generally was able to ask questions and share experiences without fear of judgement or retaliation.
  • I am able to apply the knowledge and/or skills in the training immediately or in the near future.
  • I felt the material covered in the training was relevant to me.
  • I generally felt the environment allowed me to be engaged.
  • I generally did not want to participate because I felt I would be judged negatively.
  • The material and/or skills covered in the training are irrelevant to me.
  • I am unable to apply the knowledge and/or skills in the near future.
  • The time for me to apply the knowledge or skills had already passed by the time I took the training.
  • I generally felt the environment resulted in disengagement.

Note: I based the responses from Dr. Knowles’ research.


Goal: New Content Aligned to Prior Knowledge

Which of the following is true regarding the attention given to your previous knowledge and experience in the training?

  • I generally felt that my prior knowledge and experience rendered me too advanced for the training.
  • I generally felt that I needed additional knowledge or support and felt inadequately prepared for the training.
  • I generally felt that my prior knowledge and experience was accounted for in the training.

Effectiveness of Cognitive Frameworks

Which of the following is true about the opportunities to apply what you learned? (Select all that apply)

  • I was given almost no practice.
  • I was given inadequate amounts of practice
  • I was given too much practice.
  • I did not receive adequate feedback during or after practice.
  • Generally, the time between learning and practice was too long.
  • Some of the practice was irrelevant to my job.
  • Generally, the time between learning and practice was adequate.
  • We generally received sufficient and helpful feedback after or during practice.

Any feedback for the above? Would love to hear it!

Smile Sheets in Practice

…you will have to wait and see! We are “piloting” the questions October – December.

Overall

The task of updating the evaluations felt a little daunting, but there is some relief in reducing the amount of overall questions we had. Implementing the framework for the questions was a little difficult at first, especially for those goals that do not have example questions, but I found my groove with time. The reception among my instructional design team was positive, and we will see in December how well they go!

 

Performance-Focused Smile Sheets

Performance-Focused Smiled Sheets by Will Thalheimer, PhD


%d bloggers like this: