People often say that evaluation of learning is difficult. It can be hard to measure the impact of training using business metrics.
So why is evaluation difficult? One reason is because we accept the goals of the training because the topic sounds important. Perhaps you have heard a request like this, “We need training that develops agility, determination, grit, and innovation.” The person making the request often shares, “I just read a book that says these (insert XYZ) things are important in the ever-changing digital world.”
It is hard to argue with a person who asks for training to increase diligence, determination, and drive. The skills all sound like things that are needed to be successful.
One reason we don’t, or can’t, evaluate a training program is that we skip right to determining the modality, timing, and audience.
So let’s take a step back during intake and think about how we can determine success. Ask yourself and the person making the request, “What will people do differently after this training?” If the answer is that people will be more agile, determined, and innovative, then we need to dig deeper.
What we need is to take the goal, or desired skills, and convert them to behaviors and outcomes. Think of your employees’ behaviors as the on-the-job actions that they perform and you can observe. For example, how many consecutive times did a person follow the new process? This behavior shows diligence and determination. We can think of outcomes as nouns such as orders, decisions, reports, completed projects, proposals, and relationships. For example, how many orders were filled correctly in an hour? Each of these can be measured in number of orders, number of decisions, number of completed projects, number of relationships, or number of contacts with prospective customers.
We now have a list of behaviors and outcomes that we can measure before and after the training. Or we can compare the number of behaviors and outcomes with people who complete the training against those who don’t. We can answer the question, “Did the training have an impact on business metrics?” We now know because we chose behaviors and outcomes that have a business metric or have a direct link to a business metric. Another benefit of this approach is that we are creating relevant and applicable training. Your learners will really appreciate training that they can see directly how to apply on the job.
So the next time you are asked for training on communication, problem-solving, or creativity, take the time to dig deeper into what learners will do on the job, and then you will be able to evaluate training effectiveness.
Way back when I was a newspaper photographer, I really wanted to know the who, what, when, where, and why about the story I was assigned to. I loved to find out more information so I could be in the right place at the right time in order to get the best photograph. The more information I had, along with personal experience, prepared me to take an impactful photograph. My journey to learning analytics follows the same path of asking questions and finding the right tools.
When I started working in Learning and Development as an instructional designer, I always was curious about what the learners were going to do with the training on the job. Oftentimes, I would get a response from the SME that the new knowledge would just change behavior on the job. I guess I am a little cynical about the magic of training. Just wave the magic wand, attend the training, view the WBT, and your problems will be solved. I did not know the questions to ask to ensure that the training would be applied on the job, but my leaders noticed that I was curious and liked to ask questions. They asked me whether I would you like to be a performance consultant. After telling me what a performance consultant does, I said that it sounded great. Who wouldn’t want to solve business and performance problems with a series of interventions?
It was my time as a performance consultant that I learned about the right questions to ask to get to outcomes and, in turn, I became fascinated with metrics. My favorite questions are still as follows: Can you tell me more about the problem? What have you have already tried to solve the problem? What would it look like after this problem is solved? What metrics or data do you have that show there is a problem?
I became data driven to find the causes of problems and then track the solutions to see if we were moving the needle. The tools to find the root cause of a problem are the same tools to see whether the training is being applied on the job. I use interviews, focus groups, observations, checklists, and surveys to find out what is causing a problem, and then I use the same tools to find out what is happening after training and, in turn, making an impact on business outcomes.
I would say that learning analytics and photography are similar in that you need to plan with the end in mind to collect the right information in order to tell a story and make an impact.
Latest posts by Scott Weersing (see all)
- Let It Show! Let It Show! Let It Show! - December 27, 2018
- Evaluation Isn’t Hard When You Focus on Outcomes - September 25, 2018
- Don’t Assume Younger Learners Prefer to Learn Differently - June 5, 2018