We know the practices to measure formal learning, but what about measuring and evaluating informal learning?
Formal learning is often sequential, linear, or compliance training delivered in the classroom, the virtual classroom or web-based eLearning. We use an LMS to schedule and track completions.
But what about informal learning? I define informal learning as everything else, which includes micro-learning, performance support, social, peer to peer, and even learning through a handheld or mobile device.
Then there are blended approaches. Is a MOOC formal or informal learning? Is gamification formal or informal learning? They are methods that can be used in both.
Whether your using a blended or informal approach, here are a few guidelines when measuring learning:
1. Categorize your learning.
2. Outline the questions you want answered:
- What is the impact of the learning? You have to break this question down into smaller, bite-size questions.
- Who is using it?
- What is being shared?
- What is the learning contributing?
- How is the learning applied on the job?
- What are the performance, promotion and retention goals?
3. Choose an informal learning strategy or program to evaluate.
4. Decide what tools you can use to collect data.
5. Plan for setbacks and having to choose another program to evaluate and provide the right data.
6. After reviewing the data, map the insights back your initial questions. Did you get the appropriate answers?
7. Use the intelligence gathered to improve your informal learning programs.
Way back when I was a newspaper photographer, I really wanted to know the who, what, when, where, and why about the story I was assigned to. I loved to find out more information so I could be in the right place at the right time in order to get the best photograph. The more information I had, along with personal experience, prepared me to take an impactful photograph. My journey to learning analytics follows the same path of asking questions and finding the right tools.
When I started working in Learning and Development as an instructional designer, I always was curious about what the learners were going to do with the training on the job. Oftentimes, I would get a response from the SME that the new knowledge would just change behavior on the job. I guess I am a little cynical about the magic of training. Just wave the magic wand, attend the training, view the WBT, and your problems will be solved. I did not know the questions to ask to ensure that the training would be applied on the job, but my leaders noticed that I was curious and liked to ask questions. They asked me whether I would you like to be a performance consultant. After telling me what a performance consultant does, I said that it sounded great. Who wouldn’t want to solve business and performance problems with a series of interventions?
It was my time as a performance consultant that I learned about the right questions to ask to get to outcomes and, in turn, I became fascinated with metrics. My favorite questions are still as follows: Can you tell me more about the problem? What have you have already tried to solve the problem? What would it look like after this problem is solved? What metrics or data do you have that show there is a problem?
I became data driven to find the causes of problems and then track the solutions to see if we were moving the needle. The tools to find the root cause of a problem are the same tools to see whether the training is being applied on the job. I use interviews, focus groups, observations, checklists, and surveys to find out what is causing a problem, and then I use the same tools to find out what is happening after training and, in turn, making an impact on business outcomes.
I would say that learning analytics and photography are similar in that you need to plan with the end in mind to collect the right information in order to tell a story and make an impact.
Latest posts by Scott Weersing (see all)
- Improve Your Data by Dropping These Five Poor Survey Questions - March 1, 2019
- Let It Show! Let It Show! Let It Show! - December 27, 2018
- Evaluation Isn’t Hard When You Focus on Outcomes - September 25, 2018