We all want to make an impact on business outcomes with our learning programs. In addition, we know there is big data within our reach. But how do we choose the right data? It starts with developing the right questions and properly sorting data to ensure we are making an impact with our learning programs.
During a recent webinar hosted by Training Industry, I shared examples of how different Learning and Development organizations have harnessed the power of big data to prove, improve, and predict the impact of the learner experience and learning programs.
If you missed the webinar, a recording is now available for you to watch online. However, if you are looking for the abbreviated version, I want to offer a quick look at some of the key takeaways:
- Remember the learner experience and the voice of the learner.
- Ask the right question to guide your data collection
- Focus on efficiency, effectiveness, and outcomes to show the impact of learning.
During the presentation, several great questions came up from the audience, and I want to share them with you. Below are those questions and my best answers. This is an ongoing conversation, and I encourage you to keep the questions coming in via the comments section at the bottom of this page.
Q: Can you explain more about xAPI?
A: xAPI is data that shows actor, verb, and object. For example, John (actor) liked (verb) Megan’s post (object). We can use xAPI to show the activity of the learner in the learning environment. We can then run a query of the data to answer questions like who is the most active, what activity (verb) was the most common, and what content (object) was used the most. We can also use this data to personalize our learning based on what the learners engage with.
Here are two resources to learn more about xAPI:
Q: Do you have any suggestions on how to pitch the idea of becoming more data driven? On getting buy-in and resources from leadership?
A: I believe the pitch should start with showing the value of the data and the analytics. I would present the benefits of learning analytics using a future state. For example, imagine if we could show learning’s impact on goals and performance. Imagine if we had data to help us make decisions on what learning programs we should offer and discontinue. I think of being data driven as the effort to reach the mountain top. If we put in a little effort and resources, then we can reach the top of the mountain step by step. Once we are on top of the mountain, we can see what is out there, and the trail could be the trend lines we look at to show progress.
You can also partner with operations to become data driven. They are already using practices to collect data on the performance of their units, and you can pitch the idea of measuring the impact of learning programs using the data that they are already collecting. You can also partner with the quality department of your company to use the data from audits to determine the impact of learning programs.
Q: What would you recommend for further reading or sources for deciding how to prove training is effective?
A: I recommend the following books:
- Learning Analytics: Measurement Innovations to Support Employee Development by John Maddox
- Developing Human Capital: Using Analytics to Plan and Optimize Your Learning and Development Investments by Gene Pease, Bonnie Beresford, and Lew Walker
- The Business Case for Learning: Using Design Thinking to Deliver Business Results and Increase the Investment in Talent Development by Patti Pulliam Phillips and Jack J. Phillips
I also recommend the Center for Talent Reporting’s TDRp Measures Library.
Q: I really like what you said about being selective with what we measure. Can you restate the importance of not measuring everything and speaking to the importance of measuring the impact of learning solutions that are expensive, highly visible, and tied to strategic business initiatives and goals?
A: We have limited resources and time to collect and sort through the data that is out there. So we need to establish criteria for what we are going to focus our efforts on. I would recommend evaluating programs that are costly, visible, and strategic. You will need to identify which learning programs are the costliest to deliver, which programs are seen by the most people, and which programs are aligned to business goals.
Q: How would you tell senior leaders/business partners that developing analytics is a long-term process when they’re looking for results/read-outs close to real time? Are short term analytics true indicators?
A: Developing analytic skills takes time because you need experience in consulting, collecting the right data, sorting the data, and then developing the story to go along with the data. While we can now collect data in real time, we have to remember it is only one point in time. We live in an ever-changing world, and while a measure at one point in time looks good, we need time to see if the measure is consistent to know if our programs are making an impact.
Q: What business intelligence tools help with gathering data? I’ve used Tableau but am curious about other programs.
A: Tableau is great for communicating the answers to your questions. The key is using the right tools to collect the data. You can use surveys, observation checklists, tests, and focus groups to collect data. There are different tools that help with sending out and collecting surveys, including Metrics That Matter and forMetris. But you can also use Survey Monkey for surveys. Excel is also a good tool to analyze data. Unfortunately, there is no magic black box where you can enter all data and be provided an answer to what is working and not working in your learning programs. Instead, you need to form good questions, and this will help you select the right business intelligence and data collection tools.
Q: When you work in sales training, the Holy Grail for analytics is to show that your training increased sales, but there are many more variables that affect sales. Do you have any recommendations for looking at different KPIs for proving a program’s value?
A: The key to linking learning programs to business outcomes is to build a causal chain. If the goal is to increase sales, then we are looking for outputs or behaviors that lead to increased sales. For example, it could be other data like leads generated, the number of presentations to clients, or follow-ups on leads. We could also track other data such as product knowledge, how well the learner answers questions in a role play, and time to follow up on a lead.
Q: What are your preferred or best-practice metrics and/or collection data to be implemented in the management area and also in a classroom?
A: I would recommend having learners complete an online survey in the classroom at the end of class. An online survey is better than a paper survey because time can be saved transcribing the data. And completion in class is better than when the learners return to the job because distractions can prevent completion of the survey. There are several ways to collect data in the management area. One way is to ask the direct reports of managers to provide feedback via a 360-degree survey on the performance of the manager. Another way is to compare managers using the production and quality on-the-job data of the people who report to the managers.
Q: If we don’t have an LMS for content but want to get the type of data derived from xAPI, what other choices do we have?
A: Yes, you can create eLearning in Articulate and not host it on an LMS, but rather a website or a portal. With some simple programming, you can then get the eLearning course to provide xAPI data that can then be stored in an LRS or just feed into an Excel spreadsheet. Read more here.
Q: What data can we use in a nonprofit government environment?
A: You can look for data in three areas: capacity, activity, and impact. These three areas enable a nonprofit to measure its success in mobilizing its resources, its staff’s effectiveness on the job, and its progress in fulfilling its mission. The key is to determine what good looks like and then find data that shows how close each area of the organization is to the goal. Read more here.
Q: What’s the biggest challenge you’ve seen in getting learning professionals to change their mindset/behaviors to support learning analytics?
A: The biggest challenge is that learning professionals want to help their business partners by delivering creative, fun, and innovative learning solutions that use the newest technology. With the focus on bright and shiny objects, learning professionals are distracted and do not talk with their partners about business outcomes and problem solving. Learning professionals need to think that they are problem solvers and consultants rather than learning artists. The mindset can change just by asking, “What problem will be solved with this solution?” and “How know we have solved the problem?” (“What is the data?”)
Q: Can you provide the title again of the second resource you mentioned regarding CTR?
Way back when I was a newspaper photographer, I really wanted to know the who, what, when, where, and why about the story I was assigned to. I loved to find out more information so I could be in the right place at the right time in order to get the best photograph. The more information I had, along with personal experience, prepared me to take an impactful photograph. My journey to learning analytics follows the same path of asking questions and finding the right tools.
When I started working in Learning and Development as an instructional designer, I always was curious about what the learners were going to do with the training on the job. Oftentimes, I would get a response from the SME that the new knowledge would just change behavior on the job. I guess I am a little cynical about the magic of training. Just wave the magic wand, attend the training, view the WBT, and your problems will be solved. I did not know the questions to ask to ensure that the training would be applied on the job, but my leaders noticed that I was curious and liked to ask questions. They asked me whether I would you like to be a performance consultant. After telling me what a performance consultant does, I said that it sounded great. Who wouldn’t want to solve business and performance problems with a series of interventions?
It was my time as a performance consultant that I learned about the right questions to ask to get to outcomes and, in turn, I became fascinated with metrics. My favorite questions are still as follows: Can you tell me more about the problem? What have you have already tried to solve the problem? What would it look like after this problem is solved? What metrics or data do you have that show there is a problem?
I became data driven to find the causes of problems and then track the solutions to see if we were moving the needle. The tools to find the root cause of a problem are the same tools to see whether the training is being applied on the job. I use interviews, focus groups, observations, checklists, and surveys to find out what is causing a problem, and then I use the same tools to find out what is happening after training and, in turn, making an impact on business outcomes.
I would say that learning analytics and photography are similar in that you need to plan with the end in mind to collect the right information in order to tell a story and make an impact.
Latest posts by Scott Weersing (see all)
- Evaluation Isn’t Hard When You Focus on Outcomes - September 25, 2018
- Don’t Assume Younger Learners Prefer to Learn Differently - June 5, 2018
- How Virtual Reality Can Make an Impact on Performance - May 24, 2018