SAP Data Services

How SAP’s Data Services Can Enable Your Learning Integrations

Where’s my learning data?

The push to SAP’s SaaS Learning module has been successful for nearly everyone but it does come with some downsides. By design (and to support stability of the platform) the SAP suite doesn’t make bulk exports of large amounts of data easy or even possible in some cases. There are very common and practical reasons for this in a shared computing environment like the SAP cloud. But knowing the reasons doesn’t help to solve the challenges.

Out-of-box solutions provide options but they all have their drawbacks:

  1. The API’s present features allow users to retrieve data in a manner designed for integration (one or two at a time and in response to specific API “questions”) not bulk data exports. They are designed to answer questions like “show me this student’s learning plan” and not questions like “show me all the students that have this item on their learning plan”.
  2. Some tools that are presently available and are candidates for a bulk export, like Integration Center, do not use learning data yet. While it is on the roadmap to add this data source, the date is not yet firm.
  3. Some of the previous “cheat” methods, like using a custom report, is fraught with risk as SAP will be sunsetting the PRD tool sometime soon. Even if you ignore the tool removal risk reports can be clunky, slow, and fragile.

This makes feeding a hungry business intelligence tool, having Learning-driven integration events, or even getting a large portion of the data in a meaningful way not presently supported by the SAP Learning API extraordinarily difficult.

Enter data services

SAP has a separate offering that can satisfy the need for large scale data output from the Learning module. It’s called data services. Rather than trying to feed the data though a small pipe (like the APIs), data services are flat file exports of the table deltas from the Learning module data model. The data is delivered via Secure File Transfer (SFTP). This allows far more flexibility with the delivered size of the data and creates significantly less strain on the Learning module.

While you get access to a significantly larger volume of data here the trade is that the client end is responsible for more of the effort. There needs to be a place for the data to land, and the client must build and maintain the process to consume the data in a meaningful manner. There’s no “pre-digestion” of the data. You get it as raw flat export files that you must manage. With this responsibility comes ultimate flexibility. Even though you have to do everything to manage the data coming in you can do whatever you need to do in order to make it fit your consumption needs.

Important bits about Learning data services:

  • The tables are bundled in three table groupings (SAP calls them “packages”). Package A is the most commonly used set of tables. B offers the somewhat lesser used tables and C the least commonly used tables. Pricing is based on the selected packages of tables.
  • This is an out-of-box offering from SAP. There is no real customization possible to the offering except the ability to select the table packages.

What do I get?

Once you‘ve signed up you will receive these files via SFTP delivery:

  • A full extract of all requested tables in your selected packages. This will allow you to “prime the pump” on your receiving side to have a set of base data to work with.
  • A set of delta files (typically delivered nightly) that are an extract of the data rows produced for each table in your selected packages over the course of the delta period.
  • A set of key files (also typically delivered nightly) that are just the columns comprising the primary key of the tables in your packages. This will allow you to detect deletions from the table because delta files do not include action flags. So you would not see deletions there

How does it work?

SAP will conduct a mini-project with you to ensure that the data is delivered properly and that your consumption process is working. While they do not help with the development of the receipt side of the process, they are available to answer questions about the delivery aspects. Once you have signed off on their delivery you are ready to go into production:

  1. You will request a full extract of the tables you have paid for by raising a ticket with SAP support.    
  2. Your SAP project team will enable nightly delta and key files for your feed from your production instance.
  3. Your data receipt process picks up the files from the SFTP delivery site and consumes them.

Why should I worry?

As always, there are risks to consider for any integration:

  • Deltas can get out of sync with your main data destination. This can be remedied in different ways depending on your consumption process and its implementation; though you would be wise to retain the delta files for a week or two to allow you to catch up if necessary. The emergency fix would be to enter another support ticket to get a full extract to allow you to reset your baseline.
  • Remember, your data will be delivered exactly as it is from the database. If your data contains special characters or carriage returns it will be present in the extracts. Be sure your consumption process can handle them (and has robust error handling in general).
  • In that same vein, full extracts are only delivered on demand via ticket request. They are typically an exception (or at least a very occasional regular occurrence) and usually need to be managed differently than your standard delta files.

What’s next?

Wrangling the large amount of data your Learning module produces can be difficult with the normal out-of-box tools. Using the Learning data services offering can be a reasonable answer to data-starved integrations, reporting tools, and other interfaces.

Based on my experience, working with a preferred and experienced vendor will help make your transition to Learning data services much easier and will help you to avoid some of the pitfalls. If you are interested in learning more about Learning data services or have more questions, visit https://www.gpstrategies.com/ or leave a comment below.

About the Authors

Avatar

Chris Olive

“Specialization is for insects.” – Robert Heinlein, Time Enough for Love My parents were both technology focused teachers when I was a kid, and my father was a high school math teacher so in the ‘80s he was the “computer teacher” too. This meant I was lucky enough to have an actual computer in our house, an Apple //e. I clearly remember my father bringing home the sample programming exercises (in Basic!) for me to practice. From there, I moved on to entering the example programs from the back of computer magazines myself and tinkering with them. It was this that made me realize that the flexibility of the computer made it the ultimate tool to solve problems. I also read a lot of science fiction when I was a kid. They were mostly the “classics” of the genre from the 1950s or so. I clearly remember Heinlein’s exhortation and that the main characters were inevitably able to do just about anything they needed to do to survive with their wide-ranging knowledge of multiple subjects. They knew metallurgy, biology, survival skills, hunting, cooking, sewing, and more in a large array of skills. I resolved to do the same. While I have not come close to mastery in all these skills, the notion of the “Renaissance Man” drove my early desire to learn as much as possible about as many things as possible. This led me to the Jesuits and their idea of “care for the whole person” and eventually led me to have my undergraduate education at Loyola College in Baltimore (now called Loyola University Maryland). Combining both technology and the humanities helped put perspective on how computers and other technology really were just tools and that we were solving people problems even though we were using technology to do so. That remains true even now with the clients I help. They are just trying to overcome an obstacle and the method to fix it is immaterial. I happen to use technology as a tool for much of the time but there are human fixes that can be applied to human problems too. At the bottom of every request there is a person trying to do their job. Sometimes they don’t really know what the source of their frustration might be which is why I always start by asking “What is the problem you’re trying to solve?” I figured if it was good enough of a starting place for Richard Feynman it would have to be good enough for me. Once these problems are clearly outlined, they are solvable. I enjoy helping people and solving problems. These things all work together.