The push for a new learning management system, or for an upgrade to the existing LMS, is usually an enterprise requirement for larger volumes of training assignment, or the design and development of more complex/ comprehensive training programs. The larger and more complex an organization’s training program, the larger and more varied is the volume of training data generated from it. If analyzed correctly and at a pre-set frequency (LMS reports), such data can help identify the effectiveness and engagement of the training program, trends, wishes, and gaps therein, etc.

Here are some of the ways in which LMS data can be analyzed into reports. These reports are not always available as default features with every LMS on the market. However, the data needed to create these reports is always generated as a by-product on the LMS. It is just a question of retrieving the data and arranging and analysing it into reports following pre-set formulae.

So, if you identify the need for one or more of these reports, discuss with the LMS vendor to see if it can be designed from the available data.

1. Current Training Report: An individual learner can use this report to see details of their currently assigned training.

2. Training History Report: An individual learner can use this report to see details of training they have already completed.

3. Team Training Assignment Report: Snapshot of team training status for managers. Following info is shown: User ID, Name, E-mail ID, Learning Category, Training Delivery Method, Training Topic/ Training Item, Learning Type, Score (if the learning type is an assessment), Completion Status, Completed on Time? (Y/ N), Language in which training was completed, Assigned/ Registered on, Classroom Start & End date, Suggested Training Duration & Actual Time Spent, User’s Region, User’s Country, User’s Cost Center, etc.

4. E-test Report: Comparative data of E-Test completion status across regions/ countries/ org units/ cost centers, total number of learners registered, passed, yet to pass, and % passed.

5. Catalog Overview Report: Shows number of learning items per category at a given point of time.

6. Learner Training Time Report: Shows amount of time spent by all assigned employees per category for each of the learning types- classroom sessions, e-courses, e-tests, documents, videos, and podcasts.

7. Catalog Level Training Time Report: Shows amount of time spent by all assigned employees per category for each of the learning types- classroom sessions, e-courses, e-tests, documents, videos, and podcasts. This data can be very useful to compare learning metric across locations, teams, job roles, etc. For e.g: certain countries might come across as being learning champions as they always complete their training on time, have the highest scores, etc. Similarly, there may be regions or teams which lag behind. In the latter case, this should be further investigated to see if there are any contributing organizational issues.

8. User Data Report: Data dump of all registered users at a given time or for a date range.

9. Classroom Attendance Report: Shows overview information of all sessions of a single/ several/ all classroom courses for a given category. Data tracked is total % attended, total registered, total attended, and no show.

10. Session Attendance Report: Follow-up report for a particular session of a particular classroom course.

11. Global Sessions Report: Overview information of ALL sessions held globally for a given classroom course.

12. Hall of Fame Report: At the end of a pre-set frequency (e.g: end of each quarter), global admins can pull a report with data for that quarter on the following:

  • Top 10 training topics/ training items
  • Employees with highest % of completed training items
  • Star Learners (employees with completion % between 100 and 88- can be customized)
  • Learning Champions (average assessment score between 100 and 88)
  • Star region (regions ranked in order of % training completion)
  • Star countries (all countries with average completed status between 100% and 70%)

13. Survey Scorecard: Rating provided by Learners to a specific course. This can be a numeric scorecard, or, the number of ‘Likes’ per course, etc.

14. Survey Response Report: This is qualitative feedback provided for a specific course. Usually, this data is collected at the end of a classroom course.

15. Instructor ratings: Rating provided by Learners for the Instructor of a specific classroom session.

16. Number, Frequency, and Average Duration of Visits: This can be measured per user and then aggregated to measure the workforce’s engagement with the LMS.

17. Complaint Logs: It is important to collect feedback to measure gaps in the training program. For e.g: requests for training on new topics, training to be made available in local languages, etc.

18. Technical Logs: These logs measure technical details that are important for auditing purposes. For e.g: server outage time, server speed, number of concurrent users supported, peak times, number of downloads, virus attacks (if any), geo location of each log-in operation, etc.

19. Data Synchronization Report: Oftentimes, the LMS is closely synchronized with other applications used by the organization, e.g: e-mail system, compensation/ benefits database, HR database, etc. The synchronization report can provide data on how well the data across such applications is synchronized.

20. Job Impact: Ultimately, this data can measure the effect of training on individual performance. If an employee is promoted over another, training data for both employees can be compared to see if there is any causal relationship between training completed, training score, and performance improvement (ie: the promotion).