Features of the Open Source Learning Locker® LRS

Learning Locker® Version 2

With the release of OS Version 2.0, we’ve rebuilt the platform from the ground up; rebuilding our core codebase to deprecate PHP in favour of NodeJS, and bring the Open Source Edition in line with Learning Locker® Enterprise.

Using Node’s event-driven model and huge ecosystem of libraries, Learning Locker® is now in a much better position to grow, both in terms of how much data it can handle and how fast we can iterate new versions – and faster speeds for everyone!

Key Improvements In Learning Locker® Open Source V2

The new version brings in a number of the key improvements, alongside a completely overhauled UI.

Customized Dashboard Builder

A complete visualisation tool to draw a wide range of graphs, with customized series and axes, all of which are embeddable on shareable dashboards.

Personas

Multiple actors for a single person can be grouped under a single persona; available in both the UI and on each statement, for easy querying and reporting.

Query Builder

Learning Locker® maintains a database of all query-able values that it has ever encountered this is used to make searching for statements really easy.

Faster xAPI Processing

Through our experience of hosting some of the world’s largest Learning Record Stores (such as that used by Jisc), we have made significant speed improvements across all xAPI.

Statement Forwarding

Built-in statement forwarding (which includes filters) allows you to send your data to another LRS, or to a 3rd party service, in the style of a WebHook.

Multi-Tenant, Multi-LRS

Data has been restructured into Organizations and Stores, which enables finer control and reporting across multiple stores within a multi-tenant environment.

All Of The APIS

Everything in the UI is driven by Learning Locker’s® own APIs. If you can do it in the UI, you can do it outside the UI too – amazing news for integration with your own products.

Automated Installation

A cross-platform installation script is now available to automate the installation of Learning Locker®, dependencies and the server configuration required to get up and running.

Learning Locker® is 100% xAPI Conformant

When choosing a new Learning Record Store, one of the most important considerations for many is whether or not it meets the ADLs xAPI Conformance Requirements.

ADLs LRS Test Suite evaluates whether an LRS correctly implements the mandatory xAPI server-side requirements by automating HTTP requests to an LRS and evaluating its responses. The Test Suite evaluates over 1300 LRS testing requirements, which were derived from the xAPI specification as well as community input from the Policies and Procedures for Conformance Testing Group.

Learning Locker® is now 100% conformant with the ADL’s requirements. This goes for both our current Enterprise and Open Source Editions, and you’ll now find Learning Locker® listed on ADL’s ‘Conformant LRSs’ Registry.

Frequently Asked Questions About The Learning Locker® Open Source LRS

Often, the first decision you’ll need to make is whether you want to host your own LRS, or use an online service.

An online service will be quicker to set-up, probably cheaper short-term (unless your labourer cost is zero) and, will be tried and tested. However, you will need to be comfortable with data storage/ownership responsibilities and, you’ll need to be comfortable with the medium-long term costs of continually paying for a service.

It’s really easy to imagine that little bits of JSON containing xAPI statements won’t add up to much data. And, in the singular, you’d be right.

Most xAPI statements are around 2KB in storage size (though, we have seen statements 10x this size!), meaning you’ll be able to store around 500,000 statements per GB; equivalent to 1,000 learners generating 500 xAPI statements per month.

But, if you are making a million xAPI statements a day, this is going to add up to GBs of data in a matter of moments.

Whilst cloud-based storage is generally very cheap, the technology required to run an LRS at scale does not run well on very basic equipment. (For example, the SaaS version of Learning Locker® actually runs on 12 virtual machines at a minimum).

There is a trade-off between the quantity of data you collect and the level of detail you require in your xAPI statements. Storing more than you absolutely need can be wasteful.

Don’t forget archiving…

Because xAPI is an immutable specification (you can’t edit or delete stored statements) your data set is just going to grow. To keep costs down and keep servers running efficiently, you’ll need to develop a process for archiving old data. If you don’t, the LRS is going to get pretty big in years 2, 3, 4…

For reference, Learning Locker® SaaS LRS is priced on a ‘per GB of stored data’, so you pay $115 for each GB up to 10GB, then the price per GB decreases. This model allows you to start with a hosted LRS and a modest budget whilst you see how much you use. Then you can always switch to an on-premise deployment if cost is becoming a factor.

Data security remains a hot topic and your Learning Record Store is no exception to the trend. You should consider how your xAPI data is secured whilst in-transit to the LRS and, how it is secured at-rest in the database.

There is literally no excuse to not use SSL whilst sending xAPI data to the LRS and, most cloud providers (including Learning Locker®) will offer you encryption-at-rest.

When implementing xAPI, you should become hyper-aware of your organization’s data protection privacy policies and the geographic differences that might occur.

For example, does your organization require data to be stored in a particular geographic location? Or, perhaps more likely, are there particular areas of the world you are required to avoid storing your data-at-rest?

The LRS quickly becomes a key part of your infrastructure. If you are storing all organizational learning data on it, then you really can’t afford for it to go down, or worse, to lose data.

The killer question for redundancy and backup is always how much data can I afford to lose?

Of course, the preferred answer is ‘none’ but, that tends to be unrealistic in the face of cost/benefit analysis.

In the worse-case scenario, how much data could I afford to lope and hope to recover normal operating practice?

For many circumstances, putting in some fall-over mechanism and also doing off-site daily backups is enough. But, in high-risk or testing environments, even that might not be.

How can I do backups more than once a day? How can I achieve this without breaking the bank?

If you’re on self-hosting, backups more than once a day is achievable. If you’re on cloud-hosting you can ask us to do backups more than once a day. To reduce costs, we would recommended deleting old backups that are no longer needed.