Picture of sean callahan
Working with scorm API locally
by sean callahan - Wednesday, 16 August 2017, 3:35 PM
 

Hi,

I'm attempting to set up a local test environment for getting scorm data from spoor via pipewerks and sending it to an analytics platform. Here's what I've done so far; made sure spoor is installed and enabled with logging turned on, run grunt server-scorm to launch course in scorm mode, output is now displayed on scorm_test_harness.html and I can see scorm data in the console. It says on the wiki page for spoor: 

If you need to get access to the SCORM API in your own code, /js/scorm/wrapper.js exposes a number of functions for your use (via a singleton object called ScormWrapper). Requiring /js/scorm/scorm.js will return the instantiated wrapper for you.

I'm wondering what the best way to require scorm.js is. Is this something I should do with the scriptLoader.js file? Correct me if I'm wrong but I assume once the require is set up that file will be in the build folder and I can link to it.

Once I have access to the wrapper what is the recommended way to query the data and manipulate it? Is this intended to be done from the scorm_test_harness.html page with javascript or is there another file that was intended for this function?

I might have a need to record custom data depending on how clever I can be about associating and formatting the data adapt already records, but that's a whole other topic. Just a basic explanation of the process for how to work with scorm data in a local test environment would be much appreciated.

Picture of Matt Leathes
Re: Working with scorm API locally
by Matt Leathes - Wednesday, 16 August 2017, 4:31 PM
 

Hi Sean

TBH that Wiki page is a bit out of date now. You might be better off using the offlineStorage API that was added to Adapt and implemented in spoor.

For example, if you wanted to store a bit of custom data in the suspend_data you would do something like:

Adapt.offlineStorage.set("propertyName", "propertyValue");

Similarly, to retrieve that data:

Adapt.offlineStorage.get("propertyName");

The only caveat with that is those is that you need to make sure you get/set only after spoor is ready to go, the 'app:dataReady' event should be good for most situations...

For doing analytics tracking, you may not even need to go anywhere near spoor/SCORM - could be worth having a look at either https://github.com/KingsOnline/adapt-google-analytics or https://github.com/cgkineo/adapt-googleAnalytics

If you could give us a bit more detail on what you're trying to do we might be able to help more...

Picture of sean callahan
Re: Working with scorm API locally
by sean callahan - Thursday, 17 August 2017, 4:09 PM
 

Hi Matt,

Cool features, good to know! Basically what I'm trying to do is get the number of articles in a course, the current one the user is on, and how many have been completed out of the total number. The plan is to package this data into json and pass it to tealium for analytics via the intellum LMS. I'm not yet sure if the data has to be in a scorm package or not for it to go through the LMS.

I was going to try and do something similar to whatever event fires when the location data is set at the block level and pass the data we want into json while trying to be clever about associating the completion data with the block data. I know the scorm data already has some of the info we want so I was also thinking I might be able to "steal" some of this. Unfortunately I haven't worked with backbone.js or requireJS a lot so I'm trying to get a basic grasp on how adapt works on a deeper level.

I really don't want to modify any core functionality if I can avoid it. I'm just a little unclear as to where I should start. A few things I'm unsure about are:

 

- Whats the best place to put this code within adapt? Would it be better to make this an external script that gets required by adapt? If so where should I add the require in adapt? Should this be a full blown extension? Just a basic idea of where something like this is intended to go and how the scope would work would be great.

 

- If I do need to package this in scorm is there a way to add custom data? I thought I might have seen a function somewhere in the scormAPI.js file that might do this. I'd still need to figure out how to access the scorm api via pipewerks from wherever this script is running if so. I've seen components that use this code:

var scormWrapper = require('extensions/adapt-contrib-spoor/js/scorm/wrapper').getInstance();

I'm not clear on the scope as to where you can call this from within adapt or if its possible to call it externally.

 

- If I do use the method you outlined above but need to package this data into scorm do I also need to configure adapt to use the offline_API_wrapper.js file in the spoor extension for local testing?

 

I apologize for the wall of questions here. I'd be happy to contribute some documentation for some of this once the process becomes clear to me if that would be helpful to other developers.

 

Thanks,

Sean

Picture of Matt Leathes
Re: Working with scorm API locally
by Matt Leathes - Saturday, 19 August 2017, 12:46 AM
 

Hi Sean

I don't think the spoor extension is going to be of much use with what you're trying to do. Not only does it not look at any of the data you want to (spoor looks at blocks completed, not articles) but the SCORM API doesn't really have anything for storing that kind of data - not in a way that could be reported on anyway.

I think your best bet is to create an extension to do all this. You might have a look at https://github.com/cgkineo/adapt-googleAnalytics/ to get an idea of what such an extension might look like.

Here's a code sample to get you started.. the following will log something like '1 completed out of 4 total' every time an article is completed:

Adapt.articles.on('change:_isComplete', function() {
  var totalArticles = Adapt.articles.length;
  var completedArticles = Adapt.articles.where({_isComplete: true}).length;
  console.log(completedArticles, "completed out of", totalArticles, "total"); 
});

Figuring out which article the user is currently on is a little trickier as you can have multiple articles to a page. Best to have a look in the bookmarking component as that essentially 'watches' the page to see what part(s) of it are currently in the viewport.

I'm away on holiday next week by the way but I'm sure if you get stuck others will help out if they can.

Picture of sean callahan
Re: Working with scorm API locally
by sean callahan - Tuesday, 22 August 2017, 3:28 PM
 

Great example, thanks! This will be very useful.

Picture of Vu Le
Re: Working with scorm API locally
by Vu Le - Saturday, 19 August 2017, 5:11 AM
 

Hi Sean, 

Im working on my adapt courses that can be send value to google sheet through an LMS. 

Picture of jPablo Caballero
Re: Working with scorm API locally
by jPablo Caballero - Monday, 21 August 2017, 9:44 PM
 

Hi Sean,

Of course I don't know all the details about your needs, but you say you want to put some json data together and send it to an analytics platform (I'm not familiar with it, I guess it has an API). That sounds to me like you need really 'custom tracking' in Adapt, so I agree with Matt that SCORM is not going to help you (it's a very specific thing to work exclusively with LMSs). By the way, just to clarify, a SCORM package is about packaging the content, it's not about data.

That said, the Adapt extension adapt-trackingHub (disclaimer: I'm the author) was created precisely to provide an infrastructure that would allow developers to create 'custom tracking' Adapt extensions. It basically provides base functionality, and coordinates the various 'trackingHub-compatible extensions' that may be active in a course. This framework has been used to create the adatp-tkhub-xAPI extension, an xAPI extension for Adapt.

To get the concepts about this tracking ecosystem takes some reading (there's a Readme and a Wiki in the adapt-trackingHub repo).

I have created a 'starter' version of a trackingHub-compatible plugin to make it easier (hopefully) for developers to understand the structure of a custom tracking extension, and to have something to get started. It is called adapt-tkhub-starter. Its Readme file explains how to use it, and the code is heavily commented, there are detailed explanations all throughout the main code of the extension (starterChannelHandler.js).

Again, this is just a starting point. Depending on the 'custom tracking' needed, there could be a fairly large amount of coding involved. The advantage is that this framework defines clearly what pieces of code need to be developed, and the 'integration' with Adapt is already done (the starter skeleton is a working Adapt Extension, one just has to be careful not to break it when customizing it).

What is absolutely necessary is to have a clear idea of the tracking that you want to do, any APIs involved (e.g., if you send data to an analytics platform that provides an API where you can send your json, you will need to write the code to send data to that API, there's no way around it. Thas why it is custom), where the course is deployed, how it is launched and how you are going to get the user's identity, etc.

I hope it helps.

Kind Regards.

 

Picture of sean callahan
Re: Working with scorm API locally
by sean callahan - Tuesday, 22 August 2017, 3:27 PM
 

This looks awesome, the documentation looks great! I'll probably use this as a springboard to add tracking functionality to. Thanks for taking the time to make a super useful extension. As far as the scorm stuff goes, yeah I know it's very limited in terms of both storage and functionality. I'm just not sure if it has to go through the LMS as scorm packaged data or if I can bypass that and send it another way.