Picture of Luisa Kraft
How can we evaluate responsive elearning courses?
by Luisa Kraft - Friday, 27 March 2015, 9:08 AM

Hi all,

as I am working as a Learning Designer in Munich and at the same time writing my master’s thesis about responsive elearning design, I hope to get some ideas and inspiration here.

At LearnChamp we are already designing courses with Adapt, for which we developed and discussed similar ideas and topics as already mentioned in this community (by the way: great, how discussions are going on here already). During this process for me the question came up about how to prove the effectiveness of our ideas. How can we evaluate the interaction with the courses on multiple devices?

Has anyone of you done usability testing with responsive elearning/a course on different devices? How do you go on with that? Which methods do you prefer?

I am looking forward to your ideas and advice!



Paul Welch
Re: How can we evaluate responsive elearning courses?
by Paul Welch - Friday, 27 March 2015, 11:31 AM

Hi Luisa,

Welcome to the forum.

Your post is auspiciously timed as I’ve just posted here that we’re in the process of organising some user trialling workshops (to accompanying a literature review on scrolling navigation across devices over the coming months).  We’ve already secured some budget so it’s a case of getting these workshops booked in.

 We have done something similar before although not via an academic institution. Back when Adapt was in its infancy we carried out A/B comparisons of like for like courses with users and got their feedback on their preferences between fixed layout and scrolling layouts and the ease of use across multiple devices (scrolling won!),  as well as recording issues with particular components and so on which we’ve since addressed. So much has happened since then that I felt it sensible to organise these user trialling workshops to make sure we’re still delivering something that meets the needs of users. In addition, Adapt has been used to deliver hundreds of courses and of course we get lots of feedback from clients that influences design and functionality. This has led to improvements to the framework and there’s an obvious example of this in the impending V2.0 release where we will be offering improvements to the button clusters on questions which will allow learners to reopen previously closed feedback, see the overall pass fail state (as well as option specific marking) and so on.

On a practical level one of the things we do at Kineo is create a build, let’s call it a functional proof of concept, which contains the theme, the art direction and one single example of every plugin the course will include. This goes through an internal team review before undergoing a full QA on all target devices for both functionality and usability. We then share this with the client where we review together and check it runs on all target devices when launched from the clients intended hosting solution. This is obviously all done before the solution is delivered to the user so it may not answer your question which is how do we evaluate how effective the course is being when used on different devices.

 From my experiences to date, most evaluation has gone into what devices are being used to launch content rather than effectiveness of the solution on a given device. Incidentally, the  answer from the clients I spoke to was that the vast majority of users accessed the content first from PC and then from tablet with smartphone a distant third.  I wonder how much of this is down to the mobile friendly design and ease of access to the sites containing the content, the behaviour of learners whilst learning in the workplace or the design of the courses. It’s of course probably a combination of all three but the third issue is the one we can do something about most easily. Love to hear other views, ideas and experiences.

I think it would be great to share examples on the community and why we know (or even at least think to get us started) a given solution work well across different devices.  I appreciate there may be some blockers due to commercial sensitivities but I’m sure there would be workarounds.



Picture of Luisa Kraft
Re: How can we evaluate responsive elearning courses?
by Luisa Kraft - Tuesday, 31 March 2015, 4:39 PM

Thanks Paul for your reply! This is very much about what I hoped to hear.

About your last point. I’m pretty sure, it is a combination of all three: content, learner behavior and the design of the courses and I think we always have to design with this combination in mind. In my eyes we should think how we find out more about learner behavior (and intentions) and use this knowledge to adapt content and design. In talks with clients it would be interesting to know in which contexts their learners used the courses. Did they use it at the workplace, at home or on the move? What was their intention to use it on a special device? What are their needs to access the content more from mobile devices? If we find out about these contexts I think we can adapt the content and make design more learner-specific for the different devices. I know that's not the easiest way but I am sure it helps a lot. What do you think?



Picture of x z
Re: How can we evaluate responsive elearning courses?
by x z - Thursday, 2 April 2015, 2:50 PM

Depending on how broadly you're using the term 'responsive', this may be of interest -- http://standards.dpi.wi.gov/sites/default/files/imce/cal/pdf/guiding-principles6.pdf