Friday, 4 April 2014

part1 notes from #xApi workshop #LearningAnalytics

In the past I wrote posts on the xAPI (first one in 2011 launching brainstorm, an update on the project in 2012, and one where the project is getting up to speed in 2013) which is in continued development and seems promising for learner analytics, especially the mobile options. This set of notes is 1/4, the others notes from this workshop can be found here: part2/4 (some tools and link to xAPI demo site), part 3/4 (on connecting xAPI with LRS), part 4/4 (on learning design).

The project started at the centre of ADL, and you can find many resources at the Experience API (xAPI) page here. And there is a wiki with lots of working documents and status details as well.

For me, I am interested in the xAPI in combination with its mobile options for: tracking self-reported experiential learning, informal learning capture and 3rd party evaluation. Why? Because I am looking to build a research instrument which sort of combines classic 'learning logs' (i.e. diaries kept by the learner where s/he notes down what they have learned) and the qualtified self (a combination of the qualified and the quantified self which I wrote a post about here).

A good overview to start from are the slides from Nikolaus Hruska:
http://www.adlnet.gov/what-should-i-track-in-my-learning-experiences-to-build-learning-analytics/

And a really nice twitter implemenation example of the xAPI is described by the wonderful Mayra Aixa Villar in her blogpost #xAPI twitter chat - joint venture team - recap.

These parts of the notes are from the first speaker Andy Wooler from Hitachi systems 
He starts the day, framing learning data in general (look for Hitachi bloggers on big data to keep on top of their ongoing roadmap on data)
Making sense of your data (video) 
What data is really important.
What does the technical architecture have to look like in the next 5 year.

Think about why I – Inge - want to use it?
  • My reason: map the triggers in MOOC learning across multiple devices (in a ubiquitous learning platform like FutureLearn)
  • Linking learning data to provide from the ground up research data
  • Looking for learning meaning across contemporary social files (social media, email, text files, machine log files)
  • Allowing serendipitous learning to occur (so creating an open research instrument, yet based on first findings coming out of the pilot that was performed)


Comparing with Berein by Delottte (Talent analytics maturity model looks like

At Hitachi:
ARIES currently: Analytics and Reporting Integrated Enterprise System:
  • LMS
  • assessment tool, feeding in an Oracle system.


This enables different yet not ideal dashboards analysing learning data.
Currently the dashboards on LMS’s are not ideal as a UI.

Vision for the future at Hitachi (HDS academy data future)
  • ARIES future
  • Lms
  • It project database
  • Hds community
  • The loop
  • Help desk
  • CRM systems
  • LRS (Learning Record Stores)

Roadmap towards 2018 advance analytics tools predicted.

Concept of the never ending course (CPD – continued professional development with added spaced learning), this poses a problem for SCORM. Where people are at that point in time in their learning journey.

(look up the stairway to elearning heaven (Fiona Leteney @fionaleteney also into Tin Can/xAPI)

Reflection suggestion: what type of learning would be of interest to you taken into account the full personal, informal learning as well? What can be taken out of your community tools.

What do you need to think about when looking for meaningful data/analysis?
  • Learning strategy needs to be built to know what you want to look at
  • Self-reporting is depending on subjective meaning giving (cfr classic learning diaries that learners provide by self-reporting)
  • Validation of actions actually being undertaken by the learner participant
  • Coming to a mutual shared meaning of used self-reported vocabulary
  • Iteration of tracking due to learning which data does matter, how tin can actually works, how to automate data tracking for self-reported and informal learning.
  • Looking for higher level analysis: actively linking the data with discourse research, networked learning, social learning research, looking at the big five traits of a learner