October 31, 2014

Defining Learning Analytics and the Insights it Can Bring

2 comments
A Definition

My task here is to define Learning Analytics. What is it? I think the easiest and broadest answer is (my definition):
Learning Analytics is the analysis of any data that was created as part of a learning process.

This definition is really wide open. It allows, or even forces, the observer to interpret almost every aspect of what that might mean and how it might be useful.  Where do you start? However, it is very non-judgmental. If you think any data relevant to learning in anyway and you want to conduct an analysis of that data, then go for it. 

SOLAR came up with this definition:
(note: it is not easily found on their website)

Learning Analytics is the measurement, collection, analysis, and reporting of data about learners and their context, for purposes of understanding and optimizing learning and the environments in which it occurs.

The first part of this definition allows you to look at four initial things you can do with data:
    Measurement
    Collection
    Analysis
    Reporting

Measurement is about the size, length, or amount of something, as established by measuring. With the data that we find in the digital world today and what is coming in the near future, we will want to leverage many measurements of data that we haven't probably thought about before in the learning context, such as those being pushed by big data volume, velocity,  variety, and veracity.

Collection is about capturing the data in a format you can use. What is generating the data? Who is generating the data? How is it stored? How "clean" is the data?  Can you perform an analysis with the data you have?

Analysis is about being able to pull insights for the data.  What does it tell us? Are their patterns? Correlations to be made?  Can items extrapolated or variables be isolated?  

Reporting is about sharing. Can you effectively communicate the insights that you found in your analysis?  Can you tell a story?  

The second part is about Learners (the person or people learning) and the context (I am assuming this is everything about the learner and/or the learning) 

The third part is about the purpose.  
for purposes of understanding and optimizing learning and the environments in which it occurs.

Maybe this will be discussed later in our mooc, but I am not completely sure why Learning Analytics would have to be for the purpose of understanding or optimizing?  I do disagree with this being the grand noble cause, but as a definition, I think it is more part of the mission of SOLAR.  I could measure learning data just know how long a module took me to do it?  Or how much it cost? 

I'm not recommending to change it, but something to think about.

Insights

This one is pretty open. What insights would measurement, collection, analysis, and reporting of data about learners and their context, for purposes of understanding and optimizing learning and the environments in which it occurs... provide the educator or the learner?

Much like having an open definition of learning analytics causes you to think broad, the insights of what we might get from learning analytics is also very broad.  I would start to build out different areas of focus, based on traditional learning analysis.  

What do you know about the learner?  Prior knowledge, Prior Experience, current level of performance, etc... However, today's connect world, we can learn so much more.  If are learners are engaged in the digital realm by using a smartphone or a tablet, engaged in social media, or using an organizational system such as a learning management system or a recognition system (think business here) then we start to gather a lot more information.   

George talks about this in our mooc video. He indicates that we can start to learning this things about the learner.
    • sentiment
    • attitudes
    • social connections
    • intentions
    • what we know
    • how we learn
    • and what we might do next


It is interesting that the "how we learn" topic continues to surface.  If we really have insight to how you learn or your learning style, can we do anything about it?  Can we create a design approach or even a smart system to individualize your learning.  

While learning for learning is awesome, my day job pushes me to have a particular interest in how learner experiences impact performance. This would help us start to have a story about how effective the learning activities were to help someone complete a task. (aka do their job).

Other Insights might include data about the environment. Classic argument of classroom vs online? When is the best time for learning? Is it better to play music when learners complete group activities? (All of these feel backward looking) Is it better to allow learners access to a smartphone throughout the learning experience?

Wrap Up

I am sure as we move forward we will be digging much deeper into these topics, but it is good to have a definition to build on. I am going to go with SOLAR for now. And to start considering all the different insights that we might get from learning analytics.

















October 29, 2014

Areas of Focus for Learning Analytics Tools

2 comments
I am participating in a MOOC around learning analytics.   We are in week 2 and I am already a little behind, but one of the week 1 competencies was to be able to identify proprietary and open source tools commonly used in learning analytics.


We were provided a tool called Learning Analytics: Tool Matrix and our activity is to add to it. The tool identifies the following five areas:

  • Data Cleansing/Integration

    • Prior to conducting data analysis and presenting it through visualizations, data must be acquired (extracted), integrated, cleansed and stored in an appropriate data structure. Given the need for both structured and unstructured data, the ideal tools will be able to access and load data to and from data sources including RRS feeds, API calls, RDMS and unstructured data stores such as Hadoop.

  • Statistical Modeling

    • There are three major statistical software vendors:  SAS, SPSS (IBM) and R.  All three of these tools are excellent for developing analytic/predictive models that are useful in developing learning analytics models.  This section focuses on R.  The open source project R has numerous packages and commercial add-ons available that position it well to grow with any LA program.  R is commonly used in many data/analytics MOOCs to help learners work with data. We opted for Tableau during week 1 & 2 due to ease of use and relatively short learning curve.

  • Network Analysis

    • Network Analysis focuses on the relationship between entities.  Whether the entities are students, researchers, learning objects or ideas, network analysis attempts to understand how the entities are connected rather than understand the attributes of the entities.  Measures include density, centrality, connectivity, betweenness and degrees. 

  • Linked Data


    • If Tim Berners-Lee vision of linked data (http://www.ted.com/talks/tim_berners_lee_on_the_next_web.html) is successful in transforming the internet into a huge database, the value of delivering content via courses and programs will diminish and universities will need to find new ways of adding value to learning.  Developing tools that can facilitate access to relevant content using linked data could be one way that universities remain relevant in the higher learning sector.

  • Visualization

    • The presentation of the data after it has been extracted, cleansed and analyzed is critical to successfully engage students in learning and acting on the information that is presented.  

My next focus will be to identify key tools within each area.


October 22, 2014

Are you aligning Training with Performance?

0 comments

Focus Training on Performance

As a learning leader it is important to understand how to best align your training activities with the performance goals of your organization.  To do this, we need to be sure that we always keep performance as the goal, communicate transparently and clearly, and think of learning as a process, not an event. This can be challenging, because a lot of we do looks and feels like an event.  

It feels like those of us who analyze roles, design and develop curriculum, facilitate a course, and evaluate the success for that course, assume that we have "performance" as our main focus.  But unfortunately, it can be elusive.  We can find a great concept and really focus on delivering that concept, hoping the learner will take it back to the workflow they live in on a daily basis.

Measure the Learners Performance

We also have to be careful that we don't get caught up in worrying about measuring ourselves.  When we look at the data we have, we tend to want to focus on how training activities went, how did we do, did they like us. But this takes the focus away from the performance of the employees. Can they perform the task(s) we taught them?

Communicate their Performance to the Field

We need to be able to communicate clearly to the business unit...

here is how your staff is performing and if at all possible, be part of the performance communication when they are back in the field.

Are you aligning your training with performance?