Showing posts with label motivation. Show all posts
Showing posts with label motivation. Show all posts

Monday, 9 December 2019

#Learning monitoring in Belgium - based on #LearningDoctrine #synchronous

Just this morning I got a link to a video representing a new learning technology used at IMEC. As I looking into synchronous learning technology, this is of interest. But as I was watching the video, I felt a bit uneasy. This synchronous learning solution WeConnect is offered by Barco and is implemented at IMEC (which is connected to KULeuven, which will in the years become the major university in Belgium, as it is good in gaining and keeping established power).

Monitor the learner to push them into good followers
In this synchronous learning solution, online learners attending the synchronous classroom are monitored (facial expressions), psychophysiological data is captured (using wearables), engagement is measured (based on body movements) and interventions (quizzes, polls) are embedded in the lecture in order to keep the attention of learners. But again, this is leading the biggest batch of learners, the 'normal' learners, those who have an attention span lasting a full lecture. And it is aimed at lecture-based content (university content mainly), with of-course a teacher dashboard indicating engagement of the overall student population.

It is not about instruction, it is about stimulating creative thinking on subject areas of interest
I can see the benefits of this system, but it just annoys me intensely that it is again about instruction (absorbing information), not about actual learning (creating). For instance, if you use challenge-driven education and learners are working on their own projects.... surely the engagement and learning will skyrocket through the roof?

Adults learners need a digital shepherd?
When a child is young (even up to 18 years old), I can imagine you want to learn how to learn, how to stay attentive and what it can provide you with... but once you are an adult, surely you will know your own way forward? Surely, there should be more ways for any intelligent young adult to open their own world and live it the way they feel fit?
Why are technologists so scared a learner wouldn't be attentive, stare outside, have something on their mind... and then zoom in again on the subject that is given? To me, if a learner is not interested enough in the lecture... so what? If a teacher cannot grab your attention, what of it? Should we pressure learners into learning patterns they

Learning comes naturally
When you consider MOOCs, learners learn them and take them in their spare time. There is no 'optimization of learner posture'. People learn because they like the content because they are intrinsically motivated because they have a personal goal. I would think that tailoring content and delivery to nurture intrinsic motivation and personal goals is more useful, more fulfilling from the learner's point of view? Learning is in our genes, which makes all of learning unique yet natural in its uniqueness. With all of these technologies, I would think that human satisfaction would become more interesting as a subject for innovative technologies, then creating humans that learn alike, do alike, and follow digital indicators?

GDPR
Can a learner - using this system - decline being monitored? While still following the course or the lecture? Surely this should be the case? I would immediately ask to be non-monitored. But then this could be me.

Quantum supremacy surely makes 'proper old-school learners' obsolete?
I would be very surprised if the future would be all about the best learners (which human society has never been about either), but for those who can actually fill their spare time with actions that make them feel confident, useful, creative and ... happy. Subtracting new knowledge from data can become a processing-power based activity done by e.g. computers having the sycamore chip though granted, it will still take some years before it becomes fully functional for day-to-day actions. But still.... shouldn't we focus on getting humans more actively involved in a less-school-like higher education?

What do you think? Below is the link to the movie that sparked my sighed-based eye roll resulting in this blogpost. I will try to get my hands on using it for innovative learning.

LECTURE+ from imec on Vimeo.


Wednesday, 26 April 2017

Using #learningAnalytics to inform research and practice #educon17

Talk during Educon2017 by Dragan Gasevic known for his award-winning work of his team on the LOCO-Analytics software is considered one of the pioneering contributions in the growing area of learning analytics. In 2014 he founded ProSolo Technologies Inc (https://www.youtube.com/watch?v=4ACNKw7A_04) that develops a software solution for tracking, evaluating, and recognizing competences gained through self-directed learning and social interactions.

He jumps up the stage with a bouncy step and was in good form to get his talk going.

What he understands under learning analytics is the following: shaping the context of learning analytics results in challenges and opportunities. Developing a lifelong learning journey automatically results in a measuring system that can support and guide the learning experience for individuals.
Active learning also means constant funding, to enable the constant iteration of knowledge, research and tech. But even if you provide new information, there are only limited means to understand who in the room is actually learning something, or not. So addressing the need to get meaningful feedback on what is learned is the basis of learning analytics.
Learning system (e.g. LMS), we also use socio-economic details of individuals
No matter which technologies are used, the interaction with these technologies results in digital footprints. Initially the technologists used the digital footprints as a means to adjust the technology. But gradually natural language processing, learning, meaning creation… also became investigated using these technologies.

Actual applications of learning analytics are given: two well known examples
Course Signals from the Purdue university: analysing the student actions within their LMS (blackboard), different student variables, outcome variables for student risk (high, mediat, low risk) provided by algorithms using the data from the digital footprints of each students. The teachers and students got ‘traffic light’ alerts. Those students they used the signals, had an increase of 10 to 20 percent student success.
Doing a content analysis of using course signals, summative feedback seemed to have much less related to student success, but formative (detailed specific) feedback did have immediate effect on learning success.
University of Michigan E2Coach (top 2 public universities in US). They have large science classrooms, but populated by students with very varied science grade background.
In the E2coach project, they used the idea of ‘better than expected’, so they looked at successful learning patterns: successful students would be adaptive (trying different options to learn), and those who self-organised in peer groups, to enable content structuring.
Top performing students were asked to give pointers on what they did to be successful learners. Those pointers were given to new students to provide them feedback on how they could increase learning success, but at the same time giving them the option to learn (self-determination theory). This resulted in about 5 percent improvement of learner success.

Challenges of learning analytics
Four challenges:
Generalisability: while we are seeing predictive models for student success, but they only extend to what can be generalised. Significance of these models were not too applicable across different context, so the generalisability was quite low. Some indicators seem to be significant predictors, yet in other contexts they are not. So what is the reason behind this. This means we are now collecting massive amounts of MOOC data to look for specific reasons. But this work is difficult, as we need to understand what questions do we need to address.
Student agency: also a challenge. How much of student decisions are made by themselves, but the responsibility of learning is in their hands in their hands.
Common myth in learning analytics: more time spend on tasks, the more they will learn. Actually, this is not the case, more reverse actually. Even time with educators is frequently showing that it is an indicator for poor learner success.
Feedback presentation: we felt that the only way to give feedback is visualisation and dashboards. But many different type of vendors involved in learning analytics look into dashboards. But they sometimes these dashboards are harmful, as the students compared with the class performance, resulting in less student engagement and learning. Students sometimes invested less time as they felt from the dashboards they were doing well, so with less investment less learning.
Investment and willingness to understand: http://he-analytics.com and the SheilaProject  http://sheilaproject.eu/ 50plus senior leaders investigated for their understanding of learning analytics. Institutions hardly provide opportunities to learn what learning analtytics are really about. Lack of leadership on learning analytics, so in many cases they are not sure what it entails, or what to do with it. So that results in buying a product… which does not make sense.
Lack of active engagement of all the stakeholders: students are mostly not involved from day one in development of these learner analytics (no user-centered approach).

Direction for learning analytics
Learning analytics are about learning. So we need to fall back on what we already know about learning, then design certain types of intervention using learning analytics. Learning analytics is more than data science, it provides powerful algorithms, machine learning algorithms, system dynamics… but we are end up into a data crunching problem, as we need Theory (particular approaches: cognitive load, self-regulation), practices also inform where to go. We need to take into account whether these results make sense. Which of the correlations are really meaningful, which make sense… but at the same time e need to take into account learning design and the way we are constructing the learning paths for our students. We cannot ignore experimental design, if we also are using meaningful learning analytics. We need to be very specific about study design. Interaction design is for types of interfaces, but they need to be aligned with pedagogical methods.

How does this result in the challenges mentioned before
Generalisability: if we want to make sure we think about this, we need to take into account that one fits all will never work in learning. Different mission, different population, different models, different legislation.. level of individual courses. Differences in instructional design, different courses need different approaches. It is all about contextual information. So what shapes our engagement? Social networks work only for those called weak ties. Networks with only strong ties restrain full learning success. Data mining can help us to analyse networks: exponential random graphs (not sure here?). Machine learning transfer: using it across different domains. Recent good developments addressing this.

Student agency: back to established knowledge (2006 paper: students use operations and to create artefacts for recall or trying to provide arguments or critical thinking). The student decisions are based on student conditions: prior knowledge, study skills, motivations… all of these conditions need to be taken into account. Identifying sub-groups of learners based on algorithms. Some students are really active, but not productive. Some students were only performing were only mediocre active, yet very good performing in terms of studying. Study skills are changing, priorities are changing during learning… so this means different learning agency. Desirable difficulties need to be addressed and investigated. So no significant success between the highly active and mediocre active students, which needs to be studied to find reasons behind it. Learners motivation changes the most during the day as can be seen from literature. So we need to focus to understand these reasons, and to set up interdisciplinary teams to highlight possible reasons while strongly grounding it in existing theory.

Analytics-based Feedback: students need guidance, not only task specific language indicators. This can be done by semi-automatic teacher triggers to provide more support and guidance, resulting in meaningful feedback used by students. (look up research from Sydney, ask reference Inge). Personalised feedback have a significant effect (Inge, again seen in mobimooc). http://ontasklearning.org
Shall we drop the study of visualization? No, it is significant for study skills, and decision making on analytics, but we need to focus on which methods work and to gradually involve visualisations to know what works, what not. Taylor it to specific tasks, design it in a different way than up till now.

Development of analytics capacity and culture: ethics, privacy concerns, very few faculty really ask students for feedback. What are the key points for developing culture: think about discourse (not only technical specs). We need to understand data, we need to work with IT, using different type of models not only data crunching, start from what we know already, finally transformation: we need to step away from learning analytics as technology, we need to talk to our stakeholders to know how to act, how to do it, who is responsible for certain things, … the process can be inclusive adoption process (look it up). We need to think about questions, design strategies, working on the whole phenomenon we need to involve the students. Students are highly aware of the usefulness of their data.

If we want to be successful with learning analytics we need to work together if we want to make a significant difference or impact.

Tuesday, 7 June 2016

#CALRG Keynote Allison LittleJohn professional and digital learning #liveblog @allisonl

Allison Littlejohn opened the CALRG conference day focusing on FutureLearn MOOCs. The keynote had two objectives to keynote: showcase work from OU, and encourage at the end the contributions add to the body of knowledge.

Professionals learn for present and future work.

Littlejohn & Margaryon (2013) technology-enhanced professional learning (triangle with learning in the middle and learning processes, work practices ad tech use.
Driver for learning is tasks, work-processes.
Formal and informal learning: Eraut (2000 – 2004) learning can be intentional (formal, non-formal) and unintentional (recognised, unacknowledged).
Context, resources … and their impact on learning.
Self-agency, driving learning from your own perspective is central to both self-directed as well as self-regulated learning. Learning work is dynamic, so there is a distinction between learning as a student and learning as a professional.
SRL factors: self-efficacy, goal setting (adapting according to need), task strategy, task interest (motivation), learning strategy (ability to integrate new with existing knowledge), self-satisfaction and evaluation, help seeking, learning challenge (resilience to challenge).  
Learning opportunities such as workplace context influence learning activities.
Interesting in study Littlejohn is the profile with negative help-seeking, overlap with individual learner witness.

Key factors in MOOC learning

Context counts (introduction to data science), Hood Littlejohn, milligan (2015) context counts.

Motivation matters (introduction to data science). External motivation for (self-perceived) low SRL,   intrinsic motivation for (self-perceived) high SRL. The latter not necessarily following the course structure, but learning what they needed in terms of learning goals. Emotional language difference in terms of how they share their learning. The low SRL tend to follow all the course elements, while high SRL select more often. Help seeking: Qualitative difference in terms of high SRL and low SRL, as high SRL tend to be less present in forums, yet more goal-oriented in seeking help (in-side AND outside course), including network peers outside of the course. While low SRL were active in forums, yet less focused.
Goal setting was different for low SRL and high SRL.
Milligan, littlejohn, hood, learning in MOOCs, a comparison study, proceedings of the European Stakeholder Summit on experiences and est practices I and around MOOCs (EMOOCs2016).


Integrate to innovate: we must integrate informal and formal (Tynjala, 2008). Delphi study on MOOCQ – MOOC quality. Quality based on the learner experience, is a unique experience, and a huge challenge in terms of quality measures. (eg. Semantic analysis, how people discuss what they are learning, Helen Crump). From a government perspective the quality post-MOOC is important in terms of return on investment towards society (employment, life quality…). The way quality is measured is also a Power measurement, as quality perception is related to power dynamics. 

Monday, 21 March 2016

Self-regulated learning for measuring motivation & self-esteem in #MOOC #motivation #SRL

For those interested in self-regulated learning, building upon the knowledge which is created over the years, I gladly share a recently published paper, which is part of the eMOOCs2016 proceedings. The project is briefly explained, and in this paper we (the authors) also refer to the self-regulated learning instrument which is used to monitor young students (16-17 year old) while they follow MOOCs to enhance their personal interests. The goal of this project is to increase (online) lifelong learning skills. The paper includes a reference to a SAM-scale for attitude and skills measurement, focusing on language skills (i.e. practical use of language: speaking, listening), and digital skills such as critical thinking.
The paper gives an update on a year long project which runs at GUSCO, a large and innovative secondary school in Kortrijk, Belgium, for which I lead the research end of the project (in participative mode with the teachers and directors).
The paper can be seen as part of the conference proceedings here, or downloaded from Academia here, the paper is entitled; "Ensuring Self-Regulated Learning outcomes in a MOOC and CLIL (Content and Language Integrated Learning) in a k12 project.