Wednesday, 8 April 2020

How to transform F-2-F #exams into online exams #onlineAssignments #onlineExams #pedagogy

As #COVID19 seems to disrupt our teaching and learning for a longer period of time, I wrote up a document based on requests from colleagues to look into the delivery of online exams and assignments, and what the options are.
The document is entitled "Transforming face-to-face exams into online exams: considering proctoring tool security and creative pedagogical options".

The document is shared under Creative Commons Share Alike license which aligns with the EU directive on transparency and sharing as EIT InnoEnergy for which I work, has a supporting role to the EU and the EU promotes sharing to ensure mutual growth.

This document is an addition to a previous document I wrote with 10 fast tips to move from face-to-face learning to online learning (with a lot of English resources from the University of Cape Town; South Africa; Harvard business, USA; University of North-South Wales, Australia; and EU commission on education, ... as well as tools).

(Disclosure: I don’t get or have any benefits suggesting any of the tools or solutions in these documents).
The following topics are discussed:
  • Comparing online proctoring tools which can be used to ensure safe online exams and assignments (the term ‘safe’ means non-cheating here). Can we use proctoring tools for team exams, how secure are they, what are some of the options and prices?...
  • Limiting online exam costs by looking into the usefulness of using group or team exams and open book exams in a digital learning world (benefits, as all of us can transform some or more of our usual closed book exams (CBE) towards an open book exam (OBE).
  • Using best online exam practices without proctoring tools with audio/video 1on1 only and assignments (useful for those not having the financial resources, or with limited internet access).
The full document (11 pages) can be found here.

Thursday, 19 March 2020

Sharing #oralAssignments and #OnlineExam #bestPractices to limit cheating

Request for expertise sharing on #online #exams #covid19 pro-active planning until the end of this academic year and offering #BestPractices for #OnlineExams below. 

The first rumors are indicating that our international HigherEd students will not be requested to come to their guest universities to plan their exams for the end of the 2019-2020 academic year. 

I am trying to find a solid online learning tool that can be used, but in the meanwhile, I want to share best practices that are already used at our and other institutes. Feel free to add any ideas or measures I might have missed when listing our guidelines.

Best practices for organising online exams and online tests All of us working with international students scattered around the globe as they have rejoined their families in their country of origin, will probably be facing online examination needs. With this in mind, I am listing best practices and in a second stage I will be reviewing #EdTech tools that might come in handy if you have multiple students linking up remotely for their exams (we are preparing for 382 students which is a feasible number, yet demands a streamlined approach). I took my master’s in education (M.Ed.) exams remotely myself (thank you @AthabascaUniversity, so sharing those best practices with some additions below. Best practices using only camera and audio as technology: Preparing the exam Switch any written exam questions you might have to oral exam questions. These can include notes that need to be shared (ask contextualized questions, questions that show they understand the material yet can apply it to new contexts; e.g. ask short oral essay questions). Create original exam questions: i.e. questions are not available in educational textbooks (otherwise tech-savvy students will be able to find them in no time :D Choose an online meeting tool that offers recording options (think legal discussions, you need to be able to show why you gave the examination points you gave) and a tool that allows for lengthy recordings at that (no one wants their exam to suddenly stop). Choose a tool that enables sharing the screen (might come in useful for some short essays, designs, stats…). Prepare an informed consent document and send that to the student, so they know their exam will be recorded and stored at the admin server space for X time. If possible, indicate the amount of time set aside for the exam. Make a designated exam folder structured according to your admin. Additionally: you might want to send out a ‘code of conduct’ to the students, so they know what is expected of them. This is where the penalties might be discussed: what is considered cheating, what is the penalty for each stage of cheating… Once the exam starts Introduce the student to the fact that their online session will be recorded (GDPR) – check that the informed consent was signed and sent back to you. Start recording. Indicate the overall guidelines of the exam: open book, closed book, time available, number of questions (if relevant). The student must be made aware of what they can expect. Ask whether they understood what you have just said. Check identity: ask the examinee to show their passport and take a screenshot, save that screenshot as part of the examination administration. Ask them to show their desk, room, and that they need to be in view mid-torso with hands and keyboard visible. (you know why 😊 In case you choose to go with closed book examination: ask them to share their full screen (look at the tabs that are open!). Of course, there is a workaround if they are tech-savvy, which is why exam questions should preferably be open book, it allows them some freedom, yet they still need to really understand how they come to a solution. Only offer one question at the time. Feedback is important… but: depending on the number of questions you prepared, you might want to choose a different feedback strategy. If you have different questions for each student: give feedback as you see fit. If you want to reuse questions: limited feedback is preferable. As we all know, students quickly inform each other on which type of exam questions they got, what the answers or feedback was to what they gave, and what feedback they got. Feedback is given at the end Stop recording and make sure it is in the right folder. What cannot you address in case you work with audio/video tools only? Disabling the right-click button (copying and pasting options, so that students can quickly save questions). A reason to go to tailored questions per student, based on comprehension and creative thinking. Single function add-on tools Use the Respondus Lockdown browser or similar tool to ensure that students cannot look up answers, but yet again, you need to block students looking up answers A review of more designated tools such as ProctorU, ProctorExam, … will follow.

Wednesday, 4 March 2020

Free #Horizon2020 report out @educause good inspiration #learning #education

The Horizon 2020 report (58 pages) has been released by @educause on 2 March 2020 and it is an inspiring read for those of us looking at emerging learning designs and techniques. The report covers trends in the social, economical, political, technological and of course higher education realm and new in this report is a nice contextualization of all the different trends and technologies using visual supports. Educause is Northern America based, so most of the examples and projects they refer to are also North-America based.

This report is also more consciously covering multiple scenarios resulting from the interactions between all the different realms of society, which makes it a nuanced reflection of where learning can go in the near future. The report also links to additional reading and complementary material, e.g. articles on micro-credentials and experiential learning, [High on the higher ed agenda: alternative learning and ongoing increase of online education. High on the economic agenda: climate change and the green economy]

Download it now! Why, because it has tons of interesting links with a great synopsis for each subject. See below to get an idea of only a handful of information.

Emphasized learning technologies and practices this year:

  • Adaptive Learning Technologies 
  • AI/Machine Learning Education Applications 
  • Analytics for Student Success 
  • Elevation of Instructional Design, Learning Engineering, and UX Design in Pedagogy 
  • Open Educational Resources 
  • XR (AR/VR/MR/Haptic) Technologies 

Adaptive learning technologies are still hot news, as the search for effective personalized learning is still looking for practical outcomes. One of these listed in the report is the Alchemy tool by UBC (University of British Colombia, Vancouver, Canada). It is described as "Alchemy is a multi-featured online tool that supports teaching and learning in any circumstance that benefits from flexible, scalable, and feedback-rich learning alongside growing learning analytics capabilities." which basically shows where learning/teaching is moving towards in this ever more specialized-topic driven learning world.
Another adaptive example I like is from Deakin University, called the professional literacy suite, where I especially like their focus on digital skills in the first year. Which fits with the demand on data and AI savviness, communication skills.... teach those early on in higher ed curricula.

AI and Machine learning in practice still focus a lot on chatbots (which is basically turn a FAQ into feedback offered by bots). The most interesting option mentioned is the Responsible Operations positioning paper by the worldwide library cooperative (38 pages, great insights) that looks at how AI and ML are embedded in society, and how this changes all parts of society and which research agenda might address these challenges.
And the University of Oklahoma has set up PAIR (a global directory of AI projects in Higher education) ... nice!

The Analytics for Student Success are fairly similar, but the report on Ethics in Learning Analytics (16 pages) by the International Council for open and distance education is a good reference document to keep at close hand.

Elevation of learning design - pedagogy is always of interest to me. In a way, the learning design changes feel as small changes, but with big impact as a growing number of teachers and learning-related professionals are picking up digital learning tools and embedding them into their curriculum to address multiple learning challenges.
Carnegie Melon has an open learning toolbox called OpenSimon (part of the Simon initiative), a great spot to explore tools, techniques, research projects and so on (e.g. the Tetrad project which is an easy visualisation tool for data).

OER: we need more OER, but for those looking for new material, Mason OER metafinder is a great starting point.

XR technologies (extended reality) are increasingly on the rise as just-in-time workplace learning is higher on the agenda of our rapidly changing world. It builds upon prior realizations and needs to simulate emergency actions for students, e.g. augmented reality use for medical students by the University of Leiden, Netherlands.

It is a great read, good to get a fresh perspective on where we are all going.

Thursday, 16 January 2020

Free @eLearningGuild research on #generations in the #workplace @JaneBozarth

The inspiring eLearning Guild keeps disseminating great reports in relation to learning. One of their key authors is Jane Bozarth (= director of research @eLearningGuild), who is always an inspiration. You also know that a report will be of interest if she writes it.

In the eLearning Guild's latest report (19 pages) the focus is on Generations in the Workplace: how they see each other and why this is worth all of our attention. Interestingly, the report starts out with a clear framing on why all of us tend to have possible stereotypes confirmed when thinking about 'the other' in terms of age and what a person in a certain age group can do (see page 2 of the report). And the report concludes that little comparable research in terms of learning outcome is being done, diversity and inclusion of age groups is a concern, training might be affected by trainers encountering different attitudes towards their training, concepts that are being used as indicators (e.g. lifelong learning) to assume a strong will to learn might be based on different interpretations of those concepts by learners. The complete list of ideas and outcomes for L&D practitioners is pasted below. 

Basically, learners of all ages are more similar than different! So, erase any potential stereotypes and move on :) Get the report, it is free (if you register with the eLearning Guild, also free) and it is a really good read.

It is difficult to find advice for L&D practitioners that is not rooted in often-unsubstantiated data from the popular press. A few takeaways from this review of the empirical literature base:
• It seems worth noting that while much research explores values and attitudes (Do people of different generations comply with rules? Do people of different generations like technology?), little of it compares outcomes. Is one generation more productive? Is one generation more prone to making errors? There is no evidence to suggest one person inherently performs better than another by dint of any grouping by birth year.
• Work has changed. More work is remote, more work is mobile, and people are becoming increasingly accustomed to finding quick answers and help. Workers need content that is available anytime, anywhere, and is accessible via any number of devices. The content may need to be less formal than it has been in the past. Workers may need help understanding how to access that content and how to use the devices.
• A number of researchers (among them Mencl & Lester, 2014; Urick et al. 2017; Woodward, 2015) tie concerns about generations in the workplace to the larger matter of diversity and inclusion. L&D workers involved in efforts in these areas may need to incorporate ideas about generations into that broader context.
• In their 2015 review of the literature, Woodward et al. found that younger generations placed greater emphasis on lifelong learning and personal development at work than older generations. Those in L&D might consider that similarities can take a number of forms. For instance, Mencl & Lester 2014 note that the Baby Boomer interested in “career advancement” and the Millennial interested in “lifelong learning” may be talking about very similar things.
• An interesting take comes from a 2018 Learning Solutions article by Joe Mayer, who reframes the conversation from “managing differences” to “avoiding generational bias”. Among his suggestions are using design thinking to take an empathetic view of the learner, to conduct extensive user-testing of your user base, to choose appropriate vehicles for content rather than try to accommodate some perceived generational preference for a particular medium or instructional approach, and to consider tenure rather than age.
• Finally, we should take care to check our own biases. In a lab experiment in which undergraduates taught a technical task (using Google chat) to learners of varying ages, researchers found that “ostensibly older trainees evoked negative expectancies when training for a technological task, which ultimately manifested in poorer training interactions and trainer evaluations of trainee performance” (McCausland, King, Bartholomew, Feyre, Ahmad & Finkelstein, 2015) p. 693. In short: Trainer stereotypes of learners based on age created a self-fulfilling prophecy. Older trainees received lower-quality training, which could ultimately affect job performance.
(p. 14, eLearning Guild, 2020) 

Monday, 9 December 2019

#Learning monitoring in Belgium - based on #LearningDoctrine #synchronous

Just this morning I got a link to a video representing a new learning technology used at IMEC. As I looking into synchronous learning technology, this is of interest. But as I was watching the video, I felt a bit uneasy. This synchronous learning solution WeConnect is offered by Barco and is implemented at IMEC (which is connected to KULeuven, which will in the years become the major university in Belgium, as it is good in gaining and keeping established power).

Monitor the learner to push them into good followers
In this synchronous learning solution, online learners attending the synchronous classroom are monitored (facial expressions), psychophysiological data is captured (using wearables), engagement is measured (based on body movements) and interventions (quizzes, polls) are embedded in the lecture in order to keep the attention of learners. But again, this is leading the biggest batch of learners, the 'normal' learners, those who have an attention span lasting a full lecture. And it is aimed at lecture-based content (university content mainly), with of-course a teacher dashboard indicating engagement of the overall student population.

It is not about instruction, it is about stimulating creative thinking on subject areas of interest
I can see the benefits of this system, but it just annoys me intensely that it is again about instruction (absorbing information), not about actual learning (creating). For instance, if you use challenge-driven education and learners are working on their own projects.... surely the engagement and learning will skyrocket through the roof?

Adults learners need a digital shepherd?
When a child is young (even up to 18 years old), I can imagine you want to learn how to learn, how to stay attentive and what it can provide you with... but once you are an adult, surely you will know your own way forward? Surely, there should be more ways for any intelligent young adult to open their own world and live it the way they feel fit?
Why are technologists so scared a learner wouldn't be attentive, stare outside, have something on their mind... and then zoom in again on the subject that is given? To me, if a learner is not interested enough in the lecture... so what? If a teacher cannot grab your attention, what of it? Should we pressure learners into learning patterns they

Learning comes naturally
When you consider MOOCs, learners learn them and take them in their spare time. There is no 'optimization of learner posture'. People learn because they like the content because they are intrinsically motivated because they have a personal goal. I would think that tailoring content and delivery to nurture intrinsic motivation and personal goals is more useful, more fulfilling from the learner's point of view? Learning is in our genes, which makes all of learning unique yet natural in its uniqueness. With all of these technologies, I would think that human satisfaction would become more interesting as a subject for innovative technologies, then creating humans that learn alike, do alike, and follow digital indicators?

Can a learner - using this system - decline being monitored? While still following the course or the lecture? Surely this should be the case? I would immediately ask to be non-monitored. But then this could be me.

Quantum supremacy surely makes 'proper old-school learners' obsolete?
I would be very surprised if the future would be all about the best learners (which human society has never been about either), but for those who can actually fill their spare time with actions that make them feel confident, useful, creative and ... happy. Subtracting new knowledge from data can become a processing-power based activity done by e.g. computers having the sycamore chip though granted, it will still take some years before it becomes fully functional for day-to-day actions. But still.... shouldn't we focus on getting humans more actively involved in a less-school-like higher education?

What do you think? Below is the link to the movie that sparked my sighed-based eye roll resulting in this blogpost. I will try to get my hands on using it for innovative learning.

LECTURE+ from imec on Vimeo.

Wednesday, 27 November 2019

Why is #AI useful to pro-actively prepare #learners in a changing world? #skills

Preparing for my talk today at Online Educa Berlin, after a great workshop-filled day yesterday (one of the workshops was on preparing for the 4th industrial revolution guided by Gilly Salmon ) and a wonderfully inspiring and ideas provoking workshop with Bryan Alexander looking at methods to predict parts of the future).

Below you can find my slides for the session at Online Educa Berlin looking at ways that Artificial Intelligence can be used to pro-actively prepare learners for the skills of the future.

It covers the steps we have tackled at InnoEnergy with the skills engine. In the talk I will share our approach, and how this differs from what was previously done. The slides are rather minimal, but if you download the talk, you can look at the notes in the slides to get the full picture.

Thursday, 3 October 2019

Yes a learning engine: demo is ready, but #AI and #Learning challenges ahead #TBB2019 @InnoEnergyCE

If you have ideas on ensuring continuity in pedagogy when clustering courses (research), on certifying across corporate and university learning (blockchain/bit of trust certification), on opening up industry academies to decrease L&D costs (HR and L&D), ... please think along and respond to the challenges mentioned at the end.

People in high and common places seem to agree that the world is in transition, especially workplace learning, as innovations keep changing what is possible. As I am working on one such an innovation (the skill project of InnoEnergy), I am at the one hand very excited about the new opportunities it might open, yet at the same time concerned that the complexity is bigger than expected.

First: have a look at the demo screencast here. It shows the overall idea, and ... this might immediately give rise to questions.

Today the Business Booster event (TBB) is opened, and with it, the skill project demo is launched. The skillproject (we still need to get a brand name for it), is combining AI and learning for the sustainable energy sector. But in essence, once we get the sustainable energy sector mapped with this tool, others can follow. 

AI and learning? What does it do: the project identifies industry needs (AI-driven), pinpoints emerging skill gaps in the sustainable energy sector (AI-driven), analysis the existing workforce to know where urgent skills gaps are situated (AI-driven) and then refers employees to a personalized learning trajectory addressing their skills gap (part AI, part human support). The goal of this project is to ensure that employees of the sustainable energy sector stay futureproof in a quickly changing working environment. Let's be honest, it sounds cool, but ... the challenges are multiple. 

The emergence of a Learning Engine
The skillproject helps realize the emergence of a learning engine, an intelligent career-oriented engine which knows your own skills and which signposts you to where you want to go with your career by suggesting a personalized learning track.
In the Learning Engine you simply type in “goal: become Director of Innovation’s in offshore wind energy which courses?” and the engine immediately returns a tailored, personalized learning track consisting of a variety of certified, business training from both universities, corporate academies, open educational energy resources and coaching options to send you on your way. This will allow professional learning to surpass the limits of classical, university-based learning.

In order to get our engine to come up with the best, most-tailored courses, we need access to industry academies, as well as university courses. 
Learning-to-Learn capacities. Once we signpost learners to a cluster of courses, they need to take them (the familiar 'take the horse to water' comes to mind). But even if the learners are taking the courses, 
Granularity for course clustering: clustering courses to keep on top of your field of expertise is one thing, but then what is the granularity of those courses? Micro-learning is an option, and modular learning will become a clear necessity, as all learners have different existing knowledge, which means they all need different parts in order to upskill what they already know. 
Ensuring pedagogical continuity, even OU finds that a challenge. Great, so let's cluster modules. But then, how can we link these modules together, Do we believe in the non-pedagogical support (e.g. hole in the wall from Sugata Mitra already dates back 10 years), or do we need to find a solution to provide pedagogical continuity that fits with this new assembly of short modules, and courses coming from different sources (both university and industry)?
Certification across the learning ecologies: to blockchain or not to blockchain. Once we start learning across institutes, we need to keep track of that what we learn, by keeping tabs on the actual learning: corporate academy learning, university modules, hands-on training, workplace learning... one solution is to embed blockchain in education to keep track of all learning. But this is easier said than done, and open standards and trust might be an issue to consider (bit of trust initiative offers good reading). 

Feel free to send questions, comments, share your own projects... let's get together.

Thursday, 19 September 2019

LiveBlog #Ectel2019 Rose Luckin @Knowldgillusion Keynote #AI & #education mindset

 Rose Luckin takes the stage with a headset and immediately getting into her talk. The talk was very informative and to me it looked as though Rose is so knowledgeable about a range of topics, so I got a bit curious and envious in how her mind works [It I heard - I do not know if this is correct, will ask her ] that she only got into academic life later on in life?

Key topic: develop the right AI mindset for businesses

A perfect storm: data mass plus computing power and memory enhancements, sophisticated algorithms ... this made AI part of our lives and education.

3 routes to Impact on Education

  • using AI ED to tackle some of the big educational challenges
  • education people about AI so that they can use it safely and effectively
  • changing education so that we focus on human intelligence and prepare people for an AI world (hardest to do at the moment)

Working with select committee processes to try and take forward new developments. Debating on 4th industrial revolution and what it means that people understand AI (it is not coding, it is about the humans and their understanding of the fundamentals of machine algorithms, awareness, it is a much higher order we need to engage people with).

Need for multidisciplinary teams with equal input
As change happens, we need to change our educational systems (Singapore). Be resilient to change, be adaptive.
The above are not separate routes, it interconnects, and these interconnections increase AI and that we need to change and invest in our society using emerging ideas and realities of these three buckets.
We need to build bridges between communities: all stakeholders (parents, communities, government, coders...).
Currently separated communities need to work together to build a credible, societaly based AI solution.

Companies working with UCL EDUCATE
Not all companies are already using AI, but they want to understand more about it.
EDUCATE was from Europe, but turning into a global program from Jan 2020.
250 educational study start-ups (each start-up has to have a link with London, but they need to have some profile in London, so most UK-originated).
UCL provides training (labs, clinics, blended rooms, mentoring sessions)
It is free for the companies (years spend on figuring out the gaps between educational departments and industry. This was the case for hard sciences and industry, but not education). A lot of the reasons was because they did not know who to talk to, where to start => reason for starting with start-ups, embedding the educational mindset and to understand more about outcomes and validation of educational projects, so what it means when we say 'it works' (complexities... this results in the golden triangle: edtech developers, teachers & learners, academic researchers).

Start-ups are pushed to build a logic model, and the change being the learning that they want to take place. Opportunities they have to analyze the data, how should they demonstrate impact. We hope they will get to the last stage (see picture).
EdWards are set in place (awards to proof evidence applied and evidence aware awards).
120 companies became evidence aware, and 25 become evidence applied (last being much more difficult to achieve).

EDUCATE for schools
objective: build capacity in schools to identify and evaluate edtech that meets the needs of their teaching, learning or environment.
This approach can work in different educational programs.
Sit down, get head teacher in to pick two or three educational challenges - what they find tricky, than teachers are chosen to test it, to find out how the edtech works.
Currently this is under development:
all resources included in option 1, schools identify new or existing edtech to pilot
EDUCATE provides new resources to help schools plan their edtech pilot,
educate povides video and document resurces to walk schools through the pilot process
schools step through piloting process and recieve one hour of 1:1 video mentoring support
evaluate it (not sure I put this in correctly - this last step)

Century AI:
AI and big data powers personalised learning
Quipper: video insight, smart study planner, knowledge base
EvidenceB KidsCode : paths through materials, optimised parts through material

classic recommender systems (finding the right resources for the educator/student)

Chatterbox: refugee as expert native speaker with matching backgrounds (e.g. engineering background)
OyaLabs cloudbased monitor in the baby lounge and monitors interactions between baby and its cognitive developments for language developments
MyCognition algorithms automatically increase the number of training loops for the domains where you have the greatest need. If attention is your greatest needs you will receive more attention loops, building resilience in attention. As you progress the loops become more challenging. Looks at your attention, actions... assessment and report, which powers aquasnap and takes you to a underwater world (sea routes, fish names...) and adapted to your own cognitive status.

Building an AI mindset
Important for any company that wants to get into AI
What does it means to have the right data,
not just the tech team must understand the data and AI
as an individual it would be good to understand more about AI

Working with OSTC / ZISHI company: example of AI mindset collaboration. What they do: training for trader floors. They have to train everyone. They try to attract diversity in the workforce and pick them from less evident universities. ZISHI tries to use AI, AI for financial sector.
Financial sector has used AI for some time. AI used for assist in recruiting the best traders, assist in training the traders, help traders in improving performance, mentor the traders through out their careers.

Understanding OSTC's performance metrics

  • how can training behavior be measured?
  • can we profile traders by their trading behavior?
  • how do these profiles relate to performance?
  • can we then create a tool to help recruitment a tool to help traders and a tool to help managers?

The CEO of OSTC started out at the post floor of Lloyds and moved up. One's he saw the lack of training, he got into training and set up OSTC. Fundamentally what they try to do is creating AI mindset.

Much is not easy or obvious of what traders do

  • what others tell me that I do
  • what I think I do
  • what I really do
  • what family thinks you do...

Nearly half their traders left less than one year in. So something was wrong, and investment was too costly for the results in the longterm.
Modeling using machine learning techniques to profile traders and make predictions (recruitment data from tests, interviews and videos, trading history data from trading platforms, multimodal data from eye-movements and button clicks, and behavioral data.
Masses of data from the tools used in the company.

Profiling 4 types of traders, using four identified characteristics:
data visualizations, using clustering techniques.
It turns out that the behavioral patterns relate to significantly different performance (risk management, bonuses... and different cognitive abilities & traits (openness to experiences, agreeableness...) [here my mind went off... must be something related to trader-vocabulary?]

Challenges to IA mindset

  • collaboration: is everybody onboard?
  • getting rid of AI's sci-fi fantasies and fears
  • digging in rich soil will bring out stuff. Are we ready to act upon it?
  • the appetite comes with the first byte - be ethically prepared to diet
  • data is har to collect, standardize, clean, #you-name-it

Opportunities for IA mindset

  • map the organisations' data information knowledge wisdom pyramid (and who is where
  • identify data sources: what is ready to be picked, what still needs to be ripened or sown
  • what can we learn from previous (successful of failed) experiments or pilots? what hypotheses they already have? what are their blind spots?
  • metrics - how do we know what success looks like?

OSTC - lessons

  • team members across different tiers need to embrace change
  • collect as much data 
  • tech team in company not the same as data team
  • need new expertise to digitize documenten and learning content
  • develop coherent and consistent procedures in all offices across the globe despite the cultural bias
  • track the daily activities through logs and multimodal data
  • develop tools

Developing an AI mindset

  • AI is set to transform education
  • three core types of interconnected work: using AI, understanding AI, changing education because of AI
  • multi-stakeholder collaboration can help achieve these three types of work
  • EDUCATE is an example of a multi-stakeholder collaboration to help develop a research mindset in Edtech developers and educators
  • for AI companies, or companies who want to use their data and AI we also need to develop an AI mindset (or perhaps initially a data mindset)
  • Academic research partners need to be put in this mix

Barclays provided somebody (eagle) in branches, and they would help people to use technology (from simple to complex) to get people engaged about using and thinking about technology, and how they can get involved.

Wednesday, 18 September 2019

#ectel2019 #mlearn2019 keynote @GeoffStead on #informal learning at scale #languages #AI

Geoff Stead (@geoffstead ) takes the stage with a headset, a black shirt and walking like a fit Californian surfer (looking great).

As chief product person of the Babbel language corporation, he talks about informal learning at scale and will offer insights. 750 people all working on 1 app, fully funded by individuals willing to pay small amounts of money to learn languages. Mostly Euro-centric coming from the organic growth of the organisation.

5000 courses => 64000 lessons (unique language pairs), focus on communicative confidence, light-hearted, diverse topics. Well over 1 million subscribers (of which I am one - Spanish).

Digital = scale and reach
Team of 10 people can start the magic of the web.
How can we ensure Quality?
Learner centric, otherwise what is the value of the application?

Using a learner journey to unite efforts, to enable connections between learners. Conceptual flows of individuals that is used as the mantra to move the app forward.
See picture, where they also embed some spaced learning.
They work with patterns that are turned into fake persona's, which are designed and modeled (design thinking approach). Enabling developers and strategist to understand the different demographics. These personas are linked to learner journeys. Which enables to keep a focus on the learners.

Learning from the learners
What do they do? analytics, A/B tests, behavioral segmentation (showing what you did, signposting to what you did and worked...), interviews, intercept surveys, wishboard, market surveys, UX research (ask permission to video tape parts of the learner journey and ideas), customer service, market research. Not one is representative, but hoping that with enough different angles they are hoping to get closer to the actual learning in all it's complexity.

Dev at scale
20 different teams of people, a lot of independence, but only one product. So how likely it is that the releases are synchronizable as soon as they are launched by teams? Tripping over each other, contradictions, quickly becomes chaos. So it is self-driven and autonomous, but potentially disastrous for the learners. Marketing and money was basis for scaling: stickers in planes and on poles in big cities, get people to pay a bit of money.

How do you trade off freedom versus working together
Teams organised around User Journey: Experience Groups (XGs) are clusters of teams across Product & Engineering, uniting tho enhance cross-functional collaboration around product ideas and speed up the development cycle: impressions, engagement, learning, learning media, platform and infrastructure (really interesting this!).

Product department 
Product is made up of many specialist teams. some teams are embedded within multi-function or engineering teams: didactics, product design, product management and QA, data engineering and analytics, quality and release management.

Towards "learning experience design"
Mixed multidisciplinary approach, but in larger companies most of the time they are not often set up as bridged teams in a multidisciplinary, cross-functionalness.

Babbel meetups in Berlin every 2 - 3 months, welcome to come and have a look.

LXD basics
digital learning is not content distribution, we are only a small slice of our learner's day, we never really know what is going on. Learning Experience Design, all about the multidisciplinary nature.

Learner engagement
It only works for them if they use it. What is the science of pulling learners back in?
Weekly active paying users: returners. One of the key drivers = 7 day return to learning (it is this that most of the dev teams use to validate short term impact of new features and refinements). If the people who try a new release, do they come back within 7 days to use this newly released option. This simplifies discussions on what is important.

Obsessive focus on interpreting events: Tableau, Amplitude (big fat data stream).
Mixing art and science to understand the engagement ladder (to help our learenrs focus - hooked (N Eyal) triggers motivation (Fogg), Nudge (Thaler, Flow state, spaced repetition, babbel qualitative and quantitative data....).

Gamification: treat with care, some very useful tools, often used for trivial impact.

AI to make Babbel more human
AI is a very broad umbrella term for a wide range of very specific disciplines. Babbel uses 'narrow AI' to focus on very specific problems/opportunities. NLP, CL, ASR...
Making interfaces more human (hybrid human-AI). Using NLP to give the automated feedback more human (eg "I understand what you meant").
Making guidance more useful: content recommendations, based on other, related topics and level. Still very much in beta. Optimising for speed, and identifying opportunities.

Rose Luckin's golden triangle is used.
Tutorbot corpus (Kate McCurdy, Dragan Gasevic...)

Tuesday, 17 September 2019

#mLearn2019 workshop Urban safety and #smart civic #education

liveblog from mLearn2019, so consisting of bits and pieces and notes written during the workshop.

Part 1 by Wim de Jong (OU Netherlands)
Smart solutions for urban problems (design solutions), governance for safety (prevention of crime, policing....) and systemic challenges (eg.polution...).

Can technology foster the fears it tries to combat? (perception and condition of city safety)
How can we counterbalance the bias in current perceptions of safety? (Question from Daniel Spikol).

Safe cities index (2019) here 
Sherlock app (citizens who can help and assist in crime-solving with police - Dutch)
OTT (where are the fights going on?)

Part2 Leadership in smart cities & Open innovation
New paradigm in industrial engineering. A new way to integrate a community for designing things.
Wicked problems (things are connected and affect each other): social instabilities, traffic accidents, environmental pollution, floods...)
Need for innovative solutions
requiring input and expertise of a wide array of people

the innovative ecosystem
focal entity
combination bottom-up & top-down
value capture and creation = difficult and complex
importance of partner alignment => intrinsic motivation

[While following this talk, I see how the framework shared in pictures below can be relevant when looking at #AIED and citizen jury / citizen action ].