This was a very illuminating session by Ellen Wagner, as it provided real options to tackle educational challenges (on the level of institutes as well as learners) on the basis of common educational data retrieved from a varied amount of higher ed institutions. REALLY interesting.
Analytics are taking the world by storm.
parframework.org
Learning analytics, big data is at its vanguard.
Staggering revelations about big data: in just 5 years we will look at 40 zetabytes information, that is HUGE.
All these data that are floating around are not being analysed that much as we think. It takes an amazing amount of time, tech talent, and human interpretation to find value from the data.
The full effect of data can not be envisioned today, as it hits all of society.
Where are we heading?
Pushing the data in LMS, in comprehensive analytics is very complex.
Big data landscape is getting bigger every month. But virtually no company on big data is involved in education.
there is a specific reason why: money of course, but in education - despite of the fact we talk about it - we do not use it yet. We cannot process it using normal analytical tools. Most of data comes in spreadsheets, so there is a big step to figure out of the big data solutions compared to our educational analytics.
While big data raise expectations, student data drive big decisions in .edu.
Because it is new, we do not really know where we will go.
Ellen shares some US cases, to show which work is being done.
In US there is an educational problem. There is more student dept than there is house doubt, this is unsustainable, and people sometimes cannot pay it back within their lifetime
Some schools have 'open enrollment', which results in very high drop out rates unfortunately.
So colleges are now given score cards. But this means there must be standards. The metrics at present are focused on the first time freshman... but this means 85% of the contemporary learners do not fit that profile.
Public education has dropped dramatically. Performance metrics are the basis for funding, but as standards do not exist it means that it is tough to get the expectations and hit targets.
So the score card metrics need to be reviewed.
Additionally, pedagogy are not mentioned on the score cards.
In California: license of student textbooks must be tied to student performance, but this effects universities lives.
Metrics have ramped up expectations of what analytics can do. But the challenge is to build metrics that are constructive for both students and educators.
Education is helping people to grow.
Prescriptive analytics: proscribing educational treatment.
Use case: predictive analytics reporting framework (PAR)
a national, non-profit, multi-institutional collaborative focusesd on institutional effectiveness and student success
a massive data anlysis effor using predictive analytics to identify drivers related to student risk
Par uses descriptive, inferential and predictive analyses to create benchmarks, institutional predictive models and to inventory, map and measure student success interventions that have direct positive impact on behaviors correlated with success.
If you - as an educators - do not know what happens with your learners, than how can we suspect to change education for the better?
pedagogy is important, but the emotional effects of education and the feel of education must be taken into account as well.
The privacy issues related to the learner data are staggering.
First meeting with Bill and Melinda Gates Foundation: NO, we do not think you can do it. So she went on a school circle to see what they wanted, could offer, needed. 700.000 student records, from multiple institutes. The data setup was upped to 8 million records.. this got the YES from Bill and Melinda. Granted, the data were nothing compared to weather data, business data.
But descriptive benchmarks can now be done, for each institutions predictive data can be given based on their student records.
something unexpected emerged: making a prediction is not enough. Finding how this can address the challenges is the most important thing.
It is difficult to get the open, transparent to work with due to educational ethica student related issues around their data.
First three years: building the data resources to get started with analytics.
Now: start the analysis.
The institutes varied: community colleges (done rarely), schools that were considered progressive, as well as 'old school', competency based universities.
We tried to collect data that was available for every student, every school: the simple things that could get hands on => common data definitions.
All data is anonymised (both learners, as well as schools. But the schools keep the encryption to link data to learners (VERY important cfr InBloom project for k12 schools, which resulted in a big emotional issue around data)
If solutions are found to improve learners education, that would be delivered to all. so every action taken by the school is questioned for impact, enabling to make the actual impactful strategies visible.
So they used structured, readily availbale data. Openly published via a cc license
https://public.datacookbook.com/public/institutions/par
One of the things that happened: they can now make comparible conclusions.
Great point on online versus f-2-f colleges (note to self: add movie)
Descriptive benchmarks (cross instittutional) and predictive insights (institutional specific), all with specific filters, e.g. isolate subgroups.
Predictive models reduce guesswork to find students at risk.
For those institutes that have open enrollment and attract all students, these predictive insights can be complimented by different other factors that you - as an institute can research: for instance the inpredictability of humanity (no paycheck in time, death, health issues)
Putting it all together
determine student probability of failure
determine which students respond to interventions
determine which interventions are most effective
allocate resources accordingly
Now also (based on John Campbell work)
inventorying and categorizing student success interventions / supports using a commmon framework
based on known predictors of risk and success
in the context of the academic life cycle
addresses "now what?" by linking predictions to action
enables cross institutional benchmarking
supports local and cross institutional
[Ellen: very hard time to turn down a dare :-) ]
Analytics are taking the world by storm.
parframework.org
Learning analytics, big data is at its vanguard.
Staggering revelations about big data: in just 5 years we will look at 40 zetabytes information, that is HUGE.
All these data that are floating around are not being analysed that much as we think. It takes an amazing amount of time, tech talent, and human interpretation to find value from the data.
The full effect of data can not be envisioned today, as it hits all of society.
Where are we heading?
Pushing the data in LMS, in comprehensive analytics is very complex.
Big data landscape is getting bigger every month. But virtually no company on big data is involved in education.
there is a specific reason why: money of course, but in education - despite of the fact we talk about it - we do not use it yet. We cannot process it using normal analytical tools. Most of data comes in spreadsheets, so there is a big step to figure out of the big data solutions compared to our educational analytics.
While big data raise expectations, student data drive big decisions in .edu.
Because it is new, we do not really know where we will go.
Ellen shares some US cases, to show which work is being done.
In US there is an educational problem. There is more student dept than there is house doubt, this is unsustainable, and people sometimes cannot pay it back within their lifetime
Some schools have 'open enrollment', which results in very high drop out rates unfortunately.
So colleges are now given score cards. But this means there must be standards. The metrics at present are focused on the first time freshman... but this means 85% of the contemporary learners do not fit that profile.
Public education has dropped dramatically. Performance metrics are the basis for funding, but as standards do not exist it means that it is tough to get the expectations and hit targets.
So the score card metrics need to be reviewed.
Additionally, pedagogy are not mentioned on the score cards.
In California: license of student textbooks must be tied to student performance, but this effects universities lives.
Metrics have ramped up expectations of what analytics can do. But the challenge is to build metrics that are constructive for both students and educators.
Education is helping people to grow.
Prescriptive analytics: proscribing educational treatment.
Use case: predictive analytics reporting framework (PAR)
a national, non-profit, multi-institutional collaborative focusesd on institutional effectiveness and student success
a massive data anlysis effor using predictive analytics to identify drivers related to student risk
Par uses descriptive, inferential and predictive analyses to create benchmarks, institutional predictive models and to inventory, map and measure student success interventions that have direct positive impact on behaviors correlated with success.
If you - as an educators - do not know what happens with your learners, than how can we suspect to change education for the better?
pedagogy is important, but the emotional effects of education and the feel of education must be taken into account as well.
The privacy issues related to the learner data are staggering.
First meeting with Bill and Melinda Gates Foundation: NO, we do not think you can do it. So she went on a school circle to see what they wanted, could offer, needed. 700.000 student records, from multiple institutes. The data setup was upped to 8 million records.. this got the YES from Bill and Melinda. Granted, the data were nothing compared to weather data, business data.
But descriptive benchmarks can now be done, for each institutions predictive data can be given based on their student records.
something unexpected emerged: making a prediction is not enough. Finding how this can address the challenges is the most important thing.
It is difficult to get the open, transparent to work with due to educational ethica student related issues around their data.
First three years: building the data resources to get started with analytics.
Now: start the analysis.
The institutes varied: community colleges (done rarely), schools that were considered progressive, as well as 'old school', competency based universities.
We tried to collect data that was available for every student, every school: the simple things that could get hands on => common data definitions.
All data is anonymised (both learners, as well as schools. But the schools keep the encryption to link data to learners (VERY important cfr InBloom project for k12 schools, which resulted in a big emotional issue around data)
If solutions are found to improve learners education, that would be delivered to all. so every action taken by the school is questioned for impact, enabling to make the actual impactful strategies visible.
So they used structured, readily availbale data. Openly published via a cc license
https://public.datacookbook.com/public/institutions/par
One of the things that happened: they can now make comparible conclusions.
Great point on online versus f-2-f colleges (note to self: add movie)
Descriptive benchmarks (cross instittutional) and predictive insights (institutional specific), all with specific filters, e.g. isolate subgroups.
Predictive models reduce guesswork to find students at risk.
For those institutes that have open enrollment and attract all students, these predictive insights can be complimented by different other factors that you - as an institute can research: for instance the inpredictability of humanity (no paycheck in time, death, health issues)
Putting it all together
determine student probability of failure
determine which students respond to interventions
determine which interventions are most effective
allocate resources accordingly
Now also (based on John Campbell work)
inventorying and categorizing student success interventions / supports using a commmon framework
based on known predictors of risk and success
in the context of the academic life cycle
addresses "now what?" by linking predictions to action
enables cross institutional benchmarking
supports local and cross institutional
[Ellen: very hard time to turn down a dare :-) ]
No comments:
Post a Comment