Twenty years is a long time. Twenty years ago, George W. Bush was president. The country was still reeling from the events of September 11, 2001. The number one song for the year was “How You Remind Me” by Nickleback and Spider Man was the top movie. Us educators may remember that the No Child Left Behind Act was signed into law that year.
It was also twenty years ago that SmarterServices was founded and started measuring online learner readiness using non-cognitive factors like motivation, procrastination, and willingness to ask for help. Twenty years is a long time to do anything. So, this begs the question, "Does learning readiness matter — twenty years later? Are we still measuring what matters?"
This question about whether or not learner readiness still matters is an important one. Check out the video below to learn more, or scroll down to read the video overview.
In 2002, distance learning was still in its infancy. In those days, most research about eLearning focused on the question of whether or not courses taught by an online delivery system were comparable to face-to-face instruction. Educators were asking if a person can really learn online and can a faculty member really teach concepts online? Instructional design practices were still based on classroom models and software such as learning management systems were still in early versions. Tools such as online proctoring had not yet even been invented. Little could we imagine that twenty years later almost all college students would be learning online due to the COVID-19 pandemic.
In these early days of eLearning, it was obvious that learning at a distance was a better fit for some students than for others. eLearning leaders were quickly recognizing that it was typically easier to recruit students to online programs than it was to retain them. Students were recognizing that convenience did not imply that the courses would be “easy.”
One such organization that recognized the need to measure learner readiness was the US Army. They had launched an aggressive program to educate soldier students called eArmyU. After evaluating early reports of learner satisfaction and retention, they put out a request for proposal (RFP) for an assessment that would quantify non-cognitive factors for learner success. They wanted to measure “grit.” The specifications in this RFP were the early foundations for what would first be called READI — "Readiness for Education At a Distance Indicator” — which is now known as the SmarterMeasure Learning Readiness Indicator.
Since then, the assessment has been taken by over six million students from over 1,000 higher
Over the years, the core of the assessment has remained constant, but the questions have been updated to reflect changes in culture and technology. The assessment continues to first look at the learner internally through the scales of Individual Attributes and Learning Styles which measure constructs such as motivation, control over procrastination, willingness to ask for help, and locus of control.
We continue to help students understand their context through the Life Factors scale which measures variables such as availability of time, support from others, and having an appropriate place to study. The indicator still evaluates skills such as on-screen reading rate and recall, keyboarding rate and accuracy, technical knowledge and competency, and experience with a learning management system. In addition to these traditional non-cognitive factors, we also now offer a math readiness indicator and a writing readiness indicator.
Due to the pandemic, college enrollment has substantially declined. As reported by Inside Higher Ed in January 2022, “Since the pandemic began in spring 2020, enrollment has declined by 5.1 percent across the board, with 937,500 fewer students enrolled at American colleges and universities.” With almost a million fewer students now than before the pandemic, retention is more important than ever. As there are fewer students coming in the front door of the institution, we must make sure that fewer of them slip out the back door.
Over the past two decades, we have completed multiple research projects quantifying the relationship between the non-cognitive factors measured by SmarterMeasure and metrics of student success such as engagement, satisfaction, retention, and academic success. Statistically significant improvements in retention have been documented at several schools including Middlesex Community College, Argosy University, and Utah State University.
During the quarantine year of the pandemic (AY 2019/2020), just as more students studied online than ever before, more students completed the SmarterMeasure assessment than ever before. Over half of a million students took the assessment during the academic year 2019/2020, which was a 25% increase over the prior year.
This high level of usage allowed us to compare levels of learner readiness during the pandemic to pre-pandemic levels. Our recently released National Means Report explored these findings. Here are three summary observations.
During the COVID-19 pandemic, many institutions closed their doors and moved much instruction online. This prompted many students for whom online learning was not their first choice to be constrained to learn by an online learning modality. This reduction in levels of learner readiness is demonstrated by the lowest mean on half of the scales we measure being experienced during the quarantine year of AY 19/20. The following scales revealed the lowest means during that year:
Secondly, mean scores on the Life Factors and Individual Attributes scales have consistently declined over the past four years by about two percentage points. This is a substantial observation since these scales measure learner attributes, not skills. As such, these measurements are not delivery system dependent, meaning that they are equally important for students studying online, hybrid, or face-to-face. It is paramount that institutions measure this data for all students and provide appropriate strategies for intervention and support since students are struggling more in these areas than in the past.
Finally, the mean scores for the Math Readiness and Writing Readiness scales have consistently declined over the past four years by 5 to 8 percentage points.
Because the SmarterMeasure Learning Readiness Indicator has been utilized for two decades, we are able to compare readiness data to that of prior years. This data is a comparison of select data points from a decade ago. Compared to ten years ago, students are more diverse ethnically and by gender. They tend to be less social in their preferred learning style. While they do have a little more online learning experience, their levels of readiness have declined for individual attributes, life factors, reading recall, and technical knowledge.
|
2011 |
2021 |
Female |
72% |
62% |
Caucasian |
62% |
44% |
No prior online courses |
55% |
50% |
Traditional age student |
28% |
48% |
Social learning style |
22% |
19% |
Individual attributes mean |
78.09 |
76.80 |
Life factors mean |
79.30 |
78.57 |
Reading recall mean |
74.44 |
70.43 |
Technical knowledge mean |
72.44 |
68.88 |
At SmarterServices, we recognize that simply measuring learner readiness is only the beginning. That's why we support our client institutions and their students in many ways such as custom training sessions for academic advisors and faculty, informational webinars, and helpful handouts. Here are a few of the resources that we freely share to help students persist:
To view all of our free resources, click here.
I trust that based on enrollment trends, research results, and pre/post-pandemic data that you concur that measuring learner readiness is more important now than it even was twenty years ago. While distance learning has become more sophisticated, students are still students and in this challenging environment need more support than ever to persist and succeed.
If your institution is not currently using the SmarterMeasure Learning Readiness Indicator reach out today for a brief overview and a free login to try the assessment for yourself.