ABSTRACT
Using a dataset containing nearly 500,000 courses taken by over 40,000 community and technical college students in Washington State, this study examines how well students adapt to the online environment in terms of their ability to persist and earn strong grades in online courses relative to their ability to do so in face-to-face courses. While all types of students in the study suffered decrements in performance in online courses, some struggled more than others to adapt: males, younger students, Black students, and students with lower grade point averages. In particular, students struggled in subject areas such as English and social science, which was due in part to negative peer effects in these online courses.
Adaptability to Online Learning: Differences Across Types of Students and Academic Subject AreasDi Xu and Shanna Smith JaggarsFebruary 2013CCRC Working Paper No. 54
Thursday, February 21, 2013
online courses: "decrements" in performance for all students
Subscribe to:
Post Comments (Atom)
9 comments:
I attend courses for my program synchronously online via webcam using Elluminate. Although I may knit while the teacher is expounding on a concept I already understand, I pay attention and take notes in class the same as I do when in an in-person class. Many of my younger classmates do not.
Here’s what I see:
Not showing up to lectures (even though role is taken)
Playing on Facebook during class time
Google chats during class
I’m enjoying my current synchronous classes much more than asynchronous classes I’ve taken for a different program, but both styles of learning online take more internal motivation than learning in a classroom environment.
I would echo and add to K9Sasha's comment: I have taught both fully synchronous and fully asynchronous online courses, and they are not in any way the same species. (Nor, for that matter, is a MOOC the same as either of those.) Any study of online learning that does not distinguish between these different types of online courses is, as far as I'm concerned, not worth the time it takes to read the abstract. I find it astonishing that the authors of this "working paper" don't even acknowledge that there are different types of online courses; it is as if one were studying face-to-face coursework and did not distinguish between a 500-person lecture, a 20-person recitation, an 8-person seminar and an independent study.
If there two ways of doing something and one has a lower barrier to entry than another, people who overcame the higher barrier will tend to outperform. People who commit to travelling to a college campus two or three times a week are more serious on average than those who did not and signed up for an online course.
One needs an experiment where students are randomly assigned to in-person and online courses and the peformance of the two groups is compared. Has this been done?
I've only read the abstract, but just glancing at the paper, I did find some interesting things.
One was that students want to take hard courses like math (that was the example) face to face and easy courses online. Which reinforces the abstract.
I'm just incredibly tired of reading breathless accounts of the Coming Disruption.
Meanwhile we've got data showing that, for example, black students fare worse in online courses ---- haven't we just spent a decade trying to fix that problem?
Skimming now ... they're modeling the population:
Importantly, the model is now effectively comparing between online and face-to-face courses taken by the same student.
Focusing on courses taken only during the first term may help deal with this type of selection; this is the time when students are least likely to sort between course modalities in reaction to their performance in online courses, because they know little about online courses within the college and their own potential performance in these courses.
Online courses don't have to be as good as in-person courses to heavily disrupt the system. Newspapers aren't collapsing because the average online article is superior to the average print article. They're collapsing because they have lost their monopolies (content and classifieds). The major TV networks didn't fade because cable shows were better. They faded because cable broke their monopolies, and now TV as a whole is fading, not because online content is "better," but because the Internet has broken TV's monopoly on "small-screen entertainment."
For a thousand years, universities were training centers for academic (incl. theological) careers, which were a very small percentage of careers, and places for rich kids to get to know one another before going to work and starting their actual career training. Most people skipped the college step. It had little to do with general employment and few employers looked for, or would pay much for, college credentials.
After a thousand years of this, it has only been within my lifetime that accredited universities have gained monopoly power over employment credentials. This is not the normal state. I think this monopoly is temporary and about to be broken as other credentials begin to gain acceptance among employers.
The growing variety of alternative learning venues and the rising cost of a college degree are a carrot and stick that will push increasing numbers of people who CAN handle online learning---the kinds of people employers want to hire---to opt for the alternatives. I expect this to produce a growing demand for rigorous credentials that can certify things employers care about: skill level and ability to stick with a long, difficult task. I think the market will meet this growing demand.
At first, these alternative credentials will be valued less by employers than traditional university degrees, because they will be unproven. PC compatibles from Compaq had to be cheaper than real IBMs for the first few years. But markets get better at valuing things, and after a while, employers will gain confidence in the value of the alternative credentials, and their market value will rise while that of the traditional degree will fall.
The traditional degree could remain more valuable and yet not justify the tuition premium, which would turn it from a sine qua non for employment to a luxury good---which is what it always was until recently. Universities won't go all the way back to what they were, of course, but I expect those without a luxury brand name (Harvard, Stanford) to sell will respond by selling fewer monolithic degrees and more a la carte services (non-degree certification prep classes, lab classes, joint research projects with industry, etc.)
"...push increasing numbers of people who CAN handle online learning---the kinds of people employers want to hire..."
I don't know if I agree with you on that, but I think it will be interesting to see how online degree credentialing will work for places like MIT and how employers will respond. I think it will depend on the content and testing rather than how the online material is delivered.
Having just gone to a MIT admissions session and tour yesterday, I can clearly see the advantage of being there, not for the content of individual courses, but for all of the personal development, student/professor interactions, and access to other opportunities. But how different (to the employer) are two students who both take the same courses, one at the school and one online? From a student's standpoint, what is better, a degree from a local community college or an online degree from MIT? Lots of students can handle MIT classes who couldn't even get close to being accepted. Community colleges could become support centers for online learning at all levels. There is a great potential for many students. I don't think the big name colleges have to worry about having students they admit deciding to save money and take the online version. Online courses only offer these schools more, but they will have to be careful how they do the credentialing.
Will colleges provide a la carte services? That's interesting. Just like regular college students go off to Europe for a semester, online students could attend the college for a semester. But will these colleges want to validate their online students that much? And what happens if the name colleges find that employers don't make too much distinction between resident and online students?
I don't think it will happen. While online courses may make a world of difference for many students, those who get accepted as resident students will always be different. The difference will show up in areas unrelated to the actual courses.
At MIT, we didn't hear about how strong the math classes were compared to Harvard. (They did, however, refer to it as the local trade school up the road.) We did hear a lot about "hacking". (I really don't like the evolving use of that word, but nobody asked me.) Actually, turning the Green building into a large Tetris game was seriously cool - better than the police car on top of the dome, which is now memorialized in their new Stata Center. On a personal handout for our son, one of the questions they asked (not to be handed in) was whether he could think of any good "hacks". Clearly, MIT is about being there. Clearly, MIT is defining the type of student they want, and it's much more than doing well on tests. Perhaps that's why they don't seem to worry about where an MITx degree might lead.
However, in the big world far away from those lofty heights, a lot of good can be done with online courses by putting students more in control at low or no cost. In that case, many colleges will have to better justify their existence, products, and pricing.
Also, online courses won't change employment supply and demand. It may, however, take a lot of money away from many middle level colleges. Imagine a community college system that offers many of the same college life opportunities along with direct professor support of individual online course curricula. Some of the professors at our community colleges have big name college degrees and the colleges offer adequate lab facilities for undergraduate work.
Post a Comment