kitchen table math, the sequel: In the world of MOOCs, 2 + 2 is never 4

Wednesday, December 11, 2013

In the world of MOOCs, 2 + 2 is never 4

The statistical model found that measures of student effort trump all other variables tested for their relationships to student success, including demographic descriptions of the students, course subject matter and student use of support services. The clearest predictor of passing a course is the number of problem sets a student submitted. The relationship between completion of problem sets and success is not linear; rather the positive effect increases dramatically after a certain baseline of effort has been made. Video Time, another measure of effort, was also found to have a strong positive relationship with passing, particularly for Stat 95 students. The report graphs these and other relationships between variables examined by the logistic-regression models and pass/fail.

While the regression analysis did not find a positive relationship between use of online support and positive outcomes, this should not be interpreted to mean that online support cannot increase student engagement and success. As students, Udacity service providers and faculty members explained, several factors complicated students’ ability to fully use the support services, including their limited online experience, their lack of awareness that these services were available and the difficulties they experienced interacting with some aspects of the online platform. It is thus the advice of the research team that additional investigations be conducted into the role that online and other support can play in the delivery of AOLE courses once the initial technical and other complications have been addressed.

Conclusion: The low pass rates in all courses should be considered in light of the fact that the project specifically targeted at-risk populations, including students who had failed Math 6L before Spring 2013 and groups demonstrated by other research to be less likely to succeed in an online environment. Previous studies (see Section 1) have found that these students do less well in online than in face-to-face courses. Further, student groups in at least one major study (Jaggars and Xu, 2013) who were found to experience the greatest negative effect from taking courses online share many of the characteristics found among the AOLE partner high school students in particular, a group with very low pass rates in Spring 2013.

Overall, much was learned during and from the first iteration of AOLE and improvements are already in progress in the second AOLE iteration. Perhaps most importantly, the faculty members who taught these courses, although they had to contend with major difficulties along the way, believe that the content that has been developed has tremendous potential to advance students’ critical thinking and problem solving abilities. One faculty member summed it up this way: "Udacity has brought to the table ways to make the courses more inquiry-based and added real life context."
Let's reprise.
  • The clearest predictor of success in the course was the number of problem sets students completed. In other words, practice. 
  • The online mentors, aka teachers-slash-tutors, didn't help. But they might have helped if students had a) had lots of internet experience (practice) & thus could figure out how to get to the mentors; b) known the online mentors existed; and c) been able to get the MOOC site to work.
  • "Previous research" had found that weak students do better in face-to-face courses, so….SJSU opted to run a MOOC and fill it with weak students.
  • "Most importantly," the teachers who taught the MOOCs think the "content" has "tremendous potential to advance students’ critical thinking and problem solving abilities."
Practice is what matters, so the instructors are focused on inquiry; weak students do badly in online courses, so the MOOC people put weak students in online courses; educational technology never works.

A person who lives in the world where two plus two equals four would be doing something else.

Eureka, part 2
Eureka, part 3
Eureka, part 4
Eureka, part 5

Flipping the Classroom: Hot, Hot, Hot
MOOCs grow the gap
The New York Times is surprised
In the world of MOOCs, 2+2 is never 4
World's funniest joke: humor depends on surprise
Dick Van Dyke on comedy
Philip Keller on the flipped classroom
If students could talk
Who wants flipped classrooms? (Salman Khan on liberating teachers)
True story
Are math & science lectures boring in a way humanities & social science lectures are not?


Anonymous said...

You're being a bit harsh on San Jose State. They were pushed *very hard* by the state government to do this MOOC experiment. They did it, and immediately put a hold on continuing it when it failed (to no one's surprise except the state legislature, who are suckers for a rich guy with a silver bullet).

The state legislature was specifically touting MOOCs as the cure-all for remedial courses (where they were least likely to succeed), so San Jose State demonstrated the problem on precisely the set of students that the legislature was telling them to.

Four relatively small classes of students were hurt by this MOOC experiment, compared to the 100s of thousands that the legislature wanted to inflict it on.

Anonymous said...

The governor of Florida, Rick Scott, seems to think that MOOCs will be great for K12. Hmmmm...

LynnG said...

Do you know what is meant by "younger" students? i.e., younger students did worse in MOOCs. Are we talking college freshmen v. sophomores? or were they looking at kids in k12 too?

I'm mostly interested in finding any info on middle school aged kids in online learning environments

Anonymous said...

The SJSU experiment was on remedial algebra and the classes consisted of a range of students from high school age through re-entry adults. The students from a charter high school in Oakland had the least success with the MOOCs.