What About MOOCs is Disruptive: The Idea or the Technology?

There has been a shift in how people are talking about MOOCs. At first, the idea was that MOOCs would open up access to education for anyone who wanted or needed it—people in developing countries, people in developed countries who couldn’t afford the time or money for a traditional education, people who wanted to boost their job skills but didn’t necessarily need degrees, and so on.

Recently, however, higher education in particular seems to be talking much less about these larger goals and instead focusing on the technologies that MOOCs use.

For example, MOOCs have been making a big splash in the area of business education, which, with its high tuition, is something of a cash cow for traditional colleges and universities. With elite institutions like the University of Pennsylvania’s Wharton School of Business putting their courses online, it looked for a while like traditional b-schools were in a whole heap of trouble.

Some of the early researchers behind this idea were Wharton’s Christian Terwiesch and Karl Ulrich, who in a report earlier this year basically warned business professors that they had better start finding new ways to make themselves relevant or potentially face being replaced by MOOCs. In an article this week at Financial Times, however, they seem to take a slightly different tack. While they have always suggested that it isn’t the MOOC itself, but rather the technology behind chunked video content that threatens disruption, in the new article, Terwiesch and Ulrich discuss ways that technology can be used to “strengthen today’s business schools,” rather than replace them.

Similarly, in a recent article for Inside Higher Ed Kristen Eshleman of Davidson University notes that two of the most important reasons universities try MOOCs—expanding access to education and lowering costs—haven’t actually come to pass. She suggests instead that MOOCs can provide the greatest benefit on campuses through “improving educational outcomes, fostering innovation, and conducting learning research.”

So, the question arises: What aspect of MOOCs is really disruptive? Is it the idea of making education more widely available, or is it the technology behind the courses? This is something we need to think about carefully. Technology can be used in many ways, including to reinforce the status quo. But the idea that anyone with an Internet-connected device can get an education—that is truly revolutionary.

As we move toward the next phase of MOOCs, let’s not forget their greater promise: education for all.

MOOCs, Higher Ed, and Economies of Scale

One of the largest potential benefits, as well as one of the largest sticking points, of MOOCs is the ability for colleges and universities to save money, for example, by licensing rather than creating materials for standard courses, like Psych 101. This has caused a huge controversy, as faculty see MOOCs used this way as a threat to their jobs.

The idea of reducing costs and increasing efficiency is one that we not only accept, but expect from companies in most other industries, but it is seen as antithetical to the mission of higher education. As Keith Hampson wrote,

“Higher education has not put a great deal of effort into finding ways to reduce costs through scale. While other sectors seized the new economics of digital content storage and distribution, higher education has continued to produce and use digital instructional content locally; a digital cottage industry model, of sorts. For elite institutions or those aspiring to be elite, scale runs counter to one of the trappings of elite status: exclusivity. For others, the highly decentralized organization of institutions makes institution-wide scale improbable. Another factor is concern amongst decision-makers about the impact of sharing courses on labor market value and autonomy.”

But the main goal of colleges and universities shouldn’t be exclusivity—it should be education. And improbable doesn’t mean impossible—it just means some changes need to happen.

Scale will be required to meet the education demands of current and future students, especially as employers continue to demand more (and higher) credentials for even entry-level jobs. In 2009, University of Central Oklahoma economics professor Joseph T. Johnson proposed four ways higher ed could become more efficient:

1. Separate the research function from the teaching function.
2. Hire master’s degree-holders to teach most courses, and doctorate degree-holders to develop curriculum and teach advanced courses.
3. Teach standardized curricula. Since MOOCs hadn’t yet gained popularity in 2009, Johnson talks mostly about outsourcing course development to textbook publishers, but his points apply equally well to MOOCs.
4. Avoid giving faculty administrative responsibilities. Ed tech companies can play a role here.

With more non-traditional learners entering the ranks of students, and learning becoming something one needs to do continually throughout one’s entire career, economies of scale will soon no longer be optional. Embracing MOOCs, alternative credentials, and other new innovations will likely be the only way higher ed will be able to provide all of the education need to everyone who needs it.

More Awareness Needed to Bring About Change

The cost of college is one of the most pressing problems we are facing today. The value of college degrees in the job market has become inflated beyond all reasonable expectations, while at the same time the cost of those degrees has gotten so high that many students can no longer justify the debt load.

One major contributor to that cost is textbooks, whose prices continue to rise so quickly that many students are now choosing not to buy them, even with full knowledge that the decision will adversely affect their grades. Why, in this era when we can find the answer to any question imaginable simply by typing the question into Google, are textbooks still so expensive? This is a problem that several companies like OpenStax College, Flatworld Knowledge, and Boundless have been trying to solve by creating free or low-cost textbooks based on open educational resources (OER). These resources are available and their quality can match that of traditional textbooks (OpenStax textbooks even go through the peer review process), so why aren’t they being used more widely?

A new study by the Babson Survey Research Group suggests that the reason the resources aren’t being used is that faculty are not aware of them. In fact, as many as three-quarters of faculty respondents described themselves as “unaware on OER.” Those who are aware, however, describe the materials as being of high quality — as good as or superior to traditional textbooks.

Although the context is different, these results somewhat echo findings about MOOCs and online education. For example, in a recent study of employers, fewer than one-third had heard of MOOCs at all. Once they learned about the courses, they expressed positivity about the role of the courses in employee hiring, recruiting, and training. Research also shows that faculty who have experience with online education perceive it more positively than those who don’t.

While it is unlikely that many in the higher ed community haven’t even heard of MOOCs, what they’ve heard is key. Most have probably heard of Coursera and edX, and of the high-profile but ultimately unsuccessful experiments by Udacity. But what about the new things that are being tried, like remote labs and collaborative learning? Have they ever taught or taken a MOOC? What percentage of faculty would classify themselves as “unaware on MOOCs”?

Colleges and universities are struggling to provide the education students need at a price they can afford. OER, MOOCs, and other recent innovations can be effective tools for meeting current demands. But first, we need to get the word out.

MOOCs for the Biggest Skills Gaps

The “skills gap” isn’t just one thing. Rather, it’s an umbrella term used to encompass any area where employers are having trouble finding qualified employees to fill open positions. In a recent article for Talent Culture, Spark Hire CEO Josh Tolan identified five main areas where skills gaps exist. We decided to explore how well MOOCs could help employers—and job seekers—bridge the biggest skills gaps.

1. Digital media skills

More than three-quarters of companies report that a digital skills gap is preventing them from adapting to new digital trends. It’s pretty safe to say that MOOCs have digital skills covered in spades. Specific digital media courses include Social Media and Digital Media One, both offered via Canvas, and Creative Programming for Digital Media & Mobile Apps and Fundamentals of Digital Image and Video Processing on Coursera.

2. Soft skills

Close to one-half of employers find that employees are lacking soft skills like communication, collaboration, and critical thinking. These are often considered some of the most difficult skills to train, but MOOCs are taking a stab at it. Here are a few: Creativity: Music to My Ears and Decision Skills: Power Tools to Build Your Life from NovoEd, Critical Thinking in Global Challenges and Creativity, Innovation, and Change on Coursera, and Corporate Communication from Saylor.

3. Marketing skills

Business is another area where MOOCs are flourishing. Marketing-specific MOOCs include An Introduction to Marketing on Coursera, Digital Marketing: Challenges and Insights on FutureLearn, and Services Marketing: Selling the Invisible on OpenLearning.

4. Skilled trades

OK, this might be a tough sell for MOOCs, since trades are primarily learned hands-on. However, many trades industries already use elearning for parts of their training programs, and these parts could translate well to MOOCs.

5. STEM skills

STEM skills are in continuously high demand. Computer science, software, and math courses are abundantly available, and both Coursera and edX also feature a collection of MOOCs in the natural and other sciences.

Looking for a particular MOOC to enhance your skills? Explore Degreed’s MOOC list.

More MOOCs Going Project-Based

Ever since Coursera-type MOOCs (as opposed to the original cMOOCs from 2008) turned two last spring, we’ve seen a rush of innovation in the courses, greatly extending what many people thought were the limits of online education.

Earlier this year, we saw science experiments moving online, as well as personalized one-on-one learning. Moocdemic combined a MOOC with a massive multiplayer online game, and Trauma! took a “Netflix approach,” splitting a MOOC into 10 one-week mini-courses.

The newest area emerging now is project-based courses. In the early days of MOOCs, most assessments were machine-graded multiple-choice tests. While programming courses used more practical assessments (i.e., machine-graded coding problems), most other courses focused on theory rather than the application of knowledge.

But that is changing. All of Coursera’s Specializations culminate in a capstone project, and for a new Educational Technology XSeries that just started this month on edX, all of the courses are project based. The XSeries focuses on game design. According to MIT’s Education Arcade website, “each of the courses is project-based, incorporates collaborative group as well as individual work, self-assessment as well as peer review, and concludes with a culminating project.” Students in the second course (which started today) will develop paper and digital prototypes for an educational game as well as engage in design iteration and user testing.

Project-based work, which results in students having a digital portfolio, is the next step in legitimizing MOOCs in the eyes of employers. Earning a certificate of completing from a MOOC doesn’t tell employers any more about what people know and can do than taking a face-to-face course, but projects can be clear demonstrations of learners’ knowledge and skills and they could pave the way for MOOCs to play a role in new efforts toward competency-based learning.

If you’ve taken any project-based MOOCs, I’d love to hear what you thought about them. Leave your impressions in the comments.

Page 1 of 51