Need-to-Know-News: Micro-Credentialing Movement in Higher Ed & Active Learning Trumps Lectures

This ‘Need-to-Know’ blog post series features noteworthy stories that speak of need-to-know developments within higher education and K-12 that have the potential to influence, challenge and/or transform traditional education as we know it.

credit1) The Micro-credential Movement in Higher Ed
The latest trend in higher education is micro-credentialing, the non-traditional education path where students gain skill sets in a specific area and receive a credential. Case in point, Udacity announced this week a new nanodegree (Udacity defines nanodegrees as ‘curriculums designed to help you become job-ready)’— the Android nanodegree in partnership with Google. Another example—Penn State’s College of Business also launched this week an online bootcamp course, ‘Supply Chain Leadership Academy’, to educate “supply chain leaders of tomorrow in leadership and best practices in holistic supply chain management”.

The micro-credentialing trend is driven by business entities that have a real (or perceived) workforce skill gap, where jobs can’t be filled due to lack of qualified applicants. Google reports it has thousands of jobs to fill given a dearth of qualified applicants. The Linux Foundation, also offering a certificate course in partnership with edX, reports it has over 50,000 open jobs.

MOOC providers and select higher education institutions are leveraging the apparent skills gap, using their platforms to build their online program offerings with credentialing options for a fee. A good idea. The target market is not traditional higher education students, but non-traditional students that are already in the workforce and are looking to further their careers and/or switch career paths. Alison.com is a platform offered credentialing in specific skill sets long before MOOC providers began doing so. Though Alison’s business model is different from MOOC providers such as Coursera or edX. Students aren’t the revenue source but advertisers, featured on the platform, are.

Sampling of micro-credential programs and associated fees:

  • edX’s Linux System Administration Essentials course, “This Linux course is for those just starting their career in IT as well as professionals with experience on other operating systems who want to add Linux to their portfolio”. Fee: $399
  • Stanford Online, Professional Certificates, “Our professional certificates offer short, focused courses that give you tools and techniques you can apply right away“. Fee: $1295 per online course; required number of courses vary by certificate.
  • Udacity’s Nanodegree – “All the course content is free online, but the $200 per month pays for the non-scaleable parts of the degree: project grading, feedback, instructor mentorship, assistance and a final certification”. Option to receive reimbursement of 50% of tuition upon completion.
  • Digital Literacy & IT Skills Diploma Courses, Alison.com. Free with option to pay nominal fee for paper certificate delivered via mail.
  • Coursera’s Specializations – “Master a skill with a targeted sequence of courses”. Fee: $95 per course, with a fee for the ‘capstone project’, e.g. Business Foundations Specialization = $595 for four courses and capstone project.
Screen Shot 2015-06-05 at 9.40.20 AM

Screenshot of recent email from Coursera announcing upcoming Specialization certificates. ‘Specializations’ consist of a two or more courses on focused area.

Insight: This non-traditional student population, which micro-credentials target, is an emerging market and such options are a boon to working or unemployed adults seeking skill development. It’s a positive development in higher education. Employers appear receptive to micro-credentials. However, micro-credentialing is favorable provided the programs provide quality learning resulting in tangible, applicable skill sets. The majority of the credentials require not only a financial investment, but a significant investment of students’ time and energy. It’s buyer-beware; credentials do not guarantee a job, though the courses backed by business entities likely have higher placement rates than those without a business affiliation.

2) The Case for Active Learning over Lectures*
This is not new news, but worthy of review—evidence that performance of students engaging in classes that primarily offer active learning is improved over classes involving primarily lectures. A significant study on active learning was released last year; it provides compelling evidence on active learning benefits specific to STEM subjects in higher education (Freeman, et al., 2014). Researchers conducted a meta-analyses of 225 studies in published and unpublished literature that documented student performance in courses with at least some active learning versus traditional lecturing.  Though intuitively we might know that active learning is more effective for learning, there’s now solid evidence to back it up:

The data reported here indicate that active learning increases examination performance by just under half a SD and that lecturing increases failure rates by 55%. The heterogeneity analyses indicate that (i) these increases in achievement hold across all of the STEM disciplines and occur in all class sizes, course types, and course levels; and (ii) active learning is particularly beneficial in small classes and at increasing performance on concept inventories.

Implications: Is the lecture dead? Absolutely not, but to increase student learning, retention and success, involving students in active application of concepts should be the norm not the exception. However, implementing active learning is challenging for many educations, and especially for online courses, yet it can be done with deliberate, thoughtful development of a course learning strategy. Below are links with suggestions and examples of active learning applications. One of my favorite examples of active learning, is an online literature instructor Laura Gibbs, who creates assignments using online platforms—blogging platforms, Pinterest, etc. where students engage with content, each other and the Internet community.

References:

Feature Image: by GotCredit on Flickr

How ‘Good’ is Your Online Course? Five Steps to Assess Course Quality

The view that online education is “just as good as” face-to-face instruction was not widely held in 2003: 42.8% of chief academic officers reported that they considered the learning outcomes for online instruction to be inferior to face-to-face instruction. The view of online quality has improved over time. However results for 2013 revealed a partial retreat in faculty perceptions of online learning providing quality learning experiences. The 2014 results indicate that the retreat continues—there’s an increase in faculty that perceive online education as inferior. — Grade Level: Tracking Online Education in the United States, 2015

quality-controlOne of the main criticisms of online courses is they are of poor quality as revealed in the annual Babson study mentioned in the opening. Positive perception of online learning by faculty has declined in 2013 and 2014 (Allen & Seaman, 2015). Face-to-face courses appear to be the hallmark for quality when it comes to higher education. Yet this doesn’t seem fitting considering the ongoing and often heated public dialogue about the quality of higher education programs with little consensus on what quality is. In this blog post I suggest that online educators can and should tackle the quality issue in their own courses, and that they do so by assessing their course holistically. A holistic approach encompasses elements such as students’ perspectives, results over a period of time, artifacts created during learning, and the instructor’s course experience.

I also review recent research on quality assessment specific to online courses. I also examine existing frameworks and rubrics for online course assessment and explain why, even if an institution follows such standards, these are starting points. I outline five-steps that instructors can follow to assess whether a course is ‘good’—an assessment for quality that considers foundational elements, student perspectives, course artifacts, student and instructor learning experiences.

What is Course Quality?
Up until a few years ago ‘quality’ in higher education was measured by a course’s content, pedagogy and learning outcomes (Bremer, 2012). This approach has changed to a process-oriented system where a combination of activities contributing to the education experience are considered. Activities that include: student needs, use of data and information for decision-making, department contributions, as well as improved learning outcomes (Thair, Garnett, & King, 2006). This holistic approach of evaluating education experiences is often applied to the development and assessment of online learning. For example, Online Learning Consortium’s Five Pillars of Quality Online Education (below) and Quality Matters (QM) rubric.

Screen Shot 2015-05-25 at 4.56.37 PM

“Five Pillars of Quality Online Education, the building blocks which provide the support for successful online learning”.  http://www.onlinelearningconsortium.org

Why Assessing Quality is Difficult in Online Education
Yet there are challenges associated with setting universal quality standards for online education, and though a starting point, a thorough quality assessment requires ongoing consideration of numerous elements, some that occur over a period of time.  Key challenges with assessing quality through set standards are outlined in ‘What is online course quality‘ and include: 1) the lack of authoritative body (able and willing) to address minimum level of standards across all states with their accrediting bodies, 2) the challenge of creating a comprehensive, evaluative tool to address complexities of online courses, and 3) the implementation process itself given the significant resources that would be required to implement an institution-wide evaluation process (Thompson, n.d.).

Limitations of Quality Assessments
There are other limitations. Some assessments are inherently limiting with a prescriptive set of standards that may not fit all contexts.  Another is the tendency to establish a minimum level of quality, ‘baseline standards’ which limits innovation and creativity (Misut & Pribilova, 2015).  Most course assessments are done at a point-in-time and are unable to capture dimensions over the life of a course and post-course; dimensions that include student perceptions collected as formative feedback (mid-way through course) and end-of-course feedback surveys. Furthermore, quality assessments frequently focus on course/instructional design and fail to include learning experiences of the instructor and students.

What’s involved In a Good Course Assessment?
A holistic assessment goes beyond course design; it acknowledges the nuances that make a course unique, including input and contributions from students, developments in the field of study, and current events. Most valuable are students perceptions of their learning and of the course experience. A good course assessment considers the course over a period of time, and considers interactions between instructor and students, students and students, all of which create artifacts that can be studied and analyzed (Thompson, 2005).  Artifacts might include, emails or forum posts of student questions,  dialogue within forums, feedback from group interaction, end-of-course student surveys, LMS reports on student interaction patterns, student assignment results, and more.  Course artifacts give valuable clues to a course’s quality, more so when collected from two or more course iterations and analyzed collectively.

Screen Shot 2015-05-25 at 2.55.05 PM

Figure from paper describing the Online Course Criticism model based on the concept of educational criticism which suggests a holistic review of a course to assess quality (Thompson, 2005)

Other elements to consider:

  Student behaviours including questions asked in forums, emails, interactive patterns within LMS, interaction with resources, participation patterns within discussion forums,  social platforms designated to course, etc.   Student perceptions evaluated through questionnaires, formative course feedback, post-course questionnaires, one-on-one interactions  ♦  Knowledge creation/transfer by students evaluated through assignment analysis, course artifacts, post-course surveys  ♦  Course design as per rubric/assessment tool    Use of current technology tools and platforms    Course data and artifacts from two or more sessions analyzed and compared  ♦  Quantity and type of interaction between students and instructor

Five-Steps to Assessing Online Course Quality

1) Asses Using a Rubric or Other Tool to Consider Basic Course Elements
Assess course using the tool or framework employed by your institution e.g. Quality Matters rubric. If your institution does not have a tool in place I recommend the rubric created by California State University Chico which covers six domains. The rubric (embedded below) is free to use and download under the Creative Commons Attribution 3.0 United States License.

* Thanks to a reader’s comment – there is an updated version of the Chico rubric which is a checklist format with additional dimensions. It is similar to the Quality Matters rubric. I prefer the version embedded here — its more approachable given it’s less lengthy and rigid. Link here to the updated version.

2) Analyze Course from a Student Perspective
This is perhaps the most difficult yet useful element for improving course quality. There are a variety of ways to consider students’ perspectives, several already mentioned. Other recommendations—take an online course as a student (e.g. a MOOC) in a topic you aren’t familiar with. This provides an eye-opening view of how it feels to be an online student. Another method is to ask a colleague from another department to review your course and provide constructive feedback.

3) Assess Course Artifacts, Materials, & Feedback
Another useful exercise is analyzing course artifacts. Analyzing results from student feedback via a questionnaire midway through course is helpful. If a course is offered more than once, compare data from course iterations collectively.  Consider, is student feedback incorporated into subsequent course re-runs? What about student-generated content? All artifacts and materials associated with a course are valuable material for assessing a course’s quality.

4) Consider Level and Type of Student-to-Student and Student-to-Instructor Interactions
Interaction is critical to an online course; students that feel connected, establish themselves as individuals within an online course are likely to have higher levels of motivation and learning satisfaction over those that don’t. Consider the forums, the interactive assignments where students can participate, the social exchanges within course-associated platforms, and other places for interaction. An example of assignments that encourage student feedback and involvement, leading to high levels of engagement can be found on this online instructor’s (Laura Gibbs) course site here. Also consider the Community of Inquiry model for the types of interactions in an online course that lead to positive learning experiences.

5)  Results: Are Students Learning?
Evidence of learning  is the most important assessment dimension, yet nearly impossible for a standardized quality assessment tool to evaluate.  One could argue that before and after quizzes within a course can evaluate learning. I suggest that the instructor is able to assess at a deeper level whether or not learning occurred, can determine the level of critical thinking. This can be done only when assignments demand that students demonstrate what they know and are required to apply course concepts.  Assignments that draw out students thinking by demonstration either through dialogue or written work allow the instructor evaluate learning effectively. There’s no formula for this fifth step, this is an example of customized course evaluation. But I suggest instructors evaluate student artifacts from one course to another and to consider what students learned and how well  they articulated what they learned. There may be opportunity for revising assignments, activities or other course dimensions.

Conclusion
Assessing quality in online courses is complex as we’ve seen here, yet addressing quality is critical to advance the positive perception of online education for one, but more importantly to provide learning and teaching experiences that are rewarding, rich and meaningful. Quality assessment can start one course at a time, and who better to do this than the course instructor?

References/Resources*

Does it Take More or Less Time to Facilitate and Develop an Online Course? Finally, Some Answers

How much time does it take to teach an online course? Does teaching online take more or less time than teaching face-to-face? How much time does it take instructors to develop an online course? — Instructor Time Requirements to Develop and Teach Online Courses (Freeman, 2015)

Time business conceptA study released in March of this year set out to answer these burning questions that the majority of online educators would like answers to. There’s considerable anecdotal evidence that favors both sides—it takes more time versus less time to facilitate an online course when using a face-to-face course as benchmark. The purpose of this study was to nail down the facts—to measure the perceptions of and actual time spent developing and teaching online courses. The findings are significant for institutions and educators involved in online education for several reasons. Professional development for one. The report reveals areas where survey-respondents struggled during the course development phase, and where the majority of time was spent when facilitating (the conclusions are surprising). Secondly, results may be helpful for institutions when considering compensation and work-allocation models. Institutions can use the results as benchmark, at the very least the study may act as catalyst for constructive conversations about compensation and support for online course development and facilitation. And finally, it may help online instructors gain insight into their own teaching experiences by considering the experiences of  other educators that have experience with face-to-face and online courses.

This post highlights the findings and suggests factors for educators to consider when it comes to, 1) the time spent developing online versus face-to-face courses, and 2) how much time is invested in online facilitation, and how it compares to face-to-face instructions.

Survey Details
To put the results into context—the survey gathered data from 68 instructors from a total of 165 solicited from three universities across eight academic disciplines. Each respondent had developed an average of 2.13 online courses and had experience teaching an average of 2.03 online courses, and had been teaching at the university level for an average of 14.2 years (Freeman, 2015).

1) Course Development Time: Pedagogical Learning Curve Steepest
Survey results confirmed that developing online courses is indeed more time consuming than developing face-to-face courses. Though the time required declines when the same instructor develops a second or third online course. Twenty-nine percent of respondents indicated they spend over 100 hours (median of 70 hours) to develop their (first) online course. This significant number of hours is likely due to the fact that 59% of respondents developed over 90% of the course without any assistance, which included developing content, assessments, assignments, and time associated with course design. The other 41% received course design support from instructional designer(s) and/or used ready-made content available through textbook publishers. Also significant is the technological learning curve which was found to be shorter than the pedagogical learning curve, in other words instructors required more time to determine how to implement pedagogical methods, how to create learning experiences and deliver content appropriate for the online format than they did learning about the features and nuances of the technology used to deliver the course. The learning curve is described as the time it takes to “get used to” the course elements [platform, tech features] and/or the method of teaching.

Screen Shot 2015-05-08 at 9.42.50 AM

(Freeman, 2015)

Implications:
Developing a quality online course is complex due to the fact that technology adds yet another layer to course design and one that requires a unique skill set. In addition there is an interdependent relationship between technology and pedagogy specific to online courses—for instance the features of a LMS platform will determine and shape the course and the teaching methods. Using the discussion forum as an example—the flexibility of the forum feature—how easy it is to set up by the course designer for group assignments, and how it can be used by students for a group assignment whether it can facilitate the communication and collaboration that is required for the assignment will dictate how effectively the ‘method’ is executed in the course.

Online course design requires a breadth of skills that includes technical knowledge, not only familiarity with LMS features, but also outside tools including social media platforms that can enhance student learning.  Knowledge of user-focused design, or web design principles is also critical in delivering an intuitive, learning experience for students (How Five Web Design Principles Can Boost Student Learning). Second are the pedagogical methods, in other words how learning is sequenced, framed and presented to students.  This array of skills required is far beyond the scope of most faculty, who are experts in their field of study, not necessarily course design. Realistically creating an online course requires at least two or more individuals with specific skills sets working together to develop an engaging, intuitive and quality learning experience.

The onus is on institutions to provide not only professional development for faculty in course design principles and strategies, but to provide support in the technical and pedagogical aspects.

2) BIG Time Commitment Facilitating First Online Course — Levels Off After 2nd Time, But Grading Involves More Time Investment
Though respondents in the survey originally perceived that teaching online took more time than teaching face-to-face, by the third time facilitating respondents reported that it took them about the same amount of time as it did a similar face-to-face course.

There is supporting evidence to the earlier finding that teaching an online course the second and third time becomes about as time-consuming as teaching a face-to-face course the second and third time.  The factors that still remain more time-consuming for online teaching compared with face-to-face teaching, even after teaching the course three times, are Instructor-Student Interaction and Grading & Assessment, the two specific factors  that can not be prepared in advance for online courses (unlike Content Development and Pre-Semester Setup).

Implications:
Sixty-nine percent of survey respondents indicated that it took ‘much more’ and ‘more’ time to facilitate an online class for the first time. Yet by the third time, it dropped to 25% in this same categories (table 4 below), which does support the learning curve theory. These findings suggest that acknowledging that more of the instructor’s time will be required the first and even the second time facilitating a course, is important for both the instructor and the institution. Though it does also suggest that professional development is needed for instructors—development focused on facilitation skills that will support skills specific to the uniqueness of online instruction. Such training can potentially reduce the learning curve for instructors, as well as reinforce the building of effective skills, best practices, and efficient use of time.

Screen Shot 2015-05-08 at 9.55.10 AM

Annotated screenshot that shows two-thirds of  respondents by the third time facilitating online indicate that it took ‘somewhat more’, ‘more’ or ‘much more’ time to grade and assess students in an online course than face-to-face (Freeman, 2015).

A startling (and significant) finding of this study is the time dedicated to grading and assessing online students. It appears that the time dedicated to grading students’ work actually increased from the first to third time of facilitating an online course (table 4). Two-thirds of the respondents indicated by the third time it took ‘somewhat more’, ‘more’ or ‘much more’ time to grade and assess students in an online course than face-to-face.  I find these results encouraging since an instructor’s feedback of students’ work is a critical component that can motivate students, deepen their knowledge and push them to think critically (Getzlaf et al., 2009). Implications are that skill development in this area are needed and will benefit not only students but can help instructors to provide feedback more efficiently. There are several technology tools and applications that can help instructors achieve efficiency and to make the most of giving feedback using online tools that deliver meaningful, quality feedback for students (Morrison, 2014). Again professional development is needed in the area of grading and assessment to support instructors in their efforts.

Conclusion
By no means is this study the definitive answer on the time requirements for developing and facilitating online courses, but it is an excellent starting point for conversations about ‘time’ needed to create quality online learning experiences.

References

Nicolas Carr on ‘Social Physics’…The Darker Side of Reality Mining

BigDataImageIt’s this article ‘The Limits of Social Engineering that piqued my interest this week, first because of the image featured in the article which I found appealing, then it was the reference made to Marshall McLuhan, a scholar and author I admire greatly, and finally because it was by Nicolas Carr, author of the book, “The Shallows” which I reviewed this week on my blog. But it’s the article’s unusual topic that grabbed hold of me by the collar and motivated me to share it with readers—something called ‘reality mining’.  Reality mining is an advanced branch of data mining and is central to the book “Social Physics: How Good Ideas Spread—The Lessons from a New Science that Carr reviews and draws from in his article. Carr provides a good overview of not just the book, but of the science, and hints at the potential ills of reality mining, or as the book’s author calls it ‘social physics’ (or ‘mislabeled’ it as several reviewers of the book on Amazon claim). With reality mining researchers and scientists create algorithmic models using ‘big data’ generated by human movements and behaviours tracked by mobile phones, GPS, wearable tech or tracking devices to analyze and predict social and civic behaviour. Reality mining, with the expansion of mobile phone penetration globally in the past year and now wearable internet enabled devices, is likely the next big thing in data mining. Already many experts extol the virtues of reality mining and what it can do for institutions, society and the public good. As quoted on the book’s website:

John Seely Brown, Former Chief Scientist, Xerox Corporation and director of Xerox Palo Alto Research Center (PARC):

“Read this book and you will look at tomorrow differently. Reality mining is just the first step on an exciting new journey. Social Physics opens up the imagination to what might now be measurable and modifiable. It also hints at what may lie beyond Adam Smith’s invisible hand in helping groups, organizations and societies reach new levels of meaning creation. This is not just social analytics. It also offers pragmatic ways forward.”  socialphysics.media.mit.edu/book

We can already catch a glimpse of reality mining in businesses and organizations taking shape. The WSJ featured an article this week by Deloitte that describes the target market for wearable devices which is not consumers, but organizations or ‘enterprise’.  It seems there is unlimited potential for fitting employees with these wearable tech devices to gather data to support better decision-making at the workplace.

Reality mining takes Big Data to a new level, and as Carr emphasizes Big Data can and likely will be used to manipulate our behavior. It’s the idea of manipulation in this context that is disturbing.  Several questions come to mind like this one—who makes the decisions on the actions to take to manipulate a society’s behaviour? And, based on what values?

Below researchers describe how behaviour can be manipulated, as excerpted from “Social Physics” within Carr’s article:

book-cover-hi-res-2-crop-1

Author of “Social Physics”, Alex Pentland will be teaching “Big Data and Social Physics” via the edX platform. Start date: May 12, 2014

“They go into a business and give each employee an electronic ID card, called a “sociometric badge,” that hangs from the neck and communicates with the badges worn by colleagues. Incorporating microphones, location sensors, and accelerometers, the badges monitor where people go and whom they talk with, taking note of their tone of voice and even their body language. The devices are able to measure not only the chains of communication and influence within an organization but also “personal energy levels” and traits such as “extraversion and empathy.” In one such study of a bank’s call center, the researchers discovered that productivity could be increased simply by tweaking the coffee-break schedule.”

Closing Thoughts
Like Carr, I too am somewhat wary of reality mining, or ‘social physics’.  Though in examining Marshall McLuhan’s works, who Carr refers to in the opening of his article, I find wisdom in McLuhan’s words that so accurately describe what is happening now—within the realm of big data for instance.  The website managed by McLuhan’s estate includes snippets of interviews, quotes and links to his works that are worthy of perusing and pondering. I found the quote below applicable and insightful when considered in context of reality mining.

In the electric age, when our central nervous system is technologically extended to involve us in the whole of mankind and to incorporate the whole of mankind in us, we necessarily participate, in-depth, in the consequences of our every action. It is no longer possible to adopt the aloof and dissociated role of the literate Westerner.”  Understanding Media: The Extensions of Man, (p 4)

Worth pondering, is it not?

Further Reading

What the Internet is Doing to Our Education Culture: Book Review of “The Shallows”

Following is a book review of “The Shallows: What the Internet is Doing to our Brains”, though I suggest it’s more aptly titled, “The Shallows: What the Internet is Doing to our Brains Culture” and I describe why in this post.

“Culture is sustained in our synapses…It’s more than what can be reduced to binary code and uploaded onto the Net. To remain vital, culture must be renewed in the minds of the members of every generation. Outsource memory, and culture withers.” Nicolas Carr, “The Shallows: What the Internet is Doing to Our Brains” a Pulitzer Prize finalist for general non-fiction in 2011

images

by Nicolas Carr, 2010

Overview
Author Nicolas Carr made a name for himself with his article featured in the Atlantic “Is Google making us Stupid” in 2008. Carr is not a proponent of the Internet as one might guess from his article and from the title of his most recent book The Shallows: What the Internet is Doing to our Brain (2010). Though the book’s title implies that the Internet is not good for our brains, makes us shallow, no longer capable of deep and thoughtful thinking and learning, Carr fails to provide convincing evidence that this is indeed the case. A more appropriate title might be The Shallows: What the Internet is doing to our Culture. Carr describes his own challenges with disconnecting with Internet-enabled devices and social media, which is more of a reflection of our current culture—the constant and often frenetic connectivity to the Internet via our mobile devices. Our behaviours as a society have deeply changed due to engagement with digital media, and it’s this behaviour research suggests, that is responsible for changing our brains. Granted the Internet is the vehicle, the catalyst to the Information Age, which impacts society, culture and global economies significantly. Though the point is moot, what I found worthy of consideration while reading Carr’s book from an education standpoint is the concept of ‘efficiency’.  Frequently mentioned throughout the book via the studies quoted, is the idea that the Internet increases efficiency—efficiency usually in the context of work, doing more with less, or in terms of finding information quickly and accurately.

The Internet is a machine designed for the efficient, automated collection, transmission, and manipulation of information, and its legions of programmers are Internet on finding the “one best way”— the perfect algorithm—to carry out the mental movements of what we’ve come to describe as knowledge work” (p 150)

The Culture of Efficiency and Education
And is this not what we have been hearing in the last couple of years in education circles, how technological advancements can increase efficiencies in education? Efficiency is often cited in the same context as effectiveness, yet more often in terms of cost savings. I was surprised to unearth a book on this topic, Education and the Cult of Efficiency, by Raymond E. Callahan, published in 1962 no less. The author was responding to the post-industrial business model that sought efficiencies in work processes for greater costs savings and higher profits.

Raymond Callahan’s lively study exposes the alarming lengths to which school administrators went, particularly in the period from 1910 to 1930, in sacrificing educational goals to the demands of business procedures. He suggests that even today the question still asked is: “How can we operate our schools?” Society has not yet learned to ask: “How can we provide an excellent education for our children? GoodReads

1889099

By Raymond Callahan, 1962

And though Carr references efficiency in his book, it is not necessarily in an economic context as Callahan outlines. Yet the theme of efficiency as it relates to the Internet extends to our education culture—institution leaders, politicians and administrators seeking efficiency in practices and methods (automated grading, online courses with great numbers of students, etc.) Efficiency is not a ‘bad’ outcome to strive for, yet the idea of efficiency in education is frequently referenced in terms of increasing or maintaining education outcomes, with fewer resources. A recent situation in California’s public higher education system illustrates this point beautifully. Governor Jerry Brown looked to online education (MOOCs is more accurate), as a solution to the problem of bottle neck courses, where students couldn’t graduate from public universities due to too many students and not enough general education classes. Brown and members of his team referred to efficiencies frequently:

“Gov. Jerry Brown last year said that the state’s public colleges operate on a high-cost delivery model that the state cannot sustain. The price of a college education in California is growing steadily without “adding productivity or value,” the governor said….Online education is one practical way for the state’s public universities to improve efficiency.

But online education can expand college capacity and access to courses in affordable fashion. That approach is sensible for the UC system — and a worthwhile start on the much larger task of creating a more efficient and sustainable higher education system.”  The Press Enterprise

In fact, there are numerous articles and papers describing efficiencies in education published by many prominent organizations involved in education. This article for example, Higher Education: Quality, Equity and Efficiency published by the OECD which has this to say:

All nations face the challenges of mobilising more resources and using them effectively in meeting the strategic goals of society with maximum efficiency. “

Conclusion
I’ve referenced the theme of efficiency as it relates to education here which is really not the thrust of Carr’s “The Shallows“, however his book raises several interesting points about how the Internet has affected our culture which extends to our culture within education. The book is a worthwhile read, specifically for the references to how the Internet affects our ability to educate and learn—though learning not from a scientific perspective but from a behavioural one. However, when it comes to efficiency and education, I rather like Callahan’s question posed in his book “Education and the Cult of Efficiency”, ‘how can we provide an excellent education for our children’, and I would add to this, ‘in a connected world’.  The latter is now on my to-read list.

Further Reading:

If Change is Inevitable–Is Progress Optional? Four Education Institutions Opting for Progress

“Change is inevitable. Progress is optional.” Tony Robbins

change-architect-sign1The above quote from author and motivational speaker Tony Robbins sums up Dr.  Richard DeMillo’s presentation The Fate of American Colleges and Universities delivered in May of last year at Dartmouth University. Readers might be familiar with DeMillo—professor of computer science, speaker, author of several articles and books including Abelard to Apple: The Fate of American Colleges and Universities (2011). He currently serves as Director of Georgia Tech’s Center for 21st Century Universities. His talk carried a similar message that’s outlined in his book— colleges and universities in the Middle will need to change—and if they don’t they’ll be headed for irrelevance and marginalization‘ (MIT Press). It’s been three years since the book’s publication and many of his warnings about higher education appear close to reality.  In the book and in his talk at Dartmouth, DeMillo doesn’t candy coat his message, wrap it up into a more digestible form, but serves it straight.

The system of higher education…is not a sustainable system. I don’t know anyone who has seriously looked at American higher education that can come to the conclusion that what we are doing is financially, socially, pedagogically and morally sustainableRichard DeMillo, Dartmouth University, May 7, 2013

Though the message may be grim, the education sector needs individuals like DeMillo with their extensive experience and knowledge of higher education to tell it like it is. Granted, some will say DeMillo is wrong, is only making predictions and value judgements. However, three years after Abelard to Apple’s release, events described are no longer predictions

Responses to The Message
DeMillo describes leaders’ reactions to what he has to say—some are open, eager to look for ways to adapt to change and move forward, and others are unaware, dismissive, or even defensive.

University leadership in the United States for the most part is unaware that the crossroads is ahead.  […] The obvious question is how so many smart people could miss what seems to be an inevitable crisis?”  Richard Demillo, Abelard to AppleThe Fate of American Colleges and Universities (2011)

But many institutions are listening, are opting for progress, embracing change and striving to remain relevant. Below I share four examples of institutions that are choosing to implement strategies for change. Some projects are complex, are institution-wide, engaging the majority stakeholders. Others are on a smaller scale, yet no less bold.

Readers may question whether all initiatives are progressive, a way forward. Some appear to be going backward, as the University System of Georgia where several institutions are merging, resulting in some institutions names disappearing altogether. Though institutional leaders of these schools might say that it is progress for the long-term, with changes in the short-term that are difficult.

Below are descriptions of the strategies of each, and related links to outside sources with further information.

Four Institutions Opting for Progress

1. Corporate Sponsored Degree Program: University of Maryland, Cybersecurity

Strategy: Universities are beginning to seek funding support for undergraduate programs by partnering with corporations and other private institutions to build infrastructure and curricula for specialized degree programs. Companies are motivated to do so, hoping to fill skill gaps within their own workforce by creating a pool of educated potential candidates. This initiative is part of University of Maryland’s overall plan to remain financially sustainable, and relevant; it has also cut costs by eliminating seven varsity sports teams and forcing faculty and staff to take furlough days.

change_image2. Strategic Planning InitiativeBeyond Forward, Dartmouth University

Strategy:  Dartmouth University provides an illustrative example of an institution seeking to embrace change and prepare for the future by implementing a comprehensive strategic planning effort. Dartmouth’s end goal—’to identify significant opportunities and challenges as we consider an ambitious and forward-looking course for Dartmouth’s future.’  The website describing the program is detailed, sharing many resources, including the recorded talks of experts and scholars as part of the Leading Voices in Education series of which DeMillo was one. The two-year effort involved over 3,000 stakeholders including faculty, administrators, staff, students and alumni, and assigned nine working groups a topic to research, report upon and develop recommendations for. Impressive. To learn more, you can read Dartmouth’s Synthesis report of ‘Beyond Forward‘. Other institutions that have implemented a similar strategic initiative and shared the process are Georgia Tech University, Brandeis University, and Brown University.

Strategic planning is the first significant phase of opting for progress, however putting the plan into action—the execution of the plan requires more than talking about and planning for change, it’s about making it happen. Action.

3. Institutional MergersUniversity System of Georgia

Strategy: The primary motivation for education institutions to merge is to realize costs savings through sharing of administrative expenses common to each, i.e. finance, human resources, facilitation services, IT, etc. Universities merging is not new. There’s been several examples of institutions coming together over the years. Though recent mergers are on a large-scale. Not two institutions merging, but in the State of Georgia’s case, eight in all since 2012. As you can imagine, these actions are drastic, messy, often chaotic and stressful for all involved. Even more so when communication is poor, which it usually is. Though perhaps necessary to remain viable, and may be a way forward, no doubt it must appear institutions are taking several steps back. Successful mergers require a tremendous amount of planning, communication and diplomacy. Merging Into Controversy, Inside Higher Ed (2014).

4. MOOC-Inspired Initiatives. Penn State, flex-MOOC and Georgia Tech Institute.

Strategy: There are a few institutions seeking to use the MOOC format to seek sustainability for the long-term. Even though MOOCs continue to enroll and engage thousands of students, few higher education institutions have demonstrated how MOOCs will contribute to its sustainability, relevance, and direction for the future (more so when there is no strategic plan for the future). Two schools that are taking a step forward are Georgia Tech with its Online Master of Science in Computer Science and Penn State.

Georgia Tech: “OMS CS officially launches with first cohort Today about 375 students begin coursework as the first cohort in Georgia Tech’s online Master of Science in Computer Science (OMS CS) program, offered in collaboration with Udacity and AT&T. The group was admitted from some 2,360 applications…”

Penn State: “A flex-MOOC is a MOOC that offers content in modules that the learner can assemble into a personally relevant “course” and giving learners control over content, the sequence and timeline…creating a learning path that is relevant given learners’ individual contexts, strengths, and leaning needs.”

Closing
Change will happen. It is happening. Examining how institutions handle change, move forward is instructive. Is not changing an option and the right thing to do? Possibly. But making a decision not to change but is backed by a strategy, makes sense, not changing with no strategy doesn’t. How does your institution deal with change?

Related Reading:

Image credits: ‘Time for Change’, by marsmetn tallahasse, Flickr

MOOC ‘Jam': Highlights from a Jam on Digital Pedagogy

This post includes takeaways from a ‘MOOC Jam’, a synchronous discussion online I participated in with a group of educators about digital pedagogy. 

MOOC_Jam_Image

‘Jam’ by John Wardell (Flickr)

“What is a Jam?  A Jam is an asynchronous, typed, online discussion designed to work around your schedule. The goal of a Jam is to gain perspective and solicit ideas that inform the community. After the Jam is over, you can read the exchange and the posted resources, which will remain available for several weeks.” MOOC Jam II, Digital Pedagogy   (The three threads of asynchronous discussions in this jam are: 1) Competencies for teaching online, 2) Developing Faculty Competencies, and 3) Learner Analytics for Faculty)

Screen Shot 2014-04-01 at 11.16.09 AM

Screen Shot of participants online during the Jam participating in the threaded discussions (only partial shot of participants)

This past Tuesday, I participated in a  MOOC Pedagogy Jam via the website Momentum, a platform created for stakeholders to discuss critical issues related to education, sponsored by the Bill and Melinda Gates Foundation. The purpose of the platform is to provide a space to host online events about topics related to online education, with the ultimate goal of the Jams ‘to gain perspective and solicit ideas that inform the community’.  I participated in the first MOOC Jam this past November. The topic, “Peer Review of a Framework for MOOCs” hosted by George Siemens, focused on the design of the MOOC Framework. Siemens, creator of the Framework, sought input from the community of participants.

The topic of this Jam, [which turned out to be more of a synchronous discussion] was digital pedagogy, divided into the three threaded discussions as mentioned above. Each discussion featured a moderator, responsible for responding to participants and furthering the discussion, and another moderator summarizing key themes of the discussion each hour. I chose to participate in ‘Competencies for teaching online: describing effective pedagogy’ given its description— “An exchange on how information is delivered to students, how they are engaged as active learners and community is built and how learning is assessed”.

Screen Shot 2014-04-03 at 3.55.00 PM

Screen shot of one the three threaded discussions of the Jam held on the Momentum platform

Digital Pedagogy: Themes and Highlights
Following are my insights from the discussion on digital pedagogy and I’ve included comments from other participants. (Jam II, Momentum, Digital Pedagogy).

The discussion was rich with ideas, insights and provided a glimpse into the issues and challenges with online instruction. Though the title of the Jam featured ‘MOOCs’, much input from contributors pertained to closed, online courses which created an interesting discussion by highlighting one of the primary challenges in online education—the application of appropriate pedagogical methods, which will vary depending upon the learners, the delivery method and goals of the course.

Themes:

1)  Part of the discussion was devoted to the contrast and challenges between learner-directed and instructor-directed learning. The fact that much discussion focused on this issue highlights one of the challenges with MOOCs; a MOOC, due to its scale and format lends itself to be learner-directed. It’s not surprising then that MOOCs attract learners that already know how to learn, are motivated and educated. Several Jam participants discussed methods to get learners involved in learning, how to encourage students to engage and participate [typically in the context of closed online classes].

“I’ve done something similar to engage students in action research with me. I was teaching Web Development and was not happy with the development framework we were using. So as a class we researched the pros and cons of various frameworks and decided as a class (with my approval) which one to use. This worked well – they had “buy in” as we used to say.  Beyond that, I set basic specifications as to what they were to include in their work product, but allowed them to choose the subject matter (content). I also had to give approval before they began coding.”

I see the above challenge highlighting two opportunities: 1) to provide support to students to learn how to be self-directed, and 2) to provide skill development for educators and course designers in how to be flexible and adapt instructional strategies by assessing learners, the learning context and creating appropriate learning experiences, implementing pedagogical methods that match the learning needs.

One Jam participant shared an initiative that his institution recently started for its students; a program designed to address much of what was discussed here.

“California State University, Monterey Bay, is creating an online training module for training in baseline skills in web technologies for collaboration and other soft skills, such as team working relationships. Selecting appropriate pedagogy again depends upon an analysis of the learners — goes back to careful and thrustful planning”

2)  Considerable discussion focused on how to get students to interact, collaborate and engage with peers in online classes, and what the instructors role is in facilitating group formation, participation and learner engagement. Though this theme is similar to the theme mentioned above, interesting thoughts on group formation and collaboration emerged—should it be encouraged, facilitated or left for students to form spontaneously? And if so, how? This relates to the motivation of the learner, which is quite different when students are in for-credit classes versus ‘free’ and open classes  [MOOCs] that are driven by interest and desire to learn—essentially self-directed.

But a key trade-off when you have non-static groups, as Michaelsen, Fink et al have looked at is that you lose the crucial accountability factor and or the time to form constructive group norms/roles etc. — this then leads to the ‘freeloaders’ issue that gives groupwork such a bad rep.  There is the challenge — in a MOOC context, can you establish stable, productive learning groups with accountability, positive norms, roles etc to really activate the engagement, peer learning and other benefits of group learning?”   

The comment above is interesting—is it really possible or desirable in a MOOC environment that the responsibility for group accountability and productivity rests with the instructor?

3)  The session wrapped up with discussion that focused on supporting learners, helping learners to learn in a MOOC format.  The question appears to be—how can this be accomplished, is it through course design, or while the course is live, accomplished via course facilitators?  Or do we need to teach students how to learn in a MOOC?

I think one of the goals for a MOOC is enabling learners to make connections, share, collaborate and learn from one another. Rather than thinking about self-directed or facilitator directed maybe we need to think about how we can create ways that encourage learners to support one another?”

I am very interested in how learners can and do support one another’s learning in MOOCs. Do you have some thoughts in mind about the answer to this question? What can we build into the design that supports and encourages peer-peer learning?”

Closing Thoughts
Discussions, similar to those within this Jam, create excellent opportunities to get the issues and challenges facing education, specifically online education, out in the open.  It also helps stakeholders identify what needs to be discussed and explored within their own institutions. There are commonalities across all institutions when it comes to online education, and ironically the very barriers affecting these issues, exist within institutions at all levels. Fortunately there is progress—many institutions are experimenting, collaborating and striving to adapt to cultural shifts, increase access, yet still provide high quality, relevant education. Are there similar discussions happening within your institution?