Nicolas Carr on ‘Social Physics’…The Darker Side of Reality Mining

BigDataImageIt’s this article ‘The Limits of Social Engineering that piqued my interest this week, first because of the image featured in the article which I found appealing, then it was the reference made to Marshall McLuhan, a scholar and author I admire greatly, and finally because it was by Nicolas Carr, author of the book, “The Shallows” which I reviewed this week on my blog. But it’s the article’s unusual topic that grabbed hold of me by the collar and motivated me to share it with readers—something called ‘reality mining’.  Reality mining is an advanced branch of data mining and is central to the book “Social Physics: How Good Ideas Spread—The Lessons from a New Science that Carr reviews and draws from in his article. Carr provides a good overview of not just the book, but of the science, and hints at the potential ills of reality mining, or as the book’s author calls it ‘social physics’ (or ‘mislabeled’ it as several reviewers of the book on Amazon claim). With reality mining researchers and scientists create algorithmic models using ‘big data’ generated by human movements and behaviours tracked by mobile phones, GPS, wearable tech or tracking devices to analyze and predict social and civic behaviour. Reality mining, with the expansion of mobile phone penetration globally in the past year and now wearable internet enabled devices, is likely the next big thing in data mining. Already many experts extol the virtues of reality mining and what it can do for institutions, society and the public good. As quoted on the book’s website:

John Seely Brown, Former Chief Scientist, Xerox Corporation and director of Xerox Palo Alto Research Center (PARC):

“Read this book and you will look at tomorrow differently. Reality mining is just the first step on an exciting new journey. Social Physics opens up the imagination to what might now be measurable and modifiable. It also hints at what may lie beyond Adam Smith’s invisible hand in helping groups, organizations and societies reach new levels of meaning creation. This is not just social analytics. It also offers pragmatic ways forward.”  socialphysics.media.mit.edu/book

We can already catch a glimpse of reality mining in businesses and organizations taking shape. The WSJ featured an article this week by Deloitte that describes the target market for wearable devices which is not consumers, but organizations or ‘enterprise’.  It seems there is unlimited potential for fitting employees with these wearable tech devices to gather data to support better decision-making at the workplace.

Reality mining takes Big Data to a new level, and as Carr emphasizes Big Data can and likely will be used to manipulate our behavior. It’s the idea of manipulation in this context that is disturbing.  Several questions come to mind like this one—who makes the decisions on the actions to take to manipulate a society’s behaviour? And, based on what values?

Below researchers describe how behaviour can be manipulated, as excerpted from “Social Physics” within Carr’s article:

book-cover-hi-res-2-crop-1

Author of “Social Physics”, Alex Pentland will be teaching “Big Data and Social Physics” via the edX platform. Start date: May 12, 2014

“They go into a business and give each employee an electronic ID card, called a “sociometric badge,” that hangs from the neck and communicates with the badges worn by colleagues. Incorporating microphones, location sensors, and accelerometers, the badges monitor where people go and whom they talk with, taking note of their tone of voice and even their body language. The devices are able to measure not only the chains of communication and influence within an organization but also “personal energy levels” and traits such as “extraversion and empathy.” In one such study of a bank’s call center, the researchers discovered that productivity could be increased simply by tweaking the coffee-break schedule.”

Closing Thoughts
Like Carr, I too am somewhat wary of reality mining, or ‘social physics’.  Though in examining Marshall McLuhan’s works, who Carr refers to in the opening of his article, I find wisdom in McLuhan’s words that so accurately describe what is happening now—within the realm of big data for instance.  The website managed by McLuhan’s estate includes snippets of interviews, quotes and links to his works that are worthy of perusing and pondering. I found the quote below applicable and insightful when considered in context of reality mining.

In the electric age, when our central nervous system is technologically extended to involve us in the whole of mankind and to incorporate the whole of mankind in us, we necessarily participate, in-depth, in the consequences of our every action. It is no longer possible to adopt the aloof and dissociated role of the literate Westerner.”  Understanding Media: The Extensions of Man, (p 4)

Worth pondering, is it not?

Further Reading

What-the-heck Happened in 2012? Review of the Top Three Events in Education

This was an extraordinary year in the education sector providing bloggers and journalists  with much content to write about: ed-tech start-ups, big data, open courses attracting thousands of students and even some institutional drama  in the mix. Several bloggers I follow have done an excellent job in summarizing the years’ events; my review is on a lighter note, a digest if you will on the three most significant topics [in my opinion] of 2012. Each event includes a synopsis of the topic with links to blog posts and articles summarizing and exploring what-the-heck happened in 2012.

THE MOOC! The movieimage by Guilia Forsthye

THE MOOC! The movie, image by Guilia Forsthye

1) The xMOOC movement caused a certain level of mania, hysteria and irrational decision-making by numerous educators and their institutions in 2012. It also sparked controversy, discussion, change—just what is needed to address the challenges facing education. Audrey Watters of Hack Education wrote a stellar summary, The Year of the MOOC, chronicling developments over the entire year. From another perspective, Tony Bates wrote a thoughtful piece, ‘Why MOOCs?’ in his year-end post Online learning in 2012: a retrospective. 

With the influx of several universities partnering to create massive platforms for MOOCs, [MITx, Coursera and Udemy], co-founder of the original MOOCs, Stephen Downes [with George Siemens] clarified the terminology for MOOCs [which educators are surely grateful] before things started to get really difficult to follow, “I am now referring to the MOOCs offered by Coursera, Udemy and MITx (among others) as xMOOCs, to be compared with cMOOCs, which is what we offer in our connectivist classes.” (Downes, 2012).

iStock_000013752327XSmall

2)  Ed tech Start-ups were HOT this year, and fueled by the significant sums of money investors were willing to part with. A few attracted millions of dollars including Udacity, Coursera and Knewton, of which collectively have taken in well over fifty million dollars to date.There are numerous other smaller start-up companies wanting to take advantage of the current quest within the education sector for improved access and quality, and to lower costs.  Edudemic wrote a good review, 25 Start-ups Worth Knowing. The best article I read this year about an ed-tech start-up, was Simplicity and Order for All, about Jack Dorsey founder of Twitter and a new platform ‘Square’. Dorsey is quirky, artistic and brilliant. The article sheds light on the motivation behind an idea—what sparks and drives the innovation in the first place.

EdSurge, a newsletter about ed tech start-ups published a list of the Top Educational Tools of Q2, 2012 based upon web traffic from readers.  Another helpful resource, from Jane Hart founder of Centre for Learning and Performance Technologies (C4LPT), is the Top 100 Tools for Learning, which Jane has published for the past four years. Click here the 2012 edition; it’s a worthwhile read as Jane includes a slide share presentation that describes each tool.

iStock_000008045233XSmall3) Big [Educational] Data, data, data everywhere also sparked much discussion and a quest to find new and novel ways to use the tremendous amount of data that educational institutions are keepers of. Though as more discussions and reports are generated, policies and decisions made, questions began to surface that included, ‘how can we use the data effectively’? And ‘what are the ethics behind big data in the collection and reporting of’?

I wrote several posts within the last few months about learning analytics, a branch of big data, including this one about instructors and students, How instructors can improve student engagement with learning analytics. George Siemens is a leading scholar in this area, has conducted several key notes, written several papers, and in October of 2012 posted content from the Second International Conference on Learning Analytics and Knowledge that he was involved with.

Looking Forward to 2013
The year of 2012 was tumultuous but exciting at the same time. Though I’ve covered only a fraction of the year in this post, the links to the blogs provided will direct you to more comprehensive coverage.

What will 2013 bring?  What are your predictions for next year? I have a few of my own which I’ll share in early January. Thank you to you, my reader, for taking the time to read this blog and, to those of you that have shared your own insights with comments. I wish you Happy Holidays and restful season.  I look forward learning from you, and with you in 2013.
Debbie

Dream or Nightmare? The Ethics of Learning Analytics

Just because it [academic data] is accessible doesn’t make it ethical (Boyd & Crawford, 2012).

What are the implications of higher education institutions collecting student data and compiling a multitude of reports based upon students’ online clicks, page views, time logged on, and electronic notes? Do educators have a responsibility to tell students what they are doing? These are interesting questions that education institutions should be wrestling with in this era of Big Data.

I’ve written several posts recently about learning analytics, emphasizing the need for meaningful analysis, student-centered reporting, and transparency. However, I admit to having reservations about analytics and its role in education. Not only is there potential for abuse and manipulation of data, but I am concerned about privacy and student rights. Surprisingly, there has been little written about the moral implications and the potential nightmares that analytics could create. Simon Buckingham Shum a leading scholar in learning analytics research, led a talk Learning Analytics: Dream or Nightmare? for EDUCUASE’s online spring focus session, (webinar recording available here). Shum discusses Big Data in education focusing on analytics in K-12 and higher education settings. Shums’ talk promotes deep thought as he highlights  the positive aspects of academic analysis but also its dark side. In this post I highlight some of the concerns put forth by Shum, the ethical considerations we as educators should be concerned about, and the questions we should be asking.

Simon Buckingham Shum, EDUCAUSE Webinar, 2012

Questions to Ask
If we look at where the idea of using Big Data to improve productivity and growth came from, we need look no further than to the business sector. Businesses thrive on analyzing customer data, sales, market performance and inventory logistics. Can we apply the same principles to education?  In his talk, Shum asks a rhetorical question about analytics using the slide at the right to emphasize his point. BusinessAnayltics.edu or LearningAnalytics.com? Can we treat academic data the same way as business’ treat data?  IBM thinks so. I attended a webinar several months ago where IBM was sharing a case study about a college where its analytics platforms were implemented.  I was uneasy throughout the  webinar, I heard over again and again words such as  strategy, performance, achievement, strategic planning. Rarely did I hear the words, student, learning or development. I am not suggesting that IBM is incapable of providing valuable expertise, I am only using this sliver of insight I gained through a brief webinar to highlight the bigger issue, which is the need to ask questions that include, ‘how  should we approach educational data?’, ‘who should have access?’ and ‘how does academic data differ from other types of data?’.

What do students think?
Speaking of breadth, do students know the depth and breadth of data that is collected about them within the academic platform (LMS such as Moodle or Blackboard) they use consistently? And if they do, what are their responses? Some students will not care, yet others may be vehemently opposed. However, when students are involved in the discussion of how to use the data, and are part of the conversation, which Shum suggests, the concern of privacy and ethics becomes clearer. Transparency is essential. Grand Rapids Community College blog features an excellent article, Obligation of knowing: Ethics of Data Collection and Analytics, which suggests using transparency to create trust. Letting students know how data will be used, and how they will benefit is a good place to start.

Simon Shum, Webinar

Solutions
Simon Shum closes his webinar with two slides, the first with an image of a man holding a magnifying glass, asking ‘who gets to hold magnifying glass’, implying that educators should be considering not only who should be analyzing and viewing student data but why. The final slide, an image of a student holding a mirror, suggests that analytics should be used as a mirror for learners to become more reflective, and less dependent. Yet it is up to the institution to determine how data will be used which will determine the result— either a nightmare scenario where analytics breed resentment and myopia, or a dream scenario. In the dream scenario, analytics can create a generation of tools that support and develop learners, where students become self-directed, responsive and armed with the skills needed for the 21st century.

Resources:

Photo Credit: Personal Data, by Charlie Collis (highwaycharilie), Flickr

Putting Learners in the Driver’s Seat With Learning Analytics

Steering WheelI read something disturbing this week from Inside Higher Ed, Measuring Engagement with Books

“The big buzz in higher ed is analytics,” said senior vice-president of marketing Cindy Clarke, of the e-textbook provider Course Smart. “Based on the issues there are with institutions around improving the return they’re getting on their investment in course materials, we realized we had a valuable data set that we could package up [emphasis added].” (Tilsley, 2012)

Coincidently, last week’s topic in the course I am taking Current/Future State of Higher Education (#CFHE12) was Learning Analytics, the same topic this article refers to. It’s a promising area of study and is a ‘hot’ topic in higher education right now. Data, in the form of students’ online behaviours obtained by measuring clicks, keystrokes, time logged-on, number of ‘hits’ [visits] on web pages, is collected and then compiled into ‘meaningful’ information.

Yet Course Smart’s [in my opinion] program is an example of learning analytics gone awry. The ‘packaging up’ as mentioned by Ms. Clarke refers to the program Course Smart developed with data on students’ reading patterns. The program looks at how students interact with the e-textbooks, the number of times a student views a page and for how long, highlights made, etc. Course Smart compiles this ‘data’ and sends a Student Engagement Report to professors.  Are these metrics a true measure of a student’s level of engagement?  It seems that student engagement covers a far broader scope than time spent reading a textbook.  Even if the report did provide meaningful indicators, how would an instructor actually use it to teach more effectively?

Analytics in higher education is considered by some to be a panacea to its woes. Yet it’s  complex, sensitive, and almost onerous given the abundance of student data that is collected by institutions. In this post I’ll give an overview of the three areas of analytics, micro, meso and macro to provide clarity and context as explained by a guest presenter in a recent CFHE12 webinar, Simon Buckingham Shum, Associate Director of Knowledge Media Institute, Open University, UK. I’ll also share how learning analytics is used to help students learn as discussed by educator Erik Duval, Professor at Katholieke Universiteit Leuven, Belgium, during another #CFFE12 webinar, and finally how educators can use analytics to help students take charge and ownership of their learning, essentially put them in the driver’s seat.

The Big Picture of Student Data
When we speak of learning analytics, we are at the ground floor of Big [academic] Data. Data analytics in academia has the potential to support decision makers, academic researchers, policy makers, instructors and students. Simon Shum described the layers of analytics this way:

What Questions Should Data Answer?
At the macro level, institutions share data with others, compare and determine what is useful to influence and support decisions on educational funding models, policies and for International comparisons. At the institutional level [meso level], analysis includes examining student progress and related data to make programming decisions, identifying at-risk students by predicting student performance, and making curricular decisions. Shum’s approach to analysis is holistic, he puts forth questions that educators should be asking when making decisions about how to use data effectively at the meso and micro level:

  • What are we measuring and why?
  • What problem are we trying to solve with the data?
  • What level of results should we share with the learner?
  • What are the ethical considerations?
  • How can we create a functioning ecosystem that uses data effectively and responsibly?

Dashboard of student progress displayed throughout a course. Designed by students in a research project with Duval.

Analytics in the Real Time – Helping Students
Duval, not only a professor, is chair of the research unit on human-computer interaction analytics, and his research focuses on how data can provide valuable feedback to the learner. He gathers and analyzes student behaviour patterns through online behaviours as it relates to their learning, which he shares with his students, there is 100% transparency. Information for the learner comes in several forms, one of which is a dashboard that provides a snapshot view of how the student is doing at a given point in the class. This view is designed to trigger self-reflection where learners can view their progress, compare their performance with others in the class. When asked how students respond to viewing others’ performance, Duval says he spends considerable time at the beginning of the class explaining, discussing and reviewing the purpose of the reports, how to use them and what they mean to students’ overall learning.

This approach is also in ‘real time’, it is actionable — students [and instructors] have access to feedback as the course progresses. Students can adjust, make decisions, and take action as needed. Instructors can also reach out to struggling students, ‘intervene’ with resources and support. Duval describes this as putting the learner in the driver’s seat, which is the name of the conference held in Belgium recently on Learner Analytics.

How can Instructors use Learning Analytics at the Ground Level?
There are questions and concerns about how much data students should have access to, with FERPA guidelines and privacy issues, educators must tread carefully. That being said, we can begin by asking the right questions: what do we want students to achieve, and how can data help them? What do we need to do to educate students about the data, how they can use it?

What can Educators do?

  • Identify what tools are available within your learning management system for data analysis. I’ve included in previous posts YouTube videos that provide instruction on how to use LMS tools, and other strategies for analytics. See the resources section for the links.
  • Be part of the discussion with faculty and administrators about learner analytics – ask questions, focus on ‘why’.
  • If presented with analytical tools or reports, determine how they can be used to support learners through instruction or intervention.
  • Become familiar with how analytics can help instructors be more effective in helping students learn.
  • Review  programs based upon analytics that are used at other institutions: Purdue’s Course Signals ProgramUniversity of Michigan Academics Analysis Program and Community College at Rhode Island program,Connectedu.

Closing thoughts
Learning analytics has tremendous potential for education, though I am cautiously optimistic about its use in higher education. I am far from an expert, but I see the value in giving students ownership of their learning through tools provided by analytics,  dashboards for example, similar to Duval’s. We need to involve the student in this conversation – it’s not the data that’s the solution to the challenges that higher education is facing, it’s the students. Let’s put them in charge, give students the tools to make the decisions to make learning meaningful, put them in the driver’s seat.

Resources:

Conference, Learners in the Drivers Seat, Belgium
How Instructors Can Use Analytics to Support Student Engagement, Online Learning Insights
SoLAR, Society for Learning and Analytics Research
Learning and Knowledge Analytics, Resources
Engage: Test by ACT to predict student success in college
Erik Duval’s Slideshare, Presentations
Understanding Learning Academics and Student Data, MindShift

How Course Instructors can Improve Student Engagement with Learning Analytics

Online learning is dynamic, active, at times disorganized, yet with the effective use of tools, instructors can adapt and adjust instruction to create a rich learning experience. This is part two in a three-part series on learning analytics.

Learning analytics is a powerful tool that can help instructors adapt their course to create an engaging, robust learning environment. Analytics, the newest tool to  improve student learning, is a branch of the latest application of Big Data, used exclusively at one time for big-box retailers such as Wal-Mart, Home Depot and Fortune 500 companies, has now become mainstream in education. Learning analytics in the education sector collects data whenever students log-on to their institutions learning management platform [LMS] such as Blackboard or Moodle. Each student click known as a ‘view’ in Moodle, is associated with a time stamp, a record of the time students spent with each resource.  Analytics involves the mining of  data, analysis and reporting, which creates [potentially] useful information about student learning. The Society for Learning Analytics and Research (SoLAR), dedicated to exploring the role of analytics in teaching and learning defines it this way,

Learning analytics is the measurement, collection, analysis and reporting of data about learners and the contexts for purposes of understanding and optimizing learning and the environments in which it occurs.  SoLAR

But notice earlier I stated potential; I believe that the data alone does not tell a complete story. In this post I’ll share how course instructors can use analytics as a tool to improve  engagement and quality of learning outcomes by examining activity data and student outputs [writing assignments in this case]. I use an example from my workplace to show how we: 1) assessed ‘data’ from LMS reporting tools, 2) identified how students interacted with course content through learning activities, and 3)  made decisions for course design for this course and others based upon the results.

How to Use LMS to Analyze Course Design
Within a LMS, [Moodle in this example] reports are available that show how students engage with each resource by ‘views’ (clicks) for the entire class or even by student. A resource, for the purpose of this post, is defined as course content [web page, outside link, videos clips, etc] or an activity [discussion board, wiki project etc.] In this post I used a course from my workplace  as the example, an online history course for credit with eight students.

We examined student patterns of interaction with content and application through the class discussion boards and learning activities. Learning activities consisted of students interacting with content, in the form of pre-selected web resources which provided students with an interactive learning experience requiring exploration and examination of primary sources.  Discussion boards consisted of student responses to instructor questions which addressed text-book reading and/or content from pre-recorded lectures. Below is a screen shot of one of the weeks within the course [week five] activity report, which includes the collective views by students for each resource.

Screen shot a section of a Moodle report which displays the number of student views each resource had collectively for the given week.

In each instance we examined the quality of the students’ application, which for each, required students to write two or three paragraphs describing what they learned [through interaction with content either web resource or text/video]. Instructions are below for one part of the content exploration.

Activity that focuses on the content, the Influenza Epidemic, one topic within week’s content.

The application portion of the learning activity from above.

When comparing this activity to others within the course and discussion boards, the Influenza Epidemic received the highest number of views. Upon closer analysis [with another report], students spent more than one visit on this activity, and invested more time.

Next, we examined the quality and quantity of students’ responses to the various activities, it appeared students were spending time with the content in the above activity –  and it is evident from reading student responses how effective the activity was.  This is important – the numbers or data alone did tell the complete story, it was this in conjunction with the quality of students responses that we were able to conclude that interactive activities with quality content, requiring students to interact with primary sources (in this instance), with the requirement of a written product, produces the best results.

Conclusions from Analysis:

  • Students are more likely to engage with content when the activity involves interactive resources. The activity then becomes an effective pedagogical method when students are required to produce or create an output after reflection and analysis.
  • Number of views on discussion boards can be an indicator of level of interest of a given topic. This differs from the number of postings, as some students may read the postings but not create a post.
  • Identifying which activities not only engage students but produce quality analysis – where students use higher order thinking skills, is critical to designing and adapting online courses.
  • Reviewing reports in conjunction with student outputs is essential to gain a holistic perspective on student performance.

The example I share here is only a fraction of the tools and reports available to analyze and examine student behaviours.  Investing time in learning how to use the tools most effectively is time well spent for the online course instructor; time that can lead to well designed courses resulting in application of critical thinking skills and positive learning outcomes.

Resources:

Photo Credit: Magnifying Glass, by Rafael Anderson Gonzales Mendoza

How Learning Analytics Can Make Instructors More Effective in an Online Course

This is the first in a three-part series on learner analytics, cutting edge insight for the course instructor; how to assess student behaviours in an online course using the LMS data collection tools in order to provide more effective course design and instruction.

Most course instructors strive to create a class where students are engaged with the content, appear eager to learn and participate. The indicators of student engagement in a face-to-face class are straightforward enough, attendance, participation in class discussions, and/or visits to the instructor during his or her office hours.

Measuring student engagement in an online course is more complex. However with the current learning management systems [LMS] such as Moodle and Blackboard now in use in virtually all education institutions, there is a treasure trove of data on student behaviour. This data has the potential to tell a story about a student’s engagement, even predict student success within a course. Each click or ‘view’ of a web page or resource on the course homepage is recorded in the activity database along with the time spent on each. The LMS platform becomes not only a resource provider and virtual space for students, but a source of information for instructors about student behaviour and actions.

Consider the potential if course instructors could access and interpret the data collected on the actions of students in a few simple steps. The good news, this is not only possible but takes minimal time on the instructor side, yet reaps big rewards in terms of getting feedback on what is, and is not working within a course. Online instructors that can assess patterns of student behaviours and interactions with course content and learning activities, can be responsive and adjust their teaching style accordingly.

Correlation between Engagement and Student Performance
Before exploring further, identifying the purpose of measuring student engagement in terms of data analysis is necessary to frame the discussion. Several studies have determined that a strong relationship exists between students’ LMS usage and academic performance. California State University, Chico identified that the more time students spent on learning tasks within the LMS [‘dwell time’] along with a high number of visits to the course home page, was associated with higher student grades (Whitmer, Fernandes, & Allen, 2012). Another study conducted by scholars at Central Queensland University which used a sample population of 92,799 undergraduate online students, reported a statistically significant correlation between the number of student views on the course home page and students final grade (Beer, Clark and Jones, 2010). The more ‘views’ or visits to the course home page, the higher the final grade.

Academic Analytics
The amount of data stored in educational institutions is gargantuan, and the new term for data collection is Big Data. According to McKinsey Global Institute, the education sector ranks as one of the economy’s  top ten in terms of the amount of data stored. The question becomes, what do we do with it.  At the institutional level, there are numerous opportunities for data analysis where schools can identify many patterns, gaps in student performance is just one example. Arizona State is an example of an institution that uses data analytics extensively, and has done so for several years with sophisticated analysis programs.

However in this post we are looking at the micro level, how the course instructor can affect his or her instruction using the information stored within the course to improve instruction and support students.  I’ve outlined below a few practical suggestions to get started, the basics to analytics.

Practical Applications for Course Instructors
Within virtually all learning management platforms there are reporting features that course instructors can access to display student data. Below are questions instructors may have about the students within their course that the data can answer through various reports that can be generated.

  • Which course resources/tools are being used most frequently? Video clips, posted documents, etc.
  • How often are students logging onto the course?
  • When did the student review the assignment instructions? Submit an assignment?
  • Which discussions boards generate the most traffic – have more students views? This is different from the number of discussion board postings, as many students may view [and read] the posts but not contribute.
  • When was the last time students logged onto the course? How many times per week are students logging on?
  • What are the patterns of performance in online tests? By question?

Learning to Use the Reports
Learning how to use student data is not complicated once you know where to access the information. I’ve included a selection of brief videos below [average time of each is three minutes] all created by course instructors from various institutions that demonstrate how to access student reports in Moodle and Blackboard. In my next post I’ll delve into what student engagement can tell you about your course design, how to adapt instruction to be more effective and how to troubleshoot student problems based upon my experience with analysis of the online courses at my workplace.

Click here for part two of this series, How Course Instructors can Improve Students Engagement with Analytics.

Resources: How To Videos

Photo credit: Big Data, metaroll’s photostream, Flickr