What Does a Growth Mindset Have to Do with Learning?

mindset.001We can learn to be smart is the premise of Mindset The New Psychology of Success: How we Can Learn to Fulfill our Potential by Carol Dweck (2006). Dweck debunks the idea that intelligence is fixed, is predetermined by our gene pool. Instead, Dweck suggests individuals with a ‘growth mindset’ develop intelligence and abilities over time. In her book Dweck defines growth mindset as a state when individuals view personal qualities such as intelligence, abilities and talents as malleable. This contrasts those with a ‘fixed mindset’ who see  qualities like intelligence as innate or inherited. People with a growth mindset according to Dweck, challenge themselves; they aren’t afraid of making mistakes, are known to go-for-it. People with fixed mindsets on the other hand are afraid of making mistakes, afraid of moving out of their comfort zones. Fixed mindset people Dweck describes as preoccupied with outcomes, the final grade or successful work project for instance, over the process and experience.

To support her philosophy Dweck quotes Robert Sternberg, psychologist and professor of human development at Cornell University who states that a key factor in whether people achieve expertise “is not some fixed prior ability, but purposeful engagement” (pg. 5).

How Growth Mindset Applies to Learning
Dweck outlines several applications of the growth mindset to education. One she emphasizes is that educators need to act as role models. This contrasts with typical roles where people who are put in an expert role, educators for example, often feel they need to have all of the answers. This can limit their growth and student learning. Instructors need to model behaviours that include showing students you don’t have all of the answers, and that pursuit of knowledge, failure and even confusion, is part of the learning process.

Screen-Shot-2015-11-27-at-11.37.21Applications for Learning:

  • Learners can be taught a growth mindset. Dweck developed a program that teaches students growth mindset principles—intelligence is not fixed, students are in charge of their learning, need to stretch in order to get smarter. Students taught a growth mindset performed better academically.
  • Telling students they are smart, intelligent and giving constant praise can lead to a fear making mistakes, fear of failure and a fixed mindset.
  • Learning experiences need to be challenging—difficult. The concept that learning needs to be difficult is reinforced in Make it Stick: The Science of Successful Learning. The premise of chapter 4 in Make it Stick, “Embrace Difficulties” suggests learning is deeper and more durable when it requires considerable effort (Brown, Roediger & McDaniel, 2014, p. 73). See video featuring Carol Dweck on Struggle where she discusses the importance of challenge in learning.

Tips for Educators to Support a Growth Mindset

  • Encourage students to be comfortable with setbacks and confusion.
  • Don’t praise talent, praise process. Dweck’s research revealed that praising talent leads to fixed mindsets. Praising process includes acknowledging, resilience, effort, collaboration, and the experience.
  • Be comfortable with confusion, for your students and yourself, and not finding the answer right away.

Closing Thoughts
Approaching learning with a growth mindset frees learners to expand, grow and engage fully in the process without the constraints of IQ or SAT scores.  Following a growth mindset as Dweck describes, requires a conscious effort, a mindset, a skill set. Yet  it’s a perspective that educators can model and foster by their own actions, by making learning difficult, acknowledging and allowing for failure, and emphasizing the process of learning, not the outcome. Which mindset do you have?

Further Reading

References

  • Dweck, Carol S. Mindset: The New Psychology of Success. New York: Random House, 2006. Print.
  • Roediger, Henry L., Mark A. McDaniel, and Peter C. Brown. Make It Stick: The Science of Successful Learning. N.p.: Harvard UP, 2014. Print.

Image credit: Growth Mindset, bigchange.org

Pew Research Reveals Three Barriers to Lifelong Learning

overcoming-barriers-to-technology-assisted-reviewPew Research Center’s recent report, “Lifelong Learning and Technology” gives insight into how Americans perceive and engage in lifelong learning (Horrigan, 2016). It’s a worthy read. It contains valuable data and insights for stakeholders involved in education planning and decision-making. Yet I’ve identified three themes I consider most instructive and compelling; three significant barriers that the education sector as a whole needs to acknowledge and address in order to improve and move online education programs forward.

Three Barriers
1) Limited Access: online education has not, up to this point, democratized education—adult learners with limited education do not engage, for various reasons, in learning aided by technology, 2) Lack of familiarity with online learning options persists among all adult learning groups; for instance only 14% are “very familiar” with even the concept of distance learning, and MOOCs—only 5% are “very familiar”, and 3)  Learning gap: there’s a significant gap between how some adults view learning in general and their actual lifelong learning behaviours—the majority of Americans (87%) believe learning new things is “very important” yet only 73% of adults consider themselves lifelong learners.

1. Online Education Fails to Democratize
We’ve long heard how digital education platforms such as Coursera and edX will democratize education by overcoming barriers associated with higher education by lowering costs and reaching populations with limited education. Yet Pew’s findings suggest otherwise. It reveals that these same groups, those with low levels of education and household income, are less likely to engage in any form of online learning. One finding is particularly telling—less than half of respondents with a high school education or less have used the internet for personal (43%) or job-related learning (49%) (Horrigan, pg. 7). This suggests that education providers need to determine how to leverage and implement technology as a learning tool to serve the groups that need education most.

2. Limited Awareness of Digital Platforms for Learning
Quite surprising is the fact that the majority of adult learners are not familiar with digital learning options. While most readers of this blog are likely (very) familiar with MOOCs and for-credit online courses, it’s startling to consider that most adults, even those with higher education levels are not (see screenshot below for details). This phenomenon has implications for educators and institutions; the most pressing is the need to inform the general population about digital learning options. Going further, there’s also a need to educate adults how to learn effectively in a digital world. Accomplishing this will require a strategic and concerted effort by education institutions involving a multi-pronged approach, utilizing multiple communication channels to promote learning options. Other alternatives may require forming partnerships with unrelated institutions as Khan Academy did with Bank of America for their Better Money Habits® program. There is much work to be done.

PI_2016.03.22_Education-Ecosystems_5-03
Lifelong Learning and Technology, John Horrigan. Pew Research Center (section 5).

3. The Learning Gap
According to the report Americans value learning greatly. It indicates that 87% of adults state that it’s “very important that people make an effort to learn new things about their jobs”. Yet the same survey finds that 73% of adults agree that “I think of myself as a lifelong learner” applies “very well”. The numbers suggest there’s a segment of the population who view learning as very important, yet they don’t engage in lifelong learning activities for their own personal or professional growth. Why? It’s worth further examination. There is an opportunity to reach a group of adults who value learning greatly but don’t engage for whatever reason. The report does identify factors that play a role in lifelong learning activities, e.g. household income, education attainment, etc (section 2). One avenue to consider is the role educators could play in closing the gap.  Possibly by instilling skills and modeling behaviors associated with lifelong learning in elementary and/or high school, granted the logistics of ‘how’ is a barrier in itself.

There is no easy solution to closing the gap, and it is closely linked barriers one and two.  Yet this gap deserves special consideration—further discussion among educators involved in all levels of education. How can we as educators encourage and develop skills and behaviors in students, young and old where learning is self-directed and lifelong—where students forge their own learning path based upon their, interests needs, and passions?

Closing
The Pew Report yields important and helpful insights that can drive meaningful dialogue about education: professional, elementary and higher education. Also the role of technology in education, it’s reach, and shortcomings. The report hopefully will serve as a catalyst for action, action by education institutions and individuals to advance and improve institutions and platforms reach and impact, to build and grow engaged communities of lifelong learners.

 

Can “Hooked: How to Build Habit-Forming Products” Help Make Learning a Habit?

Habit: noun: a usual way of behaving : something that a person does often in a regular and repeated way — Merriam-Webster

Hooked-hardcover“Hooked” is about how to build habit-forming products…habit-forming digital products that is. I included “Hooked” on my must-read list to see if any of the principles discussed might apply to education—Learning Management Systems (LMSs) for instance or other ed-tech applications. Given our culture’s fixation with mobile devices, surely there might be some lessons to make digital education applications more compelling. Can educators create platforms or applications that ‘hook’ students into learning, where learning behaviors become a habit? If so, how? Hooked provides some answers—see below.

‘The Hook’ Model
Author Nir Eyal, entrepreneur and product designer describes the book’s topic as “behavioral design”.  Behavioral design when applied to product development incorporates concepts of user experience, behavioral economics and neuroscience. Nir describes it as the intersection of psychology, technology, and business. His recipe for creating habit-forming products begins with ‘The Hook’ model.

the_hook
Nir Eyal’s “Hook” model encompasses four elements that products need to become habit-forming. Yet it’s not a guarantee for success, “new products can’t just be better, they must be nine times better” (pg 17).

Trigger
A trigger, either internal or external, leads to a behavior—it’s the spark that prompts action. Habit-forming products start by alerting users with external triggers such as an email, a website link, or an app icon on a smart phone, with the aim of prompting repeat engagement until a habit is formed. The trigger also tells the user of what to do next—to act.

Action
The action needs to be seamless; the user should be able to act with ease —without barriers. Action according to Nir, relies not only on ease of use but on principles of human behavior. It’s based on the premise that the user seeks one of three things: 1) pleasure in order to avoid pain, 2) hope to avoid fear, or 3) social acceptance to avoid rejection. Jack Dorsey founder of Twitter builds on this premise with his platform which is designed to solve a problem (communication, knowledge building), while addressing desires and emotions of its users (social acceptance via ‘likes’, retweets, etc.) (pg 39).

Variable Reward
What distinguishes the Hook model from the traditional feedback loop (embodied by the familiar model B.F. Skinner’s model where rewards are used to support behavior change through  positive reinforcement) is the variability of the reward which creates a desire for feedback, motivating the user to seek it out. Traditional feedback loops are predictable; they don’t create desire according to Nir. Yet when there’s uncertainty to the reward or  variability to the type of reward—the user’s interest is piqued  Think of the variability of the reward structure with slot machines; they’re unpredictable. In an education context, Nir describes how Codeacademy uses variable rewards with symbols that benchmark students progress along with variable feedback that uses rewards to fulfill the student’s desire for acceptance and validation (pg. 89).

Investment
The investment occurs when the user puts something into the product or service such as time, data, effort, social capital or money (pg. 7). The more users invest in the product or service, the more they value it—supporting the idea that labor leads to love. This investment concept is applicable to education—online courses for example where students contribute to course content (investment of time), complete course work (more time) and engage with peers (even more time).

Screen Shot 2016-03-16 at 3.58.24 PM
Image (above) of a screenshot from the YouVersion app which follows the four elements of the Hook model. The screen shot shows how rewards are built into the app, with the feature of ‘likes’ used by the community.

Case Study: The Hook Model in Action
After reading the case study of the Bible app, YouVersion in “Hooked” I could see the application of the Hook model, its relevance to learning contexts. The app provides a selection of bible study programs users can choose from based upon their needs. The app sends reminders and encouraging messages when readings or homework is due. When a message is avoided or missed, a red icon appears over the app, another cue. If more than two readings are missed,  users receive a supportive message suggesting they consider a different (less challenging) plan. There’s also a virtual community, where encouragement from its members is another source of ‘triggers’. Rewards come in several forms. When  a reading assignment is done for instance, the user gets a message “Day Complete” with a check mark on the app’s calendar. YouVersion is a success story. It’s the #1 downloaded Bible app with over 200,000,000 downloads.

How the Hook Model Can be Applied to Learning
What if learning did become a habit, where students check into their online class daily, share relevant content with classmates or engage in group assignments willingly? The case study of Youversion is instructive, suggesting that the model concepts are applicable to learning scenarios, specifically to learning platforms and applications. Learning applications created thoughtfully and purposefully can support behavior changes that result in seamless learning, with few barriers and built-in rewards that provide variety and freshness that also leverage the learning community.  Yet creating learning that follows the Hook model requires a different mindset, and commitment to create compelling learning with integrity and care that protects students, content and the process of learning.

Resources

 

 

MOOCs Desperately Seeking Quality

cropped-mooc-banner1MOOCs, despite what the critics say have transformed higher education. They have spawned new vehicles for online learning, reaching new groups of learners who want and need an alternate form of traditional education. Thanks to the MOOC format learners now have numerous pathways to furthering their education. Granted, not all MOOC programs meet the definition of open, but as programs expand as we’ve seen with certificate-granting MOOCs, MOOCs for college credit, professional development MOOCs, and others, there is a pressing need for benchmarks of quality.

This need is more apparent after the recent publication of two reports: Babson’s thirteenth annual “Online Report Card” and “In search of quality: Using Quality Matters to analyze the quality of Massive, Open, Online Courses (MOOCs)”, (Allen, Seaman, Poulin & Straut, 2016; Lowenthal & Hodges, 2015).

The Babson report devotes (only) two of its sixty-plus pages to MOOCs, yet what’s most telling is the fact that the report is in its final year of publication. Though there are a variety of contributing factors, a compelling one as described in the report’s introduction, “distance education is clearly becoming mainstream” (pg. 3). In other words online education is growing up. ‘Online learning’ is simply becoming ‘learning’. The report outlines the number of organizations dedicated to online education, who report on and address issues specific to online learning. Many do address quality standards as it relates to developing and delivering online programs, as is the case with Quality Matters rubric, Online Learning Consortium’s (OLC) Five Pillars, and California State University Chico’s rubric for online instruction, yet all fall short in specifying standards for MOOCs.

This void is a concern given the number of students engaged in learning using the MOOC format which is significant. Estimates are in the range of millions—one source states there were 35 million enrollment in 2015 (“MOOC Enrolment”, 2016). Given these numbers the question is—how will MOOC learning be advanced and improved if MOOC quality isn’t addressed by organizations involved in online education? Researchers Lowenthal and Hodges bring some of these issues forward in their paper, “In search of Quality”.  They apply the Quality Matters™ rubric to six MOOCs, offered by three providers, Coursera, edX and Udacity:

The six identified MOOCs were analyzed using the Quality Matters Rubric Standards with Assigned Point Values, which involves a type of content analysis by three different reviewers using a standard coding scheme. [Quality Matters] QM has a rubric for Continuing and Professional Development that would be appropriate to use on MOOCs (Adair et al., 2014). However, we intentionally chose to use QM’s higher education rubric rather than the continuing and professional development focused rubric because of the increased initiatives about offering college credit for MOOC completion. In other words, a MOOC should score as well as a traditional online course if it is going to be worth college credit.  (Lowenthal & Hodges, 2015)

Not surprisingly, after the QM peer-review assessment all six MOOCs failed to meet QM’s passing grade of 85%. The QM rubric consists of a set of standards grouped into eight dimensions (below); in the study, most MOOCs failed in two dimensions, #5 and #7.

  1. Course overview and introduction
  2. Learning objectives
  3. Assessment and measurement
  4. Instructional materials
  5. Learner interaction and engagement
  6. Course technology
  7. Learner support
  8. Accessibility (Quality Matters, 2014)

The apparent failure of the MOOCs in this study may give fodder to MOOC critics, yet I suggest that failure stems not from the MOOCs, but from:  1) applying a tool (QM rubric) to a MOOC, which inherently serves a variety of learning purposes and needs, e.g. not just for credit, but for professional development, personal interest, etc. and 2) assessing a MOOC on dimensions such as ‘learner interaction and engagement‘ and ‘learner support‘ doesn’t make sense in context of a MOOC, specifically at the level the QM standards articulate. Considering the massive component of MOOCs, it’s almost a given that facilitating structured, mandatory engagement and active learning is next to impossible. Furthermore since MOOC students are able to choose the level of engagement based upon their learning needs, including this as a standard doesn’t fit with the intent of the course.

The study acknowledges many of these points, and serves as a vehicle for discussion about applying quality standards to courses that align with the MOOC format. The authors also highlight a critical point, if the MOOC format is used as a vehicle for granting college credit, as it appears to be, quality benchmarks are essential.

Final Thoughts
A unique approach to quality assessment (and course design) is needed; one that heeds the needs of learners, the constraints and advantages of the delivery platform, and ensures a quality learning experience. Going further, I also suggest that before establishing quality standards, institutions would do well to first identify the primary purpose and intent of the MOOC. Categorizing a MOOC based on its purpose, then establishing quality standards is a good place to start.

References
Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online education in the United States (Rep.). Babson Survey Research Group. Retrieved from http://onlinelearningsurvey.com/reports/onlinereportcard.pdf

Lowenthal, P. R., & C. B. Hodges (2015). In search of quality: Using Quality Matters to analyze the quality of massive, open, online, courses (MOOCs). The International Review of Research in Open and Distributed Learning, 15(5). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/2348/3411

MOOC enrolment surpassed 35 million in 2015. (2016, January 05). Retrieved from http://monitor.icef.com/2016/01/mooc-enrolment-surpassed-35-million-in-2015/

 

Need-to-Know MOOC News: MOOCs Find Their Niche & Business Model in 2016

This is a special issue of the ‘Need-to-Know’ blog post series featuring the latest developments in Massive Open Online Courses (MOOCs) offered by providers: Coursera, iVersity, edX, and Udacity.

Screen Shot 2016-01-29 at 1.04.41 PM1. Coursera’s Business Model Taking Shape
Coursera is finding its niche and business model. The MOOC provider is moving towards three revenue-generating strategies: 1) fee-based courses which require students to pay a fee for access to graded assignments, 2) Specializations, a sequence of courses with a capstone project, and 3) Course Certificates (formerly known as Signature Track).

Signature Track, launched in 2013 was Coursera’s first (significant) revenue generating strategy. Students paid a fee in exchange for the opportunity to earn a verified certificate. Initially only a handful of courses featured the certificate option. Signature Track has since expanded, had a recent name change to Course Certificate and features a flat fee of $49. The Course Certificate option is now available across numerous courses. Revenue estimates suggest Certificates generated between $8 and $12 million in 2014 (Shah, 2014). 

Specializations feature a sequence of courses (typically four to six) with a capstone project where students apply the skills learned in order to earn a certificate. Launched two years ago, the program appears successful given the number of Specializations offered—in the hundreds according to Coursera. Fees range between $300 and $600. Tuition is determined by the price of each course (which range between $39 and $79), the number of courses within each, and the fee for the capstone project. If there is even modest student demand for Specializations as Coursera founder Daphne Koller indicates, revenue opportunity is significant (Bogen, 2015).

The Purchase Course strategy announced last week requires that students pay to gain access to graded assignments. There is an option to ‘audit’ the course where students have access to course materials only. An excerpt from Coursera’s blog (below) outlines the strategy:

Starting today, when you enroll in certain courses, you’ll be asked to pay a fee (or apply for Coursera’s financial aid program) if you’d like to submit required graded assignments and earn a Course Certificate. You can also choose to explore the course [audit] for free, in which case you’ll have full access to videos, discussions, and practice assignments, and view-only access to graded assignments. — Coursera Blog, January 19, 2016

This format is similar to what’s offered at iVersity, a Europe-based MOOC provider. Tuition at Coursera ranges between $39 and $119 per course. Below is a screen shot showing the options presented to students enrolling for a course on Coursera’s platform.

Screen Shot 2016-01-28 at 11.39.49 AM
Fee-based courses appear linked to courses that are part of the Specializations programs. The screenshot above is an image of what is presented when enrolling for ‘Understanding Financial Markets’

2) iVersity’s Pay-for Certificate Program & Udacity’s Nanodegree Plus
iVersity, one of Europe’s MOOC platforms launched it’s own version of Coursera’s Specializations—The Business Communication Programme. It’s targeted to working professionals seeking skills in business communication and marketing. It’s iVersity’s first venture into bundled programs. Yet the Programme is more similar to Udacity’s new Nanodegree Plus program, given it offers enhanced customer service—support and resources to help students find a job.

Udacity’s program goes further by guaranteeing that students find a job within six months, or their money back. Fees at Udacity are monthly—$299. With an estimated program length between six and eight months that brings the cost between $1,794 and $2,392.  iVersity’s tuition model takes a different approach but the price is similar (see screenshot below)—iVersity’s Programme at its regular price  is $1,704 (approximate US funds), and the enhanced model is $2,611.

Screen Shot 2016-01-29 at 5.22.15 PM
Screenshot above: Prices for iVersity’s ‘Business Communication Programme’ as displayed on the webpage at iversity.org. Sales prices still appear on site, February 2, 2016

iVersity also offers corporate learning services to companies looking for support in creating their own professional development courses. It’s promoted on their site as “a new form of professional development“.

3) Udacity for Business
Udacity also targets the corporate training market (tech-companies specifically) via its business webpage promoting “Hands-on Training. Done Online”. The courses and programs promoted are identical to Udacity’s existing ones, but are packaged to appeal to company and human resource executives as a solution to meet skill gaps among employees and as a tool for succession planning. Screenshot below from Udacity’s site:

Screen Shot 2016-02-03 at 9.59.48 AM4) edX CEO: “edX offers complete programs online, not just individual courses
EdX, an open source platform and one of the few non-profit MOOC providers,  also has revenue generating strategies, though not for profit. The strategies are needed to support edX’s goal of sustainability in order to achieve its mission of offering “access to high-quality education for everyone, everywhere”. Some of edX’s programs are similar to Coursera and Udacity—certificates with fees typically of $50 per course. Another is the XSeries program, a group of bundled courses. Students receive a Xseries Certificate upon completion, though unlike Coursera’s Specializations or Udacity’s Nanodegree, there is no final or capstone project. Another revenue strategy is licensing edX courses to countries such as China, India, France, the Middle East who have adopted Open edX (Young & Hobson, 2015).

EdX also offers Professional Education Courses targeted to students looking for skills training and professional development. Courses are stand-alone and online, some are self-paced and others have a start and end date that span between four and six weeks. Fees can be hefty, ranging between $89 and $949, as this one “Yield Curve Analysis”.

Insight:  Offering free, high-quality content on feature-rich digital platforms is not free for the MOOC provider or the partnering institutions. Even though free appeared to be the end-goal of MOOCs at the time of their launch in 2012.  But free is not sustainable. The concept of MOOCs is shifting to where the demand is—fee-based certificate courses and programs in skill-specific areas, and corporate learning. In between are programs offering MOOCs for higher education credit, as with courses for ECTS credit at iVersity, edX’s Global Freshman Academy, and Malaysia’s national credit recognition policy for MOOCs. Even degrees (Georgia Tech’s CS Master’s degree) and mini-degrees based on MOOCs as with MIT’s Micro-Masters. There still are courses for free for the life-long learner, like myself, looking for high quality, online courses not for credit. I view this as a win-win-win for everyone; the platform providers, the institutions and the students. Who says MOOCs weren’t disruptive?

Further Reading:

Can Social Network Analysis Help Teachers Change?

jpeg
Edited by Alan J. Daly. Harvard Education Press, 2010

Recent education studies underline the value of strong social networks among teachers for the spread of reform implementation and innovative climate…and their capacity to change” — Moolenaar & Sleegers, chapter 6: Social Network Theory and Educational Change

“Social Network Theory and Educational Change” is a collection of case studies that describe the impact of change efforts in schools through analysis of social networks. Using social network theory is a unique way to analyze reform initiatives within education settings—more so given social interactions among stakeholders is key factor in any type of change initiative within an organization. The studies examine teachers and education leaders communication patterns and behaviors within their school or district’s social networks; with each case measuring a different aspect of change or reform effort.

“Drawing on the work of leading scholars, the book comprises a series of studies examining networks among teachers and school leaders, contrasting formal and informal organizational structures, and exploring the mechanisms by which ideas, information, and influence flow from person to person and group to group. The case studies provided in the book reflect a rich variety of approaches and methodologies, showcasing the range and power of this dynamic new mode of analysis” — Harvard Education Press

Examples of studies in the book include one that examines a new “ambitious” district-wide math curriculum accompanied by a comprehensive professional development program for teachers. The purpose of this study was “exploratory and theory building”, researchers sought to demonstrate the value and applicability of social network analysis in education reform efforts (p. 36). Other studies delved further into teachers’ perceptions of change. Chapter five—’Peer Influence in High School Reform’ focused on measuring teachers attitudes towards reform efforts in order to “better understand the variables that impact the implementation of reform programs” (p. 82).  The study’s data came from surveys administered by Consortium for Policy Research in Education (CPRE) across nine high schools, each who had implemented externally designed reform programs that aimed to bring about significant changes in teachers’ classroom practice.

Social Network Analysis Defined
Social network theory and analysis is the study of how people, organizations or groups interact with others within their network. Social Network Theory  has its roots in sociology where graph theory was used as an analysis tool in research; it’s now an established research method used in biology, anthropology, economics, management, and is gaining momentum in education (pg 4). The focus of social network analysis (SNA) is on relationships; the flow of information within social network structures, where the structure is a collection of individuals (nodes).

social-network-theory-analysis-10-638
‘Social network analysis requires an understanding of how independent people related to each other, affect each other’s views, and interact together’ – Susan Fant (2013). Slide 8, 10

Methods for Collecting and Visualizing Data
Methods of SNA include identifying the actors—the individuals within a workplace network and implementing a questionnaire with each. Questions within a survey tool might be: “to whom do you turn for work-related information?” or “with whom do you collaborators regarding instructional issues?” or “how often does your interaction with colleague increase your energy level?”.  The purpose of the survey instrument is to determine: the flow of information, mode of communication, frequency of contact, strength of ties and the structure of relationships within the network.

Data is complied and transposed using analytic software to create network visualization. Visual representations of networks can be a powerful method for conveying complex information. Chapter 13 outlines best practices and methods for collecting and managing high-quality data for SNA, and provides readers instructive guidance to overcome the main challenges with SNA which according to the chapter author includes, 1) the quality of data, where there’s a concern that the survey-respondents don’t provide responses that accurately reflect social interactions, and 2) quantity of data—where target response rate from actors in a network should be close to 100%.

Networks-1
Diagram above: “Visualization of data from a district-wide study examining the exchange of ‘expertise’ between central office and site administrators. Findings indicate great deal of expertise sharing between the central office administrators (red nodes) and limited expertise exchange between principals (blue nodes)”. (Shanker Institute, 2014).

Conclusion
Revisiting the question—can social network analysis help teachers change?   Social network analysis is a useful tool for providing insights into the complexities of change, into school-wide and organization learning, into how relationships influence education practices, and new initiatives. Yet on its own SNA won’t help teachers change, but serves as a tool for education leaders to help teachers changeby helping leaders to understand the flow of information, to identify how to support the relationships responsible for change, and determine the critical resources needed. SNA is not a solution but a unique tool to consider and evaluate. More so now given the increasing number of applications in our workplaces that facilitate social and informal communication and collaboration.

Resources