At the heart of developing personalized learning systems are critical design and policy questions about what to personalize and what to standardize. This working paper describes key dimensions to consider in finding the right balance between consistency and flexibility.
The social component of learning has long been overlooked from both a regulatory and a design perspective, with community formation often assumed to happen through the traditions of brick-and-mortar institutions. But as students spend less time at physical campuses, whether due to part-time status, family and work commitments, or online classes, deliberately planning how students will connect meaningfully with each other becomes necessary.
Coursera’s partnership to create “learning hubs” offers one example of how the education, business, and government worlds are exploring solutions to strengthen the tenuous social fabric that keeps students in class. Along with the basics of internet and technology access, these hubs also offer a more fundamental reason to return: social ties. Fellow classmates can offer instrumental support by sharing knowledge and experiences, but they also offer emotional support and validation when uncertainty strikes. While the time and effort required to build social ties may initially seem costly, the investment can pay off through higher enrollment and retention, as well as improved learning and satisfaction.
As these initiatives reveal, personalizing learning effectively goes beyond mere individualization to include genuine integration of the participants as people connected in a community.
Most personalized learning systems have focused on the business-to-business (B2B) rather than business-to-consumer (B2C) model. The B2B model holds appeal both since it secures large contracts all at once and since it can tailor solutions to unique institutional needs, particularly applicable in the world of education, which is subject to extensive rules and regulations for funding and accreditation at multiple levels.
Yet with giant consumer sites such as Amazon jumping in to the online education space, the possibility of offering personalized learning direct to the consumer and at scale challenges old norms about regulating access. How much will consumers trust content and instruction offered outside of accredited institutions? Along what dimensions will consumers evaluate the educational experiences they choose? How will this affect the nature, content, and quality of instruction provided? How will the added mobility affect the formation of social networks which support learning, student persistence, and consumer “stickiness”? How will learners’ data be shared (or not) across institutional barriers and over time, as the students mature?
Watch this space; there are many changes afoot.
One of the most compelling arguments for personalized learning is the importance of providing an appropriate education to students with special needs. Such students challenge the system, with unexpected strengths and weaknesses that are out of scale with the norm. Simply slowing down (or speeding up) the pace of instruction won’t serve their needs, particularly as they may be exceptional on more than one dimension and in more than one direction. For them, personalized learning that decouples different skills is imperative, a way to serve their needs and extend their abilities at the same time.
While special-education laws are limited in scope due to their approach of simply setting minimum requirements, they do provide critical safeguards for supporting students at the K-12 level. As they graduate to adulthood, these students are expected to assume more responsibility to advocate for and seek accommodations for their needs if they pursue advanced study at institutions of higher education (IHEs). Even so, these minimum accommodations only grant access, and sometimes may not do enough to constitute effective instruction that enables success. Simply fulfilling minimum requirements may allow IHEs to avoid litigation, but failing to adequately serve their students is a failure to invest their resources wisely.
Challenging though it may be to (re)design instructional materials with different constraints, IHEs may find that special-needs students can provide a valuable test case, instantiating extremes on the spectrum of students they serve. These adjustments will also help them support English language learners, disadvantaged-but-capable students with gaps in their backgrounds, returning students who remember some lessons but forgot others, and career changers in search of very specific skills to flesh out their resume—deserving students whom the traditional system fails, all too often. Not all students fit the same mold, nor should they. Adapting instruction around their needs develops their potential and gives them the opportunity to give back.
As described on The Economist’s “Schumpeter” blog, economic growth depends on innovation and more flexible job preparation:
Entrepreneurs repeatedly complain that they cannot hire the right people because universities are failing to keep pace with a fast-changing job market. Small firms lack the resources to provide training and are consequently making do with fewer people working longer hours.
The claim that “established firms are usually in the business of preserving the old world” bears interesting parallels to universities, which tend to replicate their own status quo. Instead of producing numerous graduates of the programs of yesteryear, universities need to update their training to develop knowledge and skills in current demand. Adapting to these demands through lengthy committee review and accreditation requirements is unlikely to be fast enough for the “agile-development” expectations of today’s startup culture. Educational institutions thus need new processes for tailoring programs of study to modern demands with both integrity and efficiency.
An even stronger motivation for allowing students to tailor their own course of study to their particular needs is that employers seek teams of people with a mix of complementary skills, not multiple copies of people with the same skill set. Instead of trying to differentiate candidates on some imagined basis of unidimensional merit, employers need to differentiate them along multiple dimensions of value to their particular needs. Employers are constantly talking about “fit”; educational institutions should facilitate discovery of a good fit by using personalized assessment, to provide richer information about how a candidate’s unique strengths and experiences may match a particular profile of needs.
A fundamental challenge in implementing personalized learning is in determining just how much it should be personal—or interpersonal, to be more specific. Carlo Rotella highlights the tension between the customization afforded by technology and the machine interface needed to collect the data supporting that customization. He narrows in on the crux of the problem thus:
For data to work its magic, a student has to generate the necessary information by doing everything on the tablet.
That invites worries about overuse of technology interfering with attention management, sleep cycles, creativity, and social relationships.
One simple solution is to treat the technology as a tool that is secondary to the humans interacting around it, with expert human facilitators knowing when and how to turn the screens off and refocus attention on the people in the room. As with any tool, recognizing when it is hindering rather than helping will always remain a critical skill in using it effectively.
Yet navigating the human-to-data translation remains a tricky concern. In some cases, student data or expert observations can be coded and entered into the database manually, if worthwhile. Wearable technologies (e.g., Google Glass, Mio, e-textiles) seek to shorten the translation distance by integrating sensory input and feedback more seamlessly in the environment. Electronic paper, whiteboards, and digital pens provide alternate data capture methods through familiar writing tools. While these tools bring the technology closer to the human experience, they require more analysis to convert the raw data into manipulable form and further beg the question of whether the answer to too much technology is still more technology. Instructional designers will always need to evaluate the cost-benefit equation of when intuitive human observation and reflection is superior, and when technology-enhanced aggregation and analysis is superior.
From the perspective that all publicity is good publicity, the continued hype-and-backlash cycle in media representations of educational technology is helping to fuel interest in its potential use. However, misleading representations, even artistic or satirical, can skew the discourse away from realistic discussions of the true capacity and constraints of the technology and its appropriate use. We need honest appraisals of strengths and weaknesses to inform our judgment of what to do, and what not to do, when incorporating teaching machines into learning environments.
Adam Bessie and Arthur King’s cartoon depiction of the Automated Teaching Machine convey dire warnings about the evils of technology based on several common misconceptions regarding its use. One presents a false dichotomy between machine and teacher, portraying the goal of technology as replacing teachers through automation. While certain low-level tasks like marking multiple-choice questions can be automated, other aspects of teaching cannot. Even while advocating for greater use of automated assessment, I note that it is best used in conjunction with human judgment and interaction. Technology should augment what teachers can do, not replace it.
A second misconception is that educational programs are just Skinner machines that reinforce stimulus-response links. The very premise of cognitive science, and thus the foundation of modern cognitive tutors, is the need to go beyond observable behaviors to draw inferences about internal mental representations and processes. Adaptations to student performance are based on judgments about internal states, including not just knowledge but also motivation and affect.
A third misconception is that human presence corresponds to the quality of teaching and learning taking place. What matters is the quality of the interaction, between student and teacher, between student and peer, and between student and content. Human presence is a necessary precondition for human interaction, but it is neither a guarantee nor a perfect correlate of productive human interaction for learning.
Educational technology definitely needs critique, especially in the face of its possible widespread adoption. But those critiques should be based on the realities of its actual use and potential. How should the boundaries between human-human and human-computer interaction be navigated so that the activities mutually support each other? What kinds of representations and recommendations help teachers make effective use of assessment data? These are the kinds of questions we need to tackle in service of improving education.
Phil Nichols describes his youthful adventures reappropriating the humble graphing calculator to program games:
For me, it began with “Mario” — a TI-BASIC game based loosely on its Nintendo-trademarked namesake. In the program, users guided an “M” around obstacles to collect asterisks (coins, presumably) across three levels. Though engaging, the game could be completed in a matter of minutes. I decided to remedy this by programming an extended version. I studied the game’s code, copying every line into a notebook then writing an explanation beside each command. I sought counsel from online tutorials, message boards, and chat rooms. I sketched new levels on graph paper, strategically placing asterisks in a way that would present a challenge to experienced players. Finally, after a grueling process of trial and error, I transformed my designs into code for three additional stages.
As he summarizes, his non-school-sanctioned explorations of an otherwise school-based tool led to sophisticated discoveries and creations:
[W]ith the aid of my calculator, I’d crafted narratives, drawn storyboards, visualized foreign and familiar environments and coded them into existence. I’d learned two programming languages and developed an online network of support from experienced programmers. I’d honed heuristics for research and discovered workarounds when I ran into obstacles. I’d found outlets to share my creations and used feedback from others to revise and refine my work. The TI-83 Plus had helped me cultivate many of the overt and discrete habits of mind necessary for autonomous, self-directed learning. And even more, it did this without resorting to grades, rewards, or other extrinsic motivators that schools often use to coerce student engagement.
While he positions calculator programming as a balance between the complementary educational goals of “convention” and “subversion,” this also echoes tradeoffs between routine expertise and adaptive expertise, between efficiency and creativity, or between convergent and divergent thinking. It remains an ongoing risk in overly restrictive learning environments. Standards that dictate the time and sequence of each stage of students’ progression fail to allow for the different paths which personalization accommodates. Yet even adaptive learning systems that seek to anticipate every next step a student might take must be careful not to add so many constraints that crowd out productive paths the student might otherwise have pursued. Personalized learning needs to leave room for error and open-ended discovery, because some things just aren’t known yet.
In The Coming Big Data Education Revolution, Doug Guthrie argues that “big data”, rather than MOOCs, represent the true revolution in education:
MOOCs are not a transformative innovation that will forever remake academia. That honor belongs to a more disruptive and far-reaching innovation – “big data.” A catchall phrase that refers to the vast numbers of data sets that are collected daily, big data promises to revolutionize online learning and, in doing so, higher education.
I agree that there are exciting new discoveries and innovations still yet to be made through the advent of big data in education, and I also agree that MOOCs’ current reliance on scaling up delivery of existing content isn’t particularly revolutionary. Yet I see the two movements as overlapping and complementary, rather than as competing forces.
While MOOCs may not (yet) have revolutionized instruction, they have revolutionized access for many learners. Part of their appeal for those interested in their growth is their potential for enabling large-scale analysis due to the high enrollments as well as the availability of online data. The opportunity to study such large numbers of students across such disparate contexts is rare in traditional academic settings, and it permits discoveries of learning trajectories and error patterns that might otherwise get missed as noise amidst smaller samples.
Another potential innovation which traditional MOOCs (xMOOCs) have not yet explored is new models of building cohorts and communities from amidst a large pool of learners, a goal at the heart of “connectivist MOOCs” (cMOOCs) that highlights peer-learning pedagogy. Combine xMOOCs and cMOOCs, and you can improve educational access even further by enabling courses to spring up whenever and wherever enough people, interest, and resources converge. Add in the analytical power of big data, and then you have the capacity to truly personalize learning, by providing both the experiences that best support students’ learning and the human interactions that will enrich those experiences.
As I have explained in a previous post on personalized learning, an important dimension along which personalized learning goes beyond merely adaptive learning is to personalize the experience on the instructional side, not just the learner side. Amidst all the excitement about adaptive learning, teachers remain an often-forgotten yet crucial part of the equation. Well-designed personalization takes advantage of the human intelligence embedded in expert instructors, including opportunities for them to exercise their professional judgment in deciding which activities will work best for their students given their particular contexts and constraints.
This EdSurge report mentions Rocketship’s upcoming changes, as “New model attempts to bring teachers closer to students’ online learning experience” by returning some classroom control back to the teacher:
Rocketship’s new model will shift focus from running purely adaptive programs, to using programs that give teachers greater control over content that gets assigned.
What this highlights is the need for the design of personalized learning programs to identify when to allocate decisions to teachers (possibly with recommendations among which to choose) and when to adapt the students’ learning experience immediately, without need for waiting for additional human input. While this depends in part on the professional knowledge of the instructors implementing the system, some decisions may be straightforward or simple enough to automate. Decisions best left to expert human intervention are likely to be more complex, to depend on more contingencies, to require interpersonal contact, or to have more uncertainty in their effectiveness. Where that balance lies is subject to continual readjustment, but since there are always unknowns and since social interaction is fundamental to the human experience, there will always remain a need for personalization.