Featured Articles
Meredith Turner
/ Categories: 553

Learning About Learning: The Mythical Land of L&D

Tom Whelan and Amy Duvernet

In the last handful of columns, we’ve talked a lot about bridging the divide between I-O and L&D. Both sides have a lot to learn from each other. However, let’s not forget that L&D is not inherently populated by people who are well-versed with psychological science. Accordingly, there are some L&D professionals who hold beliefs about workplace learning that don't necessarily jibe with what we know about it from research. In this column, we’d like to talk about several “myths” about learning that we see as somewhat persistent in L&D spheres.

The 70-20-10 Rule

The “70-20-10 rule” is a principle with which many I-Os may not be familiar. There’s been basically no mention of it in the peer-reviewed research literature. Why is that? Are we behind the curve? Well, not really. Here’s the short origin story: in the 1980s, a research team at the Center for Creative Leadership was studying the recollections of executives about their experiences with leadership development. They came to the conclusion that leaders obtained 70% of their knowledge from job-related experiences, 20% from interactions with others, and 10% from formal educational events (Rabin, 2014). Another source sometimes cited is a Bureau of Labor Statistics paper that found between 70-90% of training is informal when people are starting a job (Lowenstein & Spletzer, 1998).

So, does 70-20-10 work as a useful heuristic? Yes. It has appealingly round numbers, it makes intuitive sense (albeit at a glance), and it was originally borne of research data. Is it a good foundation for determining specific emphasis in training? Probably not. Let’s explore why.

The rule was originally based on the experiences of, and intended to apply to, leaders in organizations (Rabin, 2014). The rule’s emphasis on the value of experience in the growth of leaders is not unduly warranted (e.g., McCall, 2010). But what has happened in many L&D circles is an instance of large-scale source amnesia. This “rule” has been rolled out as just that, an axiom of workplace learning that appears to deemphasize the role of formal training. The result has been a focus on knowledge and skill retention, and the role that social interactions play in sustaining what has been learned.

Most importantly, is the 70-20-10 distribution of learning really reflective of what employees think? If we’re talking about the executives that the rule was originally based on, perhaps so. But we thought it might be an overreach to assume this is universally true across entire organizations. To explore this, we recently had the opportunity to ask a sample of 958 working adults across a variety of industry and job levels to reflect on what contributed to their learning at work. To mimic the 70-20-10 categorization, respondents were asked to allocate their learning between doing (e.g., on-the-job experiences); interactions with peers, coworkers, and managers; and finally, formal training provided by the company. Perhaps unsurprisingly, our findings didn’t fit into the clear cut 70-20-10 division; the averages for the responses were more along the lines of 56-25-19.

This doesn’t necessarily change the point of the story. On-the-job experiences are still viewed as the most important, but, despite the lower percentage, formal training is not a small, unimportant sliver of the employee experience. Instead, formal training is a critical piece of the initiative, especially when consistency and compliance are important. In fact, it’s often the crucible for learning. Although on-the-job experiences are critical for learning, these experiences, in our humble opinion, actually represent “training transfer,” the objective of any L&D initiative.

Thus, the 70-20-10 rule represents a heuristic that can be used to think about the context of organizational learning rather than a rule. The numbers themselves aren’t terribly useful, and for as much as they are used as a calling card for talking about social and experiential learning contexts, the idea of them as a “rule” shortchanges the many, many ways that L&D can function across all sizes and shapes of companies across an array of industries.

Deliberate Practice and the “Golden Rule” of 10,000 Hours

With his 2008 book, Outliers, Malcolm Gladwell made popular the 10,000 hours rule: the idea that you must accumulate 10,000 hours of practice to be an expert at anything. Based on the work of Anders Ericsson and his colleagues (Ericsson, Krampe, & Tesch-Romer, 1993), the rule extrapolates from an observation made in the original research that professional musicians had, on average, accumulated 10,000 hours of practice by the time they reached age 20. As evidence of its impact, this original research has been cited 1,561 times in the PsycINFO database and 7,849 times within Google Scholar (as of 11/21/2017).

Unfortunately, the main points of both Ericsson’s original work and Gladwell’s reference to it have tended to get lost in translation when recounted in the popular press (e.g., Szalavitz, 2013). First, Ericsson’s work did not point to a specific number of accumulated practice hours as the key factor impacting expertise. “There’s really nothing magical about 10,000 hours,” explained Ericsson in an interview with Freakonomics Radio (Dubner, 2016). Ten thousand hours simply represents the average number of hours observed for professionals, with considerable variability on either side of that number. More importantly, Ericsson’s work highlights a large difference in the number of accumulated practice hours across groups of varying expertise, implying that practice is critical for success. Ericsson also highlights that, in order to be impactful, practice must be deliberate, meaning that it is focused on specific aspects of performance in order to accomplish well-defined goals. Deliberate practice should provide frequent feedback and should be mentally demanding. It is the combination of these elements that contribute to success.

In sparking this misconception around the necessity of 10,000 hours, Gladwell set himself up for a similar misinterpretation. Whereas Gladwell used the data point to make the case that because becoming an expert requires a large time commitment and thus external support, his writing has been interpreted as suggesting that in order to be an expert, you need only accumulate 10,000 hours of practice (Gladwell, 2013).  In fact, in that very same chapter, he also emphasized that you must possess talent in addition to accumulating practice; his point: In order to be successful, you must be the recipient of many fortunate circumstances beyond simply possessing talent, including the kind of support that would allow you time to focus on developing your talent.

So now that we have clarified each individual's original purpose, what does the research say? A 2014 meta-analysis provides evidence that although important for some domains, the role of deliberate practice in developing expertise is only one part of the equation for success (Macnamara, Hambrick, & Oswald). This work showed that deliberate practice explained the largest (24%) amount of variance in performance when activities were highly predictable (e.g., running; r = .49) and far less variance (4%) when activities were less predictable (e.g., handling an aviation emergency; r = .21). Across domains, deliberate practice was most strongly related to gaming performance and least strongly related to professional performance. Further, methodological factors such the operationalization of deliberate practice and performance acted as additional moderators of the relationship.

A caveat to this work is the study’s operationalization of deliberate practice as accumulated hours of practice which may or may not represent deliberate practice. Ericsson and colleagues have limited the definition of deliberate practice to activities that meet specific conditions: supervised or guided practice with a clearly defined task aimed to impact a clear performance goals that provides immediate feedback. Unfortunately, meta-analyses are limited to the way that variables have been measured in individual studies; to shed light on this issue, an interesting follow up would be an investigation of the moderating role of practice strategies and characteristics.

Indeed, research has shown that experts employ practice strategies that differ from those employed by individuals with less expertise; experts tend to focus on areas where they need the most improvement and exert more effort as part of their practice (Coughlan, Williams, McRobert, & Ford, 2014).  Further, additional work has pointed to an optimal age at which practice begins and the importance of genetic factors, such as intelligence and specific abilities. Taken together, it appears that the jury is still out in terms of quantifying the importance of practice and determining exactly when it exerts its impact; however, it's probably safe to say that more practice is better in terms of increasing the likelihood of success. As Steve Levitt put it, “if you don’t try hard, no matter how much talent you have, there’s always going to be someone else who has a similar amount of talent who outworks you, and therefore outperforms you” (Dubner, 2016).

Ebbinghaus’s Forgetting Curve

Across many domains in L&D, there is a lot of attention paid to sustaining the impact of training. Couched within many of these discussions is the notion of information decay in memory. Many of us have likely encountered the research of Ebbinghaus at some point, but if you’re fuzzy on the details, here’s a shorthand review: back in 1885, Ebbinghaus published work that reported on findings from memorizing lists of nonsense phonemes and then testing his own recall, charting the course of how long it took for him to forget items on these lists.

Given that the word lists contained words that were devoid of context and had no meaning, it wasn’t a bad design to test memory decay. However, the somewhat rapid decay of bits of information with no inherent meaning has been lauded by some as proof that the impact of training is short lived at best, and without retraining one will be left with employees running around like helpless clods. The resulting chart of memory decay from Ebbinghaus’s experiments is usually presented in the absence of information about his data collection methods. From this, many professionals in corporate training think that much of the information presented in training will probably be lost in a matter of days or weeks.

This ubiquitous rapid decay of any learned material, as we’re sure you already guessed, isn’t quite how things work. Although suggesting that most people have memory spans similar to the proverbial goldfish makes for a nice sales angle for some L&D products, job-relevant information simply doesn’t disappear from long-term memory in the absence of retraining. There have been many studies to explore the role of retention in long-term memory (e.g., Custers, 2010). For instance, Custers and Cate (2011) found that doctors who had been out of school for years could still score up to 75% of the performance of students currently enrolled in medical school. To give an informal example, Schmidt, Peeck, Paas and van Bruekelen (2000) found that people could remember up to 60% of the street names from their childhood neighborhood compared to people still living there. Granted, most of this type of long-term retention in the context of education follows a pattern of some forgetting occurring in the space of 6 years, followed by a relatively stable permanent memory that lasts for decades (Conway, Cohen & Stanhope, 1992).

The counterargument to the myth is to highlight why employees will forget material. What does seem to play a significant role in forgetting, and isn’t far afield from Ebbinghaus’s original conclusions about memory for meaningful material, is the depth of initial learning. In the context of workplace learning, the quality of training and engagement of learners (or students) appears to govern long-term recall. Armed with a little bit of information about depth of encoding, permastore, and very-long-term memory (though not necessarily using those words lest it come off as jargon), we can help put fears of forgetting to rest. As I-Os, we need to communicate that the reason to be concerned about the sustainment of learning isn’t that people forget, but people forget when they haven’t been given a good reason to remember.

Summary

We hope you’ve enjoyed this exploration of common myths in L&D. We’d be remiss to point out that what’s underlying these “myths” isn’t usually a fundamental warping of the research but a lack of exposure to the nuances of the conclusions. So, rather than knocking others over the head with the mighty hammer of empiricism, our intent here was to let I-Os know that you may encounter these ideas when dealing with some (but certainly not all) L&D professionals. However we see these myths, we have to acknowledge that some in L&D may have made a decision based on belief in them—perhaps not with the best information but certainly with good intentions. That’s worth remembering; our backgrounds may differ, but our goals are still the same.

References

Conway, M. A., Cohen, G., & Stanhope, N. (1992). Very long-term memory for knowledge acquired at school and university. Applied Cognitive Psychology, 6, 467-482.

Coughlan, E. K., Williams, A. M., McRobert, A. P., & Ford, P. R. (2014). How experts practice: A novel test of deliberate practice theory. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40(2), 440-458.

Custers, E. J. F. M. (2010). Long-term retention of basic science knowledge: A review study. Advances in Health Science Education, 15, 109-128.

Custers, E. J. F. M., & Cate, O. T. J. (2011). Very long-term retention of basic science knowledge in doctors after graduation. Medical Education, 45, 422-430.

Dubner, S. J. (April 27, 2016). How to become great at just about anything. Freakonomics 

Radio. Retrieved from http://freakonomics.com/podcast/peak/

Ebbinghaus, Hermann. (1885). Memory: A contribution to experimental psychology. New York, NY: Dover.

Ericsson, K. A., Krampe, R. T., & Tesch-Romer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363-406.

Gladwell, M. (2008). Outliers: The story of success. New York, NY: Little, Brown and Co. Retrieved from https://www.newyorker.com/news/sporting-scene/complexity-and-the-ten-thousand-hour-rule.

Gladwell, M. (August 21, 2013). Complexity and the ten-thousand-hour rule. The New Yorker,

Loewenstein, M. A. & Spletzer, J. R. (1998). Informal training: A review of existing data and some new evidence. US Department of Labor: Bureau of Labor Statistics, Washington, DC. Available at: https://www.bls.gov/osmr/pdf/ec940090.pdf

Macnamara, B. N., Hambrick, D. Z., & Oswald, F. L. (2014). Deliberate practice and performance in music, games, sports, education, and professions: A meta-analysis. Psychological Science, 25(8), 1-11.  http://dx.10.1177/0956797614535810

McCall, M. W. (2010). Recasting leadership development. Industrial and Organizational Psychology, 3, 3-19.

Rabin, R. (2014). Blended learning for leadership: The CCL approach. Available at: http://www.ccl.org/wp-content/uploads/2015/04/BlendedLearningLeadership.pdf

Szalavitz, M. (May 20, 2013). 10,000 hours may not make a master after all. Time. Retrieved from http://healthland.time.com/2013/05/20/10000-hours-may-not-make-a-master-after-all/.

Schmidt, H. G., Peeck, V. H., Paas, F., & van Breukelen, G. J. P. (2000). Remembering the street names of one’s childhood neighborhood: A study of very long-term retention. Memory, 8(1), 37-49.

Print
4601 Rate this article:
5.0
Comments are only visible to subscribers.

Categories

Information on this website, including articles, white papers, and other resources, is provided by SIOP staff and members. We do not include third-party content on our website or in our publications, except in rare exceptions such as paid partnerships.