Learning About Learning: The Mythical Land of I-O
Tom Whelan and Amy Duvernet, Training Industry Inc.
In our last column, we talked about some of the misconceptions that I-Os may face when interacting with L&D professionals. As we’ve said repeatedly in this column, both sides have a lot to learn from each other. On the flip side of the coin, there are some I-Os who hold beliefs about workplace learning that don't necessarily jibe with what’s going on in the practitioner space. In this column, we’d like to talk about several “myths” about training that some I-Os may have bought into over time.
L&D Is All About Assessment and Evaluation
This is far from the truth. Although we I-Os tend to focus heavily on using data to inform decisions, that’s not the primary focus of the bulk of L&D activities. There are more pieces of the puzzle to contend with than the narrow avenue of evaluation. Training organizations must be capable of enacting at least eight critical processes, including content development, training delivery, technology integration, administrative services, portfolio management, strategic alignment, and both diagnostics and reporting and analysis, which encompass needs assessment and evaluation (Harward and Taylor, 2014). At a minimum, learning and development organizations must engage in administration to assign and track training and delivery before they even begin to focus on evaluation, and realistically, most of these processes must occur and be executed well before evaluation can come into play. In order to ensure the training is effective, good content development and strategic alignment between processes and business goals are critical. In large organizations, managing learning technologies and the entire training course portfolio can require the attention of entire workgroups. In fact, a recent work analysis of training managers revealed that managing technology was the second most frequently performed job responsibility in the role (Training Industry, Inc. 2017).
A similar myth encompasses the idea that all training programs should be evaluated using each of Kirkpatrick’s (1959) four levels of evaluation. The fact is, training organizations typically lack the resources (e.g., funding, time, personnel) needed to evaluate every program at more than one level, let alone all four. Perhaps this is why so many programs default to “smiley sheets” (i.e., Kirkpatrick’s level 1 reactions). The reality is (and dare we utter this sacrilege), it's probably not necessary to evaluate all programs. Before you send us off to be reeducated on the importance of data-based decision making, hear us out. If a training organization is operating with a limited budget, they may not have the resources to offer all of the critical programs needed and evaluate them each time. If faced with a dilemma of whether to offer multiple learning programs designed to meet a critical need or offer only one but evaluate it well, how should an L&D professional proceed?
The answer, as with all things we encounter, is “it depends.” We think the focus in our field on evaluation and assessment is a natural and important extension of our strengths. But, we want to challenge our readers to consider providing more guidance on when to evaluate and at what level. Are there instances where level 1 reactions can answer the most critical evaluative questions? When is it worth the investment to capture data at multiple levels? Finally, consider how the data are used and offer guidance that extends beyond how to measure. For example, Kurt Kraiger and Eric Surface are adding to this conversation with their work investigating how to communicate value to stakeholders once evaluative data are collected (2017, 2018).
L&D Professionals Have Access to Data
Similar to the idea that L&D is all about evaluation, this myth gets at the assumption that L&D professionals have various kinds of data at their disposal and just aren’t using it. Whether it’s data from work analysis activities, employee performance, attrition data, satisfaction surveys, and so on, one would be hard pressed to find an L&D professional able to make decisions with the benefit of all of the information a company has on hand.
For example, as we’ve drilled into our graduate students time and time again, work analyses are the foundation of most HR systems; how often do you think those data are shared with L&D teams when collected? The answer, in our experience, is not often enough. Further, we know that L&D solutions can impact more than just performance (e.g., turnover and absenteeism, work related injuries, employee engagement, team cohesiveness, etc.), but are training personnel able to access that data without gathering it themselves? We hear time and time again from those practicing in the L&D field that they struggle to identify sources of such data and when they do, they face challenges convincing others to share.
So, what’s with the data hoarding? We think there are a number of concerns driving this, including the need to protect employee privacy, and the possibility that data may be misused, misinterpreted, or stray from the stated purpose for which data were originally collected. There are also hurdles related to the labor involved in gathering, merging, and cleaning such data, a task that typically falls to the group holding the data rather than the requesting L&D team. Given these complex issues, what can we as I-Os do to help?
We’re in a unique position to advocate for data sharing and to facilitate training professionals’ requests, because we understand that data-driven decisions are far more likely to result in positive organizational outcomes. By working closely with L&D teams, we can help them identify and better understand the kinds of data they may need, the formatting in which they will likely want to receive it, and the processes they’d need to undergo to utilize it. Those of us working in a talent analytics functions can also partner with L&D to provide those kinds of insights and advocate for greater use of data in L&D processes.
Training and Education Are the Same Thing
What’s the difference between “education” and “training”? Although your answer may go in many directions, we feel the importance lies with the overarching objective of learning. Loosely defined, education is concerned with teaching the theory and background of a concept; the goal isn’t necessarily for a learner to gain strategies for practical application, but to be exposed to information that increases their understanding. Training, on the other hand, seeks to impart specific skills or job-relevant knowledge to a learner, with the aim that the new knowledge or skills acquired in training can be applied back to a learner’s job. The former is typically wider in scope than the latter, but both are important to building productive competencies.
If you’re thinking, “huh?” at the preceding paragraph, let’s draw out this distinction another way using a stats example. (Because we all love those, right?) Let’s say you’re very well versed in how a correlation works: what the formula is, the assumptions, matters of statistical power, how to interpret the correlation coefficient, and so on. You’re well educated about correlations but might be poorly trained in how to apply this knowledge to a real-world problem. On the other hand, you might be quite adept at running a correlation in any software package, generating a scatterplot, working with outliers in the data, and so on, but when explaining the results to company stakeholders run afoul of the golden rule of “correlation does not imply causation.” You’re well trained in how to use a correlation, but poorly educated on their background.
Why bother to draw this line in the sand? Some people may think of training as “learning things directly related to work” and education as “learning things that may or may not have to do with work,” with little else to distinguish the two. Training, obviously, is much more consequential to an individual employee than education. Why is that? Here’s a partial list of the impacts that training can have:
- Training outcomes can be used to inform promotions, layoffs, or other job actions;
- Training can be a critical piece of meeting regulatory and compliance requirements;
- Training upskills employees on using new pieces of software, hardware, or familiarizing them with new processes;
- Training is the best vehicle for onboarding new employees to the culture and policies of an organization;
- Training can be used to provide knowledge and skills at the point of need (sometimes called “just in time” training) when a job demands it;
- In many organizations, training is a key component of strategy, not just for HR, but at the enterprise level, where it can even serve as a competitive differentiator in a market.
Does education have those impacts? We’d argue no, or at least not remotely to the same extent. We hire people based on their education and experience, whereas training fills in and insulates the gaps in knowledge and skills, and those gaps can and will differ across jobs and industries. Another way of thinking about this is whether your graduate education in I-O prepared you with everything you needed to know for a job in academia or practice. Did you already have the people skills to behave like a professional across all the situations you’d face? At hire, did you understand how to navigate the processes and systems you’d interact with in your job? Were you prepared to handle the shifting social dynamics and demands on your time? To varying degrees, some of us did, but it’s impossible that this is true of everyone in the I-O community. We had the education, but we needed training—we had to learn more, and what we needed to learn was patently job relevant in nature. This is precisely the challenge with which all L&D functions grapple. So, although training and education both involve learning, the stakes for training (and for bad training) have consequences at the employee and organizational levels in a way that education usually does not. Conflating the two things together may seem harmless (and might be in some situations), but if their repercussions were the same there wouldn’t be two distinct industries that focus on learning and both personal and professional development.
Conclusion
The point is that although L&D professionals can learn a lot from our field, we also have a lot to learn about learning. We’ve discussed the relatively limited focus on training in the I-O field in previous columns (e.g., Whelan & DuVernet, 2017). Where I-O practitioners are well-represented in selection, change management, and other areas, L&D is a critical part of organizational functioning that deserves more attention from the I-O community. You can hire the best people, based on complex statistical models and validated preemployment tests, but nobody comes into a job knowing everything they have to learn.
We originally conceived of this column as a stimulus for “more thought and focus on the ways in which I-Os can contribute to the L&D space, with the ultimate goal of observing [an increased] number of I-O articles, research studies, and collaborative applied projects related to training” (DuVernet & Whelan, 2016). Unfortunately, if the session focuses of our annual conference are any indication, it appears there’s still much work to be done. With 11 sessions listing training as a primary focus area, this year’s conference features five fewer training related sessions than last year and continues a downward trend over time (see Figure 1 and SIOP Program Explorer 2008-2016 for more information).
Figure 1. SIOP Conference Session Listing Training as Primary Content Area Over Time
Thus, as we close our tenure as columnists for TIP and reflect on the L&D journey we’ve traveled, we know that we’ve just scratched the surface of this need. We’ve provided information about L&D roles and the ways they are organized within companies; we’ve highlighted trends in training from both the I-O and L&D perspectives; we’ve hopefully busted a few myths without annoying too many people. Most importantly, we hope we’ve inspired you to learn more about learning (and development) to continue down the road toward more cohesiveness across our two complementary fields. Although we’re hanging up our hats, we want to stress how important it is for you to consider how you fit in this puzzle and how you might apply your skill set to contribute to L&D. As stated by Gary Latham (2009):
It is we academics/scientists who provide the theory and empirical data that enables we practitioners to differentiate ourselves in the marketplace from, and make ourselves invaluable to, decision makers in the public and private sectors.
References
DuVernet, A. M., & Whelan, T. J. (July 2016). Learning about learning: Defining the role of I-Os in L&D. The Industrial-Organizational Psychologist. Retrieved from http://0-www-siop-org.library.alliant.edu/tip/july16/learn.aspx
Harward, D., & Taylor, K. What makes a great training organization? Upper Saddle River, NJ: Pearson FT Press.
Kirkpatrick, D. L. (1959). Techniques for Evaluation Training Programs. Journal of the American Society of Training Directors, 13, 21-26.
Kraiger, K. & Surface, E. A. (Nov./Dec. 2017). Beyond levels: Building value using learning and development data. Training Industry Magazine, 21-23.
Kraiger, K. & Surface, E. A. (April 2018). Beyond evaluation levels: Building value using learning and development data. Pre-conference workshop to be held at the 33rd meeting of the Society for Industrial and Organizational Psychology, Chicago.
Latham, G. (Jan. 2009). A message from your president: Bridging the scientist–practitioner gap The Industrial-Organizational Psychologist. Retrieved from http://0-www-siop-org.library.alliant.edu/tip/jan09/01latham.aspx
SIOP Program Explorer 2008-2016. Retrieved from http://keshif.me/gist/?e762f88916a1c371513706d12aada92b
Training Industry, Inc. (2017). Training Manager Competency Model™. Retrieved from https://trainingindustry.com/continuing-professional-development/training-manager-competency-model/
Whelan, T. J. & DuVernet, A. M. (2017). Learning about learning: Trends in workplace training 2. The Industrial-Organizational Psychologist. Retrieved from http://0-www-siop-org.library.alliant.edu/tip/july17/learn.aspx