Society for Industrial and Organizational Psychology > Research & Publications > TIP > TIP Back Issues > 2017 > January

masthead710

Volume 54     Number 3    January 2017      Editor: Tara Behrend

Meredith Turner
/ Categories: 543

Lost in Translation: Overcoming Critics and Barriers in Applied Organizational Research

Michael Litano and Andrew Collmus

The scientific study of people is complicated. Although human behavior is astoundingly predictable, scientific disciplines are often classified into a “hard” and “soft” dichotomy based on perceptions of the field’s methodological rigor and exactitude. Unfortunately, this artificial categorization rarely takes the complexity of what we study into account. Individuals vary greatly on a number of factors, and those differences influence their behaviors. However, there are also countless environmental factors that influence human behavior, and any number of these factors can interact and modify the behaviors we might otherwise expect. These individual differences and their interactions with the social environment are what makes the study of people so interesting but also extremely difficult. This complexity affects our ability to accurately and reliably measure unobservable constructs and influences the extent to which we are able to unobtrusively conduct research in naturalistic (work) settings.

From our experience, communicating the value of studying people can be equally as complex as the conducting the actual research. Psychology is unique in that our “subjects” are, as Dr. Scott Tannenbaum quips, “day-to-day psychologists,” which empowers them to associate research findings that align with their experiences as “common sense” and be skeptical of personally counterintuitive findings. This phenomenon extends to the workplace and impacts our ability to effectively communicate the value of our research.

I-O psychologists use a variety of data collection methods to gather as much information as possible on the unobservable phenomena influencing employee behavior, including surveys, interviews, focus groups, and direct observation. However, each of these research methods comes its own set of critics. Having interacted with skeptics of survey research and managers cynical of qualitative studies in our own applied experiences, we turned to experienced I-O psychologists to better understand common obstacles in I-O research methods and to gather advice on how to overcome critics by using effective translation.

Given the extensive effort involved with arranging interviews and synthesizing data, we combined the data collection efforts from both our second and third columns. As a result, 20 of the 25 I-O psychologists (11 men, 9 women) interviewed for Column 2 also served as respondents for the current column. Our sample consisted of 11 practitioners, 3 applied researchers, 2 university professors, 1 postdoctoral researcher, and 3 graduate students who were employed as interns with external consulting companies.

Of the 20 respondents interviewed for this column, only three reported never having faced a critic or skeptic of I-O research methods. Interestingly, two of these three respondents were graduate students who were also employed as interns. The third respondent was an early-career professional who reported working primarily with a team of I-O psychologists. It seems possible that at the early stages of their careers, these participants lacked the level of exposure to unfamiliar or hostile audiences that some of the more tenured I-O professionals have experienced. The majority of the remaining 17 respondents reported experiencing pushback or concerns regarding I-O research methods from their colleagues and/or coworkers from different disciplinary backgrounds (76.5%), clients (58.8%), and upper management or senior leadership (58.8%). Dissimilar to the authors’ own personal experiences, only three respondents reported encountering any resistance from the participants of a research study. Unsurprisingly, the respondents generally reported that individuals who had greater exposure to I-O psychology (immediate supervisors, projects managers, team members, subordinates) tended to be less critical of I-O research methods.

The respondents then identified specific concerns they have faced regarding their use of I-O research methods and provided advice on how to best overcome concerns with the use of survey-based research methods, interviews and focus groups, direct observation, and quasi-experiments (including training interventions). Given that much of the advice the I-O professionals provided was generalizable across research methods, we first present the common concerns associated with each method and follow with advice on how to overcome resistance from unfamiliar audiences.

Surveys or questionnaires are commonly used in I-O psychology due to the method’s administration flexibility, output of quantitative data, generalizability to the larger population, and cost-effectiveness. However, respondents overwhelmingly reported survey-based research to face the most resistance and criticism (n = 14). The most concerns that I-O psychologists reported facing with respect to survey-based research was related to (a) the quality of information that surveys or questionnaires provide, (b) management’s concerns with faking and/or socially desirable responding, (c) concerns about the measurement instrument, and (d) survey fatigue.

Some employees and managers are skeptical that meaningful information can be gathered from surveys. I have heard concerns that people generally lie on surveys or that because people don't respond to surveys we aren’t capturing the full picture, or that some other piece of information is probably more reliable than self-reports.

A [senior leader] was concerned that survey data is unreliable because employees are likely only motivated to respond in ways that make themselves look better.

What’s wrong with a one-item scale?

 People do not exist on the response scales that you create.

Occasionally, I do run into an organization who has the reaction of, “another survey, oh my goodness, our people are surveyed out.”

We often run into situations where [organizational leadership] says, “well, we’d love to get that data, but we also got our big engagement survey going out at the same time,” or “people have survey fatigue,” or “we’re just surveyed out.” So, in some ways I think too much data collection has been challenging.

Whereas surveys provide researchers with quantitative data that can generalize to the population with a representative sample, interviews, focus groups, and SME panels tend to provide more contextualized information and can be used to develop a deeper understanding of job requirements or underlying motivations and opinions. Although respondents reported these methods to face less criticism than surveys, eight interviewees described experiences in which they faced some form of resistance from management, employees, or clients. Most concerns were related to (a) interruptions to business, (b) meaningfulness and quality of data, or (c) overreliance on intuition or “gut feelings.”

Management is generally concerned that interviews take too much time, cause business interruption, and are difficult to schedule.

I have encountered people who believe that “gut feelings” and creative interview questions are better tools for employee selection than structured interviews. For focus groups, some have said that no one will provide honest feedback.

Depending on what type of audience you are presenting to, qualitative data can sometimes be viewed as “soft” data. Qualitative data can be perceived as less rigorous than quantitative data.

Participants sometimes believe that their opinions do not matter very much.

An important part of understanding both the task requirements and human attributes of a job is by directly observing incumbents. Only three participants from our sample specifically described encountering stakeholder concerns with this research method, so general themes did not emerge.

For many jobs I am tasked to directly observe, I often run into time constraints. An hour does not provide adequate insight into what the job actually requires. There also tend to be concerns over data privacy.

Hawthorne effect—people act differently when they know they are being observed.

Limited sample sizes and observations do not provide enough information.

When well-designed and carefully controlled, training interventions or quasi-experimental methods are powerful processes through which I-O psychologists can facilitate employees’ acquisition of knowledge, skills, or attitudes. Only five I-O professionals described situations in which stakeholders (a) did not understand the research design or questioned inferences of causality, (b) attempted to influence the content of the intervention, or (c) were resistant to a quantitative evaluation of the training.

            If there’s no random assignment, how can we demonstrate causality?

The big issue I hear about interventions is that if we think it's effective, we should either roll it out to everyone or let everyone who wants to participate in the pilot join in, even if they join at different stages and put their own spin on the intervention. Senior leaders believe they know what is best for their groups and are used to making a lot of the big decisions.

The hardest part about communicating the value of training is sometimes emphasizing the importance of its evaluation. I consulted for a company where their HR team had conducted a “creativity” training class for 10+ years, and when I asked what metrics they used to evaluate the intervention’s effectiveness, I was told, “people come up to me in the halls and tell me how much they enjoy the class and how much more creative they are afterwards.” In some ways, I felt that the HR team might be resistant to a quantitative evaluation of training outcomes because they were scared to find that it might not be as effective as they thought.

One non-I-O colleague said, “There are too many other factors at play to say with confidence that this particular training was the root cause of the intended effect.”

Finally, some of our interviewees identified general concerns and criticisms they have faced that were not directly related to one of the previously described I-O research methods. Rather, these responses all link back to the complexities of what we study and to whom we report our findings: people.

I think the criticism is typically less about the methods. If I’m talking to people who don’t have an I-O background, sometimes the thing we have to overcome is that everyone feels like they are a psychologist. It’s not dissimilar to marketing. Everyone kind of feels like, “Oh, I know a little bit about marketing.” But there’s people who have deep expertise in marketing. So at one level, humans are all day-to-day psychologists, we’re all trying to figure out how people think. So the question is, “What new [information] do you bring?”

I have experienced various stakeholders express concerns such as, “You really can’t predict people's behavior,” “People don't actually change,” “We've been doing this for 20 years and it works fine,”and, “I see you're the flavor of the month.”

The most common criticism that I have had to deal with in my career is when I encounter an employee who does not believe a psychologist is the appropriate person to make decisions related to another job or occupation field. They say something like, “How can a Psychologist tell me more about my job than I can?”

When we report findings, people are sometimes concerned with our ability to generalize to the larger population of the organization—especially when our sample size is small. In these cases, they are concerned that only employees at the extremes are responding.

Although these concerns may intimidate the early-career professional or I-O psychologist that has yet to master the art of effectively communicating the value of I-O research methods, we were fortunate enough to acquire a wealth of practical advice from our interviewees that we hope you will use as a resource to help increase your effectiveness as a “translator.” We have grouped the practical recommendations into five general themes:

1. Educate the stakeholders

Just trying to help them understand that there is a deep science, there is some knowledge about human dynamics [at work]... so we’ve got to be able to provide them with examples of how an understanding of human dynamics in a work setting is more than just common sense, and you know, in our field we at times uncover research that’s contrary to what one might think is common sense.

Oftentimes sharing information and past research about why the I-O approach is better seals the deal. The key is to present our compelling data in a friendly format so that it is easy to understand.

Explain to them that certain constructs cannot be assessed using objective data, and survey data is (a lot of times) the best option we have. Also, there are strategies to assess socially desirable responding, careless responding, and other issues that are presented by self-report measures.

The optimal way is to include all the stakeholders from the beginning and educate them while listening to their perspective. Also, you can often learn how to communicate what we do in terms of their existing mental model or literature.

2. Build trust with all parties involved

One of the biggest things is just building trust. Oftentimes, companies are worried about bringing somebody in and having the employees trust that the data are going to be used in the ways that it should be used.

Sometimes clients [are resistant] because they have seen I-O psychologists in their organization before and feel like they did little to solve whatever they were there to solve. You have to essentially try to rewin the trust of the client. They aren't necessarily mad at you or critical of the science. They thought their time and voice were going to be used somehow to improve the organization. When this didn't happen, their trust was violated. Allow these stakeholders to express their concerns. Giving them a voice and empathizing with them is sometimes all you need to do. It's also a chance to learn more about the organization.

One of the things that I really try to emphasize when I’m talking with clients and with employees of these clients is that we really see ourselves as both top down and bottom up. So working with management to implement solutions that are going to help get the work done, but our own personal ethical calling is that we’re there to watch out for the employees, too.

3. Understand the stakeholder’s perspective

It takes some creativity to overcome the concerns of clients who resist because they don't value our science. Ask them questions about what makes an organization great. Then inject how I-O accomplishes those things. Provide hypothetical examples of a world without our field. Become a pro with analogies and use real-world examples whenever possible.

Always to ask questions to understand where they're coming from. Is it a general world view, a specific experience, or a negative reaction for some other reason? Then, ask about their experiences and show how they have used I-O techniques already in what they do.

I totally get that [concern about survey fatigue]. What I like to say is, from a diagnostic perspective: I don’t want to have to ask one more question, whether it’s an interview or survey question than I absolutely have to. I don’t want to impose one more minute than I have to. But there’s times in which we need to do some diagnosis, we’ve gotta be able to sell them to say, it’s targeted, it’s focused, there’s a reason for it, and it’s going to add value.

I find the most compelling arguments come from a place where I show that I'm on their side and want to help them meet their goals and make their lives easier…show them how you're helping them out. Showing that I understand where they are coming from and that I understand their skepticism can be a really helpful tool in these discussions.

I try to convey whatever related research supports my argument and try to show that I understand their skepticism. I may say that I understand a structured interview may feel stiff, and you may feel that you aren't getting the information that you want out of it, but that suggests we should revisit the items to ensure they ask about KSAOs needed for the job rather than do away with interview guides.

It starts by putting yourself into the employees’ shoes. People resist for different reasons…and until we understand the source of resistance it’s very difficult to be able to come up with a good change management plan.

4. Illustrate business value and use business terms

Illustrate business value using metrics or outcomes business leaders care about. Identify these up front so you can come back and show how you’ve impacted it. Always show your ROI for a project and provide expected outcomes in the beginning.

Demonstrate the potential value of your solution through the use of a utility analysis.

Be sure you’re presenting in terms business leaders understand. You want your audience to feel engaged and comfortable with the material you are presenting. If they don’t understand it, they likely are not going to be supportive of or receptive to it.

For business audiences, it’s really about framing the question we are trying to answer, and if you come at it from the business question angle, it can help inoculate against some of the challenges with getting the data. I think if you make a logical case for what the question is you’re trying to answer, and what are the types of data sources that are needed to answer it, you can work backwards into it.

5. Be creative and resourceful

For self-report, I try to find at least one variable that only makes sense as a self-report and use that as an example. I usually go with perceived support. Who else is going to know better than you whether you feel supported at work by your coworker or supervisor.

In instances where clients don't believe we can help them because we aren't experts in their field, I try to emphasize collaboration. Remind them that they are also an expert. For instance, with training development, I would remind clients that they are the expert in the content area and I am the expert in training. We need each other to create a great product.

In some cases we might be able to integrate in with other data collections that are being gathered, sometimes that’s a way to add a couple questions to an existing type of survey…and obviously if there are ways to capture that data aside from surveys, I think that’s beneficial…for us to make use of data that organizations already have and make sure we’re getting as much from that before initiating a new research effort…can reduce the strain placed on employees and managers and those participating in the research.

Summary

In this column, I-O professionals identified common concerns with applied organizational research and provided advice for overcoming these concerns through effective translation. The experts’ advice generally fell under five different categories: (a) Educate the stakeholders, (b) build trust with all parties, (c) understand the stakeholders’ perspective, (d) demonstrate business value and use business terminology, and (e) be creative and resourceful.

Thank you to our contributors and readers, and remember to refer to our handy guide when preparing to present to an audience that may not be familiar with I-O research methods!

Previous Article Getting to Know SIOP's Award Winners: Showcasing Small Grant Winners
Next Article Learning About Learning: Who Are L&D Employees?
Print
9626 Rate this article:
3.6