Featured Articles
Jenny Baker
/ Categories: 602

Mobilizing Knowledge: Our Personal Outreach Approach and Computer Application

Stephen D. Risavy, Wilfrid Laurier University; Chet Robie, Wilfrid Laurier University; Peter A. Fisher, Toronto Metropolitan University; & Sabah Rasheed, Wilfrid Laurier University

The purpose of this article is to provide the industrial and organizational (I-O) psychology community with a report on a novel, noncommercial (open access) knowledge mobilization initiative that has the potential to help address a fundamental issue in I-O psychology: the ongoing (and increasing!) research–practice gap. Our initial knowledge mobilization approach used a personal, high-quality method for helping to communicate the science of employee selection and hiring to practitioners who can implement science for the benefit of their organizations. Subsequently, we created an evidence-based computer application (i.e., app) that provides customized selection and hiring process feedback. Here is the link to the webpage where our app can be located: https://lazaridisinstitute.wlu.ca/resources/selection-tool.html

Consistent with the vision for The Industrial-Organizational Psychologist (TIP; Sanders, 2022), our knowledge mobilization efforts also relate to important diversity, equity, and inclusion (DEI) issues. For example, organizations that benefit from our efforts to date have often been smaller tech organizations, but our approach can be extended to target other small organizations, as well as organizations that are marginalized-owned and/or nonprofit. Our initial outreach efforts led to us creating an evidence-based consulting app that provides a high-value service to organizations that might not otherwise have the budget for procuring management consulting advice. Thus, it is possible that more widely disseminating our app as well as encouraging others to engage in knowledge mobilization initiatives similar to what we have undertaken can help to foster a more diverse and inclusive culture in the practice and science of I-O psychology. It is also worth noting that this was not initially a DEI-forward article, but after reflecting on the current vision for TIP (Sanders, 2022), we can certainly see how our knowledge mobilization efforts relate to important DEI considerations in addressing evidence-based practices.

In this article, we present our inspiration for our knowledge mobilization efforts, introduce our personal outreach approach, discuss the evolution of our initial approach into an interactive, computer-based Shiny app, and provide our reflections on these knowledge mobilization initiatives. Overall, our personal outreach approach and app provide I-O researchers with additional mechanisms that can be used to mobilize and communicate our science and hopefully make the world of work a better place for everyone.

Inspiration

One area where I-O science can clearly do great things for organizations is personnel selection, which is an area that has reached the “gold standard” of having many strong and consistent research findings (e.g., as summarized in Schmidt and Hunter [1998] and more recently in Sackett et al. [in press]). Curiously, this area of I-O research has the widest research–practice gap (Rynes et al., 2002), and this gap appears to be growing (Fisher et al., 2021). This is alarming as small and large organizations alike would benefit from using selection tools with evidence of predicting job success. Furthermore, using standardized, evidence-based selection practices may help organizations avoid claims of discriminatory hiring practices. It is also concerning that some of the organizations that would benefit most from the cost savings that can be realized through using reliable and valid selection methods (e.g., marginalized-owned, small, and/or nonprofit organizations) are likely to have the least access to advice regarding best-practice selection methods and, thus, would potentially experience even more significant research–practice gaps than large, resource-rich organizations.

Many best practices in selection are accessible for organizations to adopt. For example, it has long been understood that fully structured interviews are more valid than unstructured interviews (e.g., McDaniel et al., 1994). However, many interviews remain informal, unstructured, or semistructured. How common do you think it is for organizations to ask typical, unstructured interview questions, such as “Tell me about yourself,” “What are your strengths and weaknesses?” and “Where do you see yourself 5 years from now?” Even organizations as renowned as Google at one time infamously asked applicants off-the-wall interview questions, such as “A man pushed his car to a hotel and lost his fortune. What happened?” and “Why are manhole covers round?” (Carlson, 2010). Clearly there is room for improvement in the interview protocols of organizations and following relatively easily implementable guidelines such as asking all candidates for one job the same questions, ensuring that all questions are directly related to the job (Bonus points for conducting a job analysis to base this information off of!), and having a standardized rubric for evaluating each interview question response would be highly beneficial for organizations.

As a second example, many organizations continue to rely on the collection and evaluation of resumes and cover letters even though decisions based on these documents can lead to discrimination (e.g., He & Kang, 2021) and have low predictive or criterion-related validity. One of our recent open access publications contains practical advice for a more valid, more cost effective, and less biased initial applicant screening method that organizations can use: the general application form (Risavy et al., 2022).

Our Personal Outreach Approach

The TIP readership does not need to be convinced about the impact that the science of I-O can have on workplaces; however, we feel that more I-O researchers should seek to connect with practitioners who can use their findings to improve workplaces. After conducting our own research on the research–practice gap (e.g., Fisher et al., 2021), we were disheartened by the ongoing gap, but after some reflection and discussion we began to feel motivated to try to do something about this fundamental issue in I-O psychology. However, in order to begin moving away from simply documenting (and lamenting!) the research–practice gap, we first had several questions: How do we reach practitioners who can use our science? What do we say? Will they respond to us or care about our science? We decided to answer these questions by designing an approach for bringing our science to a practitioner audience in which we actively sought a group who we believed could greatly benefit (cf. Lewis & Wai, 2021).

Humans are social animals, innately wired for personal connections, and all models of organizational change implicitly situate the personal connection of the change agent to the organization as a central feature (cf. Burke, 2018). Thus, we believe that making personal connections will allow us to earn trust and increase practitioner willingness to hear and implement our employee hiring and personnel selection advice. Our first step was to compile a list of potential contacts. Our search criteria included tech organizations with a local headquarters that were smaller in size (i.e., 2–99 employees). We conjectured that selection best practices would be highly beneficial to smaller organizations that likely lack formal HR training and support and perhaps the resources to learn and implement evidence-based best practices. The tech industry was especially interesting for us to focus on because our home academic institutions have an emphasis and strong connection to the local tech industry, which is one of North America’s most prominent tech hubs. We also chose this focus because there seems to be a preponderance of questionable selection advice being shared in this industry; a prime example being the aforementioned Google interview questions that had gained widespread notoriety and were likely influencing other tech organization’s interview protocols. Indeed, one of our recent studies that contained a sample of tech organizations supported this assertion (Risavy et al., 2021).

We used publicly available sources (e.g., Communitech’s [a local Canadian tech hub] member list) to identify organizations that met our search criteria and then used LinkedIn to find the name of a contact person from each organization. We prioritized contacting a decision maker, which was usually a president/CEO or senior-level manager in these smaller tech organizations. We then sent a LinkedIn connection request to each of the 100 contacts that we found. Fifty-one of the 100 requests we sent were either accepted or were already LinkedIn contacts of the lead author. Making this LinkedIn connection allowed us to contact individuals directly. In this direct contact, we requested a short, 5–10 minute call to discuss an opportunity for them to receive evidence-based hiring advice from us (see the Appendix, Message #1 Template).

Of the 51 contacts, 16 (31.37%) responded to our message and 9 (17.65%) agreed to have a more in-depth conversation about their hiring practices. With these nine organizations, we started with one-on-one phone or virtual meetings that lasted approximately 20 minutes. During these meetings, we asked contacts about the employee selection procedures their organization was using. We then used their responses to provide them with a summary of evidence-based feedback about their organization’s hiring practices (see the Appendix, Message #2 Template). For example, if they indicated that they were using unstructured interviews (which we defined for our sample as interviews where the interviewer asks a variety of questions of their choice, where questions may be adapted to a particular applicant, and that may consist of an informal conversation), we provided the following feedback for that selection tool:

Interviews that are structured (i.e., the interviewer examines applicants using a prepared set of questions concerning the applicant’s past behavior in a variety of situations, the interviewer asks the same questions of all applicants) have been found to better predict future job performance compared with unstructured or mixed/semistructured interviews. Research has found that interviewers often overestimate their ability to predict future job performance based on unstructured interviews/informal conversations. Use of unstructured interviews can increase the possibility of discussing non-job-related and potentially illegal information (e.g., age, ethnicity, family status). Be sure to state to interviewees at the outset of the interview that you will not ask any questions related to protected grounds such as age, disability, sexual orientation, gender identity/expression, and family/marital status, and that you would appreciate if they do not disclose any of that information during the hiring process.

Last, we followed up with our contacts after 3 weeks to ask whether they had any questions about the feedback or how to implement our advice (see the Appendix, Message #3 Template). One limitation of our approach is that the type of feedback in the above example does not provide specific implementation or action steps; however, it is worth noting that there are some examples of feedback in our app that provide more stand-alone initial feedback. For example, part of the feedback provided for when we recommend application forms over resumes includes a link to the Ontario Human Rights Commission’s (OHRC; 2008) application form template that companies can use to help create an application form or revise their existing application form. Regardless, we chose to create the feedback for our app in a way that allows for quick, accessible, and approachable science-based advice while maintaining an appropriate level of depth, generalizability, and time investment for this pro bono undertaking (it is worth clarifying that our intention with this initiative does not involve an aspiration to drive paid consulting engagements).

Our Computer Application

After being energized by our personal outreach approach initiative and seeing that there was an appetite for feedback on hiring practices, we decided to automate this process to save time for us and the organizations we would like to help while expanding the amount and types of organizations that we could potentially help. For example, automating our process will allow us to reach a broader scope of organizations beyond smaller tech organizations and would help to make our evidence-based best practices accessible to other organizations that can benefit from this information (e.g., marginalized-owned, nonlocal/nontech small, and/or nonprofit organizations). Thus, we began building an interactive app with the Shiny package in R (https://shiny.rstudio.com/).

The app (available directly through this link: https://employee-selection-assistant.shinyapps.io/app-1/) that we built asks organizations to report the selection tools used in their hiring process, and then, once their responses have been submitted, it instantaneously provides customized feedback along with a score reflecting the efficacy of the tools they are using. Essentially, our app asks the same questions and provides the same feedback as our personal outreach approach. Furthermore, we also maintain the personal aspect of our initial outreach approach as respondents are invited to follow up with the research team for answers to their questions, to receive further information, or for help with implementing the recommendations (again, free of charge). We have received ongoing ethics approval from the lead author’s institution to allow us to collect data via this app.

Prior to writing this article, we also engaged in a launch project by speaking with a group of tech entrepreneurs at an event coordinated by a specialized institute that is affiliated with our university. From this event, we had 38 responses on our app, engaged in some excellent dialogue with event attendees, and received appreciative follow-up messages as well as interesting questions from our attendees. It is our hope that this TIP article will continue to expand the reach of this knowledge mobilization effort.

Reflections

Although I-O psychologists may think that communicating their science to the public is challenging and time consuming, we found our personal outreach approach to be effective and efficient, and we feel that our app has helped further enhance the effectiveness and efficiency of our knowledge mobilization efforts. We are also hopeful that this TIP article along with the scripts in the Appendices will be a helpful starting point for other researchers who may be interested in a novel, unique way to do their part to help bridge the research–practice gap. Regarding the time required to undertake this initiative, our efforts in personally connecting with the nine initial organizations were not onerous, and the automation achieved through our app has helped to make our subsequent outreach efforts even less time consuming. We used our external funding1 to have our graduate research assistants find the organizations and contacts and to develop our app. It was approximately a week or two of work for the lead author to contact the organizations, survey the interested ones, provide feedback, and follow up. If others are interested in using our process, they should feel confident in using and expanding upon our communication templates. In addition to the previous links that we provided to our Shiny app, our source code along with associated annotations (indicated using hashtags) have been made available online (https://doi.org/10.17605/OSF.IO/SDT9K) for any parties interested in modifying it for their own purposes.

Some may wonder why we have become involved in these efforts without a specific, tangible reward for engaging in this type of work. However, this knowledge mobilization effort has become one of the more interesting projects that we have pursued as a result of our funding and was certainly more invigorating to us than further bemoaning the research–practice gap in selection. Furthermore, it is possible that we can leverage the personal contacts we have developed in this project for future, more traditionally publishable research—so, there may yet be a specific, tangible reward as a result of these outreach efforts, but if so, it will be a matter of delayed gratification. Regardless, as I-Os, don’t we all know that it isn’t just about extrinsic motivation? Our research team was highly engaged in trying to leverage our science to help the people that it is intended to help, and it was also exciting to speak with interested professionals outside our cloisters of academia.

Conclusion

Although our personal outreach approach and app focused on selection, these methods can be adapted to any area of applied I-O psychology. Hopefully, others will consider our approach as well as our idea of using an interactive computer-based app for mobilizing their science and communicating with relevant nonacademic audiences.

Note

1 Our funding source for this project was a Social Sciences and Humanities Research Council (SSHRC) Insight Development Grant (IDG) number 430-2020-01011.

References

Burke, W. W. (2018). Organization change: Theory and practice (5th ed.). Sage Publications, Inc.

Carlson, N. (2010, November 8). 15 Google interview questions that will make you feel stupid. Insider. https://www.businessinsider.com/15-google-interview-questions-that-will-make-you-feel-stupid-2010-11

Fisher, P. A., Risavy, S. D., Robie, C., König, C. J., Christiansen, N. D., Tett, R. P., & Simonet, D. V. (2021). Selection myths: A conceptual replication of HR professionals’ beliefs about effective human resource practices in the US and Canada. Journal of Personnel Psychology, 20(2), 51–60. https://doi.org/10.1027/1866-5888/a000263

He, J. C., & Kang, S. K. (2021). Covering in cover letters: Gender and self-presentation in job applications. Academy of Management Journal, 64(4), 1097–1126. https://doi.org/10.5465/amj.2018.1280

Lewis, N. A., Jr., & Wai, J. (2021). Communicating what we know and what isn’t so: Science communication in psychology. Perspectives on Psychological Science, 16(6), 1242–1254. https://doi.org/10.1177/1745691620964062

McDaniel, M. A., Whetzel, D. L., Schmidt, F. L., & Maurer, S. D. (1994). The validity of employment interviews: A comprehensive review and meta-analysis. Journal of Applied Psychology, 79(4), 599–616. https://doi.org/10.1037/0021-9010.79.4.599

Ontario Human Rights Commission (OHRC). (2008). Appendix D: Sample application for employment. Human Rights at Work (3rd ed.). https://www.ohrc.on.ca/en/human-rights-work-2008-third-edition/appendix-d-%E2%80%93-sample-application-employment

Risavy, S. D., Robie, C., Fisher, P. A., Komar, J., & Perossa, A. (2021). Selection tool use in Canadian tech companies: Assessing and explaining the research–practice gap. Canadian Journal of Behavioural Science/Revue canadienne des sciences du comportement, 53(4), 445–455. https://doi.org/10.1037/cbs0000263

Risavy, S. D., Robie, C., Fisher, P. A., & Rasheed, S. (2022). Resumes vs. application forms: Why the stubborn reliance on resumes? Frontiers in Psychology, 13, Article 884205. https://doi.org/10.3389/fpsyg.2022.884205

Rynes, S. L., Colbert, A. E., & Brown, K. G. (2002). HR professionals’ beliefs about effective human resource practices: Correspondence between research and practice. Human Resource Management, 41(2), 149–174. https://doi.org/10.1002/hrm.10029

Sackett, P. R., Zhang, C., Berry, C. M., & Lievens, F. (in press). Revisiting meta-analytic estimates of validity in personnel selection: Addressing systematic overcorrection for restriction of range. Journal of Applied Psychology. Advance online publication. https://doi.org/10.1037/apl0000994

Sanders, A. M. F. (2022). Editor’s column: Introductions, gratitude, & looking ahead. The Industrial-Organizational Psychologist, 60(1), 2–3. https://0-www-siop-org.library.alliant.edu/Research-Publications/Items-of-Interest/ArtMID/19366/ArticleID/5856/preview/true

Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124(2), 262–274. https://doi.org/10.1037/0033-2909.124.2.262

 

 

Appendix: Communication Templates

Message #1 Template: Initial Request to Discuss the Opportunity

Subject: University Consulting/Research Opportunity

Hi NAME,

Thanks for accepting my invitation to connect on LinkedIn. My name is NAME and I am a Professor at INSTITUTION NAME. I am reaching out to you because my research team and I are looking for a few tech organizations to work with to provide consulting on their hiring practices. The ultimate goal of our research is to improve communication, information sharing, and collaboration between researchers and tech organizations. Essentially, we are offering evidence-based hiring advice to interested organizations free of charge.

Would we be able to arrange for a short, 5–10-minute call so that we can discuss this opportunity in further detail? Or, if it is easier, then please feel free to just give me a call at your convenience at: (XXX) XXX-XXXX

Thanks so much,

NAME

 

Message #2 Template: Summary of Feedback

Subject: Consulting Feedback Summary – ORGANIZATION NAME

Hi NAME,

Thanks again for your time during our meeting yesterday!

Attached is a summary of our feedback based on the information provided regarding your organization’s hiring process. Hopefully this can be the start of a larger and ongoing conversation and I would of course be very pleased to arrange for a meeting to discuss our feedback in further detail.

Best,

NAME

 

Message #3 Template: 3-Week Follow-Up Message

Subject: Consulting Feedback Summary – ORGANIZATION NAME – Follow-Up

Hi NAME,

Just wanted to follow-up on the summary of our feedback that was provided regarding your organization’s hiring process.

Do you have any questions about the feedback? Also, I am happy to help with implementing any of the suggestions that were provided, so please feel free to let me know if you would like to arrange for a meeting to discuss any of this.

Best,

NAME

Print
535 Rate this article:
5.0
Comments are only visible to subscribers.

Categories

Information on this website, including articles, white papers, and other resources, is provided by SIOP staff and members. We do not include third-party content on our website or in our publications, except in rare exceptions such as paid partnerships.