Society for Industrial and Organizational Psychology > Research & Publications > TIP > TIP Back Issues > 2018 > October
Meredith Turner
/ Categories: 562

Recognizing Shonna Waters: Distinguished Early Contributions–Practice Award Winner

Liberty J. Munson, and Garett Howardson

As part of our ongoing series to recognize the achievements of SIOP’s award winners, the last edition or TIP’s Award Winner article focused on several of our Distinguished Contributions Award winners: Eden King, one of the winners of our Distinguished Service Contributions Award, Scott Tannenbaum, Distinguished Professional Contributions Award, and Dov Eden, Distinguished Scientific Contributions. In my haste to send it to press, I overlooked one other Distinguished Contributions Award winner—Shonna Waters. So by way of apology for my oversight, this article is focused solely on her accomplishments, the work that she did to earn this award, and her insights for how you, too, could become one of next year’s Distinguished Contributions Award winners!

Share a little a bit about who you are and what you do.

My name is Shonna Waters. I’m originally from Fort Washington, Maryland but now live in Arlington, Virginia with my husband, two daughters, dog, and cat. I’m a traditionally trained scientist–practitioner. I’m a University of Minnesota grad and have spent the first 10+ years of my career in organizations that truly straddle both spaces, like HumRRO and PDRI. I’m currently a regional vice president of Behavioral Science at a late stage start-up, BetterUp. BetterUp has let me continue to blend science and practice, splitting my time across research, consulting, and leadership. At the time of the nomination, I was the vice president of Research at SHRM.

Describe the research/work that you did that resulted in this award. What led to your idea?

This award is designed to recognize the span of a career rather than a project, so I largely credit the award to the privilege of working with amazing people, in terms of both their skill and generosity, throughout my career. My earliest days were spent at HumRRO, working alongside some of the best scientist–practitioners around: Bill Strickland, Deirdre Knapp, Suzanne Tsacoumis, Teresa Russell, Cheryl Paullin, Rod McCloy, Chris Sager, Dan Putka (the list goes on). My HumRRO colleagues modeled that I didn’t have to choose science or practice and taught me to love both rigor and practicality. I continued to work there even after I went to Minnesota to get my PhD. I had a similar experience at PDRI where I worked with Janis Houston, Wally Borman, Jeff Johnson, Rob Schneider, and others who both modeled the skills and gave me opportunities to grow my own.

Working at HumRRO and PDRI was like going to grad school. I got to work on applied research projects across a wide array of topics: job analysis, competency modeling, organizational assessments, educational research, leadership development, and, of course, a lot of statistics, research methods, and test development and validation projects. I went from HumRRO to the National Security Agency (NSA) to work in a start-up within their HR organization started by Dave Dorsey and Wayne Baughman. At NSA, I did cool I-O projects like what I did in the consulting world but as an internal consultant. As an internal consultant, I gained a thirst for the thrill of building something from the ground up and for implementing solutions. I learned that every project, even a measurement project, is really a change management effort. A few projects that solidified those lessons for me were designing, developing, and validating the selection system for language and intelligence analysts and leading the agency’s performance management transformation. The scope of the projects and the critical nature of the agency’s mission made my time at NSA particularly impactful to me.

I left NSA to focus on getting science to more HR professionals. I joined other I-Os, like Alex Alonso, Mark Schmit, Scott Oppler, and Jeff Pon, as SHRM’s VP of Research to develop and implement the vision for more rigorous research translated into customer-centric formats. 

What did you learn that was surprised you? 

I have learned so many things already in my relatively short career that it’s hard to choose just one thing! The meta-lesson in it all may be to hold things loosely and embrace complexity. The world is full of gray--many, many shades. I joke that my grad school self wouldn’t recognize my present-day self because I’ve gone from one end of our field (“I”) to the other (“O”). In the end, I’ve found that although it’s tempting to reduce the world to black and white, right and wrong, “I” or “O,” the power usually lies somewhere in the middle, and we can learn the most from things that make the least intuitive sense to us. I credit Dave Dorsey and Wayne Baughman with imparting those life lessons, and they’ve made me far more curious and willing to be wrong.

What do you see as the lasting contribution of the work that led to this award on your thinking about application of I-O principles in organizations?

I believe that if we begin with the end in mind and apply the foundational concepts of our field, we will be able to help organizations meet changing demands from the market and their employees. Innovation is often borne of necessity and many of our traditional tools and processes are ripe for innovation.

Who would you say was the biggest advocate of your research/work that resulted in the award? How did that person become aware of your work?

Alex Alonso nominated me for the award, and Dave Dorsey originally encouraged me to apply. Teresa Russell and Paul Sackett also wrote nomination letters from the I-O perspective. Jim Seacord and Pat Byrd wrote nominations from the perspective of client impact, and Ron Morgan wrote a nomination from the perspective of my impact as an adjunct faculty member at GW and Georgetown. I worked with each of them as a colleague or consultant at various times throughout my career.

To what extent would you say this work/research was interdisciplinary? 

I really enjoy interdisciplinary work. I don’t typically start a literature search through I-O sources. I search for solid evidence on the topic regardless of discipline. Throughout my work in the testing space, I frequently pulled in literature from education, cognitive psychology, and neuroscience. When my work shifted to change management, organizational behavior and performance management, work from anthropology, sociology, and counseling psychology became relevant to me.

In my current role, BetterUp has an explicitly interdisciplinary approach. My boss is a psychiatrist; I work with other I-Os, data scientists, a behavioral economist, positive organizational psychologists, and people with backgrounds in adult development, clinical psychology, counseling, and social work every day. We’re all working together to create sustainable behavior change.    

Are you still doing work/research in the same area where you won the award? If so, what are you currently working on in this space? If not, what are you working on now and how did you move into this different work/research area? 

I am still working as a scientist –practitioner but in a different company and different application. I wanted to be more closely involved in what seems to me like a shift in the world of work that includes a push for more meaning, purpose, and autonomy. BetterUp offered an intersection of my interests and the chance to help build something. I now work at the intersection of science, technology, and design to further our understanding of behavior change, understand our client’s business goals and context, and design programs that integrate the two.

I still do some leadership coaching in my free time and continue to teach off and on (although I’ve moved from research methods and stats to coaching skills). I also continue to enjoy writing and presenting on a range of topics.

What’s a fun fact about yourself (something that people may not know)?

Janis Houston once described herself as an adrenaline junkie, and that probably fits me too. I didn’t really fall into adventure until I was in grad school, but now, I love trying new things so I’m always trying to push my own limits. I’ve skydived, taken trapeze lessons, run a marathon, done Ragnar and a Spartan race, and participate in strength competitions.

What piece of advice would you give to someone new to I-O psychology? (If you knew then what you know now…)

Everyone comments on how small our field is but it’s really true! Someone who is your client today may be a coworker tomorrow, so try to leave the other person in every interaction feeling better than when they entered it. Creating goodwill is a good general principle, but it may be even more important in our field where six degrees of separation may really be two. 

Project Highlights From the Application:

I began my career in selection with a focus on fairness and bias.

  • In graduate school, I studied topics, such as linearity in the upper end of the ability– performance distribution, modeling stereotype threat in SAT data, meta-analytically investigating the role of socioeconomic status on the ability–performance relationship, and determining the effect of test coaching (both formal and informal). Dissertation focused on evaluating the tradeoffs in terms of validity and subgroup differences for g and specific abilities.

Job/Occupational Analysis

  • I conducted job and practice analyses in high impact jobs and occupations, such as system operators/power dispatchers for the Edison Electric Institute and to support the licensure exam for physical therapists and physical therapist assistants at FSBPT.
  • I developed a new methodology for using military job analytic information to inform test specifications for the Armed Services Vocational Aptitude Battery (ASVAB), the military’s selection and classification battery.

Testing/Selection

  • I supported a wide range of test development and validation studies, including
    • Designed, developed, validated, and supported the implementation of the Cognitive Ability Test Battery (CATB), a suite of new prehire assessments, including a cognitive test battery comprising four nonverbal reasoning tests, a test of English proficiency, a constructed response writing test, a logical reasoning test, and a personality-oriented biodata tool designed to assess motivation and personality. These assessments are now in use in selecting intelligence and language analysts at the National Security Agency.
    • Designed and developed fully customized, client-owned, high fidelity simulations (“virtual role plays”) for civilian managers at the Department of Defense (DoD) for use in (a) development and (b) promotion assessment. The online simulations used multilevel animations and branching technology to place the test taker in realistic situations. The use of branching technology allows the tool to track and score responses at several points during the assessment process. The development followed a content-validation approach to ensure job relatedness and realism.
    • Developed and administered assessment centers for the Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF) and the Social Security Administration. The ATF assessment center is used to promote supervisors and identify executive potential. The SSA assessment centers are used for selection and leadership development.
    • I managed and led technical aspects of large credentialing programs including:
      • A multiyear project providing technical support and maintenance on the Certified Professional in Learning and Performance (CPLP) certification program for the American Society of Training and Development (ASTD). This certification included a multiple-choice knowledge test and a work sample portion.
      • Managed the development of two continuing competence exams for the Federation of State Boards of Physical Therapy.
    • I conducted data analyses related to the validation of entry-level selection tests for three job series within the Bureau’s Intelligence Career Services: intelligence analyst, language analyst, and surveillance specialist. Responsible for creating the operational version of the job knowledge test (a criterion measure) based on pilot data, as well as the operational version of a measure of critical thinking skills, which was part of the predictor battery. Participated in data collection activities during the pilot study.
    • I designed a validation strategy framework for the ongoing validation and roll out of tests to other parts of the agency, which identified a strategy to balance scientific rigor/professional and legal requirements, with the need to minimize the resources required to obtain validation evidence.
    • I collaborated with researchers from Language Testing to develop an operational length short-form of an English proficiency and writing test. Provided psychometric consultation and assistance in developing the data analysis plan, conducting and interpreting analyses, developing a test maintenance plan, and documenting the study.
    • I supported a range of other test development efforts for public and private sector clients such as USPS, DoD, Verizon, U.S. Army and Navy, Tennessee Highway Patrol, Sprint, and American Express ranging in formats (e.g., situational judgment tests, structured interviews, clerical tests, biodata, personality tests, cognitive ability tests, declarative knowledge tests).
    • I conducted a meta-analysis of clerical ability tests.

Performance Management

  • I led the design and transformation of NSA’s promotion, performance management, and awards and recognition systems to reduce administrative burden, increase transparency, promote fairness, and strengthen the performance culture. This large-scale change initiative is a multiyear project that involves collaborating with partners across the agency and working with a large cross-functional human resources working group to implement system changes. Designed and delivered briefings to garner support for the program at all levels (up to Deputy Director) within and outside the agency.

Leadership Development

  • I provided feedback to members of the Senior Executive Service (SES) at the Department of Treasury who participated in a 360-degree assessment using the OPM 360. The purpose of this work was to provide independent, third-party facilitation to help executives interpret their feedback and apply it to a developmental plan.
  • I received certificate in Leadership Coaching from Georgetown University’s Institute of Transformational Leadership and received ACC from ICF. I now work with leaders across organizational levels, sectors, and industries, such as nonprofits, IT, contracting, and real estate as an independent coach.

Organizational Development

  • I conducted climate studies within the intelligence community (IC). These studies included conducting structured interviews regarding the psychological climate and its various subcomponents, and content analysis of the qualitative results of the interviews. Documented the results in written reports and conducted debriefings to agency leadership for a small number of IC components.
  • I consulted on leadership and business challenges facing a directorate of Acquisition organization, including facilitating a leadership off site, and conducting strategic planning activities. Performed action planning to carry out additional business changes and human capital initiatives.

Program Evaluation

  • I conducted a large-scale evaluation of the Department of Defense’s (DoD) Defense Senior Leadership Development Program (DSLDP), the premier civilian leader development program for the DoD. The purpose of DSLDP is to develop senior civilian leaders with the enterprise-wide perspective needed to lead organizations and programs, and achieve results in the joint, interagency, and multi-national environments. Within the DSLDP program evaluation, conducted an evaluation of the process for leadership development evaluation and made suggestions for best practices based on research in leadership development.
  • I developed the evaluation plan for the recently established Special Salary Rate (SSR) for science, technology, engineering, and math (STEM) work roles. The evaluation plan was briefed to a congressional staffer from the House Permanent Select Committee on Intelligence (HPSCI). Collaborated with colleagues from across HR to collect baseline and recurring metrics to assess the impact of the SSR on recruitment (internal and external), retention, and engagement. Continued to serve as senior project advisor.
  • I commissioned a joint working group to develop a return-on-investment (ROI) strategy for prehire testing at the agency. As part of that effort, drafted the initial metrics, collected information about archival data sources, met with stakeholders, and enlisted participation.
  • I served as a technical advisor to senior leadership team developing an evaluation strategy for an agency-wide transformation effort.

Previous Article The Bridge: Connecting Science and Practice: Leader Edge: Applying Science to Enhance Leader Performance and Diversity
Next Article Max. Classroom Capacity: My Brain Got Drained! And It Didn’t Take Long…
Print
4256 Rate this article:
No rating