Featured Articles
Jenny Baker
/ Categories: 572

On the Legal Front: Helpful Hints

Rich Tonowski U.S. Equal Employment Opportunity Commission

The opinions expressed in this article are those of the author and not necessarily those of any government agency. The article should not be construed as legal advice.

This edition of the Front offers some words of advice on statistical analyses, government regulation, and court cases for your consideration, with commentary and Internet links.

Mind Those Interactions

Cohen et al. (2019)1 have a demonstration of how combining two supposedly related jobs, each with gender pay equity, can produce combined findings of pay disparity.2

The underlying issue is what jobs to combine. Particularly for federal contractors subject to the Office of Federal Contract Compliance Programs (OFCCP), this can get interesting. Presumably the jobs are similarly situated. But there are at least two ways to consider “similar.” OFCCP uses two terms: Similarly Situated Employee Groups (SSEGs) and Pay Analysis Groups (PAGs.) SSEGs are defined in the U.S. Equal Employment Opportunity Commission (EEOC) Compliance Manual (2000b): those who would be expected to be paid the same based on (a) job similarity (e.g., tasks performed, skills required, effort, responsibility, working conditions, and complexity); and (b) other objective factors such as minimum qualifications or certifications. This is cited in in OFCCP’s Directive 2018-05 (2018).

That directive adds PAGs: groups of employees (potentially from multiple job titles, units, categories and/or job groups) who are comparable for purposes of analyzing a contractor’s pay practices. PAGs are intended to mirror the employer’s compensation system—if reasonable. Somewhat disparate elements can comprise a PAG, with the disparities handled by statistical controls in a multiple regression analysis.

Consider a job title where, on average, employees of one sex make less money than the other sex. The employer starts all employees at the same salary and rewards tenure with a constant dollar increase per year. Thus, those with longer tenure make more money, and because in this case tenure is associated with sex, there is an explanation for the disparity that can be demonstrated with regression. Now consider a second job title where pay policy is the same, but both starting salary and annual increment are lower than in the first job title. Here also there is a pay disparity by sex accounted for by tenure.

Combine the two into one grouping. We might expect that if we again control for tenure in a regression analysis, and control for the two jobs, the disparity is accounted for. This is not necessarily so. The second job is on a different pay scale, where years of tenure are worth less than in the first job. Control for the tenure-by-job interaction; otherwise, there could be a statistically significant coefficient for sex in the regression. In the example used in the presentation, the interaction accounted for almost all the pay variation. Although interaction of the demographic variable with other explanatory variables has gotten some notice,3 structural differences in the pay system when enforcement agencies or plaintiffs combine different jobs into one regression analysis may provide new challenges.

 

Mind the OFCCP

OFCCP has continued to make good on its promise of transparency with various issuances for federal contractors. Nason, Kelly, and Arsen (2019) have a summary of guidance on record-keeping requirements with links to the OFCCP documents. This includes a relatively clear explanation of what’s involved with Internet versus traditional applicants.

In July, OFCCP also released three frequently asked questions (FAQs) documents. One affirmed that project-based or freelance workers are likely to be contractors rather than employees and so should not be included in affirmative action plans. The other two documents are discussed here.

First is the FAQ paper on practical significance (OFCCP, 2019a). References to the statistical literature are current (including Oswald, Dunleavy, and Shaw [2017]) and cover multiple approaches to determining practical significance. Essentially, OFCCP’s position is that it will include practical significance considerations in a “holistic review.” This leaves open the basis for any given analysis. “OFCCP reviews will typically employ a combination of those tests and principles to ensure that the agency is efficiently deploying its resources.” However, the document indicates that sometimes practical significance might help the contractor and sometimes OFCCP.

The second FAQ paper has less to recommend it.

The UGESP [Uniform Guidelines on Employee Selection Procedures (EEOC, 1978)] require local validation at the organization’s facilities, with the exception that criterion-related validity evidence can be “borrowed” (validity transportability) from other organizations provided that job similarity is demonstrated, and the validation studies conducted elsewhere are provided for OFCCP review and are found to meet UGESP requirements. Aside from this exception, contractors that use off-the-shelf tests that have adverse impact will not be able to defend their use of the tests unless they validate them at their own facilities. (OFCCP, 2019b)

The FAQs indicate that this view of validity applies to Big Data applications. Generally, the paper affirms that all selection procedures are covered; an extensive list of procedures is also provided. On the more positive side, the paper describes the OFCCP review process for adverse impact and test validity, and the statistical methods in use: Four-Fifths Rule, independent sample binomial Z test, and Fisher’s Exact Test. See below for how to compare results for different tests regarding adverse impact.

OFCCP is known to have this position, but for this writer it is the first time that it has been seen in print. Of course, the Principles (SIOP, 2018) discuss more than transportability, and UGESP itself is clear that the intent was not to try to freeze methodology to what it was in 1978. UGESP is not binding regulation (except for record keeping), but OFCCP has incorporated it into binding regulation for federal contractors.

For Adverse Impact Analysis Methods, Dare to Compare With FAIR

Federal law does not specify how to calculate an index of adverse impact for EEO statutes that provide for that form of liability. UGESP has the Four-Fifths Rule, but that has largely been supplanted by tests of statistical significance. Particularly in borderline situations, it can be useful to see how different statistical procedures, or changes in the number selected from a given demographic group, could affect the conclusions. Leo Alexander and Fred Oswald (2019) have the Free Adverse Impact Resource (FAIR—and it really is free to use) online software that will compare results using the Four Fifths Rule; Z tests for differences in selection rates, odds ratios, and ratio of rates; Chi-Square Test; and Fisher’s Exact Test.4 Rates, ratios, and shortfalls are reported for each method as appropriate. The number of selections by group can be varied, which is useful to see how much change would affect the conclusion. Subgroups defined by an additional demographic variable can be analyzed; that is, the primary focus is on gender, but analyses can be run for each racial group if gender and race are in the data file. A sample data file is provided. User data can be manually input, or a text file with delimiters can be loaded.

Don’t Mess With Texas

The U.S. Court of Appeals backed the State of Texas (State of Texas v. EEOC et al., 2019) in a long-running dispute with the EEOC regarding the agency’s guidance on use of criminal background checks in hiring (the “Felon Hiring Rule,” according to Texas; EEOC, 2012). The state had argued that EEOC had issued an improper substantive rule that increased the likelihood of suits against hiring bans for criminal convictions for some state and local government jobs. EEOC, represented in court by the U.S. Department of Justice (DOJ),5 argued that its guidance was just that—guidance pursuant to existing Title VII requirements that did not impose new obligations. EEOC also pointed out that it had no authority to sue Texas or any state agency; it could only recommend that DOJ do that. In response, Texas added the U.S. Attorney General as a defendant.

The effect of the ruling is to leave in place an injunction against enforcing the guidance against the State of Texas; there is no injunction against enforcing Title VII. Insofar as EEOC maintains that it enforces the law, not the guidance, the agency’s enforcement policy is not impacted. However, the court found that this guidance commits the agency in ways that simple guidance on what the law requires does not. Specifically, EEOC employees are instructed that national statistics support adverse impact of criminal convictions by race; automatic across-the-board exclusions are suspect (unless required by federal law); an employer having a demographically representative workforce is not a defense; neither is it a defense that the employer is acting in conformity with state law regarding exclusions; and (only) two defenses available to employers, validation of the use of criminal history according to UGESP (EEOC, 1978) or “individualized assessment” of the risk of hiring certain employees, are recognized as “consistently” adequate means of meeting Title VII requirements. The logical consequence is that the guidance dictates when agency personnel will determine that the employer has likely unlawfully discriminated. This is substantive rule making.

The impact of this rule making was to create new obligations for Texas, which further confirms it as substantive. There is an increased regulatory burden to avoid enforcement actions and pressure on the state to change its laws.

Finally, apart from not following the notice-and-comment procedure prescribed in the Administrative Procedures Act for substantive rules, EEOC had no authority to promulgate any substantive rules regarding Title VII. The court deleted the wording in the injunction that implied that the guidance might be reissued as substantive if the proper procedure was followed.

Texas did not get from the court a declaration it wanted—that it had the right to use criminal history exclusions. The district court judge who granted the injunction initially had also expressed skepticism on the legality of blanket bans due to felony convictions. Having upheld the injunction, the appeals court found it unnecessary to go further.

The court clarified “that EEOC and the Attorney General may not treat the Guidance as binding in any respect” as applied to Texas. The court also clarified that the injunction does not bar EEOC and DOJ from enforcing an interpretation of Title VII embodied in the guidance. For an analysis of the case from employer-side attorneys, see Maatman and Karasik (2019). The authors think the decision “can further be considered a game changer in the criminal background check litigation landscape.” On that, this writer has doubts. Times have changed. Ex-offenders may not be able to rely solely on EEOC’s say-so, but they do not need to do so. Various state and local governments have “banned the box” on employment applications, and the list keeps growing; most of these efforts are aimed at government jobs within the particular jurisdiction, but a few include the private sector. The U.S. House of Representatives passed a bill that would ban the box for federal agencies and federal contractors. The Society for Human Resource Management has its Getting Talent Back to Work6 initiative to employ those with criminal records; advice to employers follows the EEOC guidance.

Don’t Be Afraid, Very Afraid (Yet); A Personality Test (Still) Isn’t (Necessarily) a Medical Test

June saw publication of articles on personality tests in Industrial and Organizational Psychology (IOP).7 Melson-Silimon, Harris, Shoenfelt, Miller, and Carter (2019) noted the blurring of the line between a preemployment test for job-related individual characteristics and tests that might constitute a medical inquiry under the Americans with Disabilities Act (ADA). This is a legally important distinction; medical inquiries generally cannot be made until there is a conditional offer of employment. Not selecting an applicant because of a medical inquiry can put in play the ADA’s protection for disabled applicants and the need to consider reasonable accommodation. The commentary articles were, for the most part, opposed to any alarmist conclusion that there was a new and stringent need for caution. EEOC guidance recognizes the distinction between a selection test and a diagnostic test.8 Court challenges have been few and provide few lessons beyond indicating that using an instrument designed for clinical assessment as an employment selection instrument may not be a good idea.

But that does not mean that Melson-Silimon et al. (2019) are “crying wolf.” The authors may have missed the best time to raise their alarm, but there have been howls and growls heard in the recent past.

The situation has not been helped by employers’ use of questionable instruments for selection. Martin (2014) noted that employers tend to avoid better predictive cognitive ability tests for personality tests, and the tests they use are often unstable “state” measures that classify the applicant into one of a limited number of types.

An article by Weber and Dwoskin (2014) made some of those criticisms more accessible to a general audience. They featured an applicant with a disability, turned down multiple times ostensibly due to personality testing. The article mentioned EEOC’s interest in pursuing another disability case. The applicant’s situation was also described by O’Neil (2016) in her critique of Big Data applications.9

The issue is still alive with some advocacy groups.

Mental health advocates oppose these tests because they can be used to identify psychiatric disabilities resulting in the screening out of people with certain diagnoses. Accordingly, some employers are using personality tests to obtain illegal disability-related information in a more indirect way (Equip for Equality, 2018).

Typically these tests are used during the pre-offer stage as a way to more quickly screen applicants on the basis of an individual’s interpersonal skills, emotional literacy and social insight. For some individuals with disabilities, these personality tests will be barriers to initial employment interviews based on symptoms of their disabilities—such as Autism Spectrum Disorder (ASD) or Post-Traumatic Stress Disorder (PTSD) [footnotes omitted] (Brown, 2018).

Brown (2018) thought that tests meeting the EEOC guidance

are probably acceptable under the ADA. However, it should be noted that if the reason for giving personality tests is to identify individuals who will be successful in a job, there is evidence that personality tests “are not valid predictors of employee success” and according to some researchers, “close to zero in doing so” [footnotes omitted].

The citations are both about a dozen years old, but both come from reputable journals, one in management and the other in psychology.

Other speculation includes the idea that personality tests can be challenged because, not being “ability tests,” they are outside Title VII’s allowance, and so have diminished standing under that and other EEO statutes; a question that might indicate a symptom of a disorder triggers ADA concerns, even if the question might also be indicative of a job-relevant consideration not linked to a disorder; and ADA’s “regarded as disabled” provisions may be invoked depending on how the employer views personality test results.

The wolf seemed to be on the prowl a few years ago, less so now. But it is good to have this set of articles that have explored the distinction between general preemployment and medical testing.

Don’t Mess With Illinois

This state has been making news with its limits regarding application of biometric identification to employees. To this has been added regulation of one application of Big Data: video interviews. Original language in the draft bill specified “facial expressions.” The final version of the Artificial Intelligence Video Interview Act10 requires notification to, and consent from, applicants regarding the procedure. The law goes into effect on January 1, 2020. It apparently does not specify who enforces it, what the penalties are for noncompliance, or what is meant by artificial intelligence.

Now Be Afraid; Pay Data Reporting Is Here (Maybe)

Reporting pay data as Component 2 of EEO-1 annual reports starting with 2017 and 2018 was due on September 30, 2019. This follows a court decision that found that the U.S. Office of Management and Budget’s hold on implementation was improper; the same office under the Obama administration had approved data collection. Two advocacy groups sued to get the process back on track.

But on August 19 DOJ filed an appeal.11 The appeal rests on two arguments. First, the advocacy groups had no standing to sue. The federal district court for DC had ruled that they could sue under an “organizational” legal argument that groups could be harmed by the government’s withholding of information. But the court acknowledged that the groups had no statutory right to the data. The appeal argues that there could be no harm over denial of something to which the parties had no right in the first place. Second, the court specified 2 years of data and other particulars. That went beyond the court’s authority. So far, there is no indication of a stay on the September 30 deadline.

Readers may be wondering why the appeal has been filed only now; the district court’s initial ruling was made last March. A possible reason is that only recently does EEOC have a chair (Janet Dhillon) and a general counsel (Sharon Fast Gustafson); Senate confirmation for both took over a year.

The pay-data-collection idea was always controversial, with people arguing vigorously on both sides regarding whether the effort was worthwhile. Employers have expressed fear of burdensome reporting requirements and the fueling of class action suits based on apparent discrimination that would not hold up with better data and analyses. One consideration now under fire is that EEO-1 reporting would be an incentive for employers to go beyond reporting requirements to serious internal analysis of pay equity. However, the #MeToo movement may have provided that incentive already. The meaningfulness of the data is also getting renewed questioning. Colosimo, Aamodt, Bayless, and Duncan (2019) pointed out, among other things, a problem with reporting pay from Box 1 of the W-2 form. That box gives income after deductions for retirement accounts, health flexible spending accounts, and other income exclusions. Not every employee has the same exclusions. Thus, employees with the same job getting the same pay can have substantially different Box 1 incomes.

Implications for I-Os

Hopefully, the information above speaks for itself. Here is commentary on broader related matters.

OFCCP’s stand on local validation seems a bit troglodytic, despite commendable efforts lately at rational and transparent procedures. This may be an area that needs more good practice to drive out bad, or at least limited, practice. There seems to be no standard operating procedure that provides test users with enough information to satisfy the reasonable (however defined) expectations of enforcement agencies when validity evidence comes from someplace other than the user. There are objections to local validation, both scientific and practical. Transportability has its limits and its reliance on job similarity may be misplaced; recall Murphy, Dzieweczynski, and Zhang (2009) on putting too much stock in content-matching with test batteries. Synthetic validity keeps popping up, but there does not seem to be widespread usage that satisfies legally and professionally. Meta-analytic validity generalization has been around for decades, but the current Principles (SIOP, 2018) gives caveats against overextension. Perhaps we the profession need to do more to offer a reliable technology. It has come up with SIOP before, but getting traction is difficult. Possibly an airing of current issues in IOP as was done with personality tests and the ADA would be helpful.

What people do not know about our profession can hurt us. Well-intentioned folks with limited understanding of psychological assessment practices such as those involving personality tests or Big Data applications may perceive only threat to be restricted or banned. This also is hardly a new issue. We must reach those folks.

What happens with EEO-1 pay data remains to be seen. There is a difference between what is sufficient for a general overview of pay by industry and demographics, and what constitutes good analysis for pay equity for a single employer. Previous experience with the EEO-1 demographic data indicates that this source by itself generally does not make a determinative case, although some scholars are to the contrary. The professional challenge will be to optimize the data’s potential contribution to eliminate pay discrimination, differentiate what can or cannot be supported by that data, and evaluate the data-collection system itself regarding its efficacy and efficiency.

 

Notes

 

1 This writer was a member on the panel for discussion of this and other pay issues but claims no credit for the demonstration described here.

2 A more technical discussion can be found in Sady and Aamodt (2017).

3 Paetzold and Bent (2018) discuss these interactions as a reconciliation between separate models by sex and a single model with sex as a dummy (0 or 1) variable but within a single job or homogeneous group of jobs.

4 See Morris (2017) for an overview of statistical significance with adverse impact.

5 Under the Trump administration, DOJ does not endorse the content of the guidance but has opposed Texas’s challenge about the authority to issue the guidance. DOJ is also at odds with EEOC regarding sexual orientation/gender identity and disability discrimination. See Mulvaney and Smith (2019).

6 See https://www.gettingtalentbacktowork.org

7 For anyone unfamiliar with IOP, this is a SIOP journal that features a focal article on a topic and associated refereed commentary.

8 Different from medical tests are “psychological tests that measure personality traits such as honesty, preferences, and habits” (EEOC, 2000a).

9 Weapons of Math Destruction was a New York Times best seller and won the 2019 Euler Book Prize from the Mathematical Association of America. It is (intentionally) scary regarding nefarious things that can be done with personality testing and statistical forecasting.

10 The text and legislative history of the law can be found at http://www.ilga.gov/legislation/billstatus.asp?DocNum=2557&GAID=15&GA=101&DocTypeID=HB&LegID=118664&SessionID=108

11 At the time this article was drafted, this was late-breaking news. This writer relies primarily on an item by Campbell (2019).

 

 

References

 

Alexander, L. III, & Oswald, F. L. (2019). Free Adverse Impact Resource (FAIR). Retrieved from https://orgtools.shinyapps.io/FAIR/

Brown, S. (2018). Revisiting disability related inquiries and medical examinations under Title I of the ADA. ADA National Network. Retrieved from https://adata.org/publication/revisiting-disability-related-inquiries

Campbell, B. (2019, August 22). EEOC roundup: Pay data brief filed, blood bank pays $175K. Law360. Retrieved from https://www.law360.com/employment/articles/1191489

Cohen, D., Dunleavy, E., Liakos, C., Morris, S., Simpson, M., Tonowski, R., & Wilkinson, C. (2019, April 4). Contemporary issues in pay equity analysis: A cross-disciplinary discussion. Panel presented at the 35th Annual Conference of the Society for Industrial and Organizational Psychology, National Harbor, MD.

Colosimo, J., Aamodt, M., Bayless, J., & Duncan, M. (2019, April 4). 2019 EEOC/OFCCP practitioner update: Will 2019 bring stormy seas or smooth sailing? Panel presented at the 35th Annual Conference of the Society for Industrial and Organizational Psychology, National Harbor, MD.

Equal Employment Opportunity Commision. (2000a, July 27). EEOC enforcement guidance on disability-related inquiries and medical examinations of employees under the Americans with Disabilities Act (ADA). No. 915.002. Retrieved from https://www.eeoc.gov/policy/docs/guidance-inquiries.html

Equal Employment Opportunity Commision. (2000b, December 5). Section 10: Compensation discrimination. No. 915.003. EEOC Compliance Manual. Retrieved from https://www.eeoc.gov/policy/docs/compensation.html

Equal Employment Opportunity Commision. (2012, April 25). Enforcement guidance on consideration of arrest and conviction records in employment decisions under Title VII. Retrieved from https://www.eeoc.gov/laws/guidance/arrest_conviction.cfm

Equip for Equality. (2018, April). Invisible disabilities and the ADA. Employment legal briefings No. 37. Great Lakes ADA Center. Retrieved from https://www.adagreatlakes.org

Maatman, G. L., & Karasik, A. W. (2019, August 9). Fifth Circuit rules that the EEOC can’t mess with Texas over criminal background checks. Workplace Class Action Blog, Seyfarth Shaw LLP. Retrieved from https://www.workplaceclassaction.com/2019/08/fifth-circuit-rules-that-the-eeoc-cant-mess-with-texas-over-criminal-background-checks/#page=1

Martin, W. (2014, August 27). The problem with using personality tests for hiring. Harvard Business Review. Retrieved from https://hbr.org/2014/08/the-problem-with-using-personality-tests-for-hiring

Melson-Silimon, A., Harris, A. M., Shoenfelt, E. L., Miller, J. D., & Carter, N. T. (2019). Personality testing and the Americans with Disabilities Act: Cause for concern as normal and abnormal personality models are integrated. Industrial and Organizational Psychology, 12, 119–132.

Morris, S. B. (2017). Statistical significance testing in adverse impact analysis. In S.B. Morris & E.M. Dunleavy (Eds.), Adverse impact analysis (pp. 71–91). New York, NY: Routledge.

Mulvaney, R., & Smith, P. (2019, August 20). Civil rights rift at Trump agencies forces employer reckoning. Bloomberg Law. Retrieved from https://news.bloomberglaw.com/daily-labor-report/civil-rights-rift-at-trump-agencies-forces-employer-reckoning

Murphy, K. R., Dzieweczynski, J. L., & Zhang, Y. (2009). Positive manifold limits the relevance of content-matching strategies for validating selection test batteries. Journal of Applied Psychology, 94(4), 1018–1031.

Nason, L. M., Kelly, T. S., & Arsen, H. A. (2019, August 2). OFCCP releases its new compliance assistance guides. Ogletree Deakens. Retrieved from https://ogletree.com/insights/2019-08-02/ofccp-drops-its-new-compliance-assistance-guides/

O’Neil, C. (2016). Weapons of math destruction: How Big Data increases inequality and threatens democracy. New York, NY: Crown.

Oswald, F. L., Dunleavy, E. M., & Shaw, A. (2017). Measuring practical significance in adverse impact. In S. B. Morris & E. M. Dunleavy (Eds.), Adverse impact analysis (pp. 92–112). New York, NY: Routledge.

Paetzold, R. L., & Bent, J. R. (2018). The statistics of discrimination. Egan, MN: Thomson Reuters.

Sady, K., & Aamodt, M. G. (2017). Analyzing EEO disparities in pay. In S. B. Morris & E. M. Dunleavy (Eds.), Adverse impact analysis (pp. 216–238). New York, NY: Routledge.

Society for Industrial and Organizational Psychology. (2018). Principles for the validation and use of personnel selection procedures. Bowling Green, OH: Author. Retrieved from https://www.siop.org

State of Texas v. EEOC et al., No. 18-10638 (5th Cir. August 6, 2019).

Uniform Guidelines on Employment Selection Procedures (1978). Federal Register, 43 (August 25, 1978), pp. 38290–38315; Code of Federal Regulations, 41 CFR 60-3.

U.S. Department of Labor, Office of Federal Contract Compliance Programs. (2018, August 24). Directive 2018-05: Analysis of contractor compensation practices during a compliance evaluation. Retrieved from https://www.dol.gov/ofccp/regs/compliance/directives/Dir2018-05-ESQA508c.pdf

U.S. Department of Labor, Office of Federal Contract Compliance Programs. (2019a, July 23). Practical significance in EEO analysis: Frequently asked questions. Retrieved from https://www.dol.gov/ofccp/regs/compliance/faqs/PracticalSignificanceEEOFAQs.htm

U.S. Department of Labor, Office of Federal Contract Compliance Programs. (2019b, July 23). Validation of employee selection procedures: Frequently asked questions. Retrieved from https://www.dol.gov/ofccp/regs/compliance/faqs/ValidationEmployeeSelectionFAQs.htm

Weber, L., & Dwoskin, E. (2014, September 29). Are workplace personality tests fair? The Wall Street Journal. Retrieved from https://www.wsj.com/articles/are-workplace-personality-tests-fair-1412044257

 

Print
2871 Rate this article:
1.0
Comments are only visible to subscribers.

Categories

Information on this website, including articles, white papers, and other resources, is provided by SIOP staff and members. We do not include third-party content on our website or in our publications, except in rare exceptions such as paid partnerships.