Each and every job opening now results in a slew of applicants. As applications dramatically increase, more and more human resources departments are turning to technology to assist in reducing the burden of reviewing resumes and predict a candidate’s success within their workforce. Often these technological solutions—such as resume screening and personality testing—are presented to in-house counsel as a fait accompli and lead to a scramble to figure out whether the new screening tool passes the legal sniff test. This article aims to tee up the baseline issues that should be resolved before wide-scale implementation of these tools occurs.

A recent study strongly suggests that the rise of predictive analytics can create significant risk for employers. A December 2018 report by Upturn, a D.C.-based nonprofit organization, suggests that without active measures to mitigate bias, predictive tools are prone to some form of bias by default. These biases, and the problems they create, can appear at any point in the recruitment and hiring process, including at the advertising, sourcing, screening, interviewing and selection stages. Furthermore, lawmaking and regulatory bodies may lack the authority, resources and expertise to provide meaningful guidance and oversight with respect to employers’ use of predictive analytics in recruiting and hiring.