Welcome to Labor of Law. Advocates, lawmakers and regulators are spotlighting the legal risks of artificial intelligence tools in the workplace. The latest target is facial recognition technology that critics say could lead to bias against women and people of color. Plus: New York City became the first major U.S. city to institute a minimum wage for app-based drivers. Will other jurisdictions follow? Scroll down for Who Got the Work, and much more.
Lawmakers, Advocates Raise Automation’s Discrimination Potential
Automation has become more ubiquitous in the workplace as employers adopt artificial intelligence tools. Now, regulators, advocates and lawmakers are asking us to pay attention. Democratic lawmakers and workers right advocates are pointing to ways that automation tools could lead to discriminatory practices.
A recent survey by the management-side firm Littler Mendelson found hiring and recruiting account for the most common uses of data analytics and artificial intelligence. Nearly half of the employers surveyed said they use some kind of advanced data techniques to grow their workforce.
Beyond hiring, firms expect automation to be a disruptive force. “The unspoken assumption underlying these concerns about automation is that the workers at companies that automate are more vulnerable to economic dislocation than workers at companies that do not automate. But in reality, the outlook for workers at companies that do not automate may be far more bleak,” according to a recent Littler report.
What’s happening now? Eight Democratic U.S. lawmakers—including some members of the House—sent Amazon a letter recently seeking answers about the company’s use of facial recognition software. The letter was a follow-up to an earlier demand for information—apparently lawmakers were not satisfied with what they heard earlier this summer.
“We have serious concerns that this type of product has significant accuracy issues, places disproportionate burdens on communities of color, and could stifle Americans’ willingness to exercise their First Amendment rights in public.” This follows an American Civil Liberties Union claim that the software had a bias toward people of color.
The concerns are not new—but they could get a higher profile now. Three U.S. Senate Democrats—Kamala Harris, Patty Murray and Elizabeth Warren—previously sent a letter to the U.S. Equal Employment Opportunity Commission that asked the agency to mitigate “a growing body of evidence” indicating that such technologies can amplify prejudice. The senators cited reports that these systems can “encode” biases already apparent in society and do not as readily identify women or people of color. Democrats’ control of the U.S. House in January could force a more public-facing debate over artificial intelligence and bias. And the fact Amazon CEO Jeff Bezos is setting up East Coast headquarters a short drive from the Capitol.
Amazon isn’t unaware of the issues. Amazon recently scrapped a data analytics tool that sifts through applications to rate candidates. Turned out the tool was apparently biased against women. Amazon told Reuters the technology “was never used by Amazon recruiters to evaluate candidates.”
“There are significant legal risks,” said Mark Girouard, an employment attorney at Minneapolis-based Nilan Johnson Lewis, told me in October. “These tools find patterns in the data and look for correlations in whatever measure of success you are looking at. They can find correlations that are statistically significant. Just because something has a statistical correlation, doesn’t mean it’s a good or lawful way to select talent.”
How regulators play a role. The EEOC has not been silent on the question of automation and the potential for discrimination. In 2016, commission looked at the implications of the rise of big data in the workplace. Kelly Trindel, then the EEOC’s chief analyst in the Office of Research, Information and Planning, predicted some of the potential pitfalls for protected classes for companies that are increasingly using these programs to recruit and hire. It’s unclear how the agency’s leaders will continue to address these issues moving forward, particularly as more technology surfaces—such as the facial recognition software and other tools—and the EEOC awaits the arrival of Trump-appointed commissioners.
“The primary concern is that employers may not be thinking about big data algorithms in the same way that they’ve thought about more traditional selection devices and employment decision strategies in the past,” Trindel said at the EEOC meeting. “Many well-meaning employers wish to minimize the effect of individual decision-maker bias, and as such might feel better served by an algorithm that seems to maintain no such human imperfections. Employers must bear in mind that these algorithms are built on previous worker characteristics and outcomes.”
I’m Erin Mulvaney in Washington, covering labor and employment from the Swamp to Silicon Valley. Follow this weekly newsletter for the latest analysis and happenings. If you have a story idea, feedback or just want to say hi, I’m email@example.com and on Twitter @erinmulvaney.
A First Minimum Wage for App-Based Drivers
New York City became the first major U.S. city to establish a minimum pay standard for app-based drivers, which advocates say could “set the bar for contractor workers’ rights in America.” My colleague Dan Clark has more here at The New York Law Journal.
What happened? New York City officials voted this week to set the minimum pay rate for app-based drivers at $17.22 per hour after expenses, which is roughly comparable to the city’s $15 minimum wage rate. The pay rules go into effect Dec. 31 and will affect 70,000 families who are struggling to get by on the rate $11.90 after expenses.
What’s behind this move? It follows a two-year campaign by the Independent Drivers Guild, which gathered 16,000 signatures. The union provides a timeline here. To force action, the guild filed a formal rulemaking petition and the city responded saying it planned to act on rules. In August, the city council passed legislation that required the Taxi and Limousine Commission to set minimum payments for drivers.
What’s the context here? Workers in the gig economy, including those for Uber and Lyft, are often classified as independent contractors, and not employees. And that means fewer protections and benefits in many instances. Some cities and states have passed laws that provide such portable benefits, such as minimum wage, that are typically reserved for employees.
In New York, the guild pushed Uber to add a tipping option and also to provide benefits for app-based drivers, including vision and telemedicine health benefits, flu shots and death benefits. Some of these moves are the most aggressive in the country, but other jurisdictions have followed similar paths. The City of Seattle pushed first-in-the-nation legislation that would’ve allowed gig workers to unionize.
Who Got the Work
>> KPMG won the denial of class status in a gender discrimination suit in Manhattan federal district court. KPMG was represented by a team from Sidley Austin, led by Chicago-based partner Colleen Kenney. Sanford Heisler Sharppartner Kate Mueting, who co-chairs the firm’s national Title VII practice, represented the plaintiffs. My colleague Colby Hamilton in New York reports: “A proposed class action suit brought against accounting giant KPMG alleging systemic job discrimination against female employees was unable to clear the hurdles established by the U.S. Supreme Court in its landmark 2011 decision, Walmart Stores v. Dukes.” Read the decision here.
>>The EEOC has returned to Washington federal district court to sue Walmart over discrimination claims that were earlier dismissed because the agency failed to properly lay out certain allegations. The agency filed a new lawsuit last week on behalf of two deaf employees in the D.C. area who claim their disability was not accommodated. Littler Mendelson represents Walmart.
Around the Water Cooler
>> The strategies male executives on Wall Street are adopting to respond to the #MeToo movement are proving to create roadblocks and problems for women, including avoiding social and mentorship opportunities to avoid “unknown risks.” [Time]
>> State and federal courts are grappling with the fallout from the California Supreme Court’s Dynamex ruling on worker classification. [Law.com]
>> Challenger, Gray & Christmas estimated the flu costs employers $21 billion in productivity costs. Employers are warned to take precautionary measures to reduce the chances of the flu spreading. [Newsday]
>> A San Francisco trial judge declined to address whether a California Supreme Court ruling should apply retroactively to the closely watched GrubHub case that reckons with whether the food delivery service provider misclassified its workers. Yet, she acknowledged the outcome of the trial would be different under the current set of rules. [Law.com]
>> Courts and companies are grappling in the wake of a 2015 U.S. Supreme Court ruling over religious accommodations and still see a mixed bag of results in the circuits. The latest example is a case of a Minnesota hospital that beat an EEOC lawsuit filed on a behalf of a nurse whose job offer was rescinded after she requested to respect sabbath. [The National Law Journal]
>> Tesla employees and contractors say the factories in California are discriminatory toward African American workers, in three lawsuits filed by former workers. [The New York Times]
>> About a dozen student groups at Yale, Stanford, and other schools will stop promoting jobs at firms that use mandatory arbitration. This fits into the larger landscape of a push against mandatory arbitration in the public eye, while the courts have consistently bolstered it. [Vox]