The head of the US agency charged with enforcing civil rights in the workplace says AI-based “software” tools that closely track workers’ whereabouts, keystrokes, and worker productivity could also run afoul of discrimination laws.
Charlotte Burrows, chair of the EEOC, told The Associated Press that the agency is trying to educate employers and technology providers about their use of these monitoring tools as well as artificial intelligence tools that facilitate the work of job evaluation.
And if they aren’t careful—for example, strict schedule-monitoring algorithms that penalize pregnant women’s breaks or Muslims who take time to pray, or allow flawed software to screen graduates of women’s colleges or historically black colleges—they can’t blame AI when it comes. Equal Employment Opportunity Commission (EEOC).
“I am not shy about using our enforcement power when it is necessary,” said Burrows. “We want to work with employers, but there’s certainly no exception to civil rights laws because you’re discriminating in some way with advanced technology.”
The federal agency released its latest set of guidelines Thursday about using automated systems in employment decisions such as who to hire or promote. It explains how to interpret a key provision of the Civil Rights Act of 1964 known as Title VII that prohibits employment discrimination based on race, color, national origin, religion, or sex, which includes prejudice against gays, lesbians, and transgender people.
An important example, Burroughs said, involves widely used resume checkers and whether or not they can produce a biased result if they are based on biased data.
“What will happen is that there is an algorithm that looks for patterns that reflect patterns that are already familiar,” she said. “It will be trained on data that comes from its current employees. And if you have a non-diverse group of employees currently, you will likely end up inadvertently firing people who are nothing like your current employees.”
For example, Amazon abandoned its resume scanning tool to hire top talent after finding that it favored men in technical roles — in part because it was comparing job candidates against the company’s male-dominated technology workforce.
Other agencies, including the Justice Department, have sent out similar warnings over the past year, with earlier sets of guidelines about how some AI tools discriminate against people with disabilities and violate the Americans with Disabilities Act.
In some cases, the EEOC has taken action. In March, the operator of the technology job search website Dice.com settled with the agency to end an investigation into allegations that it allowed job posters to exclude Native American workers in favor of immigrants seeking work visas. To settle the case, its parent company, DHI Group, agreed to rewrite its programming to “scrape” discriminatory language such as “H-1Bs Only,” referring to the type of work visa.
A large part of the EEOC’s work involves investigating complaints made by employees who believe they have been discriminated against. And while it is difficult for job applicants to know whether a biased hiring tool has led to them being denied a job, Burroughs said there is “generally greater awareness” among workers about the tools that are increasingly being used to monitor their productivity.
These tools ranged from radio-frequency devices for tracking nurses, to minute-by-minute monitoring of the tightly controlled schedule of warehouse workers and delivery drivers, to tracking keystrokes or computer mouse clicks as many office workers began working from home during the pandemic. Some may violate civil rights laws, depending on how they are used.
Burroughs noted that the National Labor Relations Board is also looking into these AI tools. The NLRB sent out a memo last year warning that overly intrusive monitoring and management tools could impair workers’ rights to communicate with each other about union activity or unsafe conditions.
“I think the best approach out there — I’m not saying not to use it, it’s not illegal per se — but to really think about what employers are looking to measure and maybe measure it directly,” said Burroughs. “If you’re trying to see if work is done, maybe check work is done.”
#warning #companys #software #discriminatory #WRAL #TechWire