Artificial intelligence (AI) is everywhere — even in the employment sector. Tools like ChatGPT have made the headlines, but countless other AI software is used on a daily basis by companies and employers. As this new technology increases in use, so do the regulations which govern it. Understanding these guidelines and the ever-shifting AI regulatory landscape is essential for businesses that want to protect themselves from potential liability.

For employers, recent guidance from the Equal Employment Opportunity Commission (EEOC) may impact the use of artificial intelligence. AI in the human resources context may require notice to employees, limitations on use, and potential impacts on disability accommodations. When companies understand employment law, they can make informed decisions that protect themselves from liability and protect the rights of their employees.

What Is Artificial Intelligence Software?

The use of artificial intelligence continues to expand throughout every business sector. AI is a computer application capable of intelligent behavior similar to learning, problem-solving, and reasoning. It performs tasks that ordinarily require human input or intelligence, such as data entry, analysis, or even simple tasks like timekeeping.

AI software is helpful because it can streamline a business’s operations and reduce the labor force required to handle specific tasks. Businesses of every type are beginning to use this software, from manufacturing, information technologies, and even the human resources sector. With this new use comes new laws and regulations which govern how artificial intelligence software is used.

How Employers Are Using Artificial Intelligence in Hiring

Employers are also jumping on the AI bandwagon to help them with hiring. Companies are increasingly using machine learning and AI software to make employment decisions, such as:

  • Who to hire for a particular job
  • Which employee deserves a promotion
  • Which employee or employees to terminate for cause or during layoffs
  • How to restructure employees to optimize efficiency and reduce costs

These hiring steps, which typically required significant labor hours by other employees or supervisors, can now be streamlined with AI performing much of the analysis itself.

AI Tools Used by Employers

Companies are using countless artificial intelligence tools and the number grows daily. However, some of the most common AI tools used by companies to hire include:

  • Resume Scanners which automatically analyze resumes to help identify promising candidates for a particular job. Scanners look at the text and overall accuracy of the resume to search for key terms, experience, and other desired attributes that make one candidate more qualified than another.
  • Employee Monitoring Software which constantly checks employee performance metrics. These metrics may include data entry speed, number of errors, timekeeping software, and remote monitoring capabilities.
  • Chatbots and Assistants. Nearly every website now has a virtual chat to help employees or potential customers. Many of these chat features are now manned by AI rather than a living person. This chatbot may be used in the human resources context to help recruit potential employees, handle initial employee complaints, and much more.
  • Video Interviewing Software. Some AI is even used during remote video interviews. This technology can evaluate speech patterns and facial expressions to analyze a potential employee. It then provides additional information about that employee while the interview is conducted.

These tools generally rely on computerized data analysis to help a company make an employment decision. While these tools can be helpful, they may also have implications for compliance with employee protection laws such as those from the EEOC or New York City’s new law, discussed below. Employers may gain many benefits from artificial intelligence, but they must do so in compliance with applicable state and federal employment law.

New York City’s New Ordinance

New York City recently issued City’s Local Law 144, which regulates employers’ use of “automated employment decision tools.” It clarifies what types of AI technology are subject to the law and further requires an employer to:

  • Conduct a self-audit within one year of AI use
  • Provide notice to employees and applicants of the audit
  • Provide notice to employees and applicants of the employer’s AI use
  • Provide a procedure to request an alternative selection procedure

As AI use increases, more local governments will likely implement protections that an employer must consider when using AI.

Equal Employment Opportunity Commission Guidance on AI

The U.S. Equal Employment Opportunity Commission (EEOC) protects employees from violations of federal laws that discriminate against employees. The EEOC helps enforce discrimination based on a person’s color, race, gender, religion, sexual orientation, national origin, age, and disability. Most employers with 15 or more employees are subject to EEOC laws and enforcement.

The EEOC’s First Guidance Document

Artificial intelligence technology may implicate many of the federal protections the EEOC protects. To help employers navigate this shifting landscape, the EEOC issued its first guidance document regarding AI in the employment context. The guidance helps ensure that AI selection procedures do not adversely affect protected classes under Title VII of the Civil Rights Act of 1964.

An artificial intelligence or machine learning tool may be discriminatory if it has a “substantial” disproportionate effect on any protected class if that effect is not job-related or necessary. For example, an AI tool that awards higher ratings to applicants of a certain gender is likely discriminatory unless a specific gender is somehow relevant to the employment position.

As part of its guidance, the EEOC noted three important points:

  • Third-Party Software. The EEOC’s first guidance states that employers could be liable for third-party software’s impact on potential employees, even if the employer did not develop that software. Legal liability may arise from the administration of the software or even the use of software that discriminates in violation of federal law.
  • Vendor Promises. The EEOC encourages every employer to ask an artificial intelligence or machine learning software vendor about what steps they took to avoid adverse impacts that violate federal law. While the EEOC encourages companies to ask, it also made clear that a vendor’s assurance does not necessarily shield the employer from liability. An employer could be liable even if the vendor was mistaken in its technology assessment.
  • Self-Audits. A self-audit helps to determine what impact, if any, the AI or machine learning software has on certain groups. This audit requires access to the software’s underlying data and should be audited at the beginning of the software’s use as well as periodically during its use. A self-audit helps demonstrate responsibility and avoid potential liability for continuing to discriminate, even accidentally, against protected classes.

The EEOC’s Second Guidance Document

The EEOC issued a second guidance document that addressed the impact artificial intelligence and machine learning software have on individuals with disabilities. It considered the impact on those protected by the Americans with Disabilities Act (ADA). ADA guidance on AI creates a different impact analysis than for those protected under Title VII of the Civil Rights Act. The EEOC guidance document notes that because disabilities and the ADA reasonable accommodation requirements are highly individual, a statistical finding about impacts is insufficient to determine whether the technology adversely impacts those with a disability.

The EEOC determined that employers must implement an individualized process to determine whether the AI software improperly “screened out” a potential candidate with a disability and if it did so unfairly. The EEOC’s guidance also recommends that employers provide an in-depth notice of what AI or machine learning tools they use during the hiring process. It also recommends that employers offer alternative hiring or employment monitoring processes that better handle the accommodation of disabilities during hiring and employment.

If a company utilizes a test or other decision-making tool to analyze potential applicants, it must offer reasonable accommodations for those with a disability. Under the ADA, a reasonable accommodation is a change in how the applicant may interview or apply for a job to adapt to the applicant’s disability. This may include modifications to the testing format, exceptions to certain policies, or a longer time in which to take a test. Nearly any change may be an accommodation, so long as the accommodation is reasonable.

The EEOC guidance also warns employers, just like in the first guidance, that reliance on third-party vendor assurances does not protect the employer from liability. If an AI vendor assures a company that their technology is “bias-free,” and the technology later discriminates against those with a protected disability, the employer may still be liable for that discrimination. Reliance on third-party warranties or representations is not an absolute protection against ADA violations. Employers should conduct self-audits to help prevent discrimination from occurring in the first place.

How To Comply With EEOC Guidelines When Using Artificial Intelligence

Artificial intelligence offers numerous advantages to companies when hiring or managing their employees. While risks and limitations exist, this does not mean a company should avoid AI altogether. Instead, an employer should follow EEOC guidance and consult a knowledgeable attorney who can help with compliance. Employees are protected under federal law but also many state laws. Navigating this complex sea of regulations can seem daunting without the right legal assistance.

The employment law attorneys at Morris & Dewett Injury Lawyers understand compliance and how to protect employees whose rights were violated. Our team stands ready to help employers and employees with AI software discrimination litigation or compliance. Contact us today for a free consultation of your case.

Morris & Dewett provides this information to the public for general education and interest. The firm does not represent clients in every topic discussed in legal & injury news. The information is curated and produced based on trends in law, governance, and society to present relevant issues to the general public. Every effort is made to provide accurate information. Do not make any decision solely based on the information provided, please seek relevant counsel for each topic area. Consult an attorney before making any legal decision, consult a doctor before making any medical decision, and consult a financial advisor before making any fiscal decision. If you have any legal needs that we can assist you with, please do not hesitate to contact us.

Other Articles

Morris & Dewett Will Answer Your Questions and Help You Recover