Capitol Building

Citing its ability to “turbocharge fraud” and “automate discrimination,” four federal agencies pledged to more strictly monitor and regulate the use of artificial intelligence and automated systems Tuesday. The implications for the healthcare sector and nursing home providers and tech vendors are wide-ranging.

Leaders from the U.S. Equal Employment Opportunity Commission, Department of Justice, Consumer Financial Protection Bureau and Federal Trade Commission issued a joint statement. In it, the agencies pledged to focus on fairness, equality, and justice as emerging automated systems and AI become more commonplace, impacting civil rights, fair competition, consumer protection and equal opportunity. 

The agencies promised to enforce their respective laws and regulations to promote responsible innovation in automated systems, and continue to monitor AI’s development and use to curb potentially harmful uses.

Employment discrimination, EHR accuracy in focus

For nursing home and senior living operators, the potential dangers of AI include both employment discrimination and inaccuracies in electronic health records. The federal crackdown could have implications for both.

Tuesday’s statement continues federal efforts to curb employment discrimination. Last year, the U.S. Equal Employment Opportunity Commission and the U.S. Department of Justice released documents regarding disability discrimination when employers use AI and other software tools to make employment decisions.


Such AI-based tests and software increasingly are used by senior living and nursing home employers to hire new employees, monitor performance and determine wages or promotions. This use of algorithms or AI could result in unlawful discrimination against people with disabilities in violation of the American with Disabilities Act. 

Some experts also believe the use of these new technologies, such as Microsoft and Epic’s recently-announced partnership on Azure OpenAI Service with Epic’s EHR software, could lead to inaccuracies in EHRs. Transcription errors and alert fatigue, which sometimes occur through the use of these tools, make it more difficult for nursing homes and senior living facilities to offer their residents the best possible care and services. 

Federal agencies are clearly keeping watch on these technology-driven dangers.

“We already see how AI tools can turbocharge fraud and automate discrimination, and we won’t hesitate to use the full scope of our legal authorities to protect Americans from these threats,” said FTC Chair Lina M. Khan in a statement. “Technological advances can deliver critical innovation — but claims of innovation must not be cover for lawbreaking.”

Read more more technology stories here.