Unless you’ve been living under the rock, you’ve heard of OpenAI, ChatGPT, and Google Bard at this point. While the debate has largely centered on the impact of Generative AI on jobs or whether AI will replace humans, one dimension has received almost no attention: employee mental health.
The global crisis of mental health is constantly in the news. A recent survey of 1,600 HR leaders and employees found that 64% of workers have struggled with their mental health. Of those who have struggled, 91% reported being less productive due to their mental health issues, and 45% are less productive for more than 5 hours per week. If you are leading a team, ask yourself: can you afford to have anyone on your team to lose 5+ hours of productivity each week? The good news is, generative AI has tremendous opportunities to help improve employee mental health at scale.
1. Reduce the Barrier to Treatment for Employees. I’ve had many conversations with employees about mental health over the years a corporate executive, and it’d often be the first time they’ve ever heard of free sessions covered by Employee Assistance Program (EAP) and free subscription to meditation apps the company offer. It is a significant barrier to receiving treatment for employees who do not know how to get started and are concerned about the cost. By incorporating generative AI into work tech where employees can ask questions and receive a response instantly about EAP contact information, it can significantly improve the awareness of the availability of mental healthcare. Employees may also feel “safer” asking a bot that answers all work-related questions instead of going to their manager or someone in HR. A 2020 study by Oracle and Workplace Intelligence reveals that 68% of those surveyed would prefer to talk to a robot over their manager regarding stress and anxiety at work.
2. Diagnose and Reduce Severe Mental Health Issues. I realize this point will raise eyebrows. I am not suggesting any of the current generative AI tools can or should replace mental healthcare professionals. However, it does have enough knowledge to understand more about mental health than an average person, including typical symptoms and treatment options for various conditions. Given the mental healthcare system is overwhelmed and the lack of general mental health knowledge, it can make someone realize they need to seek care before a condition becomes severe.
Not convinced? I tired a simple prompt with a couple of symptoms and asked ChatGPT what’s wrong with me. Here’s the response I received.
I took it a step further and made the question less obvious.
There will need to be guardrails put in place for employee mental health usage of the technology and potentially a way to immediately alert proper authority if someone shares concrete plans to harm themselves or others.
The implication for mental healthcare professionals could be massive as well. Imagine having generative AI transcribe and summarize notes from each session. This can be a boost for their productivity and potentially allow them to see more patients as their time is no longer spent on tedious and manual work.
3. Improve Employee Mental Health through Task Replacement. The Surgeon General’s Framework for Workplace Mental Health and Well-Being emphasizes the connection between the well-being of workers and the health of organizations (figure below). There are two dimensions that generative AI can improve by completing manual tasks that used to be performed by human employees, meeting note summary, presentation slides creation, email drafting, event scheduling, to name a few. As employees have more time freed-up to perform creative work and potentially reduce overall work hours, it can significantly improve their mental health.
Caveats and Considerations: One of the largest concerns about generative AI is the tendency of Large Language Models (LLMs) to hallucinate, or make up seemingly factual information while appearing authoritative. This can be a downside when compared to a rules-based conversational agent in a healthcare setting. How are rules-based conversational agents different from generative AI tools? Everything a rules-based agent says is written and reviewed by human writers and clinicians. Generative AI, on the other hand, can form completely new sentences. It will be critical to have clinical practitioners work closely with regulators and AI experts going forward before a wide rollout in a work setting. There must be a governing body of ethical AI in such applications.
What creative solutions have you tried to improve the health and wellbeing of your employees? Do you see other ways generative AI can help with employee health?
Comments