It’s official: use of AI in the workplace is now the norm. In fact, research says that 77% of companies are either using or exploring the use of AI in their businesses.
But what about using AI to deal with tasks that require emotional intelligence, an arguably human instinct? According to experts, AI can be trained to recognise and respond to emotions in people, using sentiment analysis and other technologies. It’s been found to change how customer-facing businesses operate, as well as how HR and wellbeing in the workplace is handled.
More than that, by 2026, the combined market size of conversational AI and emotion detection are predicted to grow to more than $55 billion.
Workplaces around the world are using AI for emotional intelligence purposes. For instance, data analysis firm Gong uses various AI tools to index customer emails and video calls in order to ensure its output contains the most empathetic and persuasive language. Additionally, biomedical startup BenchSci works with a conversational intelligence platform that uses AI to analyse emails and optimise customer service responses by assessing the emotional state of the customer.
Jon Morgan, CEO of consulting film Venture Smarter, tells AllBright that he uses AI tools “specifically for its emotional intelligence capabilities”.
“When an employee posts something, whether it's a complaint or celebrating an achievement, [the AI tool] analyses the language and tone. It can then identify if someone might be feeling stressed, overwhelmed, or even excited,” he explains.
“Based on that analysis, [the tool] can take a few different actions. If it picks up on negative emotions, it might send the employee some resources, like links to our wellness programme, or articles on managing stress. It can also nudge them to connect with a colleague or HR for further support.
“On the other hand, if Watson detects positive emotions, like celebrating a win, it can send a public or private message congratulating the employee and highlighting their accomplishment to the team.”
Morgan is adamant that the use of this tool “fosters a more positive and supportive work environment”.
By 2026, the combined market size of conversational AI and emotion detection are predicted to grow to more than $55 billion.
Additionally, Ivan Harding, CEO of HR technology firm Applaud, tells AllBright he has used AI to implement two wellbeing assistants into everyday business practice: MindMate, which aims to support employees with mental health issues at work, and BiasBot, which navigates gender and other ethics biases.
“Our MindMate bot can take confidential worries from staff and provide advice on how to cope with stress at work, difficulties with team members, work life balance and other factors that can negatively impact mental health,” he says. “Remote working has increased stress for many employees who are often reticent to speak to a human being about their worries; introducing a confidential AI assistant gives people a private outlet.” He describes the Biasbot as “a great little tool to help everyone in the business navigate bias in the workplace”.
“A common example we use is to take job vacancies from companies' websites and parse them through the bias checker,” he says. “The BiasBot can also be useful when composing internal company communications, performance reviews or any important email that needs to be fair and impartial.”
So implementing AI into workplace processes that may require emotional intelligence can certainly have positive and innovative results. “AI can indeed help individuals become more self-aware and manage their relationships better at work,” says positive psychology coach Elle Mace. “Through data analysis and feedback mechanisms, AI systems can provide individuals with insights into their communication styles, emotional triggers, and interpersonal dynamics.
“This information can empower employees to make more informed decisions and navigate workplace relationships more effectively.”
Life coach and psychology consultant Bayu Prihandito adds that the opportunity to feed AI with large amounts of workplace data will enable it to “identify patterns in employee behaviour that might have gone unnoticed”. “This can provide insights to managers to make better decisions about their team dynamics and how they handle conflicts,” he says.
Through data analysis and feedback mechanisms, AI systems can provide individuals with insights into their communication styles, emotional triggers, and interpersonal dynamics.
That said, research has also found that using AI to assess and optimise emotional intelligence in the workplace could lead to issues and inconsistencies that can’t be ignored.
For instance, a study found that AI responsible for emotional analysis assigns more negative emotions to specific ethnicities than others. This is problematic on many levels and could affect individuals’ career progression and treatment in the workplace.
“AI may struggle to interpret non-verbal cues, subtle emotional expressions, or contextual nuances, leading to misinterpretations or incomplete understandings of employees' emotional states,” Elle explains.
Bayu adds that the more subtle human emotions such as apprehension, frustration or contentment might be missed by AI. “It also lacks the ability to appreciate the context fully, which is often key for understanding the full scope of our emotions,” he says.
It can’t be denied it’s important that human interaction, understanding, and analysis does need to be used in conjunction with AI in order for it to fairly aid with emotional intelligence issues in the workplace. It’s not infallible, and is just as capable of bias as humans are.
According to Harvard Business Review, AI technology isn’t yet advanced enough to perceive cultural differences when it analyses emotions, making conclusions murky and mired in more problematic biases.
After all, the AI industry itself is predominantly white and male. A report found that 80% of AI professors are men. This inequality can lead to trickle down bias when the technology is implemented in workplaces.
“The lack of diversity in the AI development community can indeed pose challenges in understanding different people's emotions in the workplace,” Elle says. “Biases in data collection, algorithm design, and decision-making processes may inadvertently perpetuate existing inequalities or overlook the needs and experiences of underrepresented groups.”
Bayu adds that among other issues, these problems also bring into question the effectiveness of using AI in the workplace. "Without supervision, this [lack of diversity] could potentially result in misunderstandings in cultural nuances making AI less effective and possibly more conflicting in managing workplace relationships.”
AI technology isn’t yet advanced enough to perceive cultural differences when it analyses emotions, making conclusions murky and mired in more problematic biases.
Use of AI in the workplace isn’t going anywhere. The technology is being used in exciting ways to look at wellbeing, emotional needs and bias at work. But we must bear in mind its own biases, and ensure that human interaction is also implemented to ensure that employees don’t suffer from AI’s shortcomings.
“AI should be used as a tool to support, not replace, human judgement in emotional intelligence needs,” Bayu says, suggesting that the technology can be a good starting point for projects requiring emotional intelligence.
“One practical approach would be to use AI for preliminary data analysis to identify potential issues or areas for improvement, while final decisions and actions should be supervised by professionals who can interpret AI outputs with a nuanced understanding of human emotions in a workplace environment.”
Elle agrees that AI should be used to “supplement rather than replace human judgement and intuition”.
“By striking a balance between AI-driven insights and human judgement, organisations can maximise the benefits of technology while preserving the human touch in workplace interactions,” she says.