Genpact is using A.I. to flag employee dissatisfaction and tying leaders’ bonuses to the results

Fortune @ Work A.I. Illustration
Genpact launched an employee engagement bot.

Employers are scrambling to infuse new generative A.I. tools into their people practices, but professional services firm Genpact has been iterating A.I. tools for years.

For Genpact, A.I. isn’t just a cyclical fad. The technology has proved an essential tool in understanding and engaging its workforce, specifically in the use of three platforms: Amber, its employee engagement bot; Genome, an A.I.-enhanced internal learning and development portal; and Watercooler, an A.I. integration that helps employees forge informal connections companywide.  

Genpact’s people team sees a huge upside for generative A.I. in retention efforts. In 2018, it launched Amber as a test to help measure employee engagement and sentiment. 

It’s an approach that has replaced the traditional, and at times ineffective, employee engagement survey. Amber interacts with new hires eight to 10 times a year in their first six months and at least four times a year for all other employees. The chatbot inputs employee responses and asks questions based on past and present answers, becoming more “intelligent” as it accumulates data. At the end of each conversation, Amber generates a mood score reflecting the employee’s sentiments.

Piyush Mehta, Genpact’s chief human resources officer, says regular employee surveys were unproductive and laborious because they required human orchestration throughout the lengthy process, from outreach to analysis to action. The company also found it challenging to continuously measure mood and organizational sentiment with traditional surveys, putting it at risk of losing valued talent.

The chatbot provides real-time analytics, allowing Genpact to create policies that reflect the workforce’s immediate wants and needs, the firm says. The tech-first approach has proved successful, Genpact says. It receives an average response rate of 82%, up from 77% in 2022. 

Employee perception is a top priority for the company, which in 2020 tied 10% of its CEO and top 150 leaders’ bonuses to Amber’s mood scores. The mood score is determined by how employees rate their experience at Genpact on a five-point Likert scale. The company says it has consistently maintained an average mood score of 4.2 over the past year, just above the service industry average of 4.1.

A.I. meets learning and development

Influenced by learnings from Amber, Genpact launched a learning platform to upskill its workforce in 2019. 

Shalini Modi, Genpact’s senior vice president of learning and employee experience, says her goal is to create a venture that’s almost “infinitely scalable and, at the same time, harvests the knowledge of the organization.”

Her team initially pinpointed 75 digital, professional, industry, and service-related skills necessary to compete in the future workforce. Leadership also used A.I. to identify around 700 experts internally to contribute their knowledge to courses now covering over 600 skills. The A.I. tool used organizational network analysis to find knowledge nodes within the company,  detecting the people employees frequently turned to for insight on identified topics, says Modi. Together, the cohort of selected “gurus” drafted content on their area of expertise. 

Courses, or “waves” as they’re called at Genpact, are voluntary, with some modules only taking 15 minutes to complete. Employees who Amber determines have low mood scores are flagged to management and encouraged to take the courses in hopes that it will increase their engagement. “In 2021, we had about 32,000 people who did 70,000 waves. In 2022, we had 45,000 people who did almost 112,000 waves,” says Modi. 

The company also employs generative A.I. to help handpicked experts create content for the learning platform. What once might have taken a few days to pull together now takes minutes. 

Other A.I. use cases include creating post-learning module quizzes and summarizing large bodies of research for workers to access. The A.I.-enhanced platform also highlights the top questions from employees and harvests the answers for easy reference.

The courses’ impact is clear. “People who learned even as little as 15 minutes a day—versus people who learned less than 15 minutes a month—experienced a three times difference in engagement and retention,” says Modi. 

Her team is experimenting with new ways to use A.I. for learning, such as A.I.-enabled voice clones. “We have a lot of senior leaders who are experts but don’t have time to create audio and video [content],” she tells Fortune. “Now we have A.I. tools where you can create their voice clones with small voice samples. We’ve tried it with some members on my team, and the clones have come out so well that you can pretty much write anything and generate audio or a podcast in no time.”

Gathering around the virtual watercooler

One of the rallying cries behind company calls for a return to the office is the watercooler moment, which builds relationships, boosts engagement, and fosters employer loyalty. Admittedly, these moments don’t lend themselves as easily to a remote model.

While many companies have experimented with virtual watercoolers—which, in most cases, require leaders to connect employees—Genpact is using A.I. to foster these connections by assessing where there are weak ties within the organization and who could benefit from connecting with a colleague outside their immediate network. Its A.I.-assisted virtual watercooler also avoids matching team members who already work closely together. 

The ethical challenge

The use of A.I. in any capacity comes with ethical challenges. Modi says that when Genpact’s learning and development gurus initially used generative A.I. to pull information from the web to create content, some copied content directly from online sources, effectively plagiarizing. Modi’s team responded by implementing a manual review process, encouraging people to flag content that looked to be plagiarized, but she quickly found the method was confusing for employees. People struggled to tell the difference.

“Many people were marking somebody else’s work as plagiarized even though it was not,” says Modi. Her team landed on a new method that seems to be more effective: random sampling and using A.I. to detect plagiarism (perhaps the most meta solution yet). 

There are also legal considerations when using A.I. in talent strategy. Some markets where Genpact has a presence, such as the U.K. and Germany, don’t allow certain A.I.-enhanced people analysis tools owing to regulation. Some regions consider individual network structure data sensitive, even if users consent. Because of this, the company says it takes a more conservative approach in those areas and has chosen not to process data points for its Watercooler platform in the EU.

“Twenty [percent] to 25% of our efforts don’t go into project execution, but they go into speaking with legal and [information security] teams,” says Praful Tickoo, Genpact’s vice president of data, insights, and A.I. “And I don’t necessarily think it’s a bad thing, because we don’t want to be on the wrong side of ethics, law, or data privacy.”

Employees are apprehensive about how their data is used, too. In response, Genpact leaders go through a visibility process wherein they create a problem statement explaining what issue they are trying to solve with a given A.I. tool and disseminate it to internal stakeholders. 

“You have to explain to them what is the legitimate interest of what we are trying to do; what are the data points we will be using; where will that data be stored; who will be accessing and processing the data; how will output look and who will have access to the output,” he says. “Forget about doing analytics. We don’t even have access to data before we close all these cycles.”

This article is part of Fortune @ Work: Generative A.I.