5 mins, 9 secs read time
Conversations about artificial intelligence (AI) tend to take two routes: Either we end up in a dystopian future where we’re ruled by robots or an idyllic scene where technology does all the work and we’re free to sit around eating bonbons. But rather than going to either extreme, we see the value in having more nuanced discussions about this technology. And we know we’re not alone. Our recent survey revealed significant interest in AI, with 62% of hiring professionals believing AI can help them hire the best candidate and 43% of candidates considering or already using generative AI in their job applications.
In a follow-up conversation after Cannes Lions 2023, Greenhouse hosted a LinkedIn Live to explore the role of AI in hiring and diversity, equity and inclusion (DE&I) efforts. Greenhouse CMO, Carin van Vuuren, moderated a discussion with Mona Khalil, Data Science Manager at Greenhouse, Tony Hobley, Chief DEI Officer at Omnicom Precision Marketing Group and Dr. Stacie CC Graham, Global Director, Racial Equity Programme at WPP.
We’ll share a few highlights from the discussion in this post or you can watch the recording of the entire conversation here.
Remember: AI is much more than ChatGPT
Like ChatGPT, generative AI has recently gotten a lot of time in the spotlight, but the panelists reminded us that AI technology has been around for quite a while. Both Dr. Graham and Tony said their companies have been using other forms of AI — like machine learning and neural networks — for years. Mona explained that the Greenhouse approach to using AI within the Greenhouse product has been to create assistive AI tools that help surface insights, create predictions on aggregate data and make suggestions. “In general, at Greenhouse we want to carefully and responsibly approach the use of machine learning and AI — we’re not going to make decisions for you,” said Mona.
The downsides of AI for hiring and job search
While we’re seeing wide interest and adoption of AI among both HR professionals and job seekers, it’s important to consider the downsides of this technology. “These models are only as good as the data they’re based on,” said Tony. He flagged the risk of bias getting baked into AI if it’s based on data sets that already contain bias. Drilling down further into this example, Dr. Graham said, “If we look at what made someone successful 20 years ago, that will have excluded a lot of people who may have been caretakers, who may have had physical conditions that kept them from being able to be in the office for 10–12 hours a day. That’s already in the data, so if machine learning is based on what made someone successful in the past, it’s likely going to be based on things that many of us would advocate against today.”
Mona called out the fact that when more people turn to generative AI like ChatGPT, the variability and individual expression of written content is reduced: “If you just take the recommendations of a generative AI system like ChatGPT at face value, the content out there is going to look the same.” This can make it harder for both job seekers and companies to differentiate themselves. Mona recommended using content from generative AI as a template or starting point, but really scrutinizing each piece of information to make sure it’s exactly what you want to communicate, either as a candidate or employer.
The promise of AI for companies and candidates
While there are clearly downsides to using AI, this technology also holds a lot of promise. Our recent survey revealed this tension: Close to half of candidates are currently using or considering using generative AI for job searches, but almost one-third (31%) of candidates are worried that a company might reject their application if they use AI.
The panelists also shared cautious curiosity about the promise of AI. Dr. Graham mentioned research from the London School of Economics and Political Science that shows AI was able to surface a more diverse pool of candidates than human recruiters. While Dr. Graham is still skeptical, she said, “What we know is if there’s only one woman in a pool of four, then she essentially has zero chance of being hired, so if we’re seeing that with AI the initial pool is more diverse, we have greater opportunity for people of different backgrounds to make it to the end and eventually be hired.”
From the candidate’s perspective, Tony sees the potential value in using generative AI to better understand what a specific company is looking for and research the people you’re interviewing with. Mona agreed: “Leveraging generative AI, you’re in essence synthesizing huge amounts of information you’d otherwise have to spend hours searching for on Google, like common interview questions, common take-home tests or any keywords you might have missed in your resume.”
“I wouldn’t have generative AI write your resume, but you can have it help you find little things you might have missed, little areas of potential optimization.” – Mona Khalil, Data Science Manager at Greenhouse
A few key takeaways
To close out the discussion, our panelists shared a few key points for you to keep in mind. “Any type of AI tool you use should include comprehensive bias monitoring as part of the solution and you should actively be monitoring the impact of your use of that tool,” said Mona. Dr. Graham encouraged some cautious experimentation: “Don’t be afraid to get it wrong, but have processes in place that catch it quickly so you can correct it.” And Tony emphasized the importance of welcoming diverse voices into your AI decision-making process: “Make sure you have the right people around the table in the process with you. People that are not normally involved in this process to be a sounding board.”
Explore the full conversation to hear more from the panelists, including their key insights and takeaways from Cannes Lions 2023.