5 mins, 12 secs read time
Wondering about the potential impact of artificial intelligence (AI) on your hiring process? You’re not alone. We’re seeing curiosity from both job seekers and employers – our recent survey revealed close to half of candidates are already using AI for job applications or considering using it, and 48% of HR managers are considering it.
At the same time, if your organization is prioritizing diversity, equity and inclusion (DE&I), you can’t ignore the potential downfalls of relying on this technology. Since it’s based on past decisions, AI can perpetuate existing biases. This is likely why over half of candidates from historically underrepresented groups said they’d hesitate applying to a company that uses AI to help make hiring decisions.
Can you take advantage of AI without negatively impacting your inclusive hiring efforts? To explore this question, Greenhouse hosted the “Inclusive hiring in an AI world” webinar. Greenhouse’s Head of Data Science Mona Khalil moderated a discussion between Jackye Clayton, VP of Talent Acquisition and DE&I at Textio, and Samantha Lawrence, Senior Vice President of People Strategy at Hired.
Read on for a few highlights from their discussion, or you can watch the recording of their conversation any time, available on demand here.
What exactly is AI?
To kick off the conversation, Mona started with a definition:
AI is an automated system that performs a task you’d typically expect some human intelligence to perform.
While there’s been a lot of buzz around generative AI like ChatGPT, other forms of AI like machine learning have been around for quite some time. You’ve probably encountered machine learning in the form of recommendations from news sites, streaming services or even your email provider when it automatically fills out an address or suggests how to finish a sentence.
Should you be worried about AI in the workplace?
There’s no consensus when it comes to HR leaders’ stance on AI in the workplace. Mona shared results from the Greenhouse survey revealing that HR leaders are almost evenly split on their opinions of whether AI will help them achieve their DEIB goals – 33% strongly or somewhat believe that AI will help and 31% strongly or somewhat disagree.
What are some of the main causes for concern? Samantha said,
We need to make sure AI is not the sole determining factor in hiring or firing decisions or the only tool being used when evaluating promotion criteria.
She mentioned the recent legislation in New York City, which set guidelines and requires employers to disclose their use of AI in the hiring process.
Jackye described how soon after the launch of ChatGPT her inbox was flooded with copycat emails that were very obviously AI-generated. “The output is really dependent on the questions you are asking, so if you’re not asking the right question, you’re not getting the right output,” she said. This can be problematic if people are using AI to represent your company in any capacity. Pointing out another shortcoming of the technology, Jackye said, “AI is looking backwards rather than to the future, so it may not be up to date with what you’re doing inside your org.”
How can you use AI today… and what might you be able to do in the future?
While AI has potential drawbacks, this technology holds plenty of promise, too – after all, that’s why it’s such a compelling topic! Jackye said that Textio – a tool that informs users of problematic language in their job descriptions or other written content – has been using AI to identify biased language with a large language model internally. If you’re using a generative AI tool, you can run the text through Textio to identify biased language and get ideas on how to reword it. In the future, Jackye said she sees the product moving into additional forms of communication to ensure it’s anti-racist and more inclusionary while also adding micro-learning elements to the experience.
Similarly, Samantha said Hired has been using AI for the past ten years. As a platform that connects active job seekers with hiring teams that are looking to fill tech and sales roles, Hired is powered by an AI matching process. Samantha also mentioned the prevalence of AI tools that are available on the market to assist with tasks like candidate outreach and sourcing.
What are some ways to navigate both AI and inclusive hiring?
One of the reasons HR professionals are intrigued by AI is that it’s so much faster than performing a task manually. But Jackye cautions that faster is not always better – especially if you’re trying to prioritize inclusive hiring:
When we’re trying to be inclusive, it means we need to pull back from the idea of speed and the fastest thing to do.
For example, AI will try to match a resume to what’s written in a job description. But job descriptions inherently contain bias, so this will be reflected in the candidates an AI tool matches to it. If you use a tool that surfaces similar candidates to the ones you’ve viewed or engaged with, Jackye recommended trying to match similar interests or projects to diversify your pool of candidates.
Samantha mentioned a few of Hired’s AI features that can help reduce bias, including a diversity goals filter that surfaces underrepresented talent and salary bias alerts that ensure companies are offering consistent salaries for the same roles.
And of course, there are plenty of ways to make hiring more inclusive without AI. Samantha said she’s a big fan of having structured interviews and rubrics, making sure interview panels are diverse and creating inclusive workplace policies like robust leaves, flexible hours and remote work.
Whether you’re embracing AI wholeheartedly, avoiding it completely or somewhere in between, the message from the panelists is clear: While certain tools can boost your speed and efficiency, there’s no shortcut to inclusive hiring.
Hear more from the panelists about navigating AI and inclusive hiring – including their recommendations for vetting and evaluating AI tools. Watch the webinar here.