AI interviews in hiring: What candidates actually want – and how to get it right

Herunterladen Report

Dieses Video ansehen

Füllen Sie das Formular unten mit Ihren Kontaktdaten aus, um darauf zugreifen zu können

Bitte lesen Sie unsere Datenschutzrichtlinie für Details darüber, wie wir Ihre Daten verwalten können.

Gespeichert unter:
April 30, 2026
Placeholder text for "read time" DO NOT REMOVE

For a growing number of candidates, the first “interviewer” they meet isn’t a person. Six months ago, AI interviews were still being tested. Now, nearly two-thirds of US job seekers (63%) say they’ve already experienced one.

The Greenhouse 2026 Candidate AI Interview Report, based on a survey of 2,950 job seekers across the US, UK, Ireland, Germany and Australia, shows just how quickly that shift has happened.

But while adoption surged, something else didn’t keep pace: trust.

AI is now a visible part of the hiring experience, and candidates are reacting in real time to how it’s being used. And right now, there are clear signs that something isn’t working.

Here’s what the data shows across US and global candidates – and where hiring teams need to adjust.

Transparency is where things start to break down

Most candidates aren’t told AI is part of the process until they’re already in it.

Among US candidates who experienced AI evaluation, 70% say AI wasn’t clearly disclosed before their most recent AI interview. For one in five, they only found out once the interview started.

And that lack of clarity doesn’t stop there. Among candidates with relevant experience, 80% of US candidates say employer AI policies are vague, rare or completely absent. Only 18% say most employers have explicit, clear policies.

At the same time, transparency expectations are rising fast. 57% of US candidates believe companies should be legally required to disclose when AI is used to evaluate them, and another 18% support mandatory disclosure when AI heavily influences the final decision – meaning 75% want some form of legal requirement.

The message from candidates is pretty straightforward: if AI is part of the process, they want to know – and they want to understand how it’s being used.

Candidates are opting out of how AI is being used

It might seem like candidates just don’t want AI involved in hiring. But that’s not what the data shows.

Only 19% of US candidates want less AI involvement than there is today. The most common US preferences are the same level of AI with more transparency (21%) and more AI with stronger human oversight at key decision points (22%).

So what’s actually driving people to drop out?

38% of US candidates have already withdrawn from a hiring process because it included an AI interview, and another 12% say they would if required. The top reasons US candidates abandon a process are pre-recorded video interviews scored by AI with no human present (33%), companies failing to disclose how AI would be used (27%), AI monitoring during the process (26%) and a required AI-led interview (26%).

And then there’s what happens after the interview.

Of US candidates who completed an AI interview, 51% never received an outcome, 38% never heard back at all and another 13% are still waiting.

That kind of silence sticks – especially after an experience that already feels impersonal.

The process AI is being built on top of was already broken. A 15-minute conversation with an AI where a candidate can show who they are is a better front door than a keyword-stuffed resume. But that’s not going to come from layering AI on top of a broken process. It’s going to come from building a better one.

– Daniel Chait, CEO and Co-Founder, Greenhouse

AI is carrying forward the same bias already in the system

One of the more difficult findings: US candidates report nearly identical rates of perceived bias from AI and human interviewers.

Age bias was flagged by 36% of US candidates for both AI and human interviewers. Race or ethnicity bias was flagged by 27% for both. Candidates also reported similar rates across gender, employment gaps, accent or way of speaking and other factors.

Only 21% of US candidates believe most employers are using AI responsibly and transparently.

This lines up with how AI is being introduced in many hiring processes. When it’s added on top of unclear or inconsistent evaluation criteria, the outcome doesn’t improve. The same issues show up, just faster and with less visibility. Structure has to come first. AI can only work as well as the system behind it.

AI isn’t fixing bias, it’s scaling it. Candidates can feel that, and when they walk away, it’s not just a missed hire, it’s a reputation problem that compounds.

– Sharawn Tipton, Chief People Officer, Greenhouse

This is shaping how candidates see your company

One piece of data that’s easy to overlook: when AI interviews go well, 38% of US candidates come away with a more positive impression of the employer. When they go poorly, 34% leave with a more negative one.

That’s a narrow margin – which means how you implement AI really matters.

41% of US candidates say AI has made job searching more stressful. Across global markets, candidates also report that AI has made it harder to stand out, made the process feel like gaming a system and reduced confidence in the job search experience.

That feeling doesn’t magically disappear once someone is hired. Candidates who feel informed and respected during your process tend to show up differently than those who don’t. You see it in offer acceptance, early engagement and whether people recommend your company to others.

The data makes it clear that many employers aren’t meeting expectations right now – and every hiring experience is shaping your reputation, whether you realize it or not.

What you can actually do about it

The good news is that US and global candidates are aligned on what would improve the experience. Most of the changes are straightforward – they come down to clearer communication and more intentional process design.

Start with transparency. 70% of US candidates say they weren’t clearly told AI was involved before their interview. Setting expectations early – in the job description, on your careers page or prep materials – removes uncertainty and helps candidates feel prepared. It also signals that your process is designed to be understood, not navigated blindly.

Clarity matters just as much. 39% of US candidates want a clear explanation of what AI is measuring. A simple overview of the skills or criteria being evaluated is enough. Without that context, candidates are left guessing how decisions are made.

Human involvement needs to be visible. 38% of US candidates want confirmation that a person reviews AI output before decisions are made. AI can support evaluation, but final decisions should be clearly owned by people. Making that explicit builds confidence in the process.

Flexibility also changes perception. 46% of US candidates want the option to request a human interview instead. Most won’t use it, but offering the option signals respect and gives candidates a sense of control.

And communication can’t fall through. 51% of US candidates who completed an AI interview never received an outcome. Automated updates and timely follow-ups are expected. When they’re missing, it becomes one of the most memorable parts of the experience.

Underneath all of this is structure. Without clear criteria and consistent evaluation, AI doesn’t improve hiring – it exposes where the process breaks. That foundation is what makes AI usable at scale.

The bottom line

AI adoption is moving quickly, and candidate expectations are keeping pace. Across the US and global markets, candidates are looking for clear disclosure, consistent evaluation, transparent communication and human accountability.

For teams already running structured hiring, this is an extension of what’s already in place. The gap between where many employers are today and what candidates expect isn’t huge – but it’s visible.

And candidates are paying attention.

FAQs

1. Do candidates actually want AI in the hiring process?
Yes – globally, candidates are open to AI as long as it’s transparent, clearly explained and supported by human decision-making at key points.

2. What’s causing candidates to drop out of AI-driven hiring processes?
The biggest drivers are lack of transparency, unclear evaluation criteria and limited human involvement—especially when AI feels like the sole decision-maker.

3. What can hiring teams do to build trust when using AI?
Be upfront about where AI is used, explain what’s being evaluated, keep humans visibly involved in decisions and communicate outcomes clearly and consistently.


See how candidates are experiencing AI interviews – and where trust is breaking down. Download the full Greenhouse 2026 Candidate AI Interview Report directly from this page.

Filed under:
April 29, 2026
Download this article
Click download to access this content.
Download
Gespeichert unter:
April 30, 2026
Micah Gebreyes  

is a Senior Manager of Content Marketing at Greenhouse where she develops and leads the content marketing strategy for Greenhouse blogs, social media and thought leadership newsletters, Modern Recruiter. When she‘s not working to bring the brand story to life, she enjoys spending time with her Pomeranian, Cashew. Keep the conversation growing with Micah on LinkedIn or through the Greenhouse LinkedIn, Twitter, Facebook and Instagram.