Share this post:
Applicant tracking system (ATS) has been a game-changer in the recruiting and human resources world, so why and how do they work? Basically, once a candidate submits their resume, an AI system reads it and matches it to the job description. If there’s a high match, they are shown to the hiring authority as a good candidate. In theory, this saves a lot of time, but in practice, many good candidates fall through the cracks as they are not using the right keywords for the AI.
So, what’s going on if you’re not in the AI talent wars? If you can’t get anywhere with recruiters because of AI and the barriers it puts on your resume.
So, what’s being done about it? Are there any more problems people should be aware of? How can candidates beat applicant tracking systems? Keep reading to find out.
Employers and employees are using AI, and that’s a problem, as Sarah E. Needleman writes in a conversation with Rod Samra. This Business Insider article examines the problems that have surfaced as more employers have begun using AI to scan resumes.
So, you might be wondering, “How does an applicant tracking system work?”
Applicant tracking systems (ATS) scan resumes for keywords and match them to the job description. This means that for recruiters and hiring authorities, it can yield better results by saving time by showing only the CVs with the required skills.
This means resumes that don’t match the AI’s language won’t get past the ATS, no matter how qualified the candidate is. A cited example is that the ATS might be looking for “leadership,” while the resume says “I’ve led a team of 50 people.” The candidate might’ve been a good fit for the position, but they would have been rejected.
That initial rejection is part of the problem, since nobody is actually reviewing the rejected CVs. Sometimes people get rejection emails mere minutes or hours after submitting their resumes, which shouldn’t be possible if someone were reading them. This has led to a lawsuit currently under review.
“AI is a double-edged sword. It can reduce biases by standardizing the résume-review process, but it can also amplify biases if algorithms are poorly designed or tested.”
Another big problem is the biases in ATS. Even if resumes don’t necessarily indicate gender, ATSs often favor male candidates over female candidates. And this can show up in other protected classes.
In a way, it could be argued that all of this is AI slop, except that it’s the contrary–the tools are doing too good a job in only finding a select few to fill the job vacancy.
So, what can candidates do? According to Monster’s 2026 State of Resumes Report, candidates are fighting back by having longer resumes. Not only that, but many are tailoring their CVs to the job application, since what works for one job posting may be a disadvantage in another.
77% of candidates have reported anxiety since companies are no longer taking a look at their resumes, and they need to invest more time applying. Resumes are now longer, but that doesn’t mean all elements have been phased out; some people still list their street address, and only 18% include their LinkedIn.
“Nearly half of job seekers (49%) use resumes longer than one page, including 30% whose resumes are two pages or more.”
Originally, resumes had to be short to ensure that hiring managers read them, but only 43% believe their CVs are being skimmed, and 6% believe they’re being read thoroughly.
To combat anxiety and help get past the ATS, resume builders from different companies have cropped up. These aim to support candidates by providing ATS-friendly templates that can be optimized for different keywords, depending on the position they’re applying for.
Okay, but if these tools are faulty and biased, why are they being used in a professional setting? Shouldn’t there be some laws regulating them? There should, but corporate lawsuits take years, and even though currently there’s a big one brewing, big tech is fighting back, reports Stacy Cowley for The New York Times.
Currently, a lawsuit seeks to compel AI companies to disclose more information about the data they gather from participants. According to job seekers, the screening tool can become an algorithmic gatekeeper that ranks applicants without telling them how to improve.
“If the tool makes mistakes, candidates have no way to correct them.”
The main argument regarding the regulation is that these tools are close to credit tools, which are subject to the Fair Credit Reporting Act. AI companies claim that the tools are designed to be unbiased, as they aim to find a certain type of person, but never improperly biased.
On the other hand, the lawsuit argues that gathering someone’s “personal characteristic” and not allowing them to argue against whatever conception AI formed about them creates a “black box” of employment decisions. This is similar to what other lawsuits against AI software have argued, as a 2023 case has been granted preliminary approval to proceed against Workday, alleging that its AI discriminates against older people, people with disabilities, and Black applicants.
Applicant tracking systems are unreliable, yet many companies and recruiters rely on them. The problem with these systems is that many companies are not training their people to use them, and they rely entirely on them instead of checking their work as they should.
For many AI companies, the challenge should be to avoid bias or to ensure that the companies they provide SaaS to know exactly how to ensure their systems are working as intended.
In the meantime, job seekers continue to battle anxiety over AI gatekeeping. Longer resumes may become standard, and some use AI to try to get past it.
WHAT DO YOU NEED TO FIND?