How Job Search Technology Trains Smart People to Fail
- Susan Morrow

- Dec 28, 2025
- 5 min read
Updated: Jan 18
Why does job search feel harder now than 10 years ago?
The common answer is "more competition." But that's incomplete.
Hiring is more automated than ever. Job search feels worse than ever. The connection is direct. Automation removed feedback loops without replacing them with usable signals.
Why do ATS systems fail to scan resumes correctly?
Applicant Tracking Systems (ATS) have been around for decades, but they still make surprisingly stupid mistakes. The ATS isn't reading your resume like a human would. It's trying to extract fields: name, contact information, job titles, dates. And despite the relative maturity of this technology, even major systems warn users that columns, tables, headers, footers and text boxes can cause partial or failed scans.
This isn't a bug, it's a feature. Interpreting information organized in columns is technically trivial. But the market structure doesn't reward solving it. Individual employers have no strong incentive to fix parsing because the cost of failure falls entirely on job seekers, who vanish silently. The ATS works "good enough" for the employer. Meanwhile, you're stuck reformatting your resume based on conflicting advice from the internet, never knowing if you've actually fixed the problem.
Take the skills section. Job seekers are told to list skills in a single line separated by commas because that's what ATS systems can parse reliably. A human-readable format might group these by category, in columns, which would better help a hiring manager to assess your skills at a glance. But that structure might break the parser.
You're optimizing for machine readability at the expense of human readability. And you still won't know if you did it right. There's no way to test this. No preview of how your resume parsed. No error message when something breaks.
What happens when everyone uses AI to apply?
According to Indeed's 2023 Global AI Survey, roughly 70% of job seekers reported using generative AI for activities like drafting cover letters and preparing applications. Two years later, GenAI use for resume customization has become table stakes for applying to any specific role.
This makes applying faster and easier. Which means people apply to more jobs. LinkedIn started processing an average of 11,000 applications per minute in 2024, a 45% increase from the previous year. Job postings didn't increase at that rate. Applying just got cheaper.
The volume lands on recruiters. Greenhouse reported that recruiter workload climbed to 588 applications per role in Q3 2024, up 26% from the same period in 2023. At hundreds of applications per job, even the resumes that make it through ATS filters receive minimal human attention.
Employers respond by adding more automated screening. Stricter keyword matching, more knockout questions, tighter filters. Which makes it harder for applicants to get through. Which pushes more job seekers to use AI tools to optimize their applications. Which increases volume. Which forces more automation.
Remember: the ATS already makes mistakes parsing perfectly good resumes. Now it's trying to process 26% more applications with the same limitations. And the humans behind the ATS who might catch those errors or spot good candidates the system missed? They have less time per application than they did a year ago.
The individual logic is sound. If AI can help you tailor your resume to match job description keywords, that seems smart. If it can generate a cover letter in 30 seconds instead of 30 minutes, why wouldn't you use it?
But the collective outcome is signal collapse. When everyone tailors using similar models, similar prompts and similar templates, the differentiation that employers used to rely on (writing voice, specificity, evidence of genuine interest) gets flattened. Resumes start to sound interchangeable. The signals that used to mean "this person actually cares about this specific role" now mean nothing.
This is a classic market failure: individual rationality creates collective dysfunction.
Why don't you hear back after applying?
Job boards and internal ATS systems could provide feedback. They could tell you whether your resume parsed correctly, which filters eliminated you, what criteria the role actually prioritized.
But that would expose platform owners and employers to legal risk with no upside. Stating a reason for why an application didn't move forward opens the door to litigation. No one who has ever talked to an employment lawyer would recommend explaining rejections to candidates. The safest legal position is silence.
So you get silence. Not because the information doesn't exist, but because sharing it creates liability while the status quo has none.
Employers bear some cost when signal quality degrades (they have to work harder to find good candidates in the noise), but they bear it collectively and diffusely. Individual employers can't fix the systemic problem. Job seekers bear the cost individually and acutely, but they can't coordinate a response. So the dysfunction persists.
Why do smart people optimize the wrong things?
When systems provide no feedback, people don't stop trying. They optimize for what they can measure, guided by cognitive biases that make broken systems especially toxic.
“Fundamental attribution error” is the tendency to attribute outcomes to personal characteristics rather than situational factors. When you send out 50 applications and hear nothing, you don't think "the ATS probably failed to parse half of these and recruiters had 30 seconds to review the rest." You think "my resume must be terrible" or "I'm not qualified enough."
The system gives you silence. Your brain interprets that silence as personal failure. This is exactly backward. The failure is structural. But you can't see the structure, so you blame yourself.
This triggers another bias, the “illusion of control.” You can't control whether the ATS parses your resume correctly or whether 587 other people applied to the same role. But you CAN control your formatting choices. Whether your skills are in a table or a list. Whether you use "managed" or "led." Whether your margins are 0.5 inches or 0.75 inches.
So that's where your energy goes. Obsessive formatting tweaks. Keyword density optimization. Template experiments. You're rearranging deck chairs while the ship is flooding.
The “availability heuristic” (a tendency to treat visible information as complete information) compounds this. The advice that's most visible is "tailor your resume to the job description," "use keywords from the posting," "apply to more positions." You see this everywhere: career coaches, LinkedIn posts, job search guides.
The actual problems (ATS scanning failures, signal collapse, unlisted or unreal postings) aren't part of mainstream job search advice. So you optimize based on visible information, even when it points you in the wrong direction.
The result: smart people end up applying too broadly, optimizing for keyword density over clarity, treating job search as a volume problem when it's actually a signal problem.
In some ways, you can say the technology “trained” you to behave irrationally through the absence of feedback. Sometimes the answer really is simpler than "you're not trying hard enough,” it’s “the system is badly designed.” It prioritizes throughput over match quality, employer efficiency over candidate experience and automation over information. None of that is your fault. But all of it becomes your problem to solve.
Which raises the obvious question: how do you actually use job boards when you know they're flawed?
Sources
Application Volume:
Greenhouse recruiting platform data, internally cited in their blog article, “Why is job hunting so soul-crushing – and what can be done about it?”
LinkedIn job application processing: New York Times, “Employers are Buried in AI-Generated Resumes”
AI Usage in Job Search:


