You spend an hour tailoring your CV. You write a cover letter that actually sounds like you. You hit submit. And then — nothing.
Not even a rejection. Just silence.
There's a decent chance a human never saw your application. According to the most recent Responsible AI Index, 62 per cent of Australian organisations now use AI in recruitment either moderately or extensively. More than 250 commercial AI hiring tools are available in Australia, handling everything from CV screening to video interview analysis.
That means for a growing share of Australian job seekers, the first gatekeeper isn't a recruiter. It's an algorithm.
What These Tools Actually Do
AI recruitment tools scan applications for keywords, qualifications, and experience patterns. Some analyse video interviews for speech patterns, word choice, and facial expressions. Others rank candidates based on how closely they match profiles of previously successful hires.
The pitch to employers is speed and consistency. When a single job ad attracts hundreds of applications — common for general clerk or receptionist roles, which employ 286,600 and 182,600 Australians respectively — AI can whittle the pile down in minutes.
The problem is what gets lost in that whittling.
Who Gets Screened Out
Jobs and Skills Australia published an analysis warning that AI recruitment "risks leaving real talent behind." The agency found that many AI platforms only look at job titles and formal qualifications, missing what they call "invisible skills" — the capabilities people use every day but that don't always appear on a CV.
That screening gap hits some groups harder than others. Professor Andreas Leibbrandt at Monash University found that AI systems trained on overseas datasets "won't perform as well for particular demographic groups that we have in Australia — for example, refugees, migrant women, First Nations" people.
Research from the University of Melbourne found that recruitment algorithms are a "real problem" for discrimination, noting that these tools may penalise employment gaps — a pattern that disproportionately affects women who've taken time out for caring responsibilities. The University of South Australia added that relying on AI alone "isn't going to lead to more diverse and inclusive outcomes" without conscious organisational effort.
Among the best-known cases globally, an AI system developed by Amazon learned to downgrade applications from anyone who used the word "women" on their CV, having been trained on hiring data from the male-dominated tech industry.
Across organisations using AI in hiring, 19 per cent report that their tools have overlooked or screened out qualified applicants. That's roughly one in five employers admitting their AI is rejecting good candidates.
The Roles Most Exposed to AI Screening
AI screening tools aren't used equally across all jobs. They're most common in high-volume hiring — roles that attract large numbers of applicants and involve standardised skill requirements.
Based on our occupation data, the roles most likely to be filtered through AI before a human ever reads the application include:
- General Clerks — AI exposure score: 7.0, 286,600 employed
- Accounting Clerks — AI exposure score: 7.2, 143,500 employed
- Receptionists — AI exposure score: 6.6, 182,600 employed
- Human Resource Clerks — AI exposure score: 7.5, 23,200 employed
- Advertising and Marketing Professionals — AI exposure score: 6.2, 102,500 employed
These are occupations where AI isn't just changing the work itself — it's changing who gets the chance to do the work in the first place.
Ironically, Human Resource Professionals — the people traditionally responsible for fair hiring — sit at an AI exposure score of 5.6. The tools designed to help them hire faster may be undermining the fairness they're supposed to uphold.
What the Law Says (and Doesn't)
Australia has no specific legislation regulating the use of AI in recruitment. The AI Ethics Principles published by the Department of Industry are voluntary, not legally binding.
The Australian Human Rights Commission has published an AI and Recruitment Compliance Checklist to help businesses assess whether their tools meet existing legal obligations. But the Commission itself acknowledges gaps. The Disability Discrimination Act and adverse action provisions under the Fair Work Act apply to hiring decisions — but they may not cover the pre-screening stage where AI filters out candidates before any formal decision is made.
That leaves a grey zone. If you're rejected by an algorithm before a recruiter even sees your name, it's not clear what legal protections you have.
Melbourne-based researchers have suggested that Australian law should default to an assumption of discrimination unless companies actively prove their recruitment systems don't discriminate — flipping the burden of proof from the applicant to the employer.
What's Coming
The federal government is rolling out a National Skills Taxonomy from mid-2026. The idea is to create a shared language for skills across the economy, making it easier for AI tools — and human recruiters — to see what people can actually do rather than just what's on their CV.
If it works as intended, it could help fix one of the core problems: that AI hiring tools reward credentials and keywords over real capability. For the first time, Australians and employers would share a common framework for describing skills, reducing the chance that good candidates get filtered out because their experience doesn't match a narrow set of keywords.
But the taxonomy won't fix bias baked into training data, and it won't address the lack of transparency around how these tools make decisions. Until there's meaningful regulation — or at minimum, mandatory bias audits — Australian job seekers are largely in the dark about what's happening to their applications.
What This Means for You
If you've been applying for jobs and hearing nothing back, it's worth knowing that the silence might not reflect your qualifications. It might reflect the limitations of a system that was never designed with your background in mind.
A few things worth considering:
- Keywords matter more than they should. AI tools scan for specific terms. If a job ad mentions "stakeholder engagement," use that exact phrase — not "working with people."
- Gaps get penalised. Career breaks for caring, illness, or study can trigger automatic downgrades. Some tools let you explain gaps; use that option when it's available.
- Video interviews may be scored by AI. If you're asked to record answers, know that your words, tone, and expressions may all be analysed before a person watches the recording.
None of this is ideal. But knowing how the system works is the first step to working within it — and pushing for something better.
Want to see how your occupation scores for AI exposure? Check your role on our occupation pages or take the How Safe Am I? quiz to see where you stand.