Signaling Theory in Hiring: What Candidates Really Tell You (And What You Miss)
Your hiring managers believe they're evaluating competence. They're not. They're reading signals (credentials, interview performance, career trajectory) and translating these into judgments about capability. The problem isn't that they use signals. In a world where you can't directly observe a candidate's future performance, signals are all you have. The problem is that most hiring managers are reading the wrong signals, misinterpreting the right ones, and systematically filtering out precisely the candidates they claim to want.
This isn't a failure of diligence. It's a structural problem. In 1973, economist Michael Spence published "Job Market Signaling," describing hiring as "purchasing a lottery" - you're placing a bet on unknown future performance based on observable present characteristics. Fifty years later, companies have added layers of process, technology, and interview rounds. The fundamental information problem remains unchanged. And the economic consequences of getting it wrong are substantial: for a $75,000 position, a bad hire costs your organization $225,000 to $300,000 when you account for productivity loss, team disruption, and the opportunity cost of the vacancy.
Understanding signaling theory isn't about becoming a better judge of character. It's about recognizing which signals actually correlate with performance, which ones merely indicate a candidate's skill at signaling, and why your current process likely optimizes for the latter.
The Economics of Information Asymmetry
Spence's insight was simple but profound: candidates know more about their capabilities than you do, and they have every incentive to present themselves favorably. You're operating under information asymmetry; the candidate holds the information, you're trying to extract it. This creates what economists call a "signaling game." Candidates send signals (education, experience, interview performance), you interpret those signals, and both parties adjust their behavior based on the feedback loop.
The critical distinction Spence made was between indices and signals. An index is a characteristic the candidate cannot manipulate: age, years since graduation, previous employers. A signal is something the candidate can alter, often at significant cost: education level, certifications, portfolio quality, interview preparation. Effective hiring requires understanding which signals are costly enough to be credible and which are cheap enough to be gamed.
An MBA from a top-tier institution costs $100,000 to $250,000 and requires two years of foregone income. This is a costly signal. It's expensive in both money and time, which theoretically means only candidates who expect a sufficient return (higher capability leading to higher earnings) will invest. But here's where it gets interesting: the signaling value of the MBA may exceed its educational value. Research on credential inflation suggests that for many roles, the MBA signals "this person can complete a rigorous program and has access to an elite network" more than "this person now possesses specific managerial skills that predict job performance."
Contrast this with cheap signals: "strong leadership skills" on a resume, "results-oriented" in a cover letter, or "excellent communicator" in an interview. These cost almost nothing to produce and therefore tell you virtually nothing about actual capability. Yet some hiring managers weight them heavily in screening and evaluation.
What Actually Predicts Performance
Here's where the research gets uncomfortable. A landmark meta-analysis by Schmidt and Hunter (1998), updated by Sackett and colleagues (2022), examined 85 years of hiring research to determine which selection methods actually predict job performance. The results challenge nearly every conventional hiring practice.
General mental ability (cognitive ability tests) has a validity coefficient of 0.31-0.51 for predicting job performance. This makes it the single best predictor available. Most companies don't use it, citing concerns about legal challenges or "culture fit." The irony: you're avoiding the most predictive signal because you prefer less predictive ones.
Structured interviews (carefully designed questions scored against clear rubrics) have validity of 0.37-0.51. Unstructured interviews (casual conversations assessing "fit") have validity of 0.38. Translation: unstructured interviews are barely better than a coin flip at predicting performance. One 1998 analysis found they lead to the right hire about 57% of the time - statistically equivalent to chance. Yet unstructured interviews remain the dominant evaluation method in most organizations because they feel more natural and allow hiring managers to trust their "gut."
Work sample tests (actual simulations of job tasks) have high validity. If you're hiring a financial analyst, give candidates a dataset and ask them to build a model. If you're hiring a marketing manager, ask them to develop a campaign strategy. These tests are predictive because they measure the thing you care about: performance on job-related tasks. Most companies don't use them because they take time to design and administer.
Now the signals that hiring managers overweight:
Years of experience: Ranked 23rd out of 24 selection criteria in Sackett's analysis. It's one of the worst predictors of performance. Why? Because years of experience measures time, not skill development. Ten years of mediocre performance doesn't transform into excellence through duration.
Age of applicants: No predictive validity for job performance.
Interest/enthusiasm: Validity coefficient of 0.10, meaning essentially no relationship to actual performance.
Unstructured interview performance: As noted, barely better than chance.
Your hiring process likely screens heavily on years of experience, relies on unstructured interviews to assess fit and enthusiasm, and makes limited use of cognitive ability testing or work samples. In other words, you've built a system that optimizes for the signals that don't predict performance and filters out measurements that do.
Why Interviews Optimize for Signaling, Not Substance
The standard corporate interview is designed to advantage candidates who are skilled at interviewing and penalize candidates who are skilled at the actual job. This isn't a bug; it's a feature of the information asymmetry problem.
Candidates prepare for interviews by researching the company, practicing common questions, and crafting narratives that signal desirable characteristics. They know to emphasize "leadership," "collaboration," and "results." They know to prepare the STAR method (Situation, Task, Action, Result) stories that demonstrate competencies. They know to ask thoughtful questions about company strategy and culture. None of this predicts job performance. All of it predicts interview performance.
Meanwhile, the hiring manager is trying to assess: Can this person do the job? Will they fit our culture? Are they someone I want to work with? But the interview setting systematically distorts these judgments. The candidate controls the narrative, presents only favorable information, and has practiced their performance. The hiring manager sees 3-5 hours of curated behavior and extrapolates to thousands of hours of actual job performance.
Research on personality assessment in hiring shows similar patterns. Conscientiousness is the personality trait most correlated with job performance across roles. But self-reported personality tests (the kind used in most hiring) have validity coefficients so low that recent meta-analyses question whether they're worth using at all. Why? Because candidates can, and do, answer in ways they believe employers want to hear. The signal is cheap, so it's unreliable.
Observer-rated personality assessments (where references or colleagues rate the candidate) have significantly higher validity. But companies rarely conduct thorough reference checks because they're seen as time-consuming and because previous employers have incentives to provide inflated assessments (get the person off their payroll or maintain relationships).
The Culture Fit Trap
"Culture fit" is one of the most dangerous signals in hiring. Not because cultural alignment doesn't matter, it does, but because hiring managers systematically conflate actual cultural alignment with surface-level similarity signaling.
Candidates signal culture fit by mirroring the interviewer's communication style, expressing enthusiasm for company values, and demonstrating knowledge of company culture. These are cheap signals. They require research and social awareness, but they don't require actual value alignment or ability to thrive in the environment.
Research on hiring bias shows that "culture fit" judgments often translate to "people like me." Hiring managers rate candidates more favorably when they share backgrounds, interests, or communication styles. This creates homogeneity that masquerades as culture preservation. The economic cost: reduced cognitive diversity leads to worse decision-making, slower innovation, and increased groupthink.
Real culture fit is about value alignment and work style compatibility. A candidate who challenges assumptions constructively, brings different perspectives, and prefers written communication over meetings might be a terrible "gut feel" fit and an excellent actual fit for a culture that values critical thinking, innovation, and asynchronous work. Your interview process likely filters them out before you discover this.
Distinguishing Costly from Cheap Signals
Not all signals are equally reliable. Costly signals, those requiring significant investment of time, money, or effort, are more credible than cheap signals because only candidates expecting a return on investment will make them. But even costly signals can mislead if you don't understand what they actually signal.
Graduate degrees (MBA, MS, PhD):
These are costly signals. They require years of time and tens of thousands of dollars. They signal cognitive ability, perseverance, and access to networks. They do not necessarily signal job-specific competence. An MBA signals "this person can complete a rigorous academic program." It does not signal "this person can manage a P&L effectively" or "this person can lead a product launch."
Career progression at prestigious firms:
Costly in terms of the screening required to enter those firms initially and the performance required to advance. But progression at Firm A doesn't automatically predict performance at Firm B if the cultures, resources, or problem types differ substantially.
Portfolio/GitHub repositories/publications:
These are moderately costly signals (time-intensive) that are also substance signals - they demonstrate actual work product, not just credentials. They're underutilized in most hiring processes.
Interview performance:
Cheap signal masquerading as substance. Interview preparation is time-intensive but requires no job-specific capability, only interview-specific capability.
Years of experience:
Cheap signal. Time passes regardless of learning. Ten years of repetitive experience equals one year repeated ten times.
The strategic insight: costly signals are more reliable than cheap ones, but you must correctly interpret what the cost signals. An MBA signals perseverance and cognitive ability; it doesn't signal management competence. Publications signal research capability and perseverance; they don't signal leadership or communication skills. Career progression signals screening and retention at previous firms; it doesn't guarantee fit or performance at yours.
The Path Forward: Signal-Aware Hiring
The solution isn't to eliminate signaling; that's impossible in a world of information asymmetry. The solution is to become more sophisticated in which signals you weight and how you interpret them.
Prioritize substance signals over credential signals.
Work samples, structured interviews with job-specific scenarios, and problem-solving exercises measure the thing you actually care about: capability to perform the job. Degrees, years of experience, and prestigious employers measure something correlated with capability, but the correlation is weaker than hiring managers believe.
Use structured interviews designed around costly-to-fake signals.
Ask candidates to solve problems in real-time. Ask them to critique their own work or identify what they'd do differently. These are harder to prepare for and more revealing than "Tell me about a time you demonstrated leadership."
Implement cognitive ability testing or work sample tests early in your process.
Yes, they require design effort. That's precisely why they work; they're costly enough (for you and the candidate) that they filter effectively.
Recognize that "culture fit" assessments are particularly prone to cheap signaling.
Unless you're measuring specific, defined cultural dimensions with validated instruments, your culture fit assessment is probably measuring "do I like this person" rather than "will this person thrive here."
Reweight your decision criteria based on actual validity.
If years of experience has almost no predictive validity and cognitive ability has the highest, why are you screening out candidates with 3 years of experience who score highly on work samples while interviewing candidates with 10 years who perform poorly on cognitive tasks?
The Bottom Line
Hiring will always involve reading signals because you can never fully observe a candidate's future performance. But you can get dramatically better at distinguishing credible signals from cheap ones, substance from performance, and actual predictors of success from comfortable proxies.
The hiring managers who win aren't the ones with the best "gut instincts." They're the ones who understand signaling theory, design processes that surface costly-to-fake signals, and systematically use the selection methods that actually predict performance, even when those methods feel less natural than unstructured conversations about passion and culture fit.
The lottery that Spence described, the fundamental uncertainty in hiring, never goes away. But you can significantly improve your odds by betting on the right signals.
Related Articles in The Hiring Economics Series: