A Statistical Market-Design Framework for Academic Job Markets
Preference signaling for the academic job market.
A centralized questionnaire converts scarce candidate signals into structured information, improving interview selection, matching efficiency, and welfare across the market.
Ali Kaazempur-Mofrad1, Xiaowu Dai1, Xuming He2
1University of California, Los Angeles • 2Washington University in St. Louis
Market pulse
Noisy, congested, hard to coordinate.
Departments cannot identify genuine interest and candidates cannot credibly signal fit. Interviews are allocated with too little information.
Offers concentrate on a few candidates
Interviews wasted on low-probability matches
Ideal pairs fail to connect
2023–24 hiring cycle
641Statistics PhDs awarded
82Accepted permanent academic positions
185Statistics PhDs with no position secured
130Assistant professor openings posted
42Searches left unfilled
Market frictions
Why the system fails
Congestion, information asymmetry, and weak coordination create lost opportunities for both candidates and departments.
01
Congestion
Offers cluster; some positions remain unfilled.
02
Noisy interest
Difficult to identify sincere interest.
03
Coordination failure
Departments may miss out on ideal candidates.
04
Resource waste
Interview slots go to candidates unlikely to accept.
Proposed mechanism
Three-stage design
A preference-signaling layer for academic hiring: candidates complete a single questionnaire and departments use the resulting signals to inform interview decisions.
1
Signaling stage
Candidates complete one questionnaire
A standardized questionnaire captures candidate preferences, and each candidate selects which departments receive their responses.
2
Interview stage
Departments rank with calibrated confidence
Questionnaire signals are combined with traditional materials to estimate acceptance probabilities and inform interview selection.
3
Matching stage
Standard offers, sharper information
Departments extend offers as usual, but more informed decisions move the market closer to stable outcomes.
Process map
Results
Incentive guarantees and simulation evidence
Theoretical guarantees
Truthful participation is a dominant strategy
Universal disclosure dominates for candidates
Misreporting incentives vanish under market competition
Aggregate welfare is nondecreasing in participation
Better information pushes toward stable outcomes
Simulation design
Empirical scaffold from real departments
Simulations merge 2026 U.S. News rankings with College Scorecard data across 103 departments in four prestige tiers.
Participation effects
Candidate outcomes by participation status. Left: Mean welfare. Right: Matching rate.
Candidate-side gains
Mean candidate welfare by ρ. Left: Total candidate pool. Right: By quality tier: gains persist across all tiers.
Distributional effects
Outcomes by tier and ρ. Left: Mean candidate utility among matched, by quality tier. Right: Department welfare per position, by prestige tier.
Allocation shiftLeft: Interview and Right: hiring allocations rebalance under structured signaling.
Outcome compositionLeft: Department position outcomes: initial-offer fills, scramble fills, and unfilled positions. Right: Candidate hire outcomes: matched vs. unmatched. Both panels compare baseline (ρ = 0%) with full participation (ρ = 100%).
Stability
Blocking pair rate by ρ.
Questionnaire
A market-wide comparable signal
The questionnaire covers dimensions central to academic job decisions.
@article{academicmarket2026,
title={A Statistical Market-Design Framework
for Academic Job Markets},
author={Kaazempur-Mofrad, Ali and
Dai, Xiaowu and He, Xuming},
year={2026}
}