MCP server that lets Claude Code scan, score, and apply to jobs autonomously. Connects directly to Ashby, Greenhouse, and Lever job board APIs. No browser. No resume upload portals. Claude does the tailoring.
JobHound MCP exposes a set of tools to Claude Code that form a complete autonomous job application pipeline. The flow is linear: scan job boards for matching roles, score each one against your profile, queue the best candidates, let Claude tailor a cover letter and resume for each, then submit directly via the ATS API. No human in the loop unless you want one.
The pipeline: scan pulls fresh listings from Ashby, Greenhouse, and Lever. score runs each listing through a relevance model against your stored profile. queue holds anything above the threshold. Claude reads the full job description, rewrites your materials for that specific role, then submit fires the POST directly to the ATS endpoint. Status is written back to a local SQLite database that the TUI reads in real time.
{
"mcpServers": {
"jobhound": {
"command": "python",
"args": ["-m", "jobhound.server"],
"env": {
"ASHBY_API_KEY": "your_key_here",
"GREENHOUSE_API_KEY": "your_key_here",
"LEVER_API_KEY": "your_key_here",
"JOBHOUND_DB": "~/.jobhound/jobs.db",
"JOBHOUND_PROFILE": "~/.jobhound/profile.json"
}
}
}
}
JobHound pulls from the following job board APIs directly — no scraping, no third-party aggregators:
Python 3.11+ FastMCP httpx sqlite3 Pydantic Claude API
Open source on github.com/Null-Phnix/jobhound-mcp. Issues and PRs welcome.