Back to articles
I built a Lighthouse for MCP tools — it scores your tool definitions on every PR
NewsTools

I built a Lighthouse for MCP tools — it scores your tool definitions on every PR

via Dev.toHiroki Honda

The problem AI agents choose between tools based on one thing: the quality of their descriptions. Research shows 97% of MCP tool descriptions have quality defects (arXiv 2602.14878), and optimized tools get selected 3.6x more often (arXiv 2602.18914). Most MCP developers don't know their tool definitions are broken until an agent silently ignores them. What I built ToolRank scores MCP tool definitions across 4 dimensions: Findability (25pts) — Can agents discover your tool? Clarity (35pts) — Can agents understand what it does? Precision (25pts) — Is the input schema complete? Efficiency (15pts) — Is it token-efficient? It's like Lighthouse, but for MCP tools. GitHub Action — score on every PR Today I published ToolRank Score on GitHub Marketplace. Add this to your repo: name : ToolRank Score on : pull_request : paths : [ ' **/*.json' ] permissions : pull-requests : write jobs : score : runs-on : ubuntu-latest steps : - uses : actions/checkout@v4 - uses : imhiroki/toolrank-action@v1 On

Continue reading on Dev.to

Opens in a new tab

Read Full Article
7 views

Related Articles