
Cypress AI Skills: Teaching Your AI Assistant to Write Better Tests
I’ve been using AI tools like Cursor and Claude Code to help write Cypress tests. It’s fast, and for simple cases, it works well enough. But as soon as you try to use it in a real project, the cracks start to show. You ask the AI to write a test, and you get something like this: cy . get ( ' .btn-primary ' ). click () cy . wait ( 3000 ) cy . get ( ' .modal-content .success ' ). should ( ' be.visible ' ) It works, technically. But it doesn’t look like something your team would ever commit. In most real-world projects, we avoid CSS selectors, we don’t rely on arbitrary waits, and we lean heavily on custom commands to keep tests maintainable. So instead of saving time, you end up rewriting most of what the AI generated. That’s the gap Cypress AI Skills is trying to close. The Real Shift: From Code Generation to Code Alignment Most AI tools are good at generating code, but they don’t understand your codebase. They don’t know your selector strategy. They don’t know your custom commands. The
Continue reading on Dev.to
Opens in a new tab
