
"There is no checkmate" — How a prompt instruction made Claude find a 4th option when 3 had failed
TL;DR I gave Claude two things: a premise that dead ends are always a framing problem, and explicit reasoning steps for what to do instead. When all three options in a practical problem were blocked, Claude didn't stop. It questioned a premise I hadn't considered and found a 4th option — better than the original three. Claude found it. I provided the frame and the steps. Who I am I'm a solo developer in Japan. 40 years in IT. I build specialized AI agents using Claude on AWS Bedrock — one for small business advisory, one for enterprise screening. I'm not a researcher. I'm a practitioner who bumped into something interesting and wanted to share it. Where "there is no checkmate" comes from This isn't a prompt engineering trick. It's something my family learned the hard way, across generations. Across generations, my family has hit bottom and come back. Every time, the way out was the same: stop believing the situation is what it appears to be. Question the frame. The path shows up. After
Continue reading on Dev.to
Opens in a new tab


