Back to articles
Bi-directional Voice-Controlled Recipe Assistant with Nova Sonic v2
How-ToDevOps

Bi-directional Voice-Controlled Recipe Assistant with Nova Sonic v2

via Dev.toDarryl Ruggles

What if your recipe assistant could talk back? I have been building a serverless Family Recipe Assistant that searches my family's recipe collection, calculates nutrition from USDA data, and handles multi-turn conversations through a web UI. It works well. But every time I am in the kitchen with flour on my hands, reaching for my phone to type "how long do I bake the banana bread?" feels wrong. I wanted to just ask . The text-based assistant already had a "cooking mode" that read recipes aloud using Amazon Polly. But listening to a long recipe read start-to-finish by a TTS voice is surprisingly tedious - you cannot ask it to slow down, skip ahead, or clarify a step without going back to the screen and typing. What I really wanted was a conversation: "What is the next step?" or "How much butter was that again?" while my hands are covered in dough. Amazon Nova Sonic v2 launched recently with sub-700ms speech-to-speech latency and a 1M token context window. The Strands Agents SDK added ex

Continue reading on Dev.to

Opens in a new tab

Read Full Article
3 views

Related Articles