
Google Just Made Every Android App an AI Agent Tool — Here's What's Missing
Google just announced AppFunctions — a framework that lets Android apps expose their capabilities directly to AI agents. Instead of opening Uber and tapping through screens, you tell Gemini "get me a ride to the airport" and it calls the function directly. Google's own blog post says it: AppFunctions mirrors how "backend capabilities are declared via MCP cloud servers." This isn't a coincidence. It's the same pattern — tools exposed to AI agents via structured function calls — applied to mobile. And it has the same security gap. What AppFunctions actually does Two things are happening here: 1. Structured function exposure. App developers annotate their code with @AppFunction , declaring what their app can do — search photos, book rides, create reminders. AI agents discover these functions and call them directly. The app never opens. The user never sees a UI. 2. UI automation. For apps that haven't adopted AppFunctions, Google is building a framework where Gemini can operate the app's U
Continue reading on Dev.to
Opens in a new tab




