Siri is Tired of Watching You Scroll: The Shift to Declarative Design is Happening.

Edi Bianco
CDO - The Experience Designer
DESIGN
The recent buzz around projects like OpenClaw , which basically automates a desktop by "watching" and interacting with the screen, marks a massive peak in AI development. But if you look closer, it actually signals the end of what I call the Brute Force Era.
XX minutes of reading
Siri is Tired of Watching You Scroll: The Shift to Declarative Design is Happening.

We’re moving away from “Imperative UI,” where an agent mimics a human’s clumsy clicks, toward Declarative Functionality, where the OS talks directly to an app’s internal logic.

I’ve spent 30 years as a design strategist obsessing over human-machine interaction. I’ve watched this trajectory from the early days of IRC and BBS, through the birth of the browser and the first protoconcepts of “cyberspace,” all the way to today’s LLMs. But what’s happening now is a different beast entirely. We are looking at an AI Agent that isn’t just an operator, but the actual user of apps. This is the final death of the “Tax on Intelligence”, that friction between what we want to do and the menus we have to navigate to get there.

While the “2030” crowd is still waiting for a revolution, the big players are making moves right now.

Your phone is becoming “Agent-Ready” faster than most people realize.

1. Ditching the Screen-Scraper for “MCP”

Research from earlier this year (see Beyond BeautifulSoup, 2026) proves that navigating a GUI is a dead end for performance. LLM agents suffer 15–20x more latency when they have to “see” a screen compared to using a direct tool.

Android 16’s AppFunctions and Apple’s App Intents treat your apps as local instances of the Model Context Protocol (MCP).

  • The Shift: Instead of an LLM guessing where a “Send” button is, the OS uses an indexed schema of “Verbs.”
  • The Numbers: Benchmarks show that bypassing the UI improves success rates by 67% and cuts interaction steps by nearly half.

We are already seeing this play out in our daily workflows. Take Figma, for example. Before the Model Context Protocol, ‘AI in design’ usually meant a plugin that tried to guess what you were drawing by scanning your layers, a process that was often more hit-than-miss. Now, with MCP, my agent doesn’t ‘look’ at my canvas like a human does, it communicates directly with Figma’s underlying spatial logic.

I can ask for a responsive navigation component, and the agent executes the build through a direct function call. It’s no longer about ‘generating’ an image of a UI… it’s about programmatic orchestration. This has shifted my role to a systems architect, where I’m managing the intent rather than the tool. And honestly, let’s all agree on this once and for all, design was never about tools!

2. The Psychology of “Guard-railed Autonomy”

Watching an autonomous cursor move on your screen is nerve-wracking. It creates a massive cognitive load because you feel like you have to “babysit” the AI so it doesn’t mess up. I’ve been there, and I’m sure you have too.

AppFunctions introduces what I call Contractual UX:

  • Permissions: You need specific EXECUTE_APP_FUNCTIONS clearance.
  • Predictability: The agent calls a typed function — like Calendar(title, time)—instead of clicking around randomly. It’s deterministic.
  • The Result: We shift from Supervised Autonomy (watching the AI work) to Delegated Autonomy (trusting the system call).

3. The “Agent-Ready” Stack

For the engineers and CEOs out there: your “moat” is moving. If your app is a “Black Box” that only humans can navigate via a GUI, it’s going to become a digital ghost town.

The new stack requires:

  1. Semantic Indexing: Your data must be discoverable via Apple’s App Entities or Android’s AppSearch without the app even being open.
  2. Tool-Centric Design: Every core feature needs to be a discrete, stateless function.
  3. On-Device sLMs: Frameworks like Apple’s OpenELM handle the “reasoning” locally, so private data stays in the Secure Enclave, not the cloud.

4. Why 2026 is the Tipping Point

This isn’t a roadmap slide; it’s already in the code.

  • Android 16 has already rolled out the AppFunctions Jetpack library.
  • Apple Intelligence is already making App Intents mandatory for anyone who wants to be part of the “Siri Orchestration” layer.

The Death of the “Destination App”

The era of the “Destination App”, where you pray for “eyeball time” and infinite scrolls , is over. Goodbye, TikTok and Instagram as we know them. In the Agentic Era, the most successful apps will be the ones that are the most usable by other machines.

If you’re still designing for “clicks,” you’re building for a world that has already moved on. This isn’t the future … it’s starting today!

Edi Bianco
CDO @Amplifi labs

Core References:

  • Android 16 Developer Docs: AppFunctions Overview (March 2026)
  • Apple Research: OpenELM Model Family
  • arXiv 2510.04607: Declarative LLM-friendly OS Interfaces
  • arXiv 2601.06301: Beyond BeautifulSoup: LLM-Powered Web Scraping

Email Icon - Elements Webflow Library - BRIX Templates

Don't Just Follow the News. Build your Competitive Advantage.

More Valuable Insights.

Architect Your Success.

You have the vision. We have the architecture to make it scale. As your partner, we’ll get straight to a engineering & design strategy that secures your Series A or drives your enterprise growth.

Discuss Your Vision
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.