Skip to content

Instantly share code, notes, and snippets.

@mfpiccolo
Created April 18, 2025 20:35
Show Gist options
  • Save mfpiccolo/baa1c79851ab56965f120f094d238513 to your computer and use it in GitHub Desktop.
Save mfpiccolo/baa1c79851ab56965f120f094d238513 to your computer and use it in GitHub Desktop.

Motia: Skip the Protocols, Ship the Product (and Still Speak Them When You Must)

TL;DR Motia collapses APIs, AI agents, cron/background jobs and into a single event‑driven step model. Most of the time, formal layers like MCP (Model Context Protocol), A2A (Agent‑to‑Agent), and runtime Tool‑Use are unnecessary—plain JSON events and SDK calls get the job done with much less ceremony. If you do need MCP/A2A to talk to a partner, Motia can emit or consume those packets through a thin adapter step, so you remain interoperable without letting protocol complexity infect your codebase.


1 · The Landscape in 60 Seconds

Acronym Purpose Hidden Cost
MCP Shuttle structured context to LLMs Gateway to translate every request & continuous schema‑version drift
A2A Standardize agent‑to‑agent chat Extra channels, retry logic, new auth surface
Tool‑Use Allow models to invoke functions dynamically Reflection layer that’s hard to log, test, or secure

Those standards solve coordination pain only if you view an agent as something alien to normal backend code. Motia flips the premise: treat every task—LLM call or not—as a step triggered by an event. Proven primitives (HTTP, queues, cron) can then cover almost every workflow and let you ship faster.

Key takeaway: Protocols add value at ecosystem boundaries. Inside your own codebase, they’re usually overhead.


2 · Motia’s Event‑Driven Steps

  • Triggers – HTTP/webhooks, queue messages, or cron schedules
  • Steps – TypeScript/Python handlers packaged as self‑contained cloud functions
  • Emits – Steps publish new events that wake the next steps in your workflow

Context rides inside the event payload, and scaling is handled by whichever compute target you choose (AWS Lambda today; 10 GB bundles & containers coming soon).

2.1 From Deterministic to Agentic in One Edit

import { EventConfig, StepHandler } from "motia";

export const config: EventConfig = {
  type: "event",
  name: "generate-invoice",
  subscribes: ["order.placed"],
  emits: ["invoice.generated"],
};

// deterministic today
export const handler: StepHandler<typeof config> = async (event, { emit }) => {
  const pdf = await renderPDF(event.order);
  await email.send({ to: event.user, pdf });
  await emit({ topic: "invoice.generated", data: { orderId: event.order.id } });
};
// tomorrow: same step, now AI‑powered
import { EventConfig, StepHandler } from "motia";

export const config: EventConfig = {
  type: "event",
  name: "generate-invoice",
  subscribes: ["order.placed"],
  emits: ["invoice.generated"],
};

export const handler: StepHandler<typeof config> = async (event, { ai, emit }) => {
  const summary = await ai.openai.chatCompletion({
    messages: [
      { role: "system", content: "You are an accounting assistant." },
      { role: "user", content: JSON.stringify(event.order) }
    ]
  });

  const pdf = await renderPDF(event.order);
  await email.send({ to: event.user, text: summary, pdf });
  await emit({ topic: "invoice.generated", data: { orderId: event.order.id, summary } });
};

No gateways, no new schemas—just import an SDK and call it.


3 · When Protocols Do Matter—and How Motia Handles Them

Sometimes you must interact with a partner who only accepts MCP payloads or A2A chat. Motia keeps that complexity at the edge with two features:

  1. Adapter Steps – A thin handler converts plain Motia events ↔ MCP/A2A packets.
  2. Schema Hooks – motia generate mcp auto‑creates validated MCP schemas or A2A stubs from your event contracts—single source of truth, zero drift.
import { EventConfig, StepHandler } from "motia";
import { callA2A } from "@partner/a2a-client";

export const config: EventConfig = {
  type: "event",
  name: "send-plan-to-partner",
  subscribes: ["plan.created"],
  emits: ["plan.processed"],
};

export const handler: StepHandler<typeof config> = async (plan, { emit }) => {
  const res = await callA2A(partnerUrl, plan); // speaks formal A2A
  await emit({ topic: "plan.processed", data: res }); // back to plain Motia events
};

Result: Your internal code stays elegant; only the boundary speaks spec‑ese.


4 · Why Extra Protocol Layers Rarely Pull Their Weight

Concern Full Protocol Stack Motia Core Motia + Adapter
Context format MCP schemas plus gateway upkeep Plain JSON payload Auto‑generated MCP docs at boundary
Peer comms A2A channels, retries, auth surface Use existing queue/topic Thin wrapper step
Function calls Runtime reflection & bespoke schemas Direct SDK import
Governance Inspect traffic in gateways IAM‑scoped creds & event logs Same, optional edge gateway
Developer DX Learn spec and run extra infra Edit a file, redeploy Same, plus motia generate

If you already rely on NPM/PyPI SDKs + OpenAPI, extra wire layers add risk (schema drift, latency) without adding real capability.


5 · One Backend to Rule Them All

Use Case Trigger Source Step Responsibility Emits
API HTTP / Webhook Deterministic logic HTTP response / event
Automation Cron schedule Batch or workflow kick‑off Queue message
Agent Event from another step LLM‑driven decision‑making Follow‑up events

Because the runtime is identical, you can:

  • Transform a nightly cron into a chat‑based assistant in minutes—no infra change.
  • Apply retries, tracing, or metrics once in the platform instead of re‑implementing for each pattern.

6 · Payoffs of the Step Model

Benefit What It Means Day‑to‑Day
Less surface area No bespoke connectors or DSLs to learn
Composable modules Steps versioned & published like any package—mix & match freely
Faster iteration Edit one file → redeploy one function, not an entire graph
Elastic scaling Managed Lambdas scale to zero; 10 GB bundles & containers on the roadmap

These gains are multiplicative: less surface area speeds onboarding and lowers on‑call load, while composability + scaling let you ship small, safe changes continuously.


7 · Generative Coding Tools Fit Naturally—and Remain Testable

Modern AI coding agents (Cursor, Copilot, etc.) thrive when the feedback loop is tight and deterministic. Motia delivers exactly that:

  1. Prompt → Code – Describe a step; the agent drafts deployable TS/Py with typed configs.
  2. Hot‑reload – motia dev bundles and reloads in seconds for instant local runs.
  3. Automated evals – Unit tests, recorded‑event replays, and LLM budget checks run on every push.
  4. Gatekeeper CI – Only green builds march from alpha → staging → prod via Motia Cloud.
Aspect Runtime Tool‑Use Layers Motia + Coding Agent
Control Function chosen at inference time—hard to test offline Full source generated up front—unit‑testable
Observability Limited to high‑level schema; inner logic opaque Standard logs, metrics, step‑level traces
Iteration speed Update spec → regenerate → redeploy Edit code → hot‑reload locally → CI auto‑runs

Bottom line: You keep the lightning‑fast “prompt → deploy” loop and deterministic builds enterprises demand.


Ship Fast, Speak Protocols Only When Required

motia deploy --stage=alpha

One command, one backend, optional adapters. Skip the protocols—until you need them. Then let Motia translate and keep moving.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment