Here's @stevekaliski, @BackseatVC, and their agent (with a wallet) throwing together an actual birthday party.
To accomplish this fun task, the agent autonomously researches, requests, and pays multiple businesses on Stripe via MPP.
Note: No hands on keyboard
Why MPP?
- Rail agnostic. Stablecoins and fiat payment methods, like cards and BNPLs via @stripe Shared Payment Tokens (SPTs).
- Built for AI. Stream tokens and pay once per session.
- Payments-pilled. Idempotency, request binding, receipts, auth/capture, performance, and cost.
Autonomous agents are an entirely new category of users to build for, and, increasingly, to sell to.
Today, we’re launching (a preview) of machine payments on @stripe—a way for developers to directly charge agents, with a few lines of code. 🤖💸
$ Let’s start tinkering… ⤵️
You may be wondering: “Why can’t agents use virtual cards and transact as people do?”
Agents need:
- microtransactions
- 24/7 global rails
- controls (for human out of the loop)
- http native
- low latency
- finality guarantees
The current financial system is tuned for humans.
Now, businesses can accept machine payments via our regular ol' @stripe PaymentIntents API.
Charge agents for their API usage, MCP calls, or HTTP requests with agent-specific pricing plans, alongside your conventional subscriptions and invoices. Here's a demo from @stevekaliski:
You can now send, sign, and track SAFEs in a few clicks, right from the @atlas dashboard.
Sincerely,
The Stripe team
Business money can only come from three places:
- cash from operations (revenue - costs)
- debit financed from loans
- equity fundraised from investors
In addition to our revenue (Stripe Payments) and loan (Stripe Capital) products, we want to help founders raise (via SAFEs).
We have major plans for making fundraising easier, faster, and hopefully more fun—globally—and would love your feedback and feature suggestions. Big and small ones, please.
Hi! We have three big agentic commerce announcements to share. @stripe is:
1/ powering Instant Checkout in ChatGPT 2/ releasing an open standard: Agentic Commerce Protocol, codeveloped with @OpenAI 3/ shipping an API for agentic payments: Shared Payment Tokens
Let's dive in ⤵️
Purchase where you prompt. Millions of people can now shop through ChatGPT, powered by @stripe.
- It’s fun to discover products via LLMs
- You can buy directly in chat from @etsy, soon @shopify, then more
- Use a saved payment method (I like @link!) or new one
- Buy in one tap
It's built on the Agentic Commerce Protocol (ACP) by @stripe and @OpenAI
- An open standard for programmatic flows between buyers, AI agents, and businesses
- Make your checkout agent-ready
- Works with any agent, commerce platform, payment processor
Today, @OpenAI launched their Agents SDK developer framework.
Until now, agents have mostly been chat-oriented (chat in, chat out), but agents will increasingly be action-oriented (data in, action out).
We, @stripe, would love to show you a few financial agents we’ve built! ⤵️
But, first, what is OpenAI Agents SDK?
It is now easy to build *multi*-agent workflows, with some nice features:
- Handoffs: Agents call each other and pass along context
- Guardrails: Validate inputs/outputs
- Tools: Files, Web Search, and Computer Use (same tech as Operator)
Last November, eons ago in AI-time, we launched our agent toolkit to help LLMs use @stripe. It is now being downloaded thousands of times per week.
Today, we've updated our toolkit to work with OpenAI’s framework.
📺 Here’s a few Stripe agents we've built (with sample code):
1/ Agents and LLMs can now understand, query, and execute @stripe, with our new MCP server.
Boot it up with a single, copy-and-pasteable line in any tool, like your AI IDE or client. Let's hop in! ⤵️
2/ Okay, MCP, what's the big deal?
LLMs know general, though sometimes out-of-date stuff (thanks pre-training!), but you often want to:
- understand latest, official info ()
- query your own data (your customers)
- execute API calls (create products)docs.stripe.com
3/ Along comes MCP (Model Context Protocol), open source from @AnthropicAI.
MCP is a standard way for any application to connect to an LLM — "a USB-C port for AI apps". LLMs now can *find* and *use* third party APIs and services.