Thread Reader
Share this page!
×
Post
Share
Email
Enter URL or ID to Unroll
×
Unroll Thread
You can paste full URL like: https://x.com/threadreaderapp/status/1644127596119195649
or just the ID like: 1644127596119195649
How to get URL link on X (Twitter) App
On the Twitter thread, click on
or
icon on the bottom
Click again on
or
Share Via icon
Click on
Copy Link to Tweet
Paste it above and click "Unroll Thread"!
More info at
Twitter Help
Kasey Zhang
@_WEEXIAO
🇺🇸 | buildin @gulp_ai; yc w25 | prev: @audacioushq @challengers_up (acq) | aspiring hedgehog; chopping wood and carrying water
Subscribe
Save as PDF
May 8
•
5 tweets
•
2 min read
We used RL to train a model for MCP!
Connect any MCP client to any MCP server - you can run MCP workflows fully with local models (+ tune it further).
It works with Ollama / any MCP client that supports Qwen3 models - download it below 👇1/
HF link:
huggingface.co/osmosis-ai/osm…
Quickstart example link:
github.com/Gulp-AI/Osmosi…
You can download it and run the model locally via Ollama/LM Studio, or host it on platforms like Fireworks AI, Groq, etc. that support Qwen3 models.
2/