Let's build a multi-agent content creation system (100% local):
Before we dive in, here's a quick demo of what we're building!
Tech stack:
- @motiadev as the unified backend framework
- @firecrawl_dev to scrape web content
- @ollama to locally serve Deepseek-R1 LLM
The only AI framework you'll ever need to learn! 🚀
Here's the workflow:
- User submits URL to scrape
- Firecrawl scrapes content and converts it to markdown
- Twitter and LinkedIn agents run in parallel to generate content
- Generated content gets scheduled via Typefully
ML researchers just built a new ensemble technique.
It even outperforms XGBoost, CatBoost, and LightGBM.
Here's a complete breakdown (explained visually):
For years, gradient boosting has been the go-to for tabular learning.
TabM is a parameter-efficient ensemble that provides:
- The speed of an MLP.
- The accuracy of GBDT.
The visual below explains how it works.
Let's dive in!
In tabular ML:
- MLPs are simple and fast, but usually underperform on tabular data.
- Deep ensembles are accurate but bloated and slow.
- Transformers are powerful but rarely practical on tables.
The image below depicts an MLP ensemble, and it looks heavily parameterized👇
- What is an AI agent
- Connecting agents to tools
- Overview of MCP
- Replacing tools with MCP servers
- Setting up observability and tracing
All with 100% open-source tools!
This course builds agents based on the following definition:
An AI agent uses an LLM as its brain, has memory to retain context, and can take real-world actions through tools, like browsing web, running code, etc.
In short, it thinks, remembers, and acts.
100% open-source tech stack:
- @crewAIInc for building MCP ready agents
- @zep_ai Graphiti to add human like memory
- @Cometml Opik for observability and tracing.