Santiago Profile picture
Sep 14, 2020 10 tweets 2 min read Read on X
Here are 12 skills that you wanna add to your Data Science / Machine Learning resume.

The first 6 are foundational and important. The other 6 are in crazy high demand, harder to build, and will set you apart.

🧵👇
The industry is relatively young so we are still figuring out titles and requirements, but certain skills are already surfacing as fundamental.

Here I compiled twelve of them.

They aren't all required. They just represent a good blueprint for you to focus on.

👇
1⃣ Notions of Probabilities and Statistics — You need at least enough to understand how some algorithms work and how to interpret their results.

2⃣ Data Management — Capturing, querying, storing, and transferring data. SQL is a very important skill here.

👇
3⃣ Data Wrangling — Preparing, cleaning, transforming the data for further analysis. This is one of the most important skills to build.

4⃣ Data Visualization — Usually an underrated skill. Your data is telling a story, and it's your job to present it to the world.

👇
5⃣ Programming — It's imperative that you know enough to draw insights from data using your language of choice.

6⃣ Machine Learning Algorithms — Understanding existing algorithms, and having the capability to apply them and interpret their results is key.

👇
Most people check these six skills.

But you aren't most, so here you have a list with the other six.

These are sexier but harder to build. These will set your resume apart.

👇
1⃣ Deep Learning — A subset of Machine Learning methods based on Neural Networks.

2⃣ Computer Vision and Natural Language Processing — These are probably the two hottest areas in the industry right now. They are about extracting meaning from images, videos, and text.

👇
3⃣ TensorFlow, Keras, PyTorch — These are the most popular libraries to build Deep Learning applications.

4⃣ Cloud Computing — Today, there's no Machine Learning without having access to the resources and services provided by the Cloud.

👇
5⃣ Big Data — The ability to deal with large and complex data sets. Tools like Hadoop and BigQuery are examples here.

6⃣ DevOps / MLOps — These skills are centered around the ability to build and manage machine learning pipelines and workflows.

👇
It's really difficult to acquire all of these skills and be good at every single one of them.

But you don't need that.

Instead, focus on the basics and expand your capabilities into areas that will increase your value.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Santiago

Santiago Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @svpino

Jun 6
Bye-bye, virtual assistants! Here is the most useful agent of 2025.

An agent with access to your Gmail, Calendar, and Drive, and the ability to do things for you is pretty mind-blowing.

I asked it to read my emails and reply to every cold outreach message.

My mind is blown!
AI Secretary and the folks @genspark_ai will start printing money!

You can try this out here:

Check their announcement video and you'll see some of the crazy things it can do for you. genspark.ai
The first obvious way I've been using AI Secretary:

100x better email search.

For example, I just asked it to "show me the last 3 emails asking for an invoice for the Machine Learning School cohort."

I also asked it to label every "email containing feedback about the cohort."
Read 6 tweets
Jun 5
You can now have a literal army of coding interns working for you while you sleep!

Remote Agent is now generally available. This is how we all get to experience what AI is really about.

Here is what you need to know:
Remote Agent is a coding agent based on @augmentcode. They were gracious enough to partner with me on this post.

Remote Agent:

• Runs in the cloud
• Works autonomously
• Can handle small tasks from your backlog

Here is a link to try it out: fnf.dev/4jobOrw
If you have a list of things you've always wanted to solve, let an agent do them:

• Refactor code and ensure tests still run
• Find and fix bugs
• Close open tickets from your backlog
• Update documentation
• Write tests for untested code
Read 5 tweets
Jun 4
Knowledge graphs are infinitely better than vector search for building the memory of AI agents.

With five lines of code, you can build a knowledge graph with your data.

When you see the results, you'll never go back to vector-mediocrity-land.

Here is a quick video:
Cognee is open-source and outperforms any basic vector search approach in terms of retrieval relevance.

• Easy to use
• Reduces hallucinations (by a ton!)
• Open-source

Here is a link to the repository: github.com/topoteretes/co…Image
Here is the paper explaining how Cognee works and achieves these results:

arxiv.org/abs/2505.24478Image
Read 4 tweets
May 26
Cursor, WindSurf, and Copilot suck with Jupyter notebooks. They are great when you are writing regular code, but notebooks are a different monster.

Vincent is an extension fine-tuned to work with notebooks.

10x better than the other tools!

Here is a quick video:
You can try Vincent for free. Here is a link to the extension:



It works with any of the VSCode forks, including Cursor and Windsurf. The free plan will give you enough to test it out.marketplace.visualstudio.com/items?itemName…
The extension will feel familiar to you:

• You can use it with any of the major models (GPT-X, Gemini, Claude)
• It has an option to Chat and Edit with the model
• It has an Agent mode to make changes to the notebook autonomously

But the killer feature is the Report View.
Read 4 tweets
May 19
I added a Knowledge Graph to Cursor using MCP.

You gotta see this working!

Knowledge graphs are a game-changer for AI Agents, and this is one example of how you can take advantage of them.

How this works:

1. Cursor connects to Graphiti's MCP Server. Graphiti is a very popular open-source Knowledge Graph library for AI agents.

2. Graphiti connects to Neo4j running locally.

Now, every time I interact with Cursor, the information is synthesized and stored in the knowledge graph. In short, Cursor now "remembers" everything about our project.

Huge!

Here is the video I recorded.
To get this working on your computer, follow the instructions on this link:

github.com/getzep/graphit…

Something super cool about using Graphiti's MCP server:

You can use one model to develop the requirements and a completely different model to implement the code. This is a huge plus because you could use the stronger model at each stage.

Also, Graphiti supports custom entities, which you can use when running the MCP server.

You can use these custom entities to structure and recall domain-specific information, which will tenfold the accuracy of your results.

Here is an example of what these look like:

github.com/getzep/graphit…
By the way, knowledge graphs for agents are a big thing.

A few ridiculous and eye-opening benchmarks comparing an AI Agent using knowledge graphs with state-of-the-art methods:

• 94.8% accuracy versus 93.4% in the Deep Memory Retrieval (DMR) benchmark.

• 71.2% accuracy versus 60.2% on conversations simulating real-world enterprise use cases.

• 2.58s of latency versus 28.9s.

• 38.4% improvement in temporal reasoning.

You'll find these benchmarks in this paper: fnf.dev/3CLQjBKImage
Read 4 tweets
May 12
Massive release here!

First, MCP. Then, A2A. Now, we have a new AI protocol.

AG-UI is the Agent-User Interaction Protocol. This is a protocol for building user-facing AI agents. It's a bridge between a backend AI agent and a full-stack application.

Up to this point, most agents are backend automators: form-fillers, summarizers, and schedulers. They are useful as backend tools.

But, interactive agents like Cursor can bring agents to a whole new set of domains, and have been extremely hard to build.

But not anymore!

If you want to build an agent that co-works with users, you need:

• Real-time updates
• Tool orchestration
• Shared mutable state
• Security boundaries
• UI synchronization

AG-UI gives you all of this.

It’s a lightweight, event-streaming protocol (over HTTP/SSE/webhooks) that creates a unified pipe between your agent backend (OpenAI, Ollama, LangGraph, custom code) and your frontend.

Here is how it works:

• Client sends a POST request to the agent endpoint
• Then listens to a unified event stream over HTTP
• Each event includes a type and a minimal payload
• Agents emit events in real-time
• The frontend can react immediately to these events
• The frontend emits events and context back to the agent

Check the link to the protocol in the next post:Image
Here is the link to learn more about AG-UI:

github.com/ag-ui-protocol…
Here is a much better illustration of how this works:
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(