Akshay πŸš€ Profile picture
Simplifying LLMs, MLOps, Python & Machine Learning for you! β€’ AI Engineering @ LightningAI β€’ Lead Data Scientist β€’ BITS Pilani β€’ 3 Patents
Ella Sanders Profile picture Rodolpho Gurgel Profile picture Sujoy De Profile picture valuepiper Profile picture Shashi Kumar Profile picture 13 subscribed
Jul 18 β€’ 9 tweets β€’ 3 min read
Let's implement & train this neural network step-by-step, from scratch using PyTorch!

1/n Image First of all let's define our model in PyTorch:

2/n Image
Jul 17 β€’ 11 tweets β€’ 4 min read
Let's build an advanced "Chat with your code" RAG application (100% local): Before we begin, take a look at what we're about to create!

Here's what you'll learn:

- @Llama_Index for orchestration
- @Qdrant_Engine to self-host a vectorDB
- LlamaIndex's advanced code parsers
- @Ollama for serving LLMs locally

Let's go! πŸš€
Jul 16 β€’ 10 tweets β€’ 3 min read
Multiprocessing in Python clearly explained: Ever felt like your Python code could run faster❓

Multiprocessing might be the solution you're looking for!

Today, I'll simplify it for you in this step-by-step guide.

Let's go! πŸš€ Image
Jul 15 β€’ 10 tweets β€’ 4 min read
Autoencoders are one of my favourite neural networks!

Today, I'll clearly explain:

- What they are❓
- And how they work❓

Let's go! πŸš€

1/n Image Autoencoders have two main parts:

1️⃣ Encoder: Compresses the input into a dense representation (latent space)

2️⃣ Decoder: Reconstructs the input from this dense representation.

The idea is to make the reconstructed output as close to the original input as possible:πŸ‘‡

2/n Image
Jul 12 β€’ 7 tweets β€’ 3 min read
ML models can only be as good as the data they're trained on!

Introducing Datalab, now you can automatically detect:

- outliers
- label errors
- (near) duplicates
- low-quality/non-IID sampling

Developed at MIT, Datalab works with all types of data & any trained model!

1/n Image How to use Datalab❓

Datalab works with any ML model you have already trained!

It's like a magic wand! πŸͺ„

Inspecting your dataset with Datalab merely requires the code below! πŸ‘‡

2/n Image
Jul 11 β€’ 11 tweets β€’ 4 min read
Let's build a "Chat with your docs" RAG application, step-by-step: Before we begin, take a look at what we're about to create!

We'll be using:

- @Cohere highly capable ⌘R+ as the LLM
- @Llama_Index for orchestration
- @Streamlit for the UI

Everything in just ~170 lines of Python code, that I've shared at the end! πŸ”₯

Let's go! πŸš€
Jul 9 β€’ 7 tweets β€’ 3 min read
5 GitHub repositories that will give you superpowers as an AI/ML Engineer: 1️⃣ Awesome Artificial Intelligence

A curated list of Artificial Intelligence:

- courses
- books
- video lectures
- and papers with code

Check this out πŸ‘‡
github.com/owainlewis/awe…
Jul 8 β€’ 9 tweets β€’ 3 min read
This is the future of building RAGs. Let me introduce you to DSPy today: In terms of analogy, DSPy is to RAG as PyTorch is to DNNs.

DSPy : RAG :: PyTorch : DNNs

To understand DSPy, we need to grasp three key concepts:

- Signatures
- Modules
- Optimizers

Let's delve into each one! πŸš€ Image
Jul 4 β€’ 9 tweets β€’ 3 min read
Let's make RAG 40x faster and 32x memory efficient: To achieve this we'll be leveraging binary quantization! πŸš€

And here's what you'll learn, today:

- Intuitive explanation of Binary Quantization (BQ)
- Self-host @qdrant_engine with BQ enabled
- Search over 36M+ vectors in <50ms πŸ”₯

Let's go! πŸš€
Jul 3 β€’ 7 tweets β€’ 3 min read
Tensors in PyTorch, clearly explained: Tensors are the fundamental building blocks for performing mathematical operations in deep learning models.

Today, I will provide a comprehensive explanation with illustrative code examples.

Let's go! πŸš€ Image
Jul 2 β€’ 10 tweets β€’ 3 min read
7 projects that every AI engineer must explore: 1️⃣ Chat with your code using RAG

A step by step guide to building a RAG application using @llama_index!

Check this outπŸ‘‡
lightning.ai/lightning-ai/s…
Jun 27 β€’ 7 tweets β€’ 3 min read
Eigenvalues & Eigenvectors clearly explained: The concept of eigenvalues & eigenvectors is widely known yet poorly understood!

Today, I'll clearly explain their meaning & significance.

Let's go! πŸš€ Image
Jun 24 β€’ 10 tweets β€’ 3 min read
f-strings in Python clearly explained: f-strings were introduced in Python 3.6 and have since become a favorite among developers for their simplicity and readability.

Today, we'll start with the basics and dive into all the ninja tricks of using f-strings.

Let's go! πŸš€ Image
Jun 20 β€’ 10 tweets β€’ 3 min read
Multithreading in Python clearly explained: Ever felt like your Python code could run faster❓

Multithreading might be the solution you're looking for!

Today, I'll simplify it for you in this step-by-step guide.

Let's go! πŸš€ Image
Jun 17 β€’ 11 tweets β€’ 3 min read
Object oriented programming in Python, clearly explained: We break it down to 6 important concepts:

- Object 🚘
- Class πŸ—οΈ
- Inheritance 🧬
- Encapsulation πŸ”
- Abstraction 🎭
- Polymorphism πŸŒ€

Let's take them one-by-one... πŸš€ Image
Jun 15 β€’ 12 tweets β€’ 4 min read
I started my career in Data Science back in 2016 ⏳

Here's a detailed roadmap for those starting out today!

What's covered:
- Python
- Machine Learning
- Maths for ML
- ML Books
- MLOps
- LLMs/AI Engineering

Read more...πŸ‘‡ Image 1️⃣ Python

If you are new to programming and just getting started.

There isn't a better place to learn Python than David J Malan's CS50p.

Beautiful explanations and great projects.
It's a complete package ⚑️

Check this out πŸ‘‡
edx.org/course/cs50s-i…
Jun 13 β€’ 9 tweets β€’ 3 min read
Let's learn how to evaluate a RAG application: Here's what we'll do today:

- Build a RAG pipeline using @llama_index
- Evaluate it with @ragas_io
- Implement observability using @ArizePhoenix

Before we dive in, check out this demo:
Jun 12 β€’ 9 tweets β€’ 4 min read
Self-attention as a directed graph!

Self-attention is at the heart of transformers, the architecture that led to the LLM revolution that we see today.

In this post, I'll clearly explain self-attention & how it can be thought of as a directed graph.

Read more...πŸ‘‡ Image Before we start a quick primer on tokenization!

Raw text β†’ Tokenization β†’ Embedding β†’ Model

Embedding is a meaningful representation of each token (roughly a word) using a bunch of numbers.

This embedding is what we provide as an input to our language models.

Check thisπŸ‘‡ Image
Jun 8 β€’ 11 tweets β€’ 4 min read
Let's compare Llama-3 & Qwen using RAG: Recently launched Qwen sits at the top of open LLM leaderboard.

Today, we build a Streamlit app to compare it against Llama3 for RAG.

Here's the stack used:

- @Ollama to serve LLMs locally
- @Llama_Index for orchestration
- @LightningAI for development & hosting

Let's go! πŸš€
Jun 7 β€’ 8 tweets β€’ 3 min read
Key concepts to understand if you're working with LLMs: 1️⃣ The Transformer

Transformers brought the AI revolution we see today, with their ability to process data in parallel & the attention mechanism.

Here's an illustrated guide to understanding self-attention in transformers:
mlspring.beehiiv.com/p/attention-ne…
Jun 6 β€’ 10 tweets β€’ 3 min read
Let's build a "Document Summarisation & Chat" RAG application, step-by-step: Before we begin, take a look at what we're about to create!

We'll be using:

- @Streamlit for the UI
- @Llama_Index for orchestration
- @LightningAI Studio for development & hosting

Everything in just ~180 lines of Python code, that I've shared at the end! πŸ”₯

Let's go! πŸš€