Simplifying LLMs, MLOps, Python & Machine Learning for you! β’ AI Engineering @LightningAI β’ Lead Data Scientist β’ BITS Pilani β’ 3 Patents
13 subscribed
Jul 26 β’ 6 tweets β’ 2 min read
Backpropagation in PyTorch, clearly explained:
Backpropagation is the key to optimizing our neural network by adjusting it's weights.
And it's done by calculating gradients of the loss function w.r.t. these weights!
Today, we learn how to do this using PyTorch...π
Jul 24 β’ 11 tweets β’ 4 min read
Let's build a RAG app using MetaAI's Llama-3.1 (100% local):
Before we begin, take a look at what we're about to create!
Here's what you'll learn:
- @Llama_Index for orchestration
- @qdrant_engine to self-host a vector DB
- @Ollama for locally serving Llama-3.1
- @LightningAI for development & hosting
Let's go! π
Jul 23 β’ 11 tweets β’ 3 min read
SQL Joins clearly explained:
Let's setup 2 DataFrames to perform merge operations & name them:
β’ left
β’ right
A, B, C are the common keys β
Check this π
Jul 23 β’ 11 tweets β’ 3 min read
Illustrated Guide to Tensor Parallelism (supercharge your LLM training):
What is Tensor Parallelism?
Tensor Parallelism is form of model parallelism. It splits individual tensors across GPUs for efficient computation and memory use.
Perfect for training LLMs!
Let's simplify it today! π
Jul 18 β’ 9 tweets β’ 3 min read
Let's implement & train this neural network step-by-step, from scratch using PyTorch!
1/n
First of all let's define our model in PyTorch:
2/n
Jul 17 β’ 11 tweets β’ 4 min read
Let's build an advanced "Chat with your code" RAG application (100% local):
Before we begin, take a look at what we're about to create!
Here's what you'll learn:
- @Llama_Index for orchestration
- @Qdrant_Engine to self-host a vectorDB
- LlamaIndex's advanced code parsers
- @Ollama for serving LLMs locally
Let's go! π
Jul 16 β’ 10 tweets β’ 3 min read
Multiprocessing in Python clearly explained:
Ever felt like your Python code could run fasterβ
Multiprocessing might be the solution you're looking for!
Today, I'll simplify it for you in this step-by-step guide.
Let's go! π
Jul 15 β’ 10 tweets β’ 4 min read
Autoencoders are one of my favourite neural networks!
Today, I'll clearly explain:
- What they areβ
- And how they workβ
Let's go! π
1/n
Autoencoders have two main parts:
1οΈβ£ Encoder: Compresses the input into a dense representation (latent space)
2οΈβ£ Decoder: Reconstructs the input from this dense representation.
The idea is to make the reconstructed output as close to the original input as possible:π
2/n
Jul 12 β’ 7 tweets β’ 3 min read
ML models can only be as good as the data they're trained on!
Introducing Datalab, now you can automatically detect:
Eigenvalues & Eigenvectors clearly explained:
The concept of eigenvalues & eigenvectors is widely known yet poorly understood!
Today, I'll clearly explain their meaning & significance.
Let's go! π
Jun 24 β’ 10 tweets β’ 3 min read
f-strings in Python clearly explained:
f-strings were introduced in Python 3.6 and have since become a favorite among developers for their simplicity and readability.
Today, we'll start with the basics and dive into all the ninja tricks of using f-strings.
Let's go! π
Jun 20 β’ 10 tweets β’ 3 min read
Multithreading in Python clearly explained:
Ever felt like your Python code could run fasterβ
Multithreading might be the solution you're looking for!
Today, I'll simplify it for you in this step-by-step guide.
Let's go! π
Jun 17 β’ 11 tweets β’ 3 min read
Object oriented programming in Python, clearly explained:
We break it down to 6 important concepts: