Akshay ๐Ÿš€ Profile picture
Jul 6 โ€ข 10 tweets โ€ข 3 min read Twitter logo Read on Twitter
Broadcasting in NumPy is widely used, yet poorly understoodโ—๏ธ

Today, I'll clearly explain how broadcasting works! ๐Ÿš€

Same rules apply to PyTorch & TensorFlow!

A Thread ๐Ÿงต๐Ÿ‘‡
Broadcasting describes how NumPy treats arrays with different shapes during arithmetic operations.

The smaller array is โ€œbroadcastโ€ across the larger array, such that the 2 have compatible shapes.

Check this out๐Ÿ‘‡
In the image below, scalar "b" is being stretched into an array with the same shape as "a".

But how do we generalise these things?

continue reading ... ๐Ÿ“–
๐Ÿ’ซ General Rules:

1) Broadcasting starts with the trailing (i.e. rightmost) dimensions and works its way left .

2) Two dimensions are compatible, either when they are equal or one of them is 1.

Check out the examples ๐Ÿ‘‡
When ever a one dimensional array is involved in broadcasting, consider it as a row vector!

Array โ†’ [1, 2, 3] ; shape โ†’ (3,)
Treated as โ†’ [[1, 2, 3]] ; shape โ†’ (1, 3)

Remember, broadcasting occurs from trailing dimension!

Check this out๐Ÿ‘‡

Let's check a scenario when broadcasting doesn't occur!

- a(4x3)
- b(4) will be treated as b(1x4)

Now, broadcasting starts from trailing dimension but (4x3) & (1x4) are not compatible!

Check this out๐Ÿ‘‡
Let's take one more example to make out understanding concrete!

Remember, 1D array treated as a row vector while broadcasting!

Check this out๐Ÿ‘‡
Here's how an array of shape (4x1) & (3,) broadcasts together!

Check this out๐Ÿ‘‡
Why use broadcastingโ“

Broadcasting provides a means of vectorising array operations so that looping occurs in C instead of Python.

It does this without making needless copies of data and usually leads to efficient algorithm implementations.
That's a wrap!

If you interested in:

- Python ๐Ÿ
- Data Science ๐Ÿ“ˆ
- Machine Learning ๐Ÿค–
- Maths for ML ๐Ÿงฎ
- MLOps ๐Ÿ› 
- CV/NLP ๐Ÿ—ฃ
- LLMs ๐Ÿง 

I'm sharing daily content over here, follow me โ†’@akshay_pachaar if you haven't already!!

Cheers!! ๐Ÿ™‚

โ€ข โ€ข โ€ข

Missing some Tweet in this thread? You can try to force a refresh
ใ€€

Keep Current with Akshay ๐Ÿš€

Akshay ๐Ÿš€ Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @akshay_pachaar

Jul 3
Everyone should learn how to fine-tune LLMs.

However, LLMs barely fit in GPU memoryโ—๏ธ

This is where fine-tuning & parameter efficient fine-tuning (PEFT) become essential.

Let's understand them today! ๐Ÿš€

A Thread ๐Ÿงต๐Ÿ‘‡
GPT-4 is not a silver bullet solution.

We often need to teach an LLM or fine-tune it to perform specific tasks based on our custom knowledge base.

Read more:


Here's an illustration of different fine-tuning strategies! ๐Ÿ‘‡ https://t.co/5SCrSQjbA4lightning.ai/pages/communitโ€ฆ
We know LLMs today barely fit in GPU memory!

And say we want to update all the layers (Fine tuining II) as it gives best performance.

This is were we need to think of some Parameter efficient technique!

One such way is to use Transformer block with adapters!

Check this out๐Ÿ‘‡
Read 8 tweets
Jul 2
I started my career in Data Science back in 2016 โณ

Here are 7 tips for those starting out today!

A Thread ๐Ÿงต๐Ÿ‘‡
1๏ธโƒฃ Learn the fundamentals!

It's important to have a strong foundation in mathematics, it helps you to stand out.

Here's a great place to start ๐Ÿ‘‡
coursera.org/specializationโ€ฆ
2๏ธโƒฃ Become a strong programmer ๐Ÿ”ฅ

Programming helps you to bridge the gap between theory and building stuff that's useful.

Python is the language data science community speaks! ๐Ÿ—ฃ

And CS50p is arguably the best place to learn Python.

Check this out ๐Ÿ‘‡
cs50.harvard.edu/python/2022/
Read 9 tweets
Jul 1
Object oriented programming is essential for writing clean & modular code!

Let's clearly understand OOPs with Python! ๐Ÿš€

A Thread ๐Ÿงต๐Ÿ‘‡
We break it down to 6 important concepts:

- Object ๐Ÿš˜
- Class ๐Ÿ—๏ธ
- Inheritance ๐Ÿงฌ
- Encapsulation ๐Ÿ”
- Abstraction ๐ŸŽญ
- Polymorphism ๐ŸŒ€

Let's take them one-by-one... ๐Ÿš€
1๏ธโƒฃ Object ๐Ÿš˜

Just look around, everything you see can be treated as an object.

For instance a Car, Dog, your Laptop are all objects.

An Object can be defined using 2 things:

- Properties: that describe an object
- Behaviour: the functions that an object can perform

...๐Ÿ‘‡
Read 11 tweets
Jun 30
Microsoft & LinkedIn just released a professional certificate in Generative AI!

It's FREE & comes in 5 course:

- Introduction to AI.
- What is Generative AI.
- The evolution of Search. ๐Ÿ”
- Streamline you work w/t Bing Chat.
- Ethics in the Age of Generative AI.

Read more ๐Ÿงต๐Ÿ‘‡
1๏ธโƒฃ Introduction to AI

Gives you a simplified overview of the top tools in artificial intelligence.

Designed for project managers, product managers, directors, executives, and students starting a career in AI.

Check this out๐Ÿ‘‡
linkedin.com/learning/introโ€ฆ
2๏ธโƒฃ What Is Generative AI?

Learn about the basics of Generative AI.

The course overs:
- history of AI
- popular models
- how it works
- ethical implications
- and much more...

Check this out๐Ÿ‘‡
linkedin.com/learning/what-โ€ฆ
Read 7 tweets
Jun 26
I've been coding in Python for 8 years now. โณ

If I were to start over again in 2023, here's a roadmap: ๐Ÿงต๐Ÿ‘‡
1๏ธโƒฃ freeCodeCamp

4 hours Python bootcamp!!

What you'll learn:
- Installing Python
- Setting up an IDE
- Basics Syntax
- Variables & Datatypes
- Looping in Python
- Exception handling
- Modules & pip
- Mini hands-on projects ๐Ÿ”ฅ

Check this out ๐Ÿ‘‡
2๏ธโƒฃ CS50p: Harvard University

There isn't a better place to learn #Python than @davidjmalan 's CS50p.

Beautiful explanations and great projects.
It's a complete package.

Highly recommended!!

Check this out ๐Ÿ‘‡
cs50.harvard.edu/python/2022/
Read 5 tweets
Jun 23
We've all dealt with activation functions while working with neural nets.

- Sigmoid
- Tanh
- ReLu & Leaky ReLu
- Gelu

Ever wondered why they are so importantโ“๐Ÿค”

Let me explain it to you in this ๐Ÿงต๐Ÿ‘‡
Before we proceed I want you to understand something!

You can think of a layer in neural net as a function & multiple layers makes the network a composite function.

Now, a composite function consisting of individual linear functions is also linear.

Check this ๐Ÿ‘‡
We have a simple neural net that does binary classification.

Scenario 1:
- Linear decision boundary
- Linear Activation function

Observe how the neural net is able to quickly learn & loss converges to zero.

Watch this ๐Ÿ‘‡
Read 7 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(