Sachin Kumar Profile picture
Apr 29 6 tweets 3 min read Twitter logo Read on Twitter
Day 46 of #100dayswithmachinelearning

Topic -- Curse of Dimensionality

🧵 Image
Refers to phenomenon where the performance of ML algorithms deteriorates as No. of dimension or feature of input data ⬆️

This is because the volume of space increases exponentially with No. of dimension which causes data to become sparse & distance btwn data point to increase Image
Many ML algorithms struggle to find meaningful patterns & relationships in high-dimensional data & may suffer from overfitting or poor generalization performance. This can lead to longer training time increased memory requirements & reduced accuracy & efficiency in predictions. Image
Why is Dimensionality Reduction
necessary?

1⃣Avoids overfitting
2⃣Easier computation
3⃣Improved model performance
4⃣Lower dimensional data requires less storage space.
5⃣Lower dimensional data can work with other algorithms that were unfit for larger dimensions. Image
To mitigate the curse of dimensionality, techniques such as feature selection, dimensionality reduction & regularization can be used. These methods aim to reduce number of feature or dimensions in the data

shiksha.com/online-courses…
🙏If this thread was helpful to you

📷 Follow me @Sachintukumar for daily content

2 Connect with me on Linkedin
linkedin.com/in/sachintukum…

3 RT tweet below to share it with your friend 3⃣

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Sachin Kumar

Sachin Kumar Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Sachintukumar

Apr 30
Day 47 of #100dayswithmachinelearning

Topic -- Principle Component Analysis
(PCA) Part 1 Image
PCA statistics is science of analyzing all the dimension & reducing them as much as possible while preserving exact information

You can monitor multi-dimensional data (can visualize in 2D or 3D dimension) over any platform using the Principal Component Method of factor analysis.
Step by step explanation of Principal Component Analysis

STANDARDIZATION
COVARIANCE MATRIX COMPUTATION
FEATURE VECTOR
RECAST THE DATA ALONG THE PRINCIPAL COMPONENTS AXES Image
Read 6 tweets
Apr 29
Hello Folks 👨‍💻

If you are someone who is learning SQL, then this list can be helpful to you.

SQL - END-TO-END Learning Resources and Guide 👇 ( Must Read ) Image
1. SQL for Data Science

🔗lnkd.in/dw4aAC-q

2. Databases and SQL for Data Science with Python

🔗lnkd.in/d2psKJd9
3. Scripting with Python and SQL for Data Engineering

🔗lnkd.in/dD3cxWAJ

4. Introduction to Structured Query Language (SQL)

🔗lnkd.in/dvB6eA9m
Read 6 tweets
Apr 28
Day 45 of #100dayswithmachinelearning

Topic - Feature Construction & Feature Splitting

A Thread 🧵 Image
Feature construction is a critical aspect of feature engineering, which involves the process of creating new features or transforming existing ones to improve the performance of machine learning models. Image
The goal of feature construction is to extract meaningful information from raw data and represent it in a way that can be effectively used by machine learning algorithms.
Read 10 tweets
Apr 27
🏹SQL Interview Questions

🧵 Image
🎯Are NULL values same as that of zero or a blank space❓

🔺A NULL value is not at all same as that of zero or a blank space.
🔺NULL value represents a value which is unavailable, unknown, assigned or not applicable whereas a zero is a number and blank space is a character.
🎯What is the usage of the NVL() function❓

🔹Answer

🔺You may use NVL function to replace null values with a default value. 🔺The function returns the value of second parameter if first parameter is null.
🔺If the first parameter is anything other than null, it is left alone
Read 8 tweets
Apr 27
Day 44 of #100dayswithmachinelearning

Topic -- Outlier Detection using Percentile Method

A Thread 🧵
Outliers are a very important and crucial aspect of Data Analysis.

It can be treated in different ways, such as trimming, capping, discretization, or by treating them as missing values.
Percentile Method -

This technique works by setting a particular threshold value, which is decided based on our problem statement.

While we remove the outliers using capping, then that particular method is known as Winsorization.
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(