Sachin Kumar Profile picture
May 10 14 tweets 5 min read Twitter logo Read on Twitter
Day 57 of #100dayswithMachinelearning

Topic - Batch Gradient Descent (BGD)

A Thread 🧵 Image
(BGD) is optimization algorithm commonly used in ML & optimization problems to minimize the cost function or maximize the objective function

It is type of GD algorithm that update model parameters by taking the average gradient of entire training dataset at each iteration Image
Here's how the BGD algorithm works:

1) Initialize the model parameters: Start by initializing the model parameters, such as weights and biases, with random values. Image
2) Compute the cost function: Evaluate the cost function or the objective function on the entire training dataset. The cost function measures the error or the difference between the predicted values of the model and the actual values in the training dataset. Image
3) Compute the gradient: Calculate the gradient of the cost function with respect to each model parameter. The gradient represents the direction and magnitude of the steepest ascent or descent of the cost function.
4) Update the parameters: Adjust model parameter by subtracting fraction of gradient from current parameter value. The fraction is determined by learning rate which control step size taken in each iteration The update equation for parameter θ is: θ = θ - learning_rate * gradient
5 ) Repeat steps 2-4: Iterate steps 2 to 4 until a stopping criterion is met. This criterion can be a maximum number of iterations, reaching a certain level of convergence, or other conditions specific to the problem.
The key Point of BGD is calculate gradient using entire training dataset at each iteration. This makes it computationally expensive for large dataset but ensure more accurate estimate of gradient. It also guarantees convergence to global minimum (or maximum) of the cost function
Despite its advantages, BGD has some limitation

It requires the entire training dataset to be loaded into memory, which can be challenging for large datasets.

Additionally, BGD may converge slowly if the cost function is non-convex or has many local minima
To overcome the memory and computational limitations of BGD, variations such as Stochastic Gradient Descent (SGD) and Mini-Batch Gradient Descent (MBGD) are often used Image
SGD updates the parameters using only one random training sample at a time, while MBGD updates the parameters using a small subset (mini-batch) of the training dataset. These variations offer faster convergence and can handle larger datasets more efficiently Image
Mini-Batch gradient descent can find a balance between the robustness of SGD and the efficiency of BGD.

@github NoteBook

github.com/sachinkumar160…
SGD method is doing one iteration or one row at a time, and therefore, the fluctuations are much higher than the batch gradient descent @kdnuggets

kdnuggets.com/2020/05/5-conc…
If this thread was helpful to you

1. Follow me @Sachintukumar
for daily content

2. Connect with me on Linkedin: linkedin.com/in/sachintukum…

3. RT tweet below to share it with your friend

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Sachin Kumar

Sachin Kumar Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Sachintukumar

May 13
🔸CONCAT_WS() in SQL { Very Helpful }

A Thread 🧵 Image
CONCAT_WS() function in SQL is used to concatenate multiple strings into single string with specified separator between each string

"WS" stands for "with separator." This function is commonly used to construct strings contain multiple values such create comma-separated list
The syntax for CONCAT_WS() is as follows:

🔸CONCAT_WS(separator, string1, string2, ..., stringN)
Read 6 tweets
May 12
Day 59 of #100DayswithMachinelearning

Topic - Mini-Batch Gradient Descent

A Thread 🧵 Image
Mini-batch gradient descent is a variation of the gradient descent optimization algorithm used in ML & DL

It is designed to address the limitations of two other variants: BGD and SGD Image
In BGD the entire training dataset is used to compute the gradient of the cost function for each iteration.

This approach guarantees convergence to the global minimum but can be computationally expensive, especially for large datasets
Read 12 tweets
May 11
Day 58 of #100DayswithMachineLearning

Topic - Stochastic Gradient Descent ( SGD )

A Thread 🧵 Image
SGD is an optimization algorithm often used in machine learning applications to find the model parameters that correspond to the best fit between predicted and actual outputs. It’s an inexact but powerful technique.
Saddle point or minimax point is point on the surface of graph of function where slopes (derivatives) in orthogonal directions are all zero (a critical point), but which is not local extremum of function

A saddle point (in red) on graph of z = x2 − y2 (hyperbolic paraboloid) Image
Read 10 tweets
Apr 30
Day 47 of #100dayswithmachinelearning

Topic -- Principle Component Analysis
(PCA) Part 1 Image
PCA statistics is science of analyzing all the dimension & reducing them as much as possible while preserving exact information

You can monitor multi-dimensional data (can visualize in 2D or 3D dimension) over any platform using the Principal Component Method of factor analysis.
Step by step explanation of Principal Component Analysis

STANDARDIZATION
COVARIANCE MATRIX COMPUTATION
FEATURE VECTOR
RECAST THE DATA ALONG THE PRINCIPAL COMPONENTS AXES Image
Read 6 tweets
Apr 29
Hello Folks 👨‍💻

If you are someone who is learning SQL, then this list can be helpful to you.

SQL - END-TO-END Learning Resources and Guide 👇 ( Must Read ) Image
1. SQL for Data Science

🔗lnkd.in/dw4aAC-q

2. Databases and SQL for Data Science with Python

🔗lnkd.in/d2psKJd9
3. Scripting with Python and SQL for Data Engineering

🔗lnkd.in/dD3cxWAJ

4. Introduction to Structured Query Language (SQL)

🔗lnkd.in/dvB6eA9m
Read 6 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(