I’m not saying you need to be an expert in advanced calculus to do machine learning…
BUT, there is a big difference between someone that does vs someone that does NOT have a good foundation in stats when it comes to getting & explaining business results.
My thought process back in the day was to obtain a great foundation in stats and machine learning at the same time.
So here’s what helped me. I read a ton of books.
Here are the 3 books that helped me learn data science the most...
1. R for Data Science (Wickham & Grolemund) r4ds.had.co.nz
When I was first exposed to the Confusion Matrix, I was lost.
There was a HUGE mistake I was making with False Negatives that took me 5 years to fix.
I'll teach you in 5 minutes. Let's dive in. 🧵
1. The Confusion Matrix
A confusion matrix is a tool often used in machine learning to visualize the performance of a classification model. It's a table that allows you to compare the model's predictions against the actual values.
2. Correct Predictions:
True Positives (TP): These are cases in which the model correctly predicts the positive class.
True Negatives (TN): These are cases in which the model correctly predicts the negative class.
Understanding P-Values is essential for improving regression models.
In 2 minutes, I'll crush your confusion.
1. The p-value:
A p-value in statistics is a measure used to assess the strength of the evidence against a null hypothesis.
2. Null Hypothesis (H₀):
The null hypothesis is the default position that there is no relationship between two measured phenomena or no association among groups. For example, under H₀, the regressor does not affect the outcome.
Bayes' Theorem is a fundamental concept in data science.
But it took me 2 years to understand its importance.
In 2 minutes, I'll share my best findings over the last 2 years exploring Bayesian Statistics. Let's go.
1. Background:
"An Essay towards solving a Problem in the Doctrine of Chances," was published in 1763, two years after Bayes' death. In this essay, Bayes addressed the problem of inverse probability, which is the basis of what is now known as Bayesian probability.
2. Bayes' Theorem:
Bayes' Theorem provides a mathematical formula to update the probability for a hypothesis as more evidence or information becomes available. It describes how to revise existing predictions or theories in light of new evidence, a process known as Bayesian inference.
Correlation is the skill that has singlehandedly benefitted me the most in my career.
In 3 minutes, I'll demolish your confusion (and share strengths and weaknesses you might be missing).
Let's go:
1. Correlation:
Correlation is a statistical measure that describes the extent to which two variables change together. It can indicate whether and how strongly pairs of variables are related.
2. Types of correlation:
Several types of correlation are used in statistics to measure the strength and direction of the relationship between variables. The three most common types are Pearson, Spearman Rank, and Kendall's Tau. We'll focus on Pearson since that is what I use 95% of the time.
Principal Component Analysis (PCA) is the gold standard in dimensionality reduction.
But PCA is hard to understand for beginners.
Let me destroy your confusion:
1. What is PCA?
PCA is a statistical technique used in data analysis, mainly for dimensionality reduction. It's beneficial when dealing with large datasets with many variables, and it helps simplify the data's complexity while retaining as much variability as possible.
2. PCA has 5 steps:
1. Standardization 2. Covariance Matrix Computation 3. Eigen Vector Calculation 4. Choosing Principal Components 5. Transforming the data