BIG NEWS: #ChatGPT breaks #Python vs #R Barriers in Data Science!

Data science teams everywhere rejoice.

A mind-blowing thread (with a FULL chatgpt prompt walkthrough). 🧵

#datascience #rstats
It's NOT R VS Python ANYMORE!

This is 1 example of how ChatGPT can speed up data science & GET R & PYTHON people working together.

(it blew my mind)
This example combines #R, #Python, and #Docker.

I created this example in under 10 minutes from start to finish.
I’m an R guy.

And I prefer doing my business research & analysis in R.

It's awesome. It has:

1. Tidyverse - data wrangling + visualization
2. Tidymodels - Machine Learning
3. Shiny - Apps
But the rest of my team prefers Python.

And they don't like R... it's just weird to them.

So I wanted to see if I could show them how we could work together...
Let’s start with a prompt.

I asked chatgpt to find a data set that I used for this example. Image
...ChatGPT found it... Image
... And gave me this code to read the data... Image
I prefer the tidyverse, so I asked Chatgpt to update the code. Image
That looks better. Image
With the data in hand, it’s time for some Data Science.

I asked this simple question. Image
ChatGPT's response was impressive. Image
But, even though I’m an R guy, my team uses Python for Deployment…

In the past, that’s a huge problem.

(resulting in days of translations from R to Python with Google and StackOverflow)
But now, that’s 1 minute of effort with chatGPT.

Can I show you?
I asked chatgpt to convert the R script to python... Image
And in 10 seconds chatgpt made this python code with pandas and scikit learn. Image
ChatGPT did in 10 seconds something that would have taken me 2 hours.

But let’s continue.

The reason we had to convert to Python is for “deployment”

Deployment is just a fancy word for allowing others to access my model so they can use it on-demand.
So I asked chatGPT this: Image
And ChatGPT made me a Python API using FastAPI. Image
But this code is useless…

… Without a docker environment.

So I asked chatGPT to make one: Image
And chatGPT delivered my Docker Environment's Dockerfile: Image
So in under 10 minutes, I had ChatGPT:

1. Make my research script in R.

2. Create my production script in Python for my Team

3. And create the API + Docker File to deploy it.
But when I showed my Python team, instead of excited...

...They were worried.

And I said, "Listen. There's nothing to be afraid of."

"ChatGPT is a productivity enhancer."

They didn't believe me.
My Conclusion:

You have a choice. You can rule AI.

Or, you can let AI rule you.

What do you think the better choice is?
If you want help, I'd like you to join me on a free #ChatGPT for #DataScientists Workshop on April 26th. And I will help you Rule AI.

What's the next step?

👉Register Here: us02web.zoom.us/webinar/regist… Image

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with 🔥 Matt Dancho (Business Science) 🔥

🔥 Matt Dancho (Business Science) 🔥 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @mdancho84

Jul 20
Forecasting time series is what made me stand out as a data scientist.

But it took me 1 year to master ARIMA.

In 1 minute, I'll teach you what took me 1 year.

Let's go. 🧵 Image
1. ARIMA and SARIMA are both statistical models used for forecasting time series data, where the goal is to predict future points in the series.
2. Business Uses: I got my start with ARIMA using it for sales demand forecasting. But ARIMA and forecasting are also used heavily in econometrics, finance, retail, energy demand, and any situation where you need to know the future based on historical time series data.
Read 12 tweets
Jul 20
Why should you learn Polars in Python?

This is why. 🧵

#python Image
Polars is a fast and efficient DataFrame library designed for data analysis and manipulation in Rust and Python.

It is built to provide high-performance data processing capabilities, often outperforming traditional libraries like pandas, especially with large datasets.
1. Performance: Polars is designed with performance in mind, leveraging Rust's speed and safety to handle large datasets efficiently.
Read 8 tweets
Jul 18
Bayes' Theorem is a fundamental concept in data science.

But it took me 2 years to understand its importance.

In 2 minutes, I'll share my best findings over the last 2 years exploring Bayesian Statistics.

Let's go. 🧵 Image
1. Background:

"An Essay towards solving a Problem in the Doctrine of Chances," was published in 1763, two years after Bayes' death.

In this essay, Bayes addressed the problem of inverse probability, which is the basis of what is now known as Bayesian probability.
2. Bayes' Theorem:

Bayes' Theorem provides a mathematical formula to update the probability for a hypothesis as more evidence or information becomes available.

It describes how to revise existing predictions or theories in light of new evidence, a process known as Bayesian inference.
Read 8 tweets
Jul 5
XGBoost is now the go-to number 1 must-have algorithm in my data science toolkit.

But for years, I had no clue what I was doing. In 3 minutes, I’ll share 3 months of research (business case included).

Let’s go: 🧵 Image
1. XGBoost, which stands for Extreme Gradient Boosting, is an advanced implementation of the gradient boosting machine (GBM) algorithm. It was developed to optimize both computational speed and model performance.
2. Gradient Boosting Machine (GBM): GBMs are an ensemble approach that combines multiple weak learners (typically decision trees) to create a strong predictive model.
Read 9 tweets
Jul 3
I completed my first Python Sales Forecasting Project using Nixtla's StatsForecast and a NEW IDE for Data Scientists.

Here's how I want to help you do it too: 🧵

#python #rstats Image
Sales forecasting is a struggle for many data scientists.

Most college DS programs don't teach it (or gloss over it).

But it's one of the most important skills that companies need.

There are 2 incredible tools out there that I want to share with you:
1. Nixtla's StatsForecast

I love this library for classic econometric models like ARIMA, ETS (Exponential Smoothing), GARCH, and AutoARIMA.

It's super easy to automate forecasting.

And it's got awesome plotting functions, conformal intervals, and more.
Read 5 tweets
Jun 29
Bayes' Theorem is a fundamental concept in data science.

But it took me 2 years to understand its importance.

In 2 minutes, I'll share my best findings over the last 2 years exploring Bayesian Statistics.

Let's go. Image
1. Background: "An Essay towards solving a Problem in the Doctrine of Chances," was published in 1763, two years after Bayes' death. In this essay, Bayes addressed the problem of inverse probability, which is the basis of what is now known as Bayesian probability.
2. Bayes' Theorem: Bayes' Theorem provides a mathematical formula to update the probability for a hypothesis as more evidence or information becomes available. It describes how to revise existing predictions in light of new evidence, a process known as Bayesian inference.
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(