Aurélien Geron Profile picture
Oct 12, 2021 15 tweets 4 min read Read on X
My favorite Python 3.5 feature: the matrix multiplication operator @
👇Python features thread👇
Other Python 3.5 features I often use:
- subprocess.run()
- math.isclose()

Big 3.5 features I don't really use much:
- Coroutines with async and await
- Type hints
👇
My favorite 3.6 feature: formatted string literals
👇
Other 3.6 features I often use:

- Underscores in numeric literals like 1_000_000
- random.choices()
- math.tau to replace 2 * math.pi (obviously)
👇
My favorite 3.7 feature: dataclasses
👇
Other 3.7 feature I really like:

Legacy C locale coercion (PEP 538): locale-aware C extensions and child processes now use UTF-8 by default, rather than ASCII.
👇
My favorite 3.8 feature: self-doc strings
Other 3.8 features I really like:

- from math import prod, dist, comb, perm
- functools.cached_property

I might start using the walrus operator as well:
👇
Not sure I'll often use positional-only arguments, but okay, why not:
👇
My favorite 3.9 feature: removing prefixes and suffixes. I know it sounds silly, but this is needed so often!
And the new syntax to merge dicts is nice too.
👇
My favorite 3.10 feature: better error messages, including more precise error line numbers.
👇
I'm not sure I'll use the new match/case feature from 3.10, though:
👇
Pros:
- it's elegant in some cases

Cons:
- more to learn, harder for beginners
- unusual semantics: case args act a bit like function args, but they outlive the match/case
- goes against the "one-way to do things" principle
👇
"If you don't like it, just don't use it" is not a valid argument unless you always work alone, and you never read anyone else's code.
👇
So many great improvements, it's nice to see Python continue to improve! 🐍💕
<The End>

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Aurélien Geron

Aurélien Geron Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @aureliengeron

Feb 14, 2022
1/ Here's a philosophical question for you: what's the level of elephantness of a tree?

Elephants are highly complex, they have all sorts of properties, so the question is meaningless unless I specify exactly which properties I'm referring to, and how to score them.
2/ So let me be specific:

elephantness = weight × trunk length

Sounds reasonable. And... Oh wow, it turns out that trees are super-elephants. And cars are pretty elephanty too. Who would have guessed?
3/ We now have a concept that is both precise and broadly applicable, yet it's still completely useless: it just gives random scores to random things, it provides no insights, it leads to no applications.
Read 9 tweets
Mar 27, 2019
Sometimes validation loss < training loss. Ever wondered why? 1/5
The most common reason is regularization (e.g., dropout), since it applies during training, but not during validation & testing. If we add the regularization loss to the validation loss, things look much different. 2/5
Oh and the training loss is measured *during* each epoch, while the validation loss is measured *after* each epoch, so on average the training loss is measured ½ an epoch earlier. If we shift it by ½ an epoch to the left (where is should be), things again look much different. 3/5
Read 5 tweets
Mar 24, 2018
If you are confused about likelihood functions and maximum likelihood estimation, this little diagram might help. 1/8
Consider a probabilistic model f(x; θ) (top left). If you set the model parameter θ (top left, black horizontal line), you get a probability distribution over x (lower left). In this case, x is a continuous variable, so we get a probability density function (PDF). 2/8
If instead you set the value of x (top left, vertical blue line), you get a function of θ (top right): this is a likelihood function, noted ℒ(θ|x). 3/8
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(