My favorite Python 3.5 feature: the matrix multiplication operator @
👇Python features thread👇
Other Python 3.5 features I often use:
- subprocess.run()
- math.isclose()

Big 3.5 features I don't really use much:
- Coroutines with async and await
- Type hints
👇
My favorite 3.6 feature: formatted string literals
👇
Other 3.6 features I often use:

- Underscores in numeric literals like 1_000_000
- random.choices()
- math.tau to replace 2 * math.pi (obviously)
👇
My favorite 3.7 feature: dataclasses
👇
Other 3.7 feature I really like:

Legacy C locale coercion (PEP 538): locale-aware C extensions and child processes now use UTF-8 by default, rather than ASCII.
👇
My favorite 3.8 feature: self-doc strings
Other 3.8 features I really like:

- from math import prod, dist, comb, perm
- functools.cached_property

I might start using the walrus operator as well:
👇
Not sure I'll often use positional-only arguments, but okay, why not:
👇
My favorite 3.9 feature: removing prefixes and suffixes. I know it sounds silly, but this is needed so often!
And the new syntax to merge dicts is nice too.
👇
My favorite 3.10 feature: better error messages, including more precise error line numbers.
👇
I'm not sure I'll use the new match/case feature from 3.10, though:
👇
Pros:
- it's elegant in some cases

Cons:
- more to learn, harder for beginners
- unusual semantics: case args act a bit like function args, but they outlive the match/case
- goes against the "one-way to do things" principle
👇
"If you don't like it, just don't use it" is not a valid argument unless you always work alone, and you never read anyone else's code.
👇
So many great improvements, it's nice to see Python continue to improve! 🐍💕
<The End>

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Aurélien Geron

Aurélien Geron Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @aureliengeron

27 Mar 19
Sometimes validation loss < training loss. Ever wondered why? 1/5
The most common reason is regularization (e.g., dropout), since it applies during training, but not during validation & testing. If we add the regularization loss to the validation loss, things look much different. 2/5
Oh and the training loss is measured *during* each epoch, while the validation loss is measured *after* each epoch, so on average the training loss is measured ½ an epoch earlier. If we shift it by ½ an epoch to the left (where is should be), things again look much different. 3/5
Read 5 tweets
24 Mar 18
If you are confused about likelihood functions and maximum likelihood estimation, this little diagram might help. 1/8
Consider a probabilistic model f(x; θ) (top left). If you set the model parameter θ (top left, black horizontal line), you get a probability distribution over x (lower left). In this case, x is a continuous variable, so we get a probability density function (PDF). 2/8
If instead you set the value of x (top left, vertical blue line), you get a function of θ (top right): this is a likelihood function, noted ℒ(θ|x). 3/8
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(