An under-appreciated feature of our present is how we record almost everything -- far more data than we can analyze. Future historians will be able to reconstruct and understand our time far better than we perceive and understand it right now.
Consider the events of January 6. Future historians will likely know who was there, who said what to whom, who did what, minute by minute. The amount of information you can recover from even a single video is enormous, and we have hundreds of them.
We're recording all of the dots -- our successors will have currently-unimaginable technology to connect them.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with François Chollet

François Chollet Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @fchollet

14 Feb
There's a pretty strong relationship between one's self-image as a dispassionate rational thinker and the degree to which one is susceptible to fall for utterly irrational beliefs that are presented with some sort of scientific veneer
The belief in recursive intelligence explosion is a good example: only someone who thinks of themselves as a very-high-IQ hyper-rationalist could be susceptible to buy into it
If you want to fool a nerd, make long, complex, overly abstract arguments, free from the shackles of reality. Throw equations in there. Use physics analogies. Maybe a few greek words
Read 5 tweets
14 Feb
An event that only happens once can have a probability (before it happens): this probability represents the uncertainty present in your model of why that event may happen. It's really a property of your model of reality, not a property of the event itself.
Of course, if the event has never happened before, that implies that your model of how it happens has never been validated in practice. You can model the uncertainty present in what you know you don't know, but you'll miss what you don't know you don't know.
But that doesn't mean your model is worthless. Surely we all have the experience of writing a large piece of code and having it work on first try.
Read 4 tweets
16 Jan
2020 was definitely a step backwards. If you're wondering how great civilizations can end up collapsing: they just have many 2020s in a row over several decades, with exponentially compounding cascade effects at each new development.
Factors of decline are multiplicative. E.g. cultural & educational deterioration leads to an incompetent government. An incompetent government makes a pandemic much worse. A bad pandemic accelerates institutional decline
For the record, I don't think civilization will collapse in the near future (within the next 400 years). Not even as a consequence of catastrophic climate change over the next two centuries. But we will go through some pretty rough patches
Read 6 tweets
6 Jan
DALL-E is the kind of application that you'd expect deep learning to be able to pull off in theory (people have been building various early prototype of text-guided image generation since 2015) that becomes really magical when done at a crazy scale.
As usual with deep learning, scaling up is paying off.
In the future, we'll have applications that generate photorealistic movies from a script, or new video games from a description. It's only a matter of years at this point.
Read 5 tweets
5 Jan
Here's an overview of key adoption metrics for deep learning frameworks over 2020: downloads, developer surveys, job posts, scientific publications, Colab usage, Kaggle notebooks usage, GitHub data.

TensorFlow/Keras = #1 deep learning solution.
Note that we benchmark adoption vs Facebook's PyTorch because it is the only TF alternative that registers on the scale. Another option would have been sklearn, which has massive adoption, but it isn't really a TF alternative. In the future, I hope we can add JAX.
TensorFlow has seen 115M downloads in 2020, which nearly doubles its lifetime downloads. Note that this does *not* include downloads for all TF-adjacent packages, like tf-nightly, the old tensorflow-gpu, etc.
Read 13 tweets
4 Jan
Here's a word-level text generation example with LSTM, starting from raw text files, in less than 50 lines of Keras & TensorFlow. colab.research.google.com/drive/1B9yLXcJ…
Of course, I should point out it's not 50 lines because Keras has some kind of built-in solution for text generation (it doesn't). It's 50 lines because Keras makes it easy to implement anything. It only uses generic features.
It uses a utility to read text files, a text vectorization layer (useful for any NLP), the LSTM layer and the functional API, the callbacks infrastructure, and the default training loop.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!