Discover and read the best of Twitter Threads about #lstm

Most recents (5)

We added a couple of articles on #machinelearning.
Content based #recommender using #python:
alpha-quantum.com/blog/content-b…
Collaborative #filtering approach to #recommender based suggestions, again using #python and #matrixfactorization: alpha-quantum.com/blog/collabora…
a few useful links for website categorization: joy.link/websitecategor…
Read 6 tweets
Hi everyone! I am @JuliaSanchezBio, a postdoc @CMBI_Imperial & today I will be tweeting about our departmental postdocs and fellows day; hope you enjoy career and science talks, and get to meet new colleagues! #DoIDPFDay Image
To kick-off the event we have @wendybarclay11, our head of department, and @avi_cmbi, our postdoc champion. We, your DoID postdoc reps @kkrohnhuse,@AnatMelamed & @CatherineKibir1, will also be introducing ourselves 😊 #DoIDPFDay Image
@wendybarclay11 @avi_cmbi @kkrohnhuse @AnatMelamed @CatherineKibir1 @imperialcollege @ImperialMed @ines_perpetuo from @ImperialPFDC is now giving us an overview of how they can support us postdocs & fellows to choose & achieve our future objectives!
They have always been incredibly helpful, and happy to be contacted! #DoIDPFDay Image
Read 9 tweets
1. Hello

les réseaux #RNN de type #LSTM et #GRU sont des réseaux qui sont généralement mal compris

Je vais essayer dans ce thread de les expliquer clairement

on va commencer par voir dans le détail les GRU (Gated Recurrent Unit).

Ready?

🔽🔽

#datascience #machinelearning Image
2. Le principe d'un réseau récurrent RNN est relativement simple

Il s'applique à des données de type séquence c'est à dire des données qui se suivent dans le temps et dont l'ordre est important (Time Series, Speech, texte, musique, ....)
3. Dans leur forme simple, les RNN sont des réseaux de neurones classiques

avec comme différence le fait que la sortie du réseau au temps t soit mis en entrée au temps t+1

Cela permet de conserver une mémoire de ce qui a été calculé au préalable par le réseau
Read 25 tweets
D1 of #50daysofudacity
I finished up to Lesson 2.19
My notes can be found here for quick refernce
docs.google.com/document/u/1/d…
D2 of #50daysofUdacity
I finished up to Lesson 2.25
Also completed lab assignment for a linear regression model to predict the price of taxi in new york city
My notes can be found here for quick reference

docs.google.com/document/u/1/d…
D3 of #50daysofudacity
I finished Lesson 2
Also completed lab assignment for linear regression model to predict the price of taxi in new york city
My notes can be found here for quick reference
docs.google.com/document/u/1/d…
Read 53 tweets
A "worrying analysis":

"18 [#deeplearning] algorithms ... presented at top-level research conferences ... Only 7 of them could be reproduced w/ reasonable effort ... 6 of them can often be outperformed w/ comparably simple heuristic methods."

Paper:
lnkd.in/dTaGCTv

#AI
[Updates worth tweeting]

2/
There is much concern about #reproducibility issues and flawed scientific practices in the #ML community in particular & #academia in general.

Both the issues and the concerns are not new.

Isn't it time to put an end to them?
3/
There are several works that have exposed these and similar problems along the years.

👏👏 again to @Maurizio_fd et al. for sharing their paper and addressing #DL algorithms for recommended systems (1st tweet from this thread).

But there is more, unfortunately:
Read 18 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!