Discover and read the best of Twitter Threads about #Transformer

Most recents (4)

1/ Elon Musk on ollut merkittävä vaikuttaja Big Datan hyödyntämisessä. Hänen investointinsa Twitteriin ja Teslaan ovat auttaneet keräämään ja käyttämään valtavia määriä dataa. #ElonMusk #Tesla #Twitter #BigData
2/ Microsoft on toinen Big Datan kerääjä ja hyödyntäjä. He ovat keränneet dataa tuotteidensa, kuten Windowsin, Officen ja Androidin, sekä palveluidensa, kuten Bingin, Azuren ja LinkedInin, kautta. #Microsoft #BigData
3/ OpenAI, jonka Elon Musk perusti, keskittyy yleisen tekoälyn (AGI) kehittämiseen. Vaikuttavia tekoälymalleja, kuten ChatGPT, on kehitetty tätä tarkoitusta varten. #OpenAI #ChatGPT #AGI
Read 7 tweets
The difference between what russia does - SCORCHED EARTH, and what Ukraine could do - STRATEGICALLY TARGET transformers supplying military & industrial capacity killing Ukrainians, is self-explanatory.

100km buffer in RU is a start to protect Ukrainians.

tinyurl.com/RuPwrGrid Image
@CinC_AFU @ua_industrial @itarmyofukraine @ng_ukraine @KpsZSU @JayinKyiv @i_army_org @EuromaidanPR @Ukraine /2
Who remembers the Bryansk Oil Depot fires?

To its SE & only 55 kilometers from Ukraine is the Druzhba crude oil hub in Vysokoe that sends oil to Belarus (Mazyr) & north towards St Petersburg.

Electricity pumps oil, not diesel or gasoline.

How this is civilian or off-limits? Image
/3
На південний захід від Високого і лише за 40 кілометрів від України знаходиться Новозибківська підстанція 110 кВ.

Його трансформатори 110 кВ живлять електродвигуни, які перекачують сиру нафту далі по лінії до Білорусі.

tinyurl.com/RuPwrGrid

Операція «Темне небо Росія» Image
Read 22 tweets
1/ Hearing about #BERT and wondering why #radiologists are starting to talk more about this Sesame Street character?

Check out this #tweetorial from TEB member @AliTejaniMD with recommendations to learn more about #NLP for radiology. A tweetorial on BERT -- by ...
2/ #BERT is a language representation model taking the #NLP community by storm, but it’s not new! #BERT has been around since 2018 via @GoogleAI. You may interact with #BERT everyday (chatbots, search tools).

But #BERT is relatively new to healthcare. What makes it different? Image
3/ For starters, #BERT is “bidirectional”: it doesn’t rely on input text sequence (left-to-right or right-to-left). Instead, #Transformer-based architecture provides an #attention mechanism that help the model learn context based on ALL surrounding words.
Read 15 tweets
The size of #NLP models have increased enormously, growing to millions, or even billions, of parameters, along with a significant increase in the financial cost and carbon emissions. ASAPP Reducing the High Cos...
The cost associated with training large models limits the #AIresearch community's ability to innovate, because a research project often needs a lot of experimentation.
Consider training a top-performing #LanguageModel on the Billion Word benchmark. A single experiment would take 384 GPU days (6 days*64 V100 #GPUs, or as much as $36,000 using AWS on-demand instances)
Read 21 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!