hardmaru Profile picture
28 Jan, 6 tweets, 3 min read
Great debate with @chamath on CNBC about the deeper structural issues behind $GME and $AMC

Love the comments on YouTube:
“lol this CNBC dude is so concerned about my $200 invested…Fuck man…I never realized how much some people cared about me losing it.”
Apparently, CNBC is trying very hard to remove this full interview and copies of it from YouTube. I wonder why... drive.google.com/file/d/16IV7TI…
Someone called in for a few favors from their broker buddies...
A reminder that Citadel, who attempted in vain to bail out @MelvinCapitalLP earlier, also pays Robinhood to see all their order flows from retail investors, who are able to enjoy zero commission trading.

When something is free it means you are the product!ft.com/content/4a4393…
When a company deviates from their core mission:
Robinhood’s market manipulation pissed off both @AOC and @TedCruz

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with hardmaru

hardmaru Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @hardmaru

14 Dec 20
I respect Pedro's work, and I also really enjoyed his book. But man, I don't know where I should start with this one…
Maybe can start with “Facial feature discovery for ethnicity recognition” (Published by @WileyInResearch in 2018):
2 years later

“Huawei tested AI software that could recognize Uighur minorities and alert police. The face-scanning system could trigger a ‘Uighur alarm,’ sparking concerns that the software could help fuel China’s crackdown”

The article cited the paper.
washingtonpost.com/technology/202…
Read 8 tweets
17 Apr 20
Why is it that we can recognize objects from line drawings, even though they don't exist in the real world?

A fascinating paper by Aaron Hertzmann hypothesizes that we perceive line drawings as if they were approximate, realistic renderings of 3D scenes.

arxiv.org/abs/2002.06260
The coolest result in this paper is when they took a depth estimation model (single-image input) trained on natural images (arxiv.org/abs/1907.01341), and showed that the pre-trained model also works on certain types of line drawings, such as drawings of streets and indoor scenes.
This paper seems like an alternative, perhaps complementary take, on @scottmccloud's views about visual abstraction:
Read 5 tweets
19 Mar 20
Neuroevolution of Self-Interpretable Agents

Agents with a self-attention “bottleneck” not only can solve these tasks from pixel inputs with only 4000 parameters, but they are also better at generalization!

article attentionagent.github.io
pdf arxiv.org/abs/2003.08165

Read on 👇🏼
Work by @yujin_tang @nt_duong me @googlejapan

The agent receives the full input, but we force it to see its world through the lens of a self-attention bottleneck which picks only 10 patches from the input (middle)

The controller's decision is based only on these patches (right)
The agent has better generalization abilities, simply due to its ability to “not see things” that can confuse it.

Trained in the top-left setting only, it can also perform in unseen settings with higher walls, different floor textures, or when confronted with a distracting sign.
Read 9 tweets
12 Feb 20
SketchTransfer: A Challenging New Task for Exploring Detail-Invariance and the Abstractions Learned by Deep Networks

If we train a neural net to classify CIFAR10 photos but also give it unlabelled QuickDraw doodles, how well can it classify these doodles? arxiv.org/abs/1912.11570
@MILAMontreal @GoogleAI Answer: Better than we thought!

Recent paper by Alex Lamb, @sherjilozair, Vikas Verma + me looks motivated by abstractions learned by humans and machines.

Alex trained SOTA domain transfer methods on labelled CIFAR10 data + unlabelled QuickDraw doodles & reported his findings:
@MILAMontreal @GoogleAI @sherjilozair Alex found SOTA transfer methods (labelled CIFAR10 + unlabelled doodles) achieves ~60% accuracy on doodles. Supervised learning on doodles gets ~90%, leaving ~30% gap for improvement.

Surprisingly, training a model only on CIFAR10 still does quite well on ships, planes & trucks!
Read 4 tweets
18 Dec 19
“Fudan University, a prestigious Chinese university known for its liberal atmosphere, recently deleted ‘freedom of thought’ from its charter and added paragraphs pledging loyalty to the Chinese Communist Party, further eroding academic freedom in China.”

qz.com/1770693/chinas…
Students and academics at Fudan University appear to have protested this change. But the video of the group chanting lines from the old charter, shared on social media (domestic WeChat), has since been removed:
“The changes provoked a substantial reaction on Weibo. A professor at Fudan U’s foreign languages school, said on Weibo that the amendment is against the university’s regulations, as no discussions occurred at staff meetings. That post was later deleted.”

bloomberg.com/news/articles/…
Read 4 tweets
13 Nov 19
Pizza delivery in Hong Kong
@pizzahut @discoverhk Other ways to lay bricks for better effectiveness against larger police vehicles is to “Stonehenge” them:
@pizzahut @discoverhk It has also been reported that dishwashing liquid and bags of marbles have been used by protesters before to slow down the riot police 💡
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!