, 11 tweets, 2 min read Read on Twitter
A thought-provoking article by @GaryMarcus . Accordingly, I had some thoughts! Longish thread ahead. /1 wired.com/story/deepmind…
First thought: Why frame this as "DeepMind lost $572 million" rather than "Google made a very large, long-term investment in one of its research arms, DeepMind"? /2
DeepMind's mission is to "solve intelligence" -- a statement I find nonsensical BTW -- but whatever you want to call it, AI is a very hard, long-term research problem and it's great for companies to fund basic, long-term research that doesn't have immediate payoff. /3
Second thought: Marcus claims that deep reinforcement learning is "a kind of turbocharged memorization". That's an interesting but still too vague hypothesis that I think needs to be made more formal and testable. /4
Third thought: Marcus: "DeepMind has yet to find any large-scale commercial application of deep reinforcement learning." Why the focus on this kind of short-term commercial application? Was that kind of short-term payoff what Google had in mind when it acquired Deep Mind? /5
Fourth thought: Marcus compares DeepMind with IBM Watson. But IBM made huge promises & put out a lot of hype on how their system would very soon revolutionize healthcare, law, etc. Has DeepMind ever made promises about how it would commercialize Deep RL in the short-term? /6
Fifth thought: Marcus correctly warns of the dangers of "overpromising". But is this fair, with respect to Deep Mind? Have they actually overpromised with respect to their technology? (I did see a quote from Shane Legg a while ago personally predicting AGI by 2020s 🙂) 7/
Final thought: I completely agree that AI research should pay more attention to cognitive science.... 8/
But I'm not sure about the glib comparison Marcus (and a lot of other people) makes: "How do children acquire language and come to understand the world, using less power and data than current AI systems do?” 9/
I suspect that it's really hard to assess the amount of "power" and "data" used for a child to acquire language, if such an assessment even makes sense. I'd love to see a very careful discussion of this claim. 10/
As I said, a thought-provoking article, which is the best kind! Thanks, @GaryMarcus ! Looking forward to your book.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Melanie Mitchell
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!