Ferenc Huszár Profile picture
Mar 2, 2020 3 tweets 2 min read Read on X
I’m happy to reveal that I will be joining the Cambridge CS Department (@Cambridge_CL) later this year, working with @lawrennd and @carlhenrikek to build a new ML group.

This should be an awesome place to do an ML PhD in the coming years 😉!
I’m looking forward to be part of the Cambridge ML community again - (at a different department this time). This of course means I’ll be stepping back from day-to-day work at Twitter, but I’m sure I'll find ways to continue working together with the great team there.
see you all after the #aiwinter

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ferenc Huszár

Ferenc Huszár Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @fhuszar

Sep 21, 2020
Over the weekend reports of racial/gender bias in Twitter's AI-based image cropping have started blowing up. I wanted to add some context from my perspective as an ex-employee and as a contributor to the research the product is based on.
First: the community is right to worry and talk about these issues, and to hold Twitter accountable for ensuring its products work for everyone. If indeed its product fails in a racially or gender-biased way, Twitter should learn from that and fix it, and I'm confident they will.
However, I have seen people jump to conclusions about negligence or oversight of the team who worked on this, or jump to generalisations on the basis of a handful examples, neither of which do I think is constructive. I wanted to add some context
Read 10 tweets
Aug 7, 2020
I enjoyed this podcast with @jack, touching on "THE ALGORITHMS", transparency, political bias and incentives.
This is not to say I agree with all of Jack's answers and views. Some are good, others underwhelming. The good bits are admission of mistakes (necessary but not sufficient for trust), promise of transparency to address political or algorithmic bias.
The notion that more people participating in Twitter will help solve problems narrowly observed on US Twitter, rather than result in exporting US Twitter's problems or creating new ones in countries developers don't understand strikes me as naive optimism.
Read 5 tweets
Jul 3, 2020
After 4 years at Twitter, and more with the Magic Pony team, time has come for me to move on and make it official by tweeting a picture of my laptop and badge, COVID-style, in a box. ImageImageImageImage
A ton of stuff happened in these 4 years. We joined weeks before the Brexit referendum, and months before the 2016 US election. Twitter has changed beyond recognition, growing up, embracing our role in serving public discourse.
A little over two years ago @jack retweeted this tweet, sending a not-so-subtle signal that it may be a good idea to move on. And indeed, I will be doing a lot more teaching now at Cambridge, and may even have more headspace to write new posts.
Read 11 tweets
Jun 23, 2020
In my first ever subtweet, let me say this:
If you're known and tweet something controversial that may send the wrong message or cause upset, you shouldn't try to clarify your stance. Amit you were wrong and apologise. Get someone to read your tweets before sending them.
Ok, scrap the subtweet, I'm talking of course about Yann LeCun's tweets. As scientists, we are used to arguing and defending viewpoints. Even if there's a technical/philosophical debate to be had about dataset vs model bias, it's is not about that anymore.
I think I can see what he was trying to say, and he makes some valid points that, perhaps, could be powerful points in a debate. But this is not the way/place to keep pushing this. People are not disappointed because they don't get it, explanation is not what they need.
Read 7 tweets
Jun 8, 2020
Early career (postdoc, PI) researchers working on applying ML in a variety of scientific disciplines: there are some great 5-year fellowships in Cambridge you should take a good look at:
jobs.cam.ac.uk/job/25959/
Hosted at the CS Department (where some great ML momentum is building) and working in collaboration with outstanding scientists across many departments in Cambridge, the scope for impact in these positions is great.
For those of you working on/interested in climate science or the environment, these's an extra boost: the same department hosts PhD training centre on AI for Environmental Risks
ai4er-cdt.esc.cam.ac.uk
Read 4 tweets
Jun 6, 2020
It’s telling of my privilege that I'm only outraged when seeing videos on Twitter of behaviors many people experience first hand - and that my primary concern is what to tweet in reaction so it does not appear empty virtue signaling. So as I should have done earlier, I'm tweeting
What this past week exposed about attitudes and behaviours in the US is unbelievable and eye-opening. What we see from police, counter-protestors and politicians is simply barbaric and backwards, so at odds with US’s self-image as a country that leads the world on modern values.
I never imagined seeing whole groups of police treating citizens like this even in the context of protests. Or it to be ok for some counter-protesters to walk around intimidating people with guns, let alone politicians posing with assault weapons in campaign ads. It's not normal.
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(