They are SO incredibly manipulative it makes my blood boil. We see what you've been hyping up, who you put your $10B in, and what they say. Why do the people who create your product talk about how people are stochastic parrots as well?
And your proteges that you're putting $10B into keep on making this point, and then continues to talk about colonizing the cosmos, utopia and how in the next 5 years their tool will read legal documents and medical documents and understand them and such?
"In the next five years, computer programs that can think will read legal documents and give medical advice...in the decades after that, they will do almost everything, including making new scientific discoveries that will expand our concept of everything” moores.samaltman.com
They mislead the public INTENTIONALLY and then tell people oh look its just a screen, not like we've been talking about it as "AGI" or "AI systems that are generally smarter than humans" or a god like machine that can do anything for anyone everywhere! No dummy its just a screen!
And the SAME chairman endorses this cult manifesto talking about so many fantastical things talking about these systems as soo close to something we've never seen that's going to solve all problems and bring utopia.
That's how Microsoft continues to get away with this, because not enough people are paying attention. You have them endorsing these ridiculous things and giving BILLIONS to the org that is pushing such feverish hype, then they have the audacity to be like c'mon its just a thing.
I truly can't stand watching him speak. "A lifetime of science fiction." Also YOU. We wrote about this exact thing in our paper. Read section 5. Makes me so angry. So ANGRY. dl.acm.org/doi/abs/10.114…
There is NO CREATURE. OMG I'm losing my mind here.
"We were able to fix the problem in 24 hours." So far everyone here is Microsoft. There's a story about a Microsoft product and everyone so far is a Microsoft executive. The media is supposed to put a check to power.
Cue race with China which Brad Smith brings up. Like clockwork.
This piece is missing @emilymbender's voice who wrote a whole peer reviewed paper on why this exact thing shouldn't be done, i.e., why an LLM based chatbot shouldn't be used as a search engine. dl.acm.org/doi/abs/10.114…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
"After a conference on AI at the Pontifical Academy Of Science in Rome, discussing with some friends (among them Aimee van Wynsberghe), we argued that the first and foremost AI bias is its name." by Stefano Quintarelli.
"It induces analogies that have limited adhrence to reality and it generates infinite speculations (some of them causing excessive expectations and fears)...Because of this misconception, we proposed we should drop the usage of the term “Artificial Intelligence...”
"and adopt a more appropriate and scoped-limited terminology for these technologies which better describe what these technologies are: Systematic Approaches to Learning Algorithms and Machine Inferences."
These are from a public discord for the for LAION project. Take a look at the discussion by professors based out of the esteemed MILA, like @irinarish.
1) What "therapy data" is this LLM you're talking about fine-tuned on?
2) You see this hugely unethical thing and are like yes we need to do this and stability.ai needs to help us with PR and legal issues.
3) Ahh yes the "woke" crowd that "attacked Yann LeCun" "triggered by BLM." I am so happy that I don't have to be anywhere near MILA. How can any Black person survive? I've heard from a few who've been telling me how awful it is. No wonder with this shit.
"Musk’s Twitter is simply a further manifestation of how self-regulation by tech companies will never work, & it highlights the need for genuine oversight...Things have to change."
"The Algorithmic Accountability Act, the Platform Accountability & Transparency Act,...,the [EU]'s Digital Services & AI Acts...demonstrate how legislation could create a pathway for external parties to access source code & data to ensure compliance with antibias requirements."
"Companies would have to statistically prove that their algorithms are not harmful, in some cases allowing individuals from outside their companies an unprecedented level of access to conduct source-code audits, similar to the work my team was doing at Twitter."
"Many of those left behind are glad for the November cease fire. But survivors are living surrounded by the dead.
“We want the world to hear what happened,” said a woman who reported losing seven close relatives in the massacre near Adwa. washingtonpost.com/world/2023/02/…
"In Kumro, between 35 and 40 villagers were killed, one woman said. “They were hiding, but the old ones stayed in their houses. They thought they would be safe,” she said. Her 11-year-old son found his grandfather’s body, she said."
The soldiers had burned the thatch that covered the stone houses, the fodder for their livestock, even the beehives, she said.
In Rahiya, Eritrean troops killed a teacher named Letemichael Fisseha Abebe with her 7-year-old son and another aged 20 months, a relative said."
"Google, Microsoft, and other leading AI companies in the US are currently racing to build and launch products built on large language models,...Abbott says Lelapa is not interested in that technology, which requires huge volumes of training data and expensive computing power."
"For most African languages, there isn’t enough training data to take that approach, she says. And AI models have to be nimble enough to deploy from a smartphone or Raspberry Pi with an inconsistent web connection."
"Eritrean asylum-seekers were arrested at a protest against their country’s dictatorship & its supporters here...questions have been raised about whether...British authorities are doing enough to protect...asylum-seekers from the ‘long arm’ of the regime" morningstaronline.co.uk/article/f/i-th…
"“At any point, you can be forcibly removed from your life, everything you know … and forced to become a soldier,” he says,.... Growing up, Aaron saw his friends, neighbours & relatives disappear into Eritrea’s system of indefinite national conscription."
"Many young Eritreans like Aaron risk all to escape. Fears of conscription have caused Eritrea to become one of the world’s biggest creators of refugees per head of population, with more than 10 per cent of the country’s total population thought to be living in exile."