Maxim Raginsky Profile picture
father, academic, raconteur, aging wannabe hipster
Jan 29, 2021 14 tweets 3 min read
In this thread, I will argue that the conventional wisdom that Shannon's information theory is all about syntax and not about semantics stems from superficial reading. On the contrary, even his 1948 BSTJ paper is already concerned with syntax, semantics, *and* pragmatics. 1/14 Of course, it does not help matters that Shannon himself states that "semantic aspects of communication are irrelevant to the engineering problem" of communication on the first page of his paper. But you have to read the entire paper, not just the first page. 2/14
Jan 27, 2021 4 tweets 1 min read
I am reading about the Piaget-Chomsky debate at Royaumont, and it is about so much more than "just" the faculty of language. M. Piattelli-Palmarini, How hard is the "hard core" of a scientific theory? (parts 1 and 2)

massimo.sbs.arizona.edu/sites/massimo.…

massimo.sbs.arizona.edu/sites/massimo.…
Oct 9, 2020 8 tweets 3 min read
Currently reading this excellent paper by Brian Cantwell Smith: Even though it is a masterful takedown of Lenat and Feigenbaum’s CYC project, full of acerbic wit and withering criticism, it’s as relevant today in the context of all of the magical thinking surrounding GPT-3:
Oct 7, 2020 4 tweets 1 min read
I’ll just leave this here Image Here we are: Image
May 26, 2020 19 tweets 2 min read
late to the game, but what the hell:

1 like = 1 favorite book

go 1. The Feynman Lectures on Physics
2. Red Plenty by Francis Spufford
3. Against the Gods: The Remarkable Story of Risk by Peter Bernstein
4. Beetle in an Anthill (Жук в муравейнике) by Strugatsky brothers
5. The Time Wanderers (Волны гасят ветер), same
May 27, 2019 5 tweets 1 min read
"Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit" with Belinda Tzen arxiv.org/abs/1905.09883 1/5 Neural SDEs are simply deep latent Gaussian models in the limit of infinitely many layers, with appropriate scaling of the layer-to-layer noise. In our earlier work (to appear in COLT 2019), we have analyzed the expressiveness of such diffusion-based generative models. 2/5