@oscredwin makes the point that this poll conflates testing the two questions “do most people spend more time deliberating over larger amounts of money than smaller amounts?” and “do people who ever allocate >$1M spend less time deliberating over $100k than those who don’t?”
It’s arguably rational to spend more time deliberating over more money, all else equal, and it’s also arguably rational for your research-time per dollar to decline as your hourly wage increases or as the amount of money you have to allocate increases.
What I’d really like to know is if my subjective impression is correct that as people gain in wealth/authority/seniority, they have a higher “activation energy” to go check object-level facts themselves. More than you’d expect just from their time being valuable.
Jeff Bezos is a famous counterexample, with his rule that people write up a brief document, share it, and read it, before every meeting.
But the fact that Bezos had to make a rule means that other Amazon managers *were* coming unprepared to meetings.
Pure intuition here, but I suspect that once you start to think “my time is valuable”, you typically stop doing whole categories of things, even when some instances of the category don’t take long. You just develop heuristics like “I don’t have time to read documents.”
It’s hard to distinguish “this person is making decisions fast or relying on social proof because that’s actually optimal for them” vs “this person is making decisions fast or relying on social proof because they’re lazy.”
You’d want some kind of evidence like “executives and investors who do more deliberation/fact-checking/due-diligence get better results.”
This is definitely true on the very low end. Individual stock market investors (eg day traders) have been shown to get worse returns than institutional investors, to have more bias, and to gather less information before investing.
But that just tells us “being an inexperienced hobbyist leads to worse investment returns than being a professional who works on investing full time and has access to expensive information resources.” It doesn’t tell us whether *among* professionals, more deliberation is better.
Perhaps more relevant is the evidence from the Good Judgment Project that superforecasters seek more information than non-superforecasters.
That’s evidence that the *best* predictions are made by “infovores.” But there isn’t real money involved, so maybe it doesn’t apply IRL.
I’d really like data on the decision behavior of investors or CEOs, and how it correlates with ROI. Haven’t found any so far.
I *have* found some negative results: hedge fund manager performance doesn’t correlate much with education, which might be a proxy for “infovoreness”.
I know @robinhanson has pulled a lot of evidence that firms are reluctant to pay for information; Robin do you know of evidence that those firms which *do* pay for info perform better?
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I think I still believe one of these papers more than I’d believe a “take” in the news or on Twitter, but less than I’d believe the opinion of a 60-year-old who has a lot of management experience.
We’ll see if the data on investors or investment analysts is better.
I did find one bizarre paper that said the rank of an investment analysis firm was *negatively* correlated with how frequently they reported using the company’s library. (Back before the Internet.)
If you’ve ever spent $1000-$10,000 in one place (either your own money or your company’s money), how much time do you usually spend researching the decision?
If you’ve ever spent $10,000-$100,000 in one place (either your own money or your company’s money), how much time do you usually spend researching the decision?
If you’ve ever spent $100,000-$1,000,000 in one place (either your own money or your company’s money), how much time do you usually spend researching the decision?
With an endorsement from Derek Lowe, my opinion is superfluous, but yes, this study looks legit. Semaglutide causes weight loss in a large, rigorous controlled trial. 🧵follows.
I could not find one research study using any of the peptides in the RADVAC white paper that found they inhibited SARS-CoV-2 infection in cells, let alone animals or humans.
All those peptides come from in silico studies: “the computer said they ought to bind to various viral proteins.” Plus a lot of theory/mechanism argument.
You have the legal, and IMO, the moral, right to experiment on yourself. But I don’t think this is very likely to work.
The most irritating thing about smartphones, to me, is that it’s more difficult to switch from one webpage to another on mobile than in a desktop browser. It makes it hard to do something like “take notes including links from a variety of websites.”
The second most annoying thing is that I can’t log into sites on my phone if their passwords are the long forgettable strings of letters and numbers that my laptop password manager remembers for me.
The third most annoying thing is that i’m slightly slower at typing on a phone than a keyboard.
The trend in deep learning for a lot of applications, for most of the past decade, seems to have been “you get out what you put in” — performance gains are proportional to increases in computing power.
I haven’t found anything I’m confident is a exception to that trend, in the direction of “performance grows faster than compute”. I’d be willing to bet that there aren’t any.
As long as that continues, it seems to me that the main question is how fast the cost of flops drops and how long it will continue to be profitable to keep buying more & better hardware.