Imagine you give a prize for achievement, and you also have some values you care about promoting.

It seems like it wouldn't be optimal for you to give the prize based exclusively on achievement all the time. The prize earns reputation with the strongest winners,

1/
which gives you latitude to promote values you care about some of the time, even when that "distorts" the allocation relative to what would be chosen by someone who did not want to promote the same values.

2/
For the persuasion story to work, it must be that the uncontroversial measure of quality is better observed by you, the prize-giver, than the public. Only then can you shift beliefs about some winners' quality relative to the hypothetical "unbiased prize."

3/
So only a credible prize whose committee gets honest advice from knowledgeable experts can be used "politically" at all.

But if those conditions are met, the prize can be used every now and then to give a huge boost to something you care about, and there's some tradeoff

4/
about how often you do that and how much you build up the reputation that gives that boost its fuel.

5/5
PS/ Please feel free to reply with your favorite theory papers that analyze this properly.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Golub

Ben Golub Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @ben_golub

1 Oct
A few observations to add, in the form of common mistakes.

A. "The audience as computer" aka "important things need to be written to memory once." In fact, nobody learns anything except the simplest things the first time. Important points should persist across a few slides.

1/
E.g. if there is an important, longish Bellman equation, DON'T just put it up once and move on. Have it hanging around on several slides.

Blackboards talks are better because of persistence. Most speakers using slides ignore that things take time to absorb.

2/
B. "Literature details before content."

It's amazing how often you hear, at minute 7, "but our eigenvalue condition will actually be in terms of the Laplacian matrix, rather than the usual adjacency matrix as in XYZ (2014)"
before anyone can follow this.

3/
Read 9 tweets
17 Sep
Hard to convey my excitement at seeing an argument by @ojblanchard1 for a networks perspective on three seemingly distinct kinds of fragility.

This is something that I have worked on for a few years now, and I hope that network theory can really help.

1/
I think it's right that there are commonalities between the fragility of

(i) production when institutions are shocked;
(ii) financial systems when asset values are shocked;
(iii) supply when shipping technology is shocked.

2/
One perspective that network theorists have been especially interested in is that there is something qualitative about some collapses: it's not just a matter of some things working worse, but the whole system entering a crisis.

3/
Read 16 tweets
2 Sep
A real-world high stakes game of experimentation with externalities:

Last night at 10, my car was at the front of several miles of cars on the Garden State Parkway all stuck behind a segment of road 3-5 feet underwater. You could try to drive through if you wanted,
but most people were dissuaded by the half dozen stalled/flooded cars in the water.

For about three hours, one vehicle every 15 minutes or so would go for it. Whether it succeeded depended on its type, the path it took, and the water depth.
The interesting thing is that a failed experiment (trying and having your car stall in 3 feet of water) has considerable private cost: deeply flooded cars are totaled, and the cost of even a lucky recovery in such a case is more than a few thousand dollars.
Read 8 tweets
29 Aug
Sometimes faculty complain about the stubborn Ph.D. student, who seems unaffected by advice. Talent and energy are risk factors for this disease, and, worse, is closely related to personality traits of many successful academics.

A few random thoughts.

1/
What "bad stubborn" looks like from the advisor perspective is that you thoughtfully engage with the work, repeatedly say something (that you feel is) REALLY IMPORTANT that should affect the project, and perceive it not to be affecting the project or the student's thought.

2/
A friend wishes they could tell students one cheat code for success. When faculty say, "This seems like a question you can answer in your project and people would really care about the answer," *actually try to do that*, or at least have serious conversations about it.

3/
Read 11 tweets
14 Jun
An applied mathematician I know thinks it's hilarious that economists care about formal rigor so much more than, e.g., applied physicists do.

Rigor, he says, is valuable, but other inputs currently seem to have a much higher return for advancing economic theory.

1/
For example, if our theorizing about long-run outcomes of social learning falls short of our potential, it's not because we forgot to check a subtle condition in applying the martingale convergence theorem in our model of their Bayesian behavior.

2/
(their = the agents').

"His people" (applied mathematicians, applied physicists) would not worry about that. Instead, they would quickly work through much more "theory," but without great rigor, and use the results to refine the collective decision about how to continue.

3/
Read 10 tweets
10 Jun
A few simple facts that some people find surprising the first time they hear them.

Imagine $100 is behind door A or B and I give you independent hints about which. The hint says either A or B but is right only 55% of the time.

First hint is worth $5, second hint is worth... $0!
Why? Because the second hint never makes you *want* to change your decision. (Think about the four possible hint combinations.)

This is a key idea behind a beautiful paper by Meg Meyer, here:

2/
If you want the second hint to be useful, you need to make it biased, "favoring" the leading option, so that if it comes back a surprising negative against the leader, you might actually change your decision.

Meyer uses this to derive implications about organizations.

3/
Read 7 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(