Suppose you were offered the following opportunity: Using highly advanced, but completely safe, psychological methods, your values and personality can be permanently altered.
The changes would be minor enough that you are not just being overwritten, replacing your mind with a different person; your parents would still recognize you as you. But they would be big enough that you would make different life choices and have a different life trajectory.
All of the changes would be in the direction generally considered "good": you'd become happier, more diligent, more conscientious, more prosocial, less neurotic.
Your preferences and interests would change somewhat: if you like history, you might come to like math instead. The people who you vibe best with would also change somewhat, and the people you're romantically attracted to (assume you're single). Your sense of humor might change.
All of those changes would be the result of increasing the new preferences more than muting the old ones. It's not that you stop enjoying history, it's more that you get _really_ into math, such that history just doesn't seem as interesting as it used to.
None of these changes would make you "worse off" as assessed on an absolute scale (ie for all shifts from preferring X to preferring Y, a neutral observer would think that someone who likes X is generally better off than, or about as well off as, someone who likes Y.)
Would you take someone up on this opportunity?
If someone was offering to pay you for doing this, how much would you need to be paid to make it worth it?
[See the above hypothetical]
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I just noticed that one of the things that I get from fiction is a kind of vicarious...pride? ...camaraderie? from competent people trusting each other.
In for instance, in urban fantasy, there's something that feels deeply Good about the moments when the wizard and the cop work together to get the job done.
Neither one fully understands the other's work, or the constraints that they work under, but they _do_ trust each other's expertise and each other's moral commitment.
But I think it is basically how human impulse control works. If a person chronically makes "bad" short-term-oriented choices, it may very well be because they _correctly_ don't depend on themselves to be able to execute on a long term strategy.
A realization that probably is obvious to people who are more savvy than me:
For most people, a lot of behavior is motivated, not on the basis of the merits of the behavior, but because it provides a template for social engagement.
I'm in Las Vegas for a conference today. I was wandering around the casino in which the conference is being hosted, and watching the people.
I was poking around in gift shop and saw two women looking through the clothes.
I'm not entirely sure what cognitive sequence lead me to that distinction, but I think it might have been (in part) downstream of editing my current date-me page (elityre.com/date.html).
This section felt kind of grammatically weird to me. And I think it was because I was sort of switching back and forth between talking about the the kind of relationship and the kind of person.
Looking at it now, it doesn't feel as awkward, though. So dunno.
I think part of it was that I was a little bit more tapped into the STATE of what I want, instead of working with abstracted descriptors.