Not at all how morality, moral arguments, or moral progress *actually* work though.
...
Soon, I will summarize all the evidence against.
And set aside some “counterarguments,” that I think are mostly just kicking up dust. Making it harder to see the legitimate, and otherwise straightforward, fallacies embedded in moral realism.
(That’s obviously true. For anything. No?)
(There are *many* problems w/ moral realism, in addition to the lack of justification for the basic axioms. E.g. the logical contortions. And unavoidable inconsistencies.)
(That would be too easy. No. It’s not just *some* irrationality muddying the waters. It is, rather, thick mud, through and through.)
(By “real” I just mean having the logical properties described above. The sense of “real” implicit in the way philosophers discuss morality. And lay people debate morality.)
-There aren’t actually “truths we find.” Only premises we assert.
-Our morals aren’t and can’t be made consistent. Or justified.
-We don’t actually convince others through reason.
-There is a much more parsimonious explaination for how moral intuitions, moral arguments, and moral progress works.
-That account is *very* distinct from the moral realism account. Even if it has superficial similarities (both “involve” logic.)
Do we have *any* a priori reason to expect morality to be logical? To presume moral “truths” exist for discovery?
(Other than the fact that people talk this way. And that our philosophical tradition acts as if it’s so.)
I don’t see any.
We might argue about these things and try to persuade others.
Sure.
No these concepts are not relevant.
(Other than the logic that describes what we evolved to like? Or where the cards have fallen?)
Why would morality be any different?
You think a mathematician, or an algorithm that checks formal proofs, would find more flaws in their logical arguments than ours? Willing to bet on that?
Of course not. That’s not what logic does.
I mean I am glad we share this premise. But liking something, or being glad others think something, isn’t exactly a logical proof.
And proclaim (thankfully).
And act as if it’s true (thankfully).
(Why? That’s another, interesting and important, discussion).
We think people should be punished based on their behavior, not based on god’s behavior.
But also wrong to let a drunk driver off lightly if he ran over a kid. Even if the difference between kid and no kid is entirely up to god.
An existential problem?
Not at all.
(Maybe a legal one. If you have to decide which intuition to apply. But not an existential one.)
That’s not how evolution works.
No. Of course not.
Cause there is no way to justify this logically.
It’s just an intuition
Yes, we still intuit that we should have punishment (obviously).
Yeah if you try and reconcile the two, without recourse to evolved intuitions about justice, you are gonna hurt your back.
(Which many have done.)
Yeah, we all intuit that’s true for everyone.
So utilitarianism *seems* like a good call.
Oh what’s that, we also intuit these things like rights, hmmm, ok, let’s just add that in. (Which is what Mill did!)
Those are all conceptual problems.
W/ enough pages and centuries of scholarship, we can sweep these conflicting intuitions under the rug!
Better that than admitting conflicting moral intuitions!
Or just appear to be so?
What about when you justify why you eat meat? Or buy lattes instead of giving that $3 to buy mosquito nets or water filters for those actually in need in Africa? Are you being logical then?
Or bending over backwards?
There is a more parsimonious account for where our moral intuitions come from.
And why they have these illogical features.
And why we nevertheless justify them and try to reason about them. And for what drives moral “progress”.
But no reason evolution (again biological or cultural) would hand us morals that can be made logically consistent. Or even reasonable.
Is it logical that peahens like long tails? Or that some viruses get their kicks out of destroying your immune system?
But that’s not what philosophers, or lay people, mean by moral reasoning.)
(Why? Presumably cause then we can sound less selfish and more principled, especially when our justifications are in line with desirable principles, like selflessness and treating others the same as us.)
It’s also, unlike the moral realism story, evolutionairily sensible.
(We would never evolve to have morals dictated by “logic,” we would evolve to have moral intuitions *justified* by (contorted) logic!)
How?
The other story says what matters is *appearance* of logical consistency.
And consistency with what? Not truth. But principles that are *socially desirable.*
That’s where things like plausible deniability come in.
Which, as a matter of actual fact, plays a huge role in moral intuitions, moral arguments, and moral progress.
Unlike *actual* consistency. Plausible consistency.
(As well as prominent explanations for why we evolved to distinguish omissions from commissions. And means from byproducts. See Kurzban and Descioli.)
But are obvious implications of the evolutionary story.
The evolutionary story, in contrast, says which principle will depend on what’s valued/socially enforced in your community and context.
You tell me.
Or the French and British were just that much smarter or better educated than the Germans and Italians?
(Like whether they are powerful and control others.)
The other story says this shouldn’t matter.
Think about the famous slave ship image that helped end the slave trade in the British empire. And not-so-coincidentally ended the ability to plausible deny knowing how bad slave ships were.
The other says debates will be won when plausible deniability is preempted, and the principle at play is made socially undeniable.
But is singer compelling because he taught you something you didn’t know? Or because he removed plausible deniability, and left no wiggle room for you to maintain your selfish or discriminatory behavior?
You be the judge.
The other story realizes that’s just a facade. Perhaps a deeply internalized one, but just a facade.
Which do you think fits the facts better?
It doesn’t work the way “reason” dictates.
/eom