Who Needs Morals, Anyway?
The most-often-asked question when debating morality with theists is, “but where do you get your morals?” Of course, if the theist says “I get my morality from the Vedas/Quran/Bible/Dianetics”, that doesn’t help, since it just raises the question that Matt Dillahunty posed at his debate at UMBC: let’s say some being comes along and says, “I am a god. Here’s a book with my moral system”, then so what? How do we decide whether the system in the book is any good?
I thought I’d step back for a moment and ask, what if there were no morals?
Maybe there are no rules, or no one to give them. Maybe there are rules, but nobody knows them. Maybe the rules are known, but they’re ignored, and there is no mechanism for enforcing them, not even a twinge of guilt. What then?
I don’t think anyone has any trouble imagining this sort of world: theft and lying are rampant, people will kill each over a can of beans and not feel remorse. In fact, there wouldn’t be any cans of beans, because the industry required to produce them couldn’t exist without some kind of stable society and the ability to form long-term associations. A world where you’re constantly looking over your shoulder, lest your own child stab you in the back.
Okay, so this vision may not be accurate. Maybe some combination of game theory and psychology can show that there might be amoral societies where life doesn’t suck as much as what I described.
But I think it’s safe to say that the vision of a world without morals that I described above, or the one that you imagined, represents our fear of what would happen without some sense of morality.
If you’re with me so far, then presumably you’ll agree that then morality is a way of avoiding certain Bad Things: living in fear, being killed or seeing your loved ones killed, and so on; and also of being able to get some Good Things: establishing trust, assuring some level of stability from day to day, and so forth.
We may not agree on anything. You might want to security cameras on every street corner, to make the risk of being robbed as small as possible, and I might feel that the feeling of not being watched all the time is worth the occasional mugging. But if we can agree in broad outline that certain outcomes (like being killed) are bad, others (like knowing where our next meal is coming from) are good, then morality reduces to an engineering problem.
That is, it’s simply(!) a matter of figuring out what kind of world we want to live in, what rules will allow us to get along, and how to get there.
Obviously, this is a thorny problem. But nobody said this was going to be easy. Well, nobody who wasn’t trying to sell you something. As is the case with every engineering project ever, not only are there conflicting requirements, but they change over time. Everyone wants to put their two cents in, and everyone thinks their personal pet cause is the most important one of all. Finding a solution requires political and diplomatic negotiation, and convincing people to give up something in order to strike a deal. It’s enough to make your head spin.
But this strikes me as a huge problem, not an intractable one. We can tract this sucker. We have enough history behind us, and enough data collection methods, that we can see what works and what doesn’t, which sorts of societies are worth living in and which aren’t, and try to figure out how to get where we want.
Saying “I get my morals from an old book” is a lazy cop-out. It’s the response of someone who doesn’t want to look at the problem, let alone try to solve some part of it. And if you’re not going to help, the least you can do is stay out of the way of those who are trying to fix things.