The Thing, and the Name of the Thing
Yesterday, during a routine medical examination, I found out that I have a dermatofibroma.
Don’t worry about me. My prognosis is very good. I should still have a few decades left. It means that at some point I got bitten by an insect, but a piece of stinger was probably left behind, and scar tissue formed around it.
But if you thought, if only for a moment, that something with a big scary name like “dermatofibroma” must be a big scary thing, well, that’s what I want to talk about.
I’ve mentioned elsewhere that as far as I can tell, the human mind uses the same machinery to deal abstract notions and patterns as it does with tangible objects like coins and bricks. That’s why we speak of taking responsibility, of giving life, of sharing our troubles, and so forth. (And I bet there’s research to back me up on this.)
A word is the handle we use to grab hold of an idea (see what I did there?), and sometimes we’re not very good at distinguishing between the word and the idea. I know that it’s a relief to go to the doctor with some collection of symptoms and find out that my condition has a name. Even if I don’t know anything about it, at least it’s a name. It’s something to hold on to. Likewise, I remember that back in the 80s, simply coming up with the name “AIDS” seemed to make the phenomenon more tractable than some unnamed disease.
I think a lot of deepities and other facile slogans work because people tend not to distinguish between a thing, and the word for that thing. Philosophers call this a use-mention error. C programmers know that it’s important to distinguish a variable, a pointer to that variable, a pointer to a pointer to the variable, and so forth1.
The solution, I’ve found, is to keep a mental model of whatever the discussion is about, kind of like drawing a picture to help you think about a math problem. For instance, if a news report says that “seasonally-adjusted unemployment claims were up 1% in December” and I wonder why the qualifier “seasonally-adjusted” was thrown in there, I can think of department stores hiring lots of people for a few months to take handle the Christmas rush.
Richard Feynman describes this process in Surely You’re Joking, Mr. Feynman. In the chapter Would You Solve the Dirac Equation?, he writes:
I can’t understand anything in general unless I’m carrying along in my mind a specific example and watching it go. Some people think in the beginning that I’m kind of slow and I don’t understand the problem, because I ask a lot of these “dumb” questions: “Is a cathode plus or minus? Is an an-ion this way, or that way?”
But later, when the guy’s in the middle of a bunch of equations, he’ll say something and I’ll say, “Wait a minute! There’s an error! That can’t be right!”
The guy looks at his equations, and sure enough, after a while, he finds the mistake and wonders, “How the hell did this guy, who hardly understood at the beginning, find that mistake in the mess of all these equations?
He thinks I’m following the steps mathematically, but that’s not what I’m doing. I have the specific, physical example of what he’s trying to analyze, and I know from instinct and experience the properties of the thing. So when the equation says it should behave so-and-so, and I know that’s the wrong way around, I jump up and say, “Wait! There’s a mistake!”
This sort of thinking is a way to have the analytical and intuitive parts of your mind working in tandem. If you have an intuitive understanding of the system in question — be it computer code or preparing a Thanksgiving meal for twelve — you can apply that intuition toward understanding how everything is supposed to work. At the same time, your analytical mind can work out the numerical and logical parts. Normally, they should give the same result; if they don’t, then there’s probably an error either in your analysis or in your intuition.
The downside of this approach is that I tend to get very frustrated when I read theologians and philosophers — or at least the sorts of philosophers who give philosophy a bad reputation — because they tend to say things like “a lesser entity can never create something greater than itself” without saying how one can tell whether X is greater or lesser than Y, and without giving me anything to hang my intuition on. And if a discussion goes on for too long without some sort of anchor to reality, it becomes hard to get a reality check to correct any mistakes that may have crept in.
Since I started with jargon, I want to close with it as well. Every profession and field has its jargon, because it allows practitioners to refer precisely to specific concepts in that field. For instance, as a system administrator, I care whether an unresponsive machine is hung, wedged, angry, confused, or dead (or, in extreme cases, simply fucked). These all convey shades of meaning that the user who can’t log in and do her work doesn’t see or care about.
But there’s another, less noble purpose to jargon: showing off one’s erudition. This usage seems to be more prevalent in fields with more, let’s say bullshit. If you don’t have anything to say, or if what you’re saying is trivial, you can paper over that inconvenient fact with five-dollar words.
In particular, I remember an urban geography text I was assigned in college that had a paragraph that went on about “pendular motion” and “central business district”s and so on. I had to read it four or five times before it finally dawned on me that what it was saying was “people commute between suburbs and downtown”.
If you’re trying to, you know, communicate with your audience, then it behooves you to speak or write in such a way that they’ll understand. That is, you have a mental model of whatever it is you’re talking about; and at the end of your explanation, your audience should have the same model in their minds. Effective communication is a process of copying data structures from one mind to another in the least amount of time.
That geography text seemed like a textbook example (if you’ll pardon the expression) of an author who knew that what he was saying was trivial, and wanted to disguise this fact. I imagined at the time that he wanted geography to be scientific, and was jealous of people in hard sciences, like physicists and astronomers, who can set up experiments and get clear results. A more honest approach, it seems to me, would have been to acknowledge from the start that while making geography scientific is a laudable goal, it is inherently a messy field; there are often many variables involved, and it is difficult to tease out each one’s contribution to the final result. Add to this the fact that it’s difficult or impossible to conduct rigorously controlled experiments (you can’t just build a second Tulsa, but without the oil industry, to see how it differs from the original), and each bit of solid data becomes a hard-won nugget of knowledge.
So yes, say that people commute. Acknowledge that it may seem trivial, but that in a field full of uncertainty, it’s a well-established fact because of X and Y and Z. That’s the more honest approach.
1: One of my favorite error messages was in a C compiler that used 16 bits for both integers and pointers. Whenever my code tried to dereference an int or do suspicious arithmetic with a pointer, the compiler would complain of “integer-pointer pun”.
(Update, 11:43: Typo in the Big Scary Word.)