My friend Benny (who produces the Rationally Speaking podcast) really hates the word "skepticism." He understands and appreciates its meaning and long intellectual pedigree (heck, we even did a show on that!), but he also thinks — based on anecdotal evidence — that too many people apply a negative connotation to the term, often confusing it with cynicism. (And notice, to make things even more confusing, that neither modern term has the philosophical connotations that characterized the ancient skeptics and the ancient cynics!).
On the contrary, I really like the word, and persist in using it in the positive sense adopted by David Hume (and, later, Carl Sagan): skepticism is a critical stance, especially toward notions that are either poorly supported by evidence or based on poor reasoning. As Hume famously put it, "A wise man ... proportions his belief to the evidence" (from which Carl Sagan’s famous "Extraordinary claims require extraordinary evidence").
Now, why on earth would skeptics be associated with (the modern sense of) cynicism, an entirely negative attitude typical of people who take delight in criticism for the sake of criticism, negativity for the sake of negativity? I blame — at least in part — Francis Bacon. Let me explain.
Bacon was one of the earliest philosophers of science, and his main contribution was a book called The New Organon, in purposeful and daring contrast with Aristotle’s Organon. The latter is a collection of the ancient Greek’s works on logic, and essentially set down the parameters for science — such as it was — all the way to the onset of the scientific revolution in the 16th century. Bacon, however, would have none of Aristotle’s insistence on the superiority of deductive logic (which is, among other things, the basis of all mathematics). New knowledge is the result of reduction (explaining a complex phenomenon in terms of a simpler one) and induction (generalization from known cases). Bacon thought of his inductive method as having two components, which he called the pars destruens (the negative part) and the pars construens (the positive one). The first was concerned with eliminating — as far as possible — error, the second with the business of actually acquiring new knowledge.
It’s a nice idea, as long as one understands that the two partes are logically distinct and need not always come as a package (they did in Bacon’s treatise). Think of it in terms of the concept of division of cognitive labor in science. This is an idea famously discussed by Philip Kitcher, who explored the relevance of the social structure of science to its progress, arguing that such structure — once properly understood — can be improved upon to further the scientific enterprise. The basic idea, however, is familiar enough, even in everyday life: some people are good at X, others at Y, and we don’t ask everyone to be good at both, especially if X and Y are very different kinds of activities.
The same goes, I think, for Bacon’s partes destruens and construens: he may have pulled both off in the New Organon, but the more human knowledge progresses, the more it requires specialization. We have physicists and biologists, geologists and astronomers. Not only that: we have theoretical physicists and experimental ones, and even those are far too broad categories in the modern academy (e.g., theoretical atmospheric physics requires approaches that are very different from those deployed in, say, theoretical quantum mechanics). Why not, then, happily acknowledge that some people are better at constructing new knowledge (theoretical or empirical) and others at finding problems with what we think we know, or with how we currently proceed in attempting to know (Bacon’s correction of "errors")? Indeed, this division of cognitive labor may even reflect different people’s temperaments, just like personal preference and style may lead one to pick a particular musical instrument rather than another one when playing in an orchestra (or to become a theoretical or experimental physicist, as the case may be).
What does any of the above have to do with the perception problem from which skepticism (allegedly) suffers? Well, skeptics (and, hum, philosophers!) are in the criticism business, and nobody likes to be criticized (including skeptics and philosophers). But we may cut some slack to critics if they also propose ways forward, constructive solutions to the problems they identify. This, I think, is a mistake. Criticism is valuable per se, as a way to engage our notions, show where they may go wrong, and help (other) people see ways forward. Criticism — pace Bacon — is inherently constructive, even when negative, because it allows us to make progress by identifying our errors and their causes. And it can be highly entertaining: just read a good (negative) movie, book or art review, or perhaps watch an episode of the (now ended) Bullshit! series.
This under-appreciated role of criticism, incidentally, may also be responsible (in part, i.e. egos and turf wars aside) for the continuing diatribes between philosophers and physicists, where too often the latter do not appreciate that the role of philosophy is a critical one, with the discipline making progress by eliminating mistaken notions rather than by discovering new facts (we’ve got science for the latter task, and it’s very good at it!).
So, my dear Benny and other fellow skeptics, let’s reclaim the term skepticism as one that encapsulates a fundamental attitude that all human beings interested in knowledge and truth should embrace: the idea that mistakes can be found and eliminated. It’s not at all a dirty job, and we are able and ready to do it.