
“Treat every gun as if it’s loaded”
(My readers in the States are almost certainly familiar with this most important rule of firearm safety, hopefully others will be as well.)
This essay is not about firearms, but rather on the relationship between truth and utility, and the degree to which we value these concepts in day-to-day life.
When we say, “treat every gun as if it is loaded”, there is an implicit understanding that this may not actually be the case.
The gun may be loaded, or it may not. For those of us that handle firearms regularly, the longer, more realistically worded version of this principle actually works out to “treat every gun as if it is loaded until you have personally opened the action, removed the source of ammunition, and then visibly and manually checked the chamber for the presence of ammunition. Upon doing so, it is safe to continue administrative handling of the weapon, but rules of politeness still dictate that you do not point the weapon in its assembled form at another human being. If someone hands you a weapon that you have not personally verified to be unloaded, even if they claim it is unloaded, you must still run the complete check process to confirm that this is the case.”
Conversely, the aphorism might be simplified even further when speaking to a child “Guns are loaded and dangerous.”
As we can see, the truest description of the desired course of action is the least concise, and the most concise and powerful description is the least true. The most commonly used version “treat every firearm as if it is loaded” splits the difference by introducing the concept of “as if”.
This “as if” concept calls to mind the analogy of Schrödinger's Cat seen in discussions of quantum mechanics. To be fair, I don’t have a firm grasp of quantum mechanics (and likely neither do you, but on the odd chance you do, please forgive my admittedly “pop” usage of Schrödinger’s thought experiment). I’ll be discussing it in an admittedly loose fashion because of the usefulness of the analogy, and not because of its direct relevance to the current discussion, which seems, to me at least, delightfully appropriate.
We might imagine our firearm as being simultaneously loaded and unloaded. As such, our behavior is then dictated by the consequences experienced if we guess which state it exists in incorrectly.
If we guess that the firearm is unloaded and it is, in fact, loaded, the consequences could be catastrophic. Conversely, if we guess that the firearm is loaded and it is in fact unloaded, there are no real negative consequences (in this administrative situation at least, should we need the firearm to defend ourselves, of course this could cause a problem.)
Therefore, a pragmatic decision is reached to treat all unexamined firearms as being in a loaded state, not because this is necessarily true, but to best avoid undesirable outcomes. We might say that the truth of the state of the firearm is eclipsed by the greater truth that the negligent discharge of a firearm is a potentially disastrous occurrence to be avoided at all costs. From this we might extrapolate that, not only are not all “truths” created equal, but some truths may be so compelling that falsehoods, either partial or total, are allowable in service of those “greater” truths.
Our language is rife with other examples of this.
“The customer is always right”
“If you can dream it, you can achieve it”
“What doesn’t kill you makes you stronger”
Any of these statements can quite easily be attacked as being factually incorrect in many or even most cases, but still, we use them because, while many of us may recognize that they are not necessarily correct, we also realize that believing them, or behaving “as if” we do, can motivate culturally or personally desirable behaviors.
I find these examples interesting and feel as though our language doesn’t possess a satisfactory term for describing them. I’d like to humbly propose a word that, while not original to myself, sees very little use outside of particular philosophical discussions, or in certain types of computer programming.
This word is “pragma”.
We are all familiar with the words “dogma” and “dogmatic”. We are also familiar with the word “pragmatic”, which Merriam-Webster defines as an adjective “relating to matters of fact or practical affairs, often to the exclusion of intellectual or artistic matters : practical as opposed to idealistic.”
I would propose that “pragma” as the noun form of this concept is useful, and would define it as follows-
“Pragma (noun) - a concept whose truth value is superseded by its utility.”
As such, a pragma is fundamentally different from the concept of dogma, in that, should circumstance change in such a way that the concept loses its utility, the concept could no longer be defined as pragma, and instead would have deteriorated to simple falsehood.
Of course, I am in no position to dictate anyone’s speech, and the reader may or may not find my proposition compelling. It’s simply a suggestion, as the English language is something of a bastard tongue, a sort of linguistic sponge, and I see no issue with proposing new ways of speaking and thinking. Many words in our current dictionary earned their place through the humble origins of common usage. So, if “pragma” serves a purpose for you, have at it. If not, it is no skin off of my proverbial ass.
Fully agreed, but that methodology turns up everywhere, Ishmael.
In Engineering, it’s called Independent Validation and Verification and you don’t have quality until it’s completed.
In Management, they say ‘People don’t do what you expect; they do what you inspect.’ Literally, the way you manage changes the way that people perform.
In risk management, you have Likelihood of an occurrence and Impact. The risk is the product of multiplying them. If the likelihood is too high then you can reduce it through mitigations, but mitigations can’t reduce the impact — that’s managed through contingency. After you’ve applied suitable mitigations and made provision for contingencies, what you have remaining is Residual Risk, which you can monitor, but have to live with. (By the way, this is part of why Pascal’s Wager is broken and wrong.)
All of these are pragmatic applications of managing a future where you might believe one thing, but are responsible regardless. And yes, coming from ICT I’d happily call it pragma.
By the way, in professional ethics you are judged not by what you believed but by what you checked and planned for. It’s called Due Diligence and Duty of Care. There are professions where you can be deregistered for not applying due diligence and duty of care, and you can be held civilly or even criminally liable in cases where you’ve failed it — essentially failing empirical rigour and critical thought.
One area where I think that is not applied adequately is executive management — they often skate on what they claimed not to have known, when they didn’t do diligence to find out.
So agreed: we absolutely need more of it. That’s not a mustardy observation — it’s timely and acute.
My contribution here though is that it’s hardly new. It doesn’t need a new word, but better accountability, and a better popular understanding (which I think fall to media, education and civics.)
Hope that may interest. 😉