Flavius Belisarius
The first thing to keep in mind is there are over 5,000 years of recorded military history, starting with the battle of Kadesh. Second, each generation studying and engaging in the science and art of warfare suffer from amnesia. The third thing to keep in mind is one can over-think, just like one can over-engineer.
The fear of failure is real and understandable. If we get things wrong in the profession of arms, people die, wars are lost and national interests are compromised. Fear of ignorance is more complex. In the profession of arms, it surfaces in professionals not wanting to admit they either don’t know something, how something works or what will happen next. Often the “wiz-kids” who have great ideas and/or workable & actionable theories are often shut down. The reasons are many: Fears of being professionally shown-up, embarrassment of not coming up with the bright idea or theory sooner or the sense of being reputationally challenged. There are more, but these illustrate the point.
The Once and Future Reformer
What does reform actually mean in the professional sense? In bureaucratic circles, “reform” can be a nasty word for anything that challenges a status quo or forces less-proficient practitioners to step out of their comfort zones. Seniors sometimes say we need it. Most don’t want to go down that road. Why? Because there usually isn’t enough clarity about what needs to be reformed and why? How many times has someone said, “We need to get back to the basics?” “Think outside of the box?” “We need to tear down silos.” All are valid points, if you know what basics require attention and the actual dimensions of the box. Reformers whose theories were successfully exploited by our opponents or otherwise proven effective are often praised by later generations who ask why they weren’t their voices heard before the war or national crisis. The reason is the profession of arms can be a treacherous beast, especially when it comes to what I mentioned above – fear of the unknown or insecurity over a challenge to “conventional wisdom.”
I said all of that to say this: As long as career reputations, budgets and fears of failure are at play, the contents of all three articles will be rejected by the very people who should be supporting them.
Since you sometimes quote Odd Ball, “Always with the negative waves, Moriarty,” let me dispatch with the negative waves and get down to business.
We can get there from here. The first thing we need to do is to stop setting-up and sitting in camps and take a good look at our own position. The Ukrainians are doing a great job at innovating because their lives depend on it. We haven’t been as effective in that regard, because we’ve been an uncontested superpower for 30 years. Guess what those days are over. I was an operator for 20 years and an “int-guy” for the last 20. Since my current professional focus is on intelligence analysis, I will use that to address the main issues of these excellent articles.
The most important thing is not to get locked into what I’m saying as being only intelligence focused. Think about what I’m saying and apply it to the larger picture of a whole of force approach.
“It’s The Economy, Stupid.”
That simple remark during the 1992 election cycle should conceptually resonate. That is take a step back, evaluate what’s really important and define the problem. If you (you, in general, not you, Dave) can define the problem, your mind won’t go blank trying to over-think things. If you can define the problem, you can apply the correct solution and that’s where innovation begins. The problem in Afghanistan - it was defined as an “insurgency.” Wrong. It was a Pashtun “uprising.” Both are violent internal instability problems requiring specific sets of solutions to solve them. In the end, we lost a 20-year campaign in the previously titled Global War on Terror (The Artist Formally Known as Prince). You and your JSOTF won a campaign in said war, but I digress. Back to the issue of defining problems: In Afghanistan, we were treating a case of halitosis with a root canal.
I’ll address the question and the content of the three articles by the numbers. Again, I’m facilitating this through the narrow prism of intelligence analysis, as the catalyst, to answer the question from a whole-of-force perspective.
1. In the operational field, we have an adage, “Amateurs talk tactics. Professionals talk logistics.” I say this about intelligence analysis: Amateurs talk data. Professionals talk Judgment. What the “data guys” don’t get yet is that the secret of a successful model is to mimic a living system. They keep trying mathematical models based on data, rather than systems. Human behavior is systematic, based on subjective probabilities. It’s not data driven. Analytical judgment requires structured techniques which require the use of specific regions/sub-regions of the brain. It’s cognitive. Of the five kinds of analytical judgment, only two are data and fact driven. You can achieve a maximum 75% judgment accuracy in predicting human behavior using data driven models (though 75% is rarely achieved). Living systems can boost that to over 90% accuracy and even higher (that’s proven, by the way).
2. Applying history isn’t about mimicry. It’s about realizing that all human beings borrow from history (many don’t even realize it) and use what worked…with learning. When you define the historic precedence, you can more easily “predict” the next move(s). Remember what I said about amnesia? You can borrow from what others have done well and learn from their mistakes. It’s ok to admit when you (you, in general, not you, Dave) don’t know something. That’s one of the biggest hurdles to overcome.
3. The main issue the “establishment” has with the so-called Futurist Camp, is they often drive off the cliff of speculation. Futurists can be their own worst enemies. Those in the so-called Traditionalist Camp are sometimes akin to Linus from Peanuts. They need their security blankets. That’s why OSINT isn’t always welcomed, because the “data guys” say they need precise data and lots of it. Our handicap, in the 70+ years of post-World War II intelligence analysis has been an over-reliance on data. Don’t get me wrong. We’re second-to-none, when it comes to data collection. Unfortunately, we swim in data and that’s where the problem comes in. What does an over-reliance of data have to do with the Futurists vs. Traditionalist dilemma? I’m glad you asked. It all comes down to three words: Phenomenology, Process and Futurity. When you (you, in general, not you, Dave) understand the phenomenology of problem, the problem can be defined. When you identify the process(es) at work, you can measure and predict the course of the problem, like doctors measuring and plotting the course of a disease. That’s when you can start talking about the futurity of events. When you know the phenomenology and the process, you don’t need loads of data. All living systems are predictable. That’s not theory. That’s fact. That’s one of the things I teach and it works.
4. That brings me to the issue of reducing surprise. Guess what, a surprise is a multiplier of other effects. It can’t be studied without reference to another action (a surprise tornado, a surprise earthquake, a surprise attack). Surprise can’t be avoided, but the potential damage can be avoided. The multiplier can be negated completely. It’s important to understand that surprise also refers to alertness and readiness conditions, NOT the actions of the threat entity. Again, surprise is not avoidable, but damage is. By focusing on potential damage (threat) the damage and surprise conditions can be avoided. What helps in this? I’m glad you asked. Understanding the phenomenology at work and the processes involved which make it work. That’s where “Warning” and readiness come into play.
5 (a). Finally, our defense establishment needs to take the pressure off itself. I will express it first, in terms of intelligence analysis. Then operationally (from a historic perspective). Without any use of technology, but using the techniques I teach, a large number of diverse analysts achieved between 90-95% accuracy in their judgments over a four-year period with minimal all-source information (we’re into the fifth year now). The constituent decision-makers didn’t think it could be done without some software application or big data platform. The hill most intelligence analysts have to climb is suspicions of skeptical executive decision-makers who want something – a machine – that doesn’t require any deep thinking by analysts but makes accurate predictions. All the technology of analysis is aimed at reducing the decision-maker’s dependence on people they don’t know. That’s because analysts have gotten it wrong too often and most of the time when it hurts the most, in a crisis. If decision-makers thought they could get rid of analysts, they would do so with glee. My goal was to find what works and apply that. The cognitive model I teach is built from things that worked, from experience, not from theory.
5 (b). From an operational perspective, our defense establishment needs to give themselves a break. “They” often criticize PME as too intellectual and not actionable. Some say that because it takes them out of their comfort zone. If what the critics say is true, why would our greatest competitor seek to emulate and build from our PME? I’ll tell you. PME separates the professionals from the amateurs. There are, unfortunately, many amateurs in the senior ranks, but that’s been the case since Pharaoh led his chariot attack at Kadesh. The reason why I said the establishment needs to give themselves a break is because in the 5,000 years of recorded military history, no one, I mean no one has gotten it right. What has been the norm is the armed force that prevails is usually the one that makes the least mistakes in a given battle, campaign or war. We can address the decision-making of national leadership and policymakers another time. The bottom line - it’s all about a willingness to learn. We’re back to the cognitive aspect.
No comments:
Post a Comment