As members of the intelligence community (IC), we are highly trained and work on very important and sensitive assignments. While it goes without saying that protecting our national assets both at home and abroad requires a high level of passion, tireless dedication, and extreme attention to detail, there are patterns that intelligence and imagery analysts can easily fall into that can compromise the overall effectiveness of our efforts. After all, we are human.
Conversely, because we are human, we can always improve our skills and efforts. One of the best ways to improve our efforts is by taking a hard look at the pitfalls that we confront on a daily basis.
- Argument to the Stick – We have all been in forums where someone—perhaps in a position of authority—asks something like, “Nobody here is stupid enough to believe the Steelers will really win the Super Bowl this year, right?” This kind of statement is illogical and stifles debate. Regarding sports, it can be endearing and even humorous. But when dealing with substantive discussions, such as a potential enemy course of action (COA), the stakes are high and “argument to the stick” can be dangerous, shutting down potential dialogue that may determine the appropriate course of action.
- Missing the Big Picture – Imagery analysts are required to research, assess, integrate, manipulate, exploit, extract, and analyze full-motion video as well as satellite imagery. These tasks require a keen eye for detail, but because we are detail-oriented, we often miss the “big picture.” As such, we may need to step back to fully understand the core of a mission and its goals. Once this is achieved, the details will fall into place.
- Group Think – Often when subordinate advisors are in awe of a leader and believe that he/she has already made their decision, we fall into a “group think” mode where we automatically agree with the leader or the group in general. For example, when President Kennedy asked his advisors for their counsel on a potential Bay of Pigs invasion, he was well respected and most thought he was already in support of the idea. As a result, nobody spoke up to articulate the significant risk of mission failure—or the implications for the United States’ image throughout the world.
- Mirror Imaging – When we think about a COA only from our own subjective point of view, we do not take into account the unorthodox tactics that the enemy may employ. For example, when the Japanese attacked Pearl Harbor in 1941, most intelligence professionals were convinced that such an attack just didn’t make sense. At the time, the potential bad outweighed short-term advantages that the Japanese would gain. Despite this logic, the Japanese concluded that a surprise attack was their best COA to knock out the U.S. as a potential opponent in the Pacific Theater.
- Personal Biases – Personal biases spring from our culture, education, and upbringing—and we all have them. All too often, intelligence and imagery analysts fall back on their personal biases when making decisions, sometimes without being fully aware of it. There are many examples throughout history where personal biases influenced decisions and outcomes. For example, General Custer did not believe that the American Indians would take a stand against him, but history proved that his personal bias and views got him into trouble.
Working in high-stakes environments where the margin of error is miniscule, imagery and intelligence analysts often need to take proactive steps to break daily patterns that can compromise our work. While we are only human, the job often requires us to perform at levels beyond human abilities. And when it comes to saving lives and protecting our nation, there is no room for error.