Devizesbowmen shooting a recurve bow at archery target. (c) Jethrothompson (CC BY SA 3) Via Wikimedia.

“Ready, Fire, Aim”: Intuition, Analysis and Tacit vs. Explicit Knowledge

Forrest ChristianChange, Knowledge, Theory 2 Comments

Q: What is the intersection of intuition/analysis, explicit/implicit knowledge and CIP/work levels?

Tom Peters (the management guru so amusingly mocked in Dilbert) first heard this phrase from an executive at Cadbury describing how his company differed from the planning ideal. He uses is a lot, and you can find a mess of Peters’s articles using the phrase (e.g., “For Heaven’s Sake, Try It!“).

I think that this is the secret to most of life, but not all. Most of the time, we really have to shoot from the hip and adjust as we go along. I think this is why David Billis says that he and Ralph Rowbottom had so many problems replicating the time-span of discretion measurements that Jaques did for executives. For regular workers, it’s straight forward. For executives, it seemed to fall apart because they didn’t work on anything that long term. Some of this may be definitions, but since Billis and Rowbottom were early Jaques disciples at Brunel Institute of Organizational and Social Sciences (BIOSS), I doubt it. The change in the predictability of the economy from the 1960s, when Jaques did his work, to the 1970s, when Billis and Rowbottom started complaining, explains a great deal, too. But it seems that in the current wildly changing world, “Ready, Fire, Aim, Fire, Aim, Fire, Aim” is probably closer to reality.

Or “Fire, Ready, Aim, Fire, Aim, Fire, Aim…”

Stan Smith talks about this. He refers to the need for real-time feedback from the system to know how you are doing. By investigating the solutions as you are trying to determine the problem, you get farther ahead. If you knew what they problem was, it wouldn’t be much of a problem: you’d just go ahead and fix it. Most of what we do we don’t truly understand what will work or why. We move around by intuition, using analysis to then determine whether we’re on the right track or how to sharpen our focus.

I’ve talked about the differences between intuition and analysis, something I think that Henry Mintzberg understands but Gladwell doesn’t seem to (see his new book, Blink). They often aren’t separable. Intuition is different from tacit knowledge, although tacit knowledge can inform it. That’s a mistake that KM people make a lot. They interrelate but are different. I think that intuition vs. analysis is an entirely different function of the brain than tacit vs. explicit, but people confuse the two. I may have a great deal of difficulty taking my tacit knowledge of something I’m an expert in and making it explicit. But I may also have a tremendously difficult time making the patterns I used for my intuitive hunch explicit, moving it from right brain to left brain.

At least, I don’t think that tacit knowledge is completely a right brain activity.

Can we be intuitive about something we haven’t explicitly known?

When we take a “Fire, Aim, Fire, Aim” approach, we mean that we initially use a combination of intuition and tacit knowledge to get the shot in the right general direction. We then adjust more finely using our tacit knowledge and analytical left brain. If we don’t see the result we want, we either veer off into another direction (using intuition to get into the right general area) or we examine our current direction analytically to determine where it may have failed.

“Ready, Aim, Fire” style management is flawed in many cases (but not all) because it assumes that you have to get it right the first time. This is a seriously flawed understanding of market forces. Being able to fire while running is less likely to make accurate hits, but with a shotgun instead of a rifle you probably don’t need to worry. (Or an M1 tank: J once told me that he could hit a Soviet topline tank on the fly going 40 mph over rough terrain, whether or not the Soviet tank was in motion.)

Gerat tubular bridge. Engraving

Science is really “fire, aim, fire, aim…” You look at a problem and ask a question. You begin a series of investigations that escalate as it looks like a fruitful line of inquiry. You begin with a review of the literature: does this question even need answering or should it be tweaked, based on studies done by others? You continue with a small study to get funding and then a large-scale investigation of tests. It looks like “Ready, Aim, Fire” only at a certain angle, since you do study the literature (“ready”), prepare a hypothesis (“aim”) and test (“fire”). But these are done in an internally recursive manner that means you are really adjusting constantly. And if you look at science from a broader perspective, this is what’s done generally. People asks questions of other people’s answers, look for new happy accidents, build on each other’s work. Even the “paradigm shifts” (which Skokal has said aren’t really there) in science are built off other people’s work, although they may represent an intuitive leap that then requires serious work and study.

It’s not just tacit vs. explicit, because that confuses the issues of intuition. Others had the same information that Charles Darwin had, but they didn’t make the leap to adaption of a species. Einstein didn’t have information that others lacked: he used the data available to other physicists to construct his thought experiments, which he refined and refined. Intuition and analysis seem intertwined in thegreat minds.

And what part does complexity of information processing play in this? I don’t think that intuition/analysis is split as a low vs high stratum issue. Again, this is probably an issue of how information changes as viewed by different work levels. A higher stratum person may intuit things differently, in a bigger way. Intuit larger guesses. But there are serious differences in how much people access their right and left brains, according to the research. Given the same information, different work levels will intuit different answers.

At Big Insurance Group (BIG), an old client of my previous employer, they had a serious problem with web server slowdowns. This is a pretty normal thing, of course. We in the consulting company all intuited the same answer without knowing all the details about the problem: they were using Microsoft web servers for a throughput that MS servers simply were not made for. They needed to mix up their server mix, with various *nix loads along wth several MS server loads. But that seemed like an answer that their tiger team would not be giving. Part of the problem was one of work level: these folks were looking at the problem from a more detailed level than we were. They wanted to know how to get the MS servers to behave. We saw it from a higher level and came up with a different answer. I’m sure that folks above us would have seen the problem as one of organization and management (and I wouldn’t disagree).

Was the answer one of tacit knowledge? We certainly knew more about the variations of web services and what had worked in different environments. Some of us had worked in consulting for some time and had seen similar problems before. But we didn’t know as much about their server situation. I came up with the same answer and I had very little knowledge about their web environment. i as working in project risk management. We had tacit knowledge about the general work area (web server farms) but not as much about the particular problem. They had more tacit knowledge in their situation but lacked our broader knowledge. But I’m not sure that they could have had our tacit knowledge.

Which is part of the problem: all of these discussions seem to take for granted that if someone knew what I did then they could do what I do. Which isn’t true. It fails to take into account the differences in our abilities to handle complexity, much less our interests and natural inclinations. What is my tacit knowledge? Is it simply explicit knowledge made unconscious? That seems like an illogical position but it informs most KM discussions. There are more processes happening.

BIG’s technical staff rightly looked at the problem and saw problems with the Microsoft server loads and network configurations. INFOSEC looked at it and rightly saw the problem was asking a single Microsoft server load to do too much, and to do things that *nix servers could do better if properly configured. I looked at the situation and said that the organization was simply misaligned which permitted it to get to this decision to use all MS servers in the first place. Up the chain of levels of abstraction. At each level, the tacit knowledge of the other levels is not that useful. Tacit knowledge works within work levels, which is why Communiites of Practice seem to be mostly around a particular work level. Which is why you say that someone doesn’t belong because he or she just “isn’t in our league”.

I tend to be a “ready, ready, ready, ready, ready… FIRE! FIRE! FIRE! Now aim! FIRE! FIRE! Aim again! Now fire! FIRE! FIRE!” type of worker. Man, do I love a crisis. Come in, create the team (or rebuild it), fight the fire, build something that won’t burn down and leave the team to build it further. Of course, my version of “ready” is studying whatever comes to mind rather than putting together a coherent plan for a particular situation. More like getting more tools to respond to more varied situations.

Of course, maybe this right/left split is all sewn up nowadays as nonsense. Chime in if you have better information.

[Actually, it’s “Ready, Fire, Aim, Fire, Aim, Fire, Aim, Fire, Aim….” as per Henry Mintzberg in Rise and Fall of Strategic Planning.]

Image Credit: Devizesbowmen (shooting a recurve bow at archery target). © Jethrothompson (CC BY-SA 3.0). Via Wikimedia Commons.

Comments 2

  1. This really depends on what “ready, fire, aim” is meant to distinguish, and how it is compatible or incompatible with the context and purpose. Certainly one needs to possess some understanding, if only by intuition in the first instance that there is a problem, and how the problem fits in respect of its significance. For example is this the limiting problem or is there one bigger? Also, if solved what problems are created in the resolution of this one, or could be created and how do we implement parallel solutions such that the next problem is being worked on before it becomes a critical limitation? (The adage that the solution to the current problem will inevitably become the source of the next problem.) “Intuition” is merely one’s ability to hypothesize based on capability (CMP, skills, knowledge and valuing the problem) such that the problem is conceived in the mental process and a cognitive structure or image is developed. We will wish to have sufficient understanding of the work processes and relevant data to confirm the nature of the problem and rest assured we will always aim. We can argue about how refined the aim might be, however as a minimum we are aiming at this problem or that one that we as a minimum intuit to be a problem. I think the distinction as you have pointed to is one of moving to action with a sufficient understanding of the problem and one or more potential solutions, versus paralysis by analysis where one only analyzes and fails to act.

    Bear in mind that there is no substitute for a systematic work process that begins with planning, then scheduling, then execution, then analysis, then redefining the specification for the next time the task is planned, and/ or developing new problem statements. There is the distinction between assumption of risk in trying new solutions to solve a problem and simply failing to apply a systematic approach where the planning and analysis are carried out versus happenstance execution and firefighting in this “ready, fire, aim” approach.

  2. Pingback: Gordon’s Comments on Professional Firms and Requisite Ordering | Requisite Reading

Leave a Reply

Your email address will not be published. Required fields are marked *