One of the things that “Tim” could do was to predict the future. No, he didn’t have a crystal ball. He simply could somehow see more of what was going on in the present, and create viable scenarios for the future. These scenarios were usually rather simple, just taking into account certain things that made certain events more likely. For example, he could work on a project for a couple of weeks and make a series of predictions about its success, including particular risks that were likely to come about and the results. He even predicted certain actions by different individuals.
It was uncanny. He was almost always right.
And totally, absolutely not wanted.
Because, as Douglas Adams observed, “the one thing they really couldn’t stand was a smartass.”
This is a lesson that is easily learned from the stories of the Israelite prophets. The people killed the prophets not because they had a bad track record but because they didn’t like the implications of the message. Even when they turned out right, people didn’t flock to them.
It turns out that being right is usually the wrong thing to do, because it has a nasty habit of getting you killed, fired, or sued. (There’s nothing quite like barristry to suck the life out of you.)
Think back on the stories of the various whistleblowers, whether in government or industry. They have tended to struggle for a long time, while the people who did the wrongdoings — even when it resulted in someone’s death — rose in power and prestige.
This is one of the reasons why people follow sociopaths. Yes, they are amazingly charming and hypnotic, and part of this charm is the ability to know what people want to hear and tell them this. Telling someone lies about someone else is very effective, and has the side benefit of making the sociopath feel good since sociopaths love to destroy good people.
We have this problem in business, of course. There are several well known findings in management science that are totally ignored because they say that most of what we now do in management is just flat out wrong. Elliott Jaques’s discovery of felt-fair pay and work levels is a good example.
Jerry B. Harvey, the retired GWU professor and creator of the “Abilene Paradox”, said that manager and management experts threw out Jaques’s work because it would create what he called anaclitic depression, the depression that is caused when something that we lean on for emotional support is removed. (Harvey has even talked about how it comes into play in whistleblowing.) Nassim Nicholas Taleb talks about something similar in The Black Swan: financiers would sit through his talk, agree with everything that he said about the uselessness of bell curves in finance, and then return to their jobs using bell curve-derived instruments.
Sometimes it’s just that people are running faster than we are. Ken Kearns, a retired IBM manager, tells of sitting in a meeting in their research center. Among the attendees was one of their technical geniuses, who provided some initial content. Ken noticed that the man wrote a note down, folded the piece of paper, then put his head down on the table and went to sleep. Now Ken was an extremely accomplished operations manager, and he probably almost blew a gasket at the gaul of this guy. But knowing that the “geniuses” at IBM were treated with kid gloves, he bit his tongue. The meeting went on for another four hours as they argued a series of very complex problems before they finally reached a decision. As everyone was leaving, the “genius” woke up and looked over at Ken.
“Meeting over?” he asked.
“Yeah,” said Ken. I’m sure he had a pretty low opinion of this guy.
The genius unfolded the piece of paper and handed it to Ken. “Is this what we agreed to?”
“Um, yes. Yes it is.”
Then the genius shuffled off.
Notice how this played out. The researcher knew what the decision was going to be because that’s pretty much the only sane decision that could be made. Ken got a bit peeved at the lack of respect. When he saw the guy’s prediction after the fact he realized that while the researcher could see the solution, the other people in the room had to fight it out for several hours before they could reach the same conclusion that he reached in the first few minutes.
Even though the researcher could predict the outcome, and could therefore have been a resource for the business process, the business managers couldn’t use that skill, nor did they want to.
Apparently, we just can’t give up our lies, including that we are competent or in control. When someone like Tim comes in and suddenly confronts us with the Truth (even when he’s not pushing an agenda), we have to kill him. We all collude in this. We all want our lies. It’s what makes people who want the truth so frightening. Even when it’s a simple idea, like Warren Kinston’s discussion of how our decision making process in many ways predetermines our decisions, we react with fear and loathing.
Even preventing problems is looked on with contempt. Think about it: who gets rewarded, the project manager who never has any surprises and never works overtime or the one who knows how to “fight fires” and drives his staff when the unexpected happens? We say we want a smooth predictable process but we reward war heroes, not the diplomats who keep wars from happening.
So if you want to find some more success, keep your mouth shut about what you know is going to happen. Figure out a way to exploit it, but that’s going to be harder than you think, and in doing so you will always be an outsider and even a traitor.
Them’s the breaks of the seer.
Image Credit: “Black Dog being chased from the Admiral Benbow Inn by Captain Billy Bones” by N.C. Wyeth. From Treasure Island, 1911, via Wikimedia Commons. Public domain.