In an information rich society Womens Sean Taylor Jersey , too many people are still starving their decisions of enough of the right information.
INTRODUCTION
How do you know if your decision process is well-informed or ill-informed? And even if you could detect the clues of an ill-informed decision process, would you know what to do about it? Here are some ideas for how to get more rigour into your decision process by sliding a little further away from fantasy and a little further toward fact.
CLUES THAT YOU'RE WITNESSING AN ILL-INFORMED DECISION PROCESS
You can tell the hallmarks of an ill-informed decision process simply by listening for all the substitutes that are offered in place of real data, fact and evidence. Usually these substitutes go quietly unnoticed, or are selectively ignored. We either aren't aware that they are indeed poor stand-ins for good and sufficient information Wes Martin Jersey , or we remain silenced by our fear of the repercussions of publicly questioning them.
The alternative is actually more frightening. Think for a minute about the consequences of medical researchers making decisions about introducing new drugs on the basis of a handful of test subjects, or of civil engineers making decisions about bridge design on the back of professional opinion, or aircraft manufacturers making decisions about fuel economy without thorough analysis of the impacts of changing the fuel system. It's not always a case of life and death, but if you can imagine the money and time being wasted on account of ill-informed decisions Bryce Love Jersey , then you might start imagining how different the world could be if that money and time were available for better use.
If slaying ill-informed decisions is a crusade you're up for, then a skill worth sharpening is your ear for those poor substitutes for good information. Here are some clues for what to listen for, and some linguistic lances to prod with.
VAGUE, NON-SPECIFIC CLAIMS
When people are asked for an update or progress check on how their initiatives or projects or functions or processes are going Terry McLaurin Jersey , and they are ill-equipped to answer with specific data or evidence, you'll probably hear them say things like the following:
"It is working really well."
"We're tracking along fantastically."
"The result was slow to get off the ground, but now it's up to speed."
"Cycle time is too high."
"That project is failing to realize benefits."
Are responses like these really enough to enlighten a decision making team sufficient that they need interrogate no further? Hardly. They are too vague and non-specific, and they tempt all to snuggle up together in a false sense of security from which they either ignore what is really going on or make rash untested decisions. If you hear this genre of performance update dialogue Montez Sweat Jersey , have courage to ask questions that dig for specifics:
"What exactly is working well?"
"How are we tracking, specifically?"
"How slow was it? What speed is it at now?"
"Too high compared to what?"
"What kinds of benefits is it failing to realize?"
OPINIONS AND HEARSAY
When you've been around something for a long time, you get to know the way things work by the patterns that keep recurring. It is super easy to be seduced by the predictive power of those patterns, especially when it saves you effort. When uttered by recognised experts Dwayne Haskins Jersey , opinion and hearsay shine like pearls of wisdom:
"Obviously we have the best sales performance."
"Our customers are very satisfied with our responsiveness."
"That project is failing to realize benefits."
"I think we've done a great job this year."
Opinion and hearsay are dangerous when they come clothed in crisp words and confident tones. But they are fact no more than the Emperor's new clothes are fabric. It's a brave soul indeed that asks the dumb questions of those who are certain. Time and again, however, the dumb questions turn out to be excellent questions when they turn attention to concrete evidence:
"How is it obvious?"
"How do you know? How did you find this out?"
"In what ways is the project failing?"
"What leads you to conclude this?"
LOGIC LEAPS
The cause-effect conversation is a mainstay of management decision processes, but its familiarity doesn't guarantee its sensibility. "Cause-effect" is a simple form of logic connecting two results in a distinct relationship. It takes a keen ear to hear logic leaps in a cause-effect argument connecting the results of familiar performance attributes:
"We've met our downsizing target and costs are rationalizing now."
"We have improved customer loyalty because we implemented the CRM."
"Several initiatives together have improved revenue."
"Employee turnover has reduced because of our performance planning system."
Leaps in logic of this ilk are a symptom of failure in the planning process to establish sound and clearly articulated hypotheses of which strategies are supposed to impact which results Da'Ron Payne Jersey , and failure in the strategy implementation process to validate these hypotheses as early as possible with evidence of the real impact. Armed with common sense, curiosity and a coping strategy for the uncertainty likely to ensue, you can put a stop to long-held logically flawed beliefs about what really causes what:
"How was the relationship between costs and downsizing determined and verified?"
"Is it possible that customers might not stay loyal, despite the fact we implemented a CRM?"
"Which factors have the most influence on revenue?"
"What size is the impact that the performance planning system has Landon Collins Jersey , compared with other factors that influence turnover?"
THE CLICH?