Rose Colored Safety Glasses
- Created: Wednesday, 16 October 2013 11:25
- Published: Wednesday, 16 October 2013 11:25
By Randy K. Logsdon
There are really only two steps that are critical to the success of achieving a goal: Formulate a plan and execute the plan.
There is a plethora of activity around these from the preparatory stages through evaluation stages. Research is conducted, objectives are defined, training may be a part, certain collaborative activities are inserted, budgets are approved, services and equipment are acquired and schedules are arranged.
Actions directed at achieving an injury reduction goal are no exception. While errors may be made in formulating the plan, I’ve found that the most difficult hurdle to success involves execution of the plan. Probably the least predictable variable there is the human element – more specifically, human judgment.
I recently stumbled onto a phenomenon know to Neuroscientists as optimism bias. I discovered that there was actually extensive research on the subject and immediately realized that understanding this concept may be useful in managing that human variable – particularly as that human element affects risk assessment.
Stated simply, optimism bias is a physiologically driven condition that most all humans exhibit. It is actually a necessity for our mental survival. We each view our future optimistically. Only severely depressed individuals exhibit little or not optimism bias. For example, we know that in 2012 there were 36 fatal mine incidents in the U.S. We can objectively predict that in 2013 the experience nationally will be similar. In fact, as of this writing, we’ve experienced 23 mine deaths already in 2013.
If each miner in the United States were asked if he/she expected to die in a mine incident, the response would be an overwhelmingly “NO.” There will be some deaths, but others will be the victims.
Humans are unique in their ability to visualize anticipated events and expectations. Our dreams of the future are colored (as the saying goes: “through rose-colored glasses”) by an optimism bias. Aside from the distraction dangers of daydreaming in the workplace, optimism bias can have a profound effect on how we recognize the risks associated with everyday work. When confronted with new or different circumstances, we make judgments concerning the risk of proceeding.
Optimism bias can cloud what should be an objective evaluation toward an unreasonably optimistic assessment of the risk. We can find ourselves muttering statements like “it probably won’t move, it’s not that high, I can lift that, it’ll be ok until shift change, etc. Such optimism is normal and natural but not objective. The problem is compounded by research that indicates that our optimism bias will override objectivity even when confronted with objectives facts.
Researchers tested individuals within a certain demographic whose average likelihood of developing cancer was 30 percent. They asked each individual what they estimated of their chances of developing cancer.
Answers ranged from well below the mean to well above the mean. Then, after having been given the information, they were asked to reassess their chances. Those who had previously estimated their chances of developing cancer at 50 percent adjusted their personal assessment to align with the mean (30 percent). Those who had previously rated their chances at 10 percent, adjusted their personal assessment only slightly to 11 percent even after having learned the objective data.
There is a slippery area in the walkway. You know that there is an increased risk of falling by taking that path. Both your experience and a warning sign confirm that fact. However, your bias tells you that if you’re careful, and make adjustments, you can walk through that area without slipping. Optimism trumps objectivity. There is a lot to learn about optimism bias. And it appears that simply understanding it is not enough to overcome it. But it is a start.