In 2013, a Japanese whistle-blower exposed the falsification of trial data for Diovan, Novartis’ blockbuster blood pressure drug. The ensuing investigation led to the arrest of one man, Nobuo Shirahashi, who was accused of manipulating the Diovan data and skewing clinical research published by two Japanese universities (Japan Times, 2014).
The scandal sent shock waves through the industry. This raised questions about what industry rules would change, what measures would be taken, and whether the broader Pharmaceutical collective would take the necessary action to ensure this would never happen again. One industry that has learned from crisis is aviation, to the point where flying is extremely safe. Airlines go back and look at what happened moments before a disaster. They ask questions: What was missed? What was the mind-set of the pilots? Most importantly, they ask why. Blame appears to be deferred until clear answers can be known and they can determine what needs to be changed for the future. In contrast, the first questions asked by Novartis appear to have been who was responsible and who was to blame. In contrast, the airline industry asks why this event occurred.
Novartis determined that Shirahashi acted alone and admits no knowledge of his deceit. This may be a case of “bad apple thinking.” It was one rogue employee; their actions were not indicative of the entire organization. Whether Shirahashi acted alone or was a product of a broader culture is unknown. However, a failure to learn from mistakes is a huge obstacle to an organization’s success. An inquisitive mind-set is essential to any firm being able to improve. However, society tends to want to blame someone. It is easier and neater to solve the problem, place blame, and move on. Yet, blame is counterproductive and inhibits learning from a crisis.
In the book Black Box Thinking, Matthew Syed (2015) tells the story of United Airlines Flight 173 in December 1973, when the flight crew, coming in to land, became convinced that the landing gear had not locked into place. They spent so long trying to fix the problem that they ran out of fuel and had to crash-land in a residential area. Thanks to the skill of the pilot, and a lot of luck, no one was killed on the ground, and only eight people on the plane lost their lives. The investigators learned that the engineer had not been assertive enough in pointing out that fuel was running low. The pilot, obsessively trying to solve the other problem, lost focus of the other problem and time literally ran out. After this crash, protocols were put in place, training methods were changed, and nothing quite like it has happened since (Berkmann, 2015).
What did Novartis do? In order to restore faith in the integrity of its quality assurance and control, Novartis has cracked down on its Japanese division. It issued a sweeping pay cut to all executives, around 30 percent, and actually replaced some of Novartis Pharma KK’s senior management, bringing in senior executives from headquarters or other subsidiaries. Novartis also instituted a series of corrective measures, such as remedial training for staff, and issued a third-party review of its testing protocols.
Did these measures really address the problem? Only time will tell. However, by blaming one person, it may be difficult to learn the necessary lessons. Has Novartis lost a golden opportunity to address culture changes by firing executives who may in fact have the answers? Feedback is at the very heart of improvement and growth. Often the approach taken to any feedback tap into our fears. How we embrace or ignore feedback, and ultimately the great necessity that humanity has of feedback, in order to learn and improve (Syed, 2015).
It is difficult to believe that executives brought in from the outside will be able to ignite the necessary changes given the language and cultural barriers that Japan possesses. Only by accepting failure will the industry be able to tap into new areas of creativity and tenacity. All pharmaceutical companies have been confronted with evidence that challenges the core of Japan’s organizations. It is natural to want to reframe the evidence than alter one’s beliefs. Cognitive dissonance theory suggests people seek consistency between their expectations and their reality (Cognitive Dissonance, Wikipedia.) Organizations and individuals will reframe the crisis to reinforce their beliefs. Has the industry failed to ask the tough questions?
Failure should be our friend. No one likes to fail, but we should know it is only temporary and that one failure should not define an individual or an organization. The only real failure would be not seizing the opportunity to learn and improve protocols and standard operation procedures.
Berkmann, Marcus. (2015) The Daily Mail, October
Syed, Matthew, (2015) Black Box Thinking: Why Most People Never Learn from Their Mistakes–But Some Don’t
Cognitive Dissonance In Wikipedia. Retrieved October 13, 2016, from http://en.wikipedia.org/wiki/cognitive_dissonance