In 2013 the Japanese Pharmaceutical Industry was rocked when a whistle-blower exposed the falsification of trial data for the drug Diovan, the blockbuster blood pressure drug manufactured by pharma giant, Novartis. The ensuing investigation led to the arrest of one man, Nobuo Shirahashi, who was accused of manipulating the data that skewed the clinical research published by two Japanese Universities.
It was a scandal that send shockwaves throughout the industry, with implications felt globally, as Novartis and the Japanese pharmaceutical industry scrambled to recover. But after an investigation that left the blame solely with one man and that raised more questions than it answered – what can we, as an industry, honestly say we’ve learnt five years on? Have the relevant industry rules been changed? What measures have been taken, across the broader pharmaceutical collective, to ensure this will never happen again?
One noteworthy industry, most adept at learning from its mistakes, is the Aviation Industry. Throughout history there are numerous examples of terrible accidents and disasters associated with air travel, but one thing they all tend to have in common, is that they only happen once. Airlines go back and conduct thorough investigations, looking at everything that happened during the time leading up to the disaster – pulling no punches and with no attempts at saving face. They ask the difficult questions, taking evidence from every possible source, examining what was missed, was there any cause for human error? Was there any cause for malfunction? Examining the evidence, until they can say why, with all the confidence of a systematic and exhaustive enquiry, the accident occured and what can be done to prevent the same thing from happening in the future. Blame doesn’t appear to be of primary concern until a clear direction can be determined for the necessary preventative measures to be put into place.
In contrast, the first question Novartis appeared to ask, was who was responsible – seeking a scapegoat to take the blame as far away from the company as possible. Novartis determined that Shirahashi acted alone, admitting no knowledge of his deceit. A conclusion that was supposed to comfort the many people its drugs treat, as well as the industry as a whole, but surely Novartis’ hierarchy, and the hierarchies of all businesses, dictate there should be some sort of executive accountability. If they didn’t know, they should have known. It might be a case of bad apple thinking, questioning the broader culture of the organization, rather than simply accepting the narrative of ‘one rogue employee’, but if anything, Novartis’ failure to conduct a proper investigation and learn from its mistakes, can only hamper its future successes.
A collective, inquisitive mind-set is essential to any firm being able to improve, however our societal mind-set seem hell bent on blaming, someone, anyone, as long as its not us. It’s easier and neater to solve the problem by holding someone else responsible and moving on. Yet blame is so unbelievably counterproductive to not just in business, but in everyday life, and seriously inhibits our ability to take crises and failures for what they are – an opportunity to do better.
In his book, ‘Black Box Thinking’, Matthew Syed talks about the United Airlines Flight 173 that crash landed in December 1973. The pilot and his crew, when coming into land, became convinced that the landing gear had not locked into place and spent so long trying to fix the problem, they ran out of fuel. The passenger jet had to land in a nearby residential area, tragically resulting in the deaths of eight people on board, but amazingly, thanks to the skill of the pilot, avoiding killing or injuring anyone on the ground. After United Airlines conducted their investigation, they discovered that the engineer responsible, had not been assertive enough in pointing out the fuel was running dangerously low, and the pilot, so overwhelmed, obsessively trying to fix the landing gear problem, lost focus and time literally ran out. It was a tragedy, but thanks to the ability of the pilot and the crew, it was not more of one. Since then, protocols have been put into place and training methods have been changed, to ensure nothing of its nature would ever happened again – and guess what, it hasn’t.
Novartis on the other hand, cracked down on its entire Japanese division, issuing sweeping pay cuts of about 30% to all executives, replacing some of Novartis Pharma KK’s senior management, implementing a series of corrective measures, such as remedial training for staff, and commissioning a third-party review of its test protocols. One man might have been responsible, but the whole Japanese arm felt the repercussions. Only time will tell if these measures really addressed the problem. However, by holding one person solely responsible Novartis has fed into the age-old culture of stigmatising failure – the real culprit, rife throughout histories most memorable disasters.
In January 1986, Allan McDonald, an engineer with Morton Thiokol, was unable to stop the launch of the Challenger, despite repeatedly advising NASA it was too cold. When McDonald and his team were unable to prove the mission would fail, NASA refused postpone the launch for a third time, resulting in the space shuttle exploding 73 seconds after launch, killing all seven people on board. There are many reasons we can speculate as to why NASA was so reluctant to postpone the mission – arrogance, money or they had just been too successful – but it all boils down to this big corporate organization, just not wanting to fail. There is still a huge stigma attached to failure and despite the many hugely successful figures throughout history painting failure as something necessary, even a blessing, to fail still conjures feelings of shame and embarrassment. Feelings so off putting, NASA would rather send a space shuttle into space without the proper safety protocols, then admit they might have a problem.
Cognitive Psychologist and author of Seeing What Others Don’t: The Remarkable Ways We Gain Insights, Gary Klein, has spent the last few years trying to solve this problem. In his book he outlines a technique he’s dubbed, the ‘pre-mortem’, a process designed to highlight what might go wrong, before it goes wrong. Klein takes a group of people, all involved in the same project, and tells them that he has seen the future and that their project has failed. The group is then given two minutes, to write down as many reasons they can, as to why it failed. This, he calls, ‘prospective hindsight’. Each of the participating group’s list, is them combined to form a catalogue of potential problems and measures can then be put into place to ensure they won’t happen, without them ever having to occur. Finally, Klein asks the group to each suggest one thing they would do to make the project more likely to succeed.
The technique identifies many more potential issues than just asking people ‘do you think the project will succeed?’, for a number of reasons. Firstly, the group aspect – people are far more likely to feel comfortable expressing ‘negative’ opinions, when they are amongst peers, all doing the same thing. Secondly, the questions is phrased in ‘show me how smart you are’ way, encouraging the participants to really dig deep into the complexities of the problems that might arise, the process also bypasses any anti-negativity culture rooted in the company they work for. Klein makes sure to finish the exercise positively, ensuring all the participants continue with an optimistic perspective, both of themselves and the success of the project. The idea can be seen as reaping the benefits of failure, without actually having to fail – using feedback, uninhibited by fear, a lack of confidence or company culture, to grow and improve.
And here, Novartis has again waved goodbye to yet another golden opportunity to come out of the crisis, by firing the very executives that might have held the answers. Their feedback might have proved invaluable when performing the required introspection to move forward. This is especially true given the language and cultural barriers, the nuances of the Japanese market presents. Only by accepting and embracing failure, with all positives and negatives that come with it, will Novartis, and the wider industry as a whole, be able to tap into the creativity and tenacity required to really come out better than they started. But the question remains, are they willing to do it? Are they willing to stop pasting over the issues and confront them head on.
It’s a tough call. Cognitive dissonance theory describes the uncomfortable state of mind we suffer when faced with two conflicting, contradictory beliefs, and this crisis has done just that. We have been confronted with evidence that challenges the very core of Japan’s business ethos and naturally we all want to reframe the evidence to reinforce our own beliefs.
It is in situations like these, we must all take a long hard look at ourselves and ask the difficult questions, remembering that failure is our friend. Nobody likes to fail, but it is always only as lasting as you choose to make it. Failing is temporary, and one failure, mistake, disappointment or disaster, does not define us as failures, what defines us, is how we react to those difficult times – and that is far more telling of our characters, than how we react to our successes. The only real failure is not seizing the opportunity to learn and improve, changing the way we do things for the better.
With all this in mind, can we honestly say, the industry has asked the right questions? And if they have, has Novartis answered them?