The Chief Psychology Officer

Ep35 Martin Bromiley OBE: Overcoming Blame Culture

Dr Amanda Potter CPsychol Season 2 Episode 35

Send us a text

Creating a "Just culture" that has Psychological Safety at its core is the life work of Martin Bromiley OBE since personal tragedy hit his family over 17 years ago. Martin is the founder and Chair, and now Trustee of the Clinical Human Factors Group (CHFG). It is the mission of CHFG to show how a better understanding of the role of human factors can have a significant impact on safety, quality and productivity in healthcare and beyond. It is his aim to to stimulate dialogue and demonstrate through concrete actions, how clinical and high risk organisations can improve their psychological safety, improve accuracy, overcome risk and remove blame culture. In this episode, Martin and Amanda are being interviewed by Tim Hepworth.

The Chief Psychology Officer episodes are available here https://www.thecpo.co.uk/
To follow Zircon on LinkedIn and to be first to hear about podcasts, publications and news, please like and follow us: https://www.linkedin.com/company/zircon-consulting-ltd/
https://www.linkedin.com/company/clinical-human-factors-group/
To access the research white papers mentioned in this and other podcasts, please go to: https://zircon-mc.co.uk/zircon-white-papers.php

Timestamps

Overcoming Blame Culture

·       00:00 – Introduction to Overcoming Blame Culture

·       00:27 – Martin Bromiley

·       01:24 – I want to tell you a story…

·       04:30 – A generation gap

·       05:36 – I’m stressed, let me concentrate!

·       06:33 – Take a step back

·       07:25 – Need an expert opinion?

The Human element

·       08:58 – Clinical Human Factors Group

·       10:14 – Made our mark

·       11:57 – Rising from the ashes

·       12:42 – Respect (Just) cultures

·       13:35 – Seven small errors, sow the seeds of disaster

·       15:01 – Is it negligence or inadvertent human error?

·       16:14 – Growth Mindset

·       17:02 – It comes from a good place

To err is human…

·       18:16 – Jeremy Hunts blunder (got his last name right)

·       19:28 – What does the boss say?

·       20:30 – What can we do to improve Psychological Safety?

·       21:47 – Accountability

·       22:56 – Error is normal; accept it

·       24:34 – The Antithesis to Blame Culture

·       25:49 – You can’t get away with this

·       26:41 – Fight or Flight

·       27:58 – The crowd has spoken

…to forgive is divine.

·       29:23 – All natural or reactionary?

·       30:42 – The impact on control

·       32:39 – Speak up

·       35:09 – Every player is critical

·       36:26 – Neuroscience!!!

·       38:00 – Guidance

·       41:37 – Summary

·       42:47 – The end.

Episodes are available here https://www.thecpo.co.uk/

To follow Zircon on LinkedIn and to be first to hear about podcasts, publications and news, please like and follow us: https://www.linkedin.com/company/zircon-consulting-ltd/

To access the research white papers mentioned in this and other podcasts, please go to: https://zircon-mc.co.uk/zircon-white-papers.php

For more information about the BeTalent suite of tools and platform please contact: TheCPO@zircon-mc.co.uk

SPEAKER_00:

Hello and welcome to this episode of the Chief Psychology Officer with Dr. Amanda Potter, Chartered Psychologist and CEO of Zircon. I'm Tim Hepworth and today we have an esteemed guest, Martin Bromley OBE, and we'll be looking at how we can help individuals and teams overcome a blame culture. So Martin, could you just give us a brief introduction, tell us a bit about yourself?

SPEAKER_01:

Yeah, thanks very much, Tim. So I'm an airline pilot. I fly for a major UK airline. And in that job, I'm a training captain, which means I get to train our new pilots. I get to train our existing pilots and I get to train our new commanders. So that's what I do day to day. Outside of work, I fly as a passion. I fly aerobatics. But perhaps the more relevant reason why I'm here today is that in 2007, I set up a charity called the Clinical Human Factors Group, which exists to promote the science of human factors to healthcare. And as no doubt we'll get onto, there's a bit of a story about why I do that voluntary role.

SPEAKER_02:

It's lovely to meet you, Martin. And I'd heard about you before we invited you on the podcast. And one of my colleagues, Sharon, suggested that we interview you because your story was told in one of Matthew Saeed's books, which is Black Box Thinking. So I feel very privileged to have met you. So thank you.

SPEAKER_01:

Thank you, Amanda. That's very kind.

SPEAKER_00:

So Martin, you mentioned the Clinical Human Factors Group. As you mentioned, there is a story behind that. Would you mind expanding upon that and telling us how you came to actually create this wonderful charity?

SPEAKER_01:

Yeah, so it all goes back to 2005. So at the time, I'm married, two young children, and life's pretty good. But my then wife, Elaine, has to go into hospital for routine surgery. Unfortunately, during the surgery, problems occurred. She was anaesthetised. And straight after she was anaesthetised, they had problems getting oxygen to her. And what in fact effect happened was the situation deteriorated during this emergency. Unfortunately the emergency wasn't perhaps handled as best as it could have been and sadly Elaine ended up being unconscious. She was transferred to intensive care but died 13 days after the original attempted operation having never regained consciousness. During that time that she was in intensive care you know I was just trying to make sense of what was going on and do the best I could for her and the children but after her passing I'd had some conversations with the head of the intensive care unit just trying to understand what had happened and he'd explain that it was just one of those things they wouldn't investigate because these things kind of happen and I was thinking hang on a minute you know with my background in aviation you know normally when something goes wrong we try to investigate not to blame anybody but just to learn and surely there would be something that perhaps could be learned from this and in the end there was an independent review of Elaine's care. And what we learned was that there was an experienced team looking after Elaine, but that when those problems occurred, the team had become fixated on a particular solution. They lost the awareness of the big picture about what was going on around them and what the bigger problem was. And what was really significant is I said the team kind of lost the big picture. That's not quite true. It was the doctors that lost the big picture become fixated on a But it was the very junior members of the team who could see what was happening and what was going wrong. And they tried to speak up, but were unsuccessful getting the message across. As I kind of looked at the output from the inquest and from the independent review, it was very clear to me that these are exactly the things that we train and we work to avoid or trap or mitigate in my world of aviation. And so when I started talking to doctors after the report had come out about what had happened to Elaine, there was a general lack of understanding, actually, of the science of human factors, as we call it, which is something that is absolutely a fundamental part of what we do in aviation.

SPEAKER_02:

I remember reading about your story before we met and just so we're clear this is in the Matthew Saeed black box thinking book and it was a harrowing read I must say because of all of the things that you've just said where the more junior staff were clear about the errors that were being made but also the perception of time that was another point that I heard about that perception of time changing where the doctors when interviewed afterwards thought I thought that being mere matter of few minutes, but actually that it had been a lot longer that they'd been struggling with this single focus, this single issue. And what was happening was this single minded focus that, like you say, they lost the big picture and they were unable to look at alternatives and look at other solutions because they were trying to fix this one thing. And it struck me. And I think one of the things you said to me when we met last time is that people stopped listening. So therefore, the more junior colleagues couldn't get heard. even though they were trying desperately to get heard because they could see the mistakes being made or things going wrong

SPEAKER_01:

yeah and it's you know we know that it's absolutely normal under stress that we become fixated we see a problem and we want to go and fix that problem and fixation is actually a good thing to have because it allows to focus on the biggest thing so if you're being chased by a saber-toothed tiger in prehistoric times you want to focus on the one thing you know you're not interested in anything else around you and that's a very simple at example of a threat and a solution but in the modern world whether we're talking about healthcare aviation or just the sort of businesses that you work in there are very very few simple problems the chances are very high that the problem you're working on you might be able to fix it but there are unintended consequences of that solution and there might be many other solutions and this is something we're very familiar with in aviation because we spend a lot of time in simulations and we experience this sort of thing but in healthcare certainly in 2005 that sort of simulation was extremely rare much more common now I'm pleased to say and in a complex world we have to be able to take a step back and try and understand everything that's going on and how something is interacting before we leap into some form of action and even if we do leap in the thing that we have to help us often in most organizations is a team of people whether they are a direct team or whether they are departments or people who might be involved who can give a perspective and that perspective that external perspective even from somebody who perhaps you might not regard as being expert as you can be very very valuable because they have the clarity of vision that might be able to offer some different and powerful ideas and the importance of taking a moment to listen and to understand exactly what's being suggested before you continue diving into your issue is really critical in my view

SPEAKER_02:

that's such a good point you make I mean we We often turn to experts or to people who we believe are credible. So if we trust them for their credibility, we are likely to listen to them because of that expertise, because of that credibility. It's often the people who are coming from a very different perspective who may not be experts, who have a very different lens and different perspective that we should most definitely listen to because they will challenge our thinking.

SPEAKER_01:

That's especially true, Amanda, I think, not just in a very complex world, but a fast-moving world. And that's exactly what we're seeing now. And just as a very quick example, a healthcare organisation near me, they've merged three sites. They're currently issuing orders to the staff about how things will be different. And they're saying to the staff, from now on, this is the new policy for this. But the staff who are actually doing the work, the experts haven't had an input to this. Every time they're issuing another order and a directive and a policy, the staff are coming back saying, that's not going to work. Have you thought And it creates this real tension where the leaders think they're being challenged and almost, you know, having the mickey taken of them. And the staff at the front line, the poor staff at the front line is saying, we're just trying to get the job done. As the world gets more complex, as we start to enter new ventures in whatever sphere we're in, the ability to listen to the people who are perhaps not expert, but are just expert in their own world or expert in their perspective becomes ever more important in business.

SPEAKER_02:

And so what is the role of your business now of Clinical Human Factors Group? So the

SPEAKER_01:

Clinical Human Factors Group is a charity. We exist purely to promote an understanding of the science of what I've been talking about the science of safety and the science of human factors and ergonomics and just to kind of simplify because I've talked about the term human factors a lot human factors and ergonomics is the science and it's really a collection of sciences and there's two sides to it one is human factors engineering this is engineering systems designing systems that make it easy to get it right so systems could be the physical architecture of a building having two departments next door to each other that have to work together, it could be the design of IT systems, it could be the design of processes and procedures as well, which often doesn't require an engineering degree to do, but it's still a how you design the system to make it easy to get it right. The other science is much more perhaps your side of the science, which is around the kind of psychology side about how can we help people behave and perform in a certain way and to think in a certain way. And so the science has two sides and we try and promote an understanding of both of

SPEAKER_00:

those. Martin, what do you think your biggest achievement has been since focusing on human factors? Where has it made the most impact?

SPEAKER_01:

Probably the biggest success is that now we can see that human factors and indeed safety itself is a major part of policy documents in the health service and you might say policy doesn't change anything but you have to start somewhere in a big bureaucratic organisation. Human factors is a big part of thinking of senior clinicians when they're talking about making changes in practice We have actually changed practice in many cases, and we've seen some really good stuff come out of that. We've taken human factors science all the way to the very top, all the way to the kind of policy and political level, including some systemic changes that we've made, the setting up of the healthcare safety investigation branch, which is an equivalent body to the air accident investigators that now works in healthcare. So that's been set up under primary legislation now, and that was quite a bit of work to get that achieved. We've certainly inspired clinicians and I know we have directly saved lives. But actually what's been the surprising thing is that the work the charity has done, although it's had this influence in the National Health Service and I should say in other countries as well in their healthcare services, some of the stuff we've produced we've seen has been used by firefighters in Alaska, it's been used by nuclear power plant operatives in South Carolina, explosive ordnance disposal, bomb disposal teams used it pre-deployment, some of our stuff prior to Afghanistan and Iraq and we've seen Australian military helicopter engineers using some of our material as well. And that's because the science around making it easy to get it right is not about a domain specific. It's about the human. It's again about psychology, for example, and good engineering.

SPEAKER_02:

That's a complete goose bump moment for me, actually, just understanding the impact that you've had. And whilst the story starts with such a tragic start, the fact that you've made such a significant difference in so many people's lives. And I'd heard through reading about you that you'd been getting so many letters from doctors from around the world saying that the work that you had done had made a difference to their practice and as a result had saved lives so phenomenal.

SPEAKER_00:

It is an amazing list of achievements I mean when I ask the question normally when you ask questions like that somebody will come up with like one thing or whatever that stood out but that list there is absolutely phenomenal like they're culminating an actual saving of lives what could be better.

SPEAKER_02:

I completely agree and one of the things that we talked about when we were preparing for this podcast, Martin, is we were talking about culture and the impact of all of this and the science of clinical human factors on the culture of the NHS and other clinical organisations, health organisations, but more broadly, actually. And you talked about a just culture. Would you mind just articulating what a just culture is, please?

SPEAKER_01:

A just culture is really about providing justice, if you like, for those within the culture. So if I just take you back to the we all make mistakes but we have this idea that when somebody gets expert when somebody gets experienced they no longer make mistakes if they're well trained they won't make mistakes we know that's not true in aviation so typically and there's some good data around this in both aviation and surgery funnily enough and i suspect fun in other places when you have an accident or an incident you tend not to see just one big mistake causes one big accident it's normally a small number of errors of what i call micro errors it's about seven actually on average we found in aviation and surgery seven small errors lined up can deliver a disaster of some sort so a just organization then has a look when something goes wrong and tries to understand the nature of what happened and it tries to look at the system not the individual to understand why it made sense at the time so when something has gone wrong and think about this in any organization whether it's zircon or whether it's an organization that's one of your clients. When something goes wrong, just taking a moment to think, right, okay, what's this inadvertent human error? Was the person doing their best? And if it was inadvertent human error, then I would argue that that person probably needs support. Once you've rectified whatever the issue is, that person then probably needs help to learn from what happened. And you as an organisation need to learn from what has happened. Don't get me wrong, if that individual behaved in a way which really was grossly negligent, or if even was in some way deliberate in other words some form of sabotage that's a completely different thing altogether so at that point you're going to get rid of the person or you're going to discipline them or whatever so there's a continuum a line if you like from on the one side inadvertent human error freely admitted and on the other side kind of gross negligence and whenever anything goes wrong an organization the managers in an organization have to decide where do we draw the line between those two things in this particular case now i would say suggest that for most errors that happen in businesses today 99% of them are actually inadvertent human error and we should be learning from there are very very very few cases where it genuinely is grossly negligent and where it genuinely is some form of sabotage so once you've decided that hey we're going to treat inadvertent human error with compassion with support with learning with investigation then you can start to learn to redesign your systems to Now, it might be the systems include training for individuals involved, but also the other people, because the question is, if it's somebody else, could the same error have happened? And if the answer is yes, then you need to change the system. It could be subtle changes in processes. It could be redesign of IT systems. But learning, so a just culture enables you to learn, basically, because it says to people, you know what, if you make a mistake, own up to it, be honest, report it, let's learn from it and see how we can move forward. And you mentioned Matthew Syed, and this feeds directly into a growth mindset, because we talk about growth mindset as if it's an individual that has a growth mindset. And sure enough, an individual can have a growth mindset, but we need to have an organization with a growth mindset. And that requires an organization that can just look at something and say, you know what, let's suspend judgment. Let's just try and learn from this. As an organization, we can get better if we learn. And hey, staff, it's okay if you make mistakes. Obviously, try not to, but we want to learn from you. learn when mistakes happen. And we want to support and we want to really change how we do business in the future. So that's a just culture where it makes it okay to make mistakes inadvertently, as long as you admit them, report them, and we won't sanction you. But we retain the right to sanction.

SPEAKER_00:

So Amanda, have you ever heard of that term before? It is a new one on me, just culture.

SPEAKER_02:

I hadn't. And now that I've done the research, I do understand the concept of it. And of course, Martin has very eloquently described described it for us it just strikes me the relationship between just culture and psychological safety and I know Martin is talking about systems and processes and with psychological safety we're talking about people and climate so it's the difference between culture and climate so just culture is very much about systems processes and how people need to interact and operate as a team in order to create a safe environment whereas the climate is how people feel so the There is a striking overlap between just culture and psychological safety for me, which is very interesting indeed. But stepping back and reflecting on the two, both of them come from a good place. In most situations, the intention is good. And most times, I think with poor psychological safety in teams, the behavior comes from a good place. Unfortunately, the impact of that behavior and the outcome is often negative. For example, wanting to please, deferring to leadership, being cautious, and so on. All of those things come from a good place, but actually the intention is good. So lots of similarities.

SPEAKER_01:

When an organisation talks about psychological safety and says we're going to be psychologically safe, they also need to think about their policies around just culture, about error. And that's always the first thing I say is when something goes wrong, how are you going to deal with that? There was a story told to me, and I won't give too many details, but it was a major NHS trust that had had a major series of disasters. In the end, it ended up with the government intervening significantly. But they parachuted in a new chief exec. And the chief exec started talking to the board about just culture and the importance of just culture and how it might support psychological safety, even though at the time, I don't think that term was in common use. And the board sat around and nodded and said, yes, we totally agree. And then the next item on the agenda was a particular error that had occurred in one particular department. And when it was talked about, the first thing that a board member said was, who was to blame? And the moment that comment came out, the chief No, no, that's exactly what I don't want. We're not interested in who's to blame. We're interested in what is to blame. How was that possible? Why did it make sense at the time? So as an organisation, you can't have psychological safety, in my view, unless you have a just culture. And a just culture isn't just about policies at the top. It's about your immediate boss. So, you know, you're working in an organisation doing whatever job and you make a silly mistake. What does your boss say? Does your boss say, what on earth were you thinking when you did that? Or does your boss say, okay, that's interesting. So just talk me through how that happened and what we might learn from it. That is the way that starts to build that psychological safety because it's about trusting. It's about trusting how somebody is going to respond to you when you tell them. And I think one of the biggest things that I see certainly in aviation and particularly in some senior doctors and senior nurses is a real sense of humility. So I'll be sitting with a colleague in the flight deck for example and my colleague will make one of these micro errors and one of the first things I'll usually say is don't worry I've done that myself which is inevitably true but it's so important to say you know what that happened so great well done we caught it what can we take away from that and that humility because we all make mistakes even the very best.

SPEAKER_02:

There's two points that you've made which are great Martin and I would love to come back to one of them in a moment which is around blame culture because I feel like that is the opposite to But another point that you've made is one that our clients really struggle with, which is once we've conducted our psychological safety audits and we've identified some of the risks within the organization and we've started to understand from a behavioral perspective, what are some of the changes that they can make to help to improve the feeling of psychological safety within teams? One of the things they often say to us is what are the some tangibles that we can do as an organization to create a greater sense of psychological safety? And for me, it's really helping me to understand that a just culture is the answer because it is around policy. So having clearer policies, clearer guidelines around error reporting, around mistakes, and actually celebrating the fact that people are stepping up and stepping forward to say that something has gone wrong and almost sharing the learning. So if we could make it around policy, I think that could be one of the actions that could come from a psychological system. safety behavioural audit that would help with the culture change that's required to create that momentum that's so necessary.

SPEAKER_00:

In previous discussions that we've had about these things, whenever the word blame comes up, then you always counter that with the concept of accountability rather than blame.

SPEAKER_01:

You are still accountable. So that doesn't change, but it's how you enforce that accountability, I suppose. Now, obviously, some errors become so serious, the accountability even goes outside the organisation in ends up in matters of law. But as a general rule, you know, it's holding up your hand and saying, yes, I was responsible for that error. And then the organization handling that appropriately in a way that is appropriate. So as I say, it's very, very rare in an organization that one error on its own delivers disaster. Inevitably, there's a whole host of them. So you have a whole set of time, if you like, a whole number of moments to catch the errors that are small before they create the big disaster. So individual tiny micro errors on their own are not the big deal, if you like, but catching them is. And once you can be open about error, then you have a much greater chance of catching them. And this is the kind of paradox, I suppose, about aviation, because we have an incredible safety record. If you look at 1960 in commercial aircraft, 30 accidents per million departures. If you look at the stats now, it's 0.3 accidents per million departures. And with the biggest improvement, actually, over the last 20 years. And that's really hard data. So it works, but it works by accepting that error is normal. And it works by learning from it. And it also works by having multiple lines of defence because we've learnt about error. But it's also recognising that we can stop the big error by catching the small errors, but not trying to enforce people to be perfect. Because if you fire people who aren't perfect, you're not going to have any staff left in any organisation. There's been a real kind recognition that this stuff works.

SPEAKER_02:

One of the things from our research is that if we manage to drive personal accountability and ownership for outcomes within teams or within organisations, plus we create an environment where people are able to speak up, they are able to challenge, they are able to admit mistakes, that's when we see our clients having the highest level of performance. Because to Martin's point, they will spot the small errors and they will stem the tide of the bigger errors really starting to take hold. And what happens is you see those organizations slowing down that process of problem solving and therefore recognizing and understanding stakes before they actually happen and learning from one another. So it's truly important that we create that sense of ownership and accountability that everybody who is in that team has the responsibility, has the accountability for outcomes.

SPEAKER_00:

You mentioned a little earlier, you were going to ask Martin whether you see a just culture being the opposite of a blame culture?

SPEAKER_01:

That's an interesting one. So in aviation and now in healthcare, what we're talking about is no blame investigation. So no blame investigation is when you forensically investigate a disaster, if you like, where you're just trying to identify what happened and the learning, but you are not holding anybody to account. You are not identifying who might be to blame. You produce a report, And that is shared openly. And that allows organizations and individuals to learn. Now, as a result of that report, it might be then an organization decides to take a route that is holding somebody to account, or the law might take a route that's holding somebody to account. But that no blame approach is designed just to learn. But you then have to have an organizational just culture that manages, you know, when things have generally been done just inadvertently versus when things have been grossly negligent because you can't have no blame if you have just no blame in an organization it means people get away with really bad things sometimes and it actually cheapens the profession i would not want a no blame approach in aviation i would not want it in health care because it would say you can be pretty rubbish grossly negligent but we don't care and that's absolutely the wrong thing so yeah The opposite of a blame culture is a no-blame culture, but a no-blame is only about learning. When you investigate, a just culture is an appropriate way of conducting yourself as an organisation and as a profession. It allows for the fact that people make inadvertent errors. It accepts the fact that that usually results in learning about the system, but on the rare occasion when we do have to take action against an individual, we have the ability to do that.

SPEAKER_00:

Would you agree with that, Amanda? Is that how you would... and characterize a blame culture?

SPEAKER_02:

I think we really need to be clear about what is a blame culture. And when I was doing the research, what I noticed is it's really related to the concept of accountability and ownership. And when we get into a situation of people wanting to blame one another, of course, we are likely to see that they're in an environment where they don't feel psychologically safe. So they are having that fight or flight experience. And so they're experiencing high levels of stress and anxiety and worrying about the situation and their role and their place in that team and what happens when they are having that fight or flight response and they are experiencing the feelings of anxiety as a result of the release of the neuromodulators and transmitters and hormones is that they'll be releasing the hormone cortisol and they'll be releasing adrenaline and as a result they will be feeling like they need to avoid and it's that avoid desire and need that results in them wanting to swerve the blame and not admit to mistakes, which creates the feeling of blame and wanting to point to someone else. So there is a very strong relationship, I think, Tim, between blame, accountability, anxiety, resilience, and psychological safety.

SPEAKER_00:

Martin, do you see organizations just lunging head forwards into having a blame culture? Is the blame culture the natural state of affairs and you've got to work your way to not have a blame culture or is it something that is some companies do some companies don't it all depends on the character of the upper management in your experience what is the things that lead up to a company ending up with a blame culture

SPEAKER_01:

you know earlier on amanda talks about it's often driven by people just trying to do their best and i think that's true there's an element also of saying perhaps slightly cynically that if you want to see a blame culture just look at social media when something goes wrong you know when a football and misses a penalty, what is the response of the crowd? But the response of the crowd is usually to blame that individual and shout their name and call them names and all those awful things that happens, just as one example. So there is an element of burdening an individual with blame sometimes. And I think, you know, when you're leading an organisation or you're part of an organisation, you're working on a project and somebody makes a mistake, which results in a problem, it would be so easy to say, oh, that idiot, why did they do that? You know, they've just cost us money. They cost us energy and all that work we put in. It's almost an easy thing to do. The harder thing is to take a step back and say, hang on a minute, I've made mistakes. I'm human. They made a mistake today. What can we learn from it? I think for me, I don't know if blame is the default for human beings or not. I think sometimes we like to kind of defend ourselves. And so you have to ask yourself in an organization, when something has gone wrong, do people become very defensive? do they try to duck the issue? Because that starts to suggest that fear, if you like, is something that suggests you have a blame culture. A colleague of mine who speaks very passionately about this topic was asked a question on another podcast about why is it that NHS managers, you know, do this, do that, do the other. And it was a very long question. And the response from her was fear. That was her one word answer. And so we can't afford to have an organisation that has fear. The other thing, by the way, In case you want a definition of a blame culture for me, a definition of a blame culture is one where people avoid owning up and where the organisation tries to make people feel bad when they make a mistake.

SPEAKER_02:

I think fear is a very strong point that you've made there. And also that point that if someone is trying to make someone else feel a certain way, in particular to feel bad, that would be a most definitely be an indication of a blame culture. of psychological safety around control. So when we see very top-down culture where there's a very high command and control approach, and that's often related to organisations that do have safety requirements, safety risks, what's the impact of having a very high controlling or a deference to leadership culture on the way in which people interact and potentially the risk of a blame culture from your mind, Martin?

SPEAKER_01:

I think the experiences of safety-critical industries is very telling when you have that kind of strong hierarchy. We talk about a hierarchical gradient, basically. When it's a very steep gradient, then you see more accidents. I don't think there's any doubt about that. Now, you can relate that then to business. You might work in a business where accidents are extremely rare. What you're more worried about is profit and loss. I would guess that you're going to see more loss as a result of a strong hierarchy. The only time you can probably get away with it is when the leadership are absolutely clear and know that they are completely right all the time and i can't think of too many cases in the modern changing complex world we live in now where that is the case there will be some things that happen that are relatively simple that as an experienced leader you've seen time and time again and you know okay it would be a bit like a technical issue you know you're talking about how to fix the plumbing system in the household something well if you're an experienced plumber you probably know a lot more than people who aren't and probably can go straight in with a fix that's a fairly simple technical thing if you like but most of the time in the modern world as i say it's very complex and for me when you look at a lot of times that big disasters happen then you talk about a lot of complex interacting in ways perhaps that weren't expected and you talk about a need for or you identify a need for people to understand who didn't understand just as happened in my late wife's So having that very steep hierarchical gradient isn't appropriate a lot of the time, especially when things are unusual or different. And even when they aren't unusual and aren't different, having that humility as a leader, that confident humility, as the American psychologist Adam Grant talks about, is really important. That ability to say, look, we need to work our way through this carefully. I'm not happy, I understand. So team, just give me your thoughts, okay? That's confident, but it's still seeking a lot of input it's saying I need to hear from you because you may have some good ideas and you know what happened in my late wife's case when the junior staff didn't speak up or they tried to speak up but were unsuccessful wasn't necessarily because of the immediate culture in that room it was probably because of years of working with the culture hierarchy in health care and with some of the individuals how people behave in a crisis is exactly how they've behaved in normal times So you can't suddenly expect people to speak up when things are going wrong if they won't speak up the rest of the time. And we have a lot of debate in health care now about how to train people to speak up and the importance of speaking up. And actually, I say, you know what? I'm not interested. We want leaders who listen up. We want leaders who want people to talk, who want to listen because they genuinely value their perspective. And, you know, when I'm flying, for example, we might face some form of abnormal situation, maybe a minor emergency or something like that. And as a leader, what I'm training my commanders to do and what I would do myself is the first thing I'll do is I'll have an idea in my head about how to deal with this from the experience I've got, but I'm not going to share it. What I'm going to do is turn to the team and say, any thoughts about this? Have you seen this before? And how would you deal with it? What would you do next? Now, it might be that they come up with an idea that's the same as mine. And most of the time, hopefully they will do. But even then, I have some nuanced thoughts which means that i can take my idea and make it even better i can optimize my decision so it's not just an okay decision but it's actually a really good decision but sometimes they'll see something that i don't they'll say yeah martin when you look at this x y and z and i'm thinking there it's not x y and z it's a b and c what are they saying and then i realize they've seen something that i haven't and i've gotten the wrong idea so it can actually help change your perspective and that's absolutely what we need in the future in business for mostly I completely

SPEAKER_02:

agree. And I was just reflecting as you were talking on what would a high controlling defer to leadership culture mean for me, and it would mean frustration, it would mean I'd feel a bit helpless, I wouldn't feel like I would really have anything that's worth saying or worth listening to, I wouldn't feel like my skills are being utilised, I would probably actively withdraw, and I wouldn't feel like I need to try quite so hard. So I can see very clearly some of the feelings and emotions I would experience. Yeah, and it

SPEAKER_01:

was Steve Jobs who said, we don't employ people to tell them what to do. We employ people for them to tell us what to do. And there's a large element of truth in that. Why would you employ experienced people? And by experienced, by the way, again, let me just talk about the NHS. You know, we talk about sometimes, say, receptionists and say, oh, the receptionists, they're not very experienced. They're not really important. Absolute rubbish. They are critical. They have safety critical roles in what they do often. And, you know, They might not be an expert in healthcare, but they are an expert in the individual they're dealing with across the desk or on the phone at that particular moment. They have seen situations before. And the last thing you want is exactly the sort of feelings that you just talked about, Amanda, which was spot on, I think.

SPEAKER_00:

Amanda, it's neuroscience time. I think we need to inject a little bit of chemistry into the job. So can you tell us what's happening here when there's a blame culture from a neuroscience perspective?

SPEAKER_02:

It's pretty simple, really. we know that when we are feeling threatened or feeling anxious what we're doing is we are preparing ourselves for a fight or flight and that's via the amygdala so the amygdala is the emotion center and the amygdala sends signals to the body to prepare for that fight or flight response so it sends messages to the adrenal gland to produce cortisol and also to produce adrenaline or epinephrine and that will help us with the energy and will help us with the courage and The reality, however, of those hormones and transmitters is that they create a feeling in our bodies and that feeling or frustration could result in an emotion and the emotion gets interpreted. And so what happens is that emotion then results in us maybe behaving in a way that isn't particularly productive, like avoiding the threat and distancing ourselves in order to protect ourselves from the threat. threat or disassociating ourselves from the situation and saying that i'm not going to be part of this and so what we do is we very much remove ourselves so that we can stop the discomfort that we experience so the neuroscience perspective for a blame culture is fundamentally the same as the neuroscience for low psychological safety

SPEAKER_00:

thank you very much right well i think we've talked quite a bit about blame culture and low psychological safety martin how are you going to uh You've got to

SPEAKER_01:

start at the top, first of all. You've got to get a commitment. An organisation is wanting to develop a just culture, a psychologically safe culture. It needs to be written into a policy. It needs to start there. But the really important thing is the leaders need to role model it. For example, you know, when something goes wrong, it doesn't matter whether we're talking about board level, whether we're talking about supervisory level, it's about trying to work out why it made sense at the time what can we learn from that

SPEAKER_02:

those lessons learned debriefs are so important aren't they you are truly listening and understanding but also it's a real privilege to have the opportunity to learn from people and understand exactly what happened

SPEAKER_01:

and it is about when you debrief for example to also talk about where you think you might have contributed to it whether you're right or wrong again it helps people to open up it's about offering support it's about checking up on people later saying you you okay after that you know this morning or whatever it was the other thing for me is most of the time you will if you're being honest you will find systems that have made it easier for people to get it wrong so what is it that was doing that and that requires a bit of thought i think sometimes organizations like to do a knee-jerk reaction and say okay this happened we think that was a problem so we're going to change the policy now tomorrow well there can be unintended consequences so i'd usually say a bit of time of reflection before you make a change is is often very very valuable. And I think the final thing for me is about those leadership behaviours when things are going well. It is about the ability of leaders to ask open questions and not fill in the blanks. And the situation is somebody comes into the boss's office and says, hey, boss, I've got a bit of a problem here. I'm not sure what to do. And the boss says, OK, well, tell me your thoughts on what to do. And the individual goes quiet for a moment. And then the boss says, well, I mean, what you could do is we could do this, couldn't we? Or we could do that. What do you think about that? Oh, I was actually thinking about and that you've never actually asked the individual at all what you need to do is be really good at asking open questions and listening because that sets up that trust it says I genuinely want to hear from people I genuinely want to hear from you and then it's thanking people for what they've said and what they've said might be wrong it might not be helpful at all but you can always say okay interesting thought doesn't work for me actually because of this this and this but I really appreciate you sharing that so it's very important that you thank people for their input and just really encourage that honest and open conversations. Again, quick example, NHS chief exec had a bit of a reputation. A doctor went into this person's office and said, I just want to talk to you about a problem we've got and what I need support in and how we might solve it. And he managed to talk for about two minutes. The rest of the 58 minutes of the hour they had was the boss telling him all the wonderful things that were going on and what was happening. Trying to make them feel better. To summarise for me, it's about role modelling those things it's about being able to ask open questions listen focusing on what's right not who's right it's about trying to be objective but it's also trying to be compassionate when things go wrong

SPEAKER_02:

and one of the things you said before it's not just about speaking up it's much more about listening up and making sure everybody takes the time to truly listen

SPEAKER_00:

so Amanda time as ever catching up with us so I think we need to round things up what are your take I've

SPEAKER_02:

learned that the very best people can still get it wrong, and that you need to create an environment where people are prepared to share and prepared to admit mistakes, which means that we need to have a just culture and a climate of psychological safety. I've also learned that a blame culture is not the opposite to a just culture, but actually that we should not be striving for a no blame culture, because actually it means that we are going to struggle. stop people from taking accountability and then we'll have a whole load of mavericks on our hands who are just doing whatever they want when they want because they can. So we don't want blame, we want justice. That for me has been quite profound that we should be doing the very best we can at all times for people. I'm also feeling incredibly humbled, Martin, the fact that you had such a significant life event and you channeled it into doing so much good for so many people around the world, not just in healthcare, also so in aviation and you continue to do so. And so I commend and applaud your energy and your complete passion for this.

SPEAKER_00:

Thank you very much. That's very kind. Thank you, Martin. And thank you, Amanda. If you'd like to hear more about our research looking at psychological safety, you can download our white paper on that subject and many others from the website www.zircon-mc.co.uk or feel free to contact us at hello at btalent.com. Martin's charity there, the Clinical Human Fact group if you want to know more about them just give them a search on google find their website it'll be all very interesting so all that remains for me is to say once again thank you martin thank you and thank you amanda

SPEAKER_02:

thank you

SPEAKER_00:

and we'll be back with our next podcast in a couple of weeks

SPEAKER_02:

thank you everyone for listening hope you have a wonderful and successful

SPEAKER_00:

day

People on this episode