The Chief Psychology Officer

Ep71 Unlocking Talent Potential Through Strategic Data

Dr Amanda Potter CPsychol Season 4 Episode 71

Send us a text

Dr. Amanda Potter, Chartered Psychologist and CEO of Zircon BeTalent, and Jig Ramji, an experienced Chief Talent Officer, join us for a compelling discussion on the transformative power of big data in talent management. Listen as we unpack the parallels between sports analytics and organizational performance, revealing why data integration is as crucial to talent strategy as it is to financial success. Jig challenges the conventional belief that more data is inherently better, illustrating the importance of data variety and context in unlocking meaningful insights. 

Discover the complexities of implementing talent and leadership models within organizations as we examine the pitfalls of relying too heavily on standardized approaches. We emphasize the necessity of cognitive and cultural diversity and the importance of gathering diverse data points throughout the employee lifecycle. This approach ensures a clear understanding of organizational capabilities while maintaining agility to align with strategic goals. Our discussion underscores the value of a balanced framework, offering guidance and clarity for all stakeholders.

Explore the evolving tech industry landscape and the challenges of overcoming personal biases in interpreting financial and people data. We highlight the need for thoughtful talent models and objective assessments to mitigate bias and enhance workforce engagement. As Amanda and Jig delve into the concept of unconscious bias, they advocate for individual accountability in recognizing and managing biases. The conversation concludes with a focus on the importance of data discipline, emphasizing the transformative potential of rigorous data analysis while cautioning against over-reliance on psychometrics. Join us for a journey through the strategies that can drive significant change and performance improvements in organizations.

The conversation highlights the significance of cohesive data frameworks while addressing the challenges of bias and complacency in decision-making.

• Emphasising the need for robust data use in talent strategies
• Defining big data and its implications for organisations
• Balancing internal and external data perspectives
• Addressing the risk of over-relying on off-the-shelf talent models
• The importance of data discipline in collecting actionable insights
• Challenging complacency amidst continuous organisational growth
• Discussing biases in the interpretation of psychometric data
• Advocating for curiosity and humility in leadership practices
• Urging talent professionals to act as catalysts for change

Episodes are available here https://www.thecpo.co.uk/

To follow Zircon on LinkedIn and to be first to hear about podcasts, publications and news, please like and follow us: https://www.linkedin.com/company/zircon-consulting-ltd/

To access the research white papers mentioned in this and other podcasts, please go to: https://zircon-mc.co.uk/zircon-white-papers.php

For more information about the BeTalent suite of tools and platform please contact: TheCPO@zircon-mc.co.uk

Speaker 1:

Welcome to this episode of the Chief Psychology Officer. I'm Angela Malik, and I'm here with Dr Amanda Potter, CEO of Zircon and Chartered Psychologist, and today we're going to be discussing the importance of big data in talent, and we've invited a client to join us, Jig Ramji, who has been Chief Talent Officer at a number of large organizations. Us Jig Ramji, who has been Chief Talent Officer at a number of large organizations. So today the focus will be specifically on discussing how data can help to make a difference when building and implementing talent strategies.

Speaker 2:

Hello, both of you, Hi Angela, Hi Angela.

Speaker 1:

Jig, why don't you start by telling us a bit about yourself?

Speaker 2:

So I'm Jig Ramji. I'm a father of two, which is obviously my main and most important job A young man called Harry and a daughter called Leela nine and seven respectively they keep me on my toes. In addition to that, I've had the privilege of working at a number of large organisations. I've been a chief talent officer on three occasions. I've also led HR teams as a business partner or a head of HR, and I also started off life as a consultant many, many years ago, working for Deloitte within their human capital practice. My very first job was actually in a startup, when startups weren't that cool, working in executive search. So yeah, it feels like a long time ago, but thankfully, I've had lots of great opportunities in the UK, but actually all around the world as well in great opportunities in the UK, but actually all around the world as well in Europe, in the Netherlands, in Australia, in Singapore and in Hong Kong. So I feel like a real global citizen these days.

Speaker 1:

Lots of experience to draw on, Amanda. How do you know?

Speaker 3:

Jig. So we've had the great joy of working together a couple of times now, and so in a couple of Jig's most recent roles Jig's been our client. I've been invited by Jig and his team to help them with a talent challenge that they've had, to help them conduct some research, identify some ideas to challenge current thinking and then working with him to implement new talent strategies. So it's been quite cool. Jig and I have got very similar ways of thinking on many things. So, quite unlike the last podcast where Ben and I disagreed on most things, I actually think Jig and I agree on quite a lot, particularly around the concepts of diversity, inclusion and the way in which we should design and implement talent strategies, and that's what I'd love to talk about today.

Speaker 1:

So, jake, what brought you to the podcast?

Speaker 2:

I just feel incredibly passionate about the impact that we can make within talent within organizations if we get it right.

Speaker 2:

Amanda talked a lot about having quite similar philosophical views on the importance and the benefit of having a strong talent strategy. Now, for me, that dovetails really nicely into using data in the most effective way. Within talent, within the people function. In the same way that organizations such as sport organizations use data to improve performance, we use data for every facet of what exists today, making financial decisions into thinking about what needs to change for organizations, what needs to change for us as individuals to improve both mental and physical well-being. And there is no doubt in my mind utilizing data as your core element of your talent strategy can have as much, if not more, of an impact across an organisation as well. So for me, this is not just about talking about what can be done. This is a bit of a call to action, actually to say let's make sure that our people data, our talent data, is utilised as effectively as financial data within organisations and actually beyond organisations, in terms of sectors and macroeconomic challenges too.

Speaker 3:

So are we saying Jig, therefore, that organisations are not good at using their data in the same way as they are using financial data?

Speaker 2:

I always like to almost provide more of a challenge, I would say Amanda. So I think for me, the answer to that question is they're possibly not being as effective as they could be Because I think primarily that discipline doesn't exist or that thought process doesn't exist around. If we got this right, what impact could this make? Now, the reality is, when organizations do start to use data effectively, there's no going back, and they utilize that for multiple talent decisions that they make. I think the greatest form of that that we see today is around hiring or selection decisions, and therefore most organizations are using data, but are we using it as effectively and as powerfully as we really could?

Speaker 1:

So there's data and then there's big data. How would you define big data?

Speaker 2:

The way I would look at this is I think we get drawn into labeling quite frequently sometimes, and I think when we talk about big data and data, I think there's almost this differentiation that one is more superior than the other, whereas what I think we should really think about and this is where Amanda and I do really channel a very similar view the nuances within the context of an organization will allow you to understand what are the most important aspects or what are the most important data points for you. So if we want to then translate that into that's big data versus data within the context of the organisation I feel quite comfortable with that labelling. Otherwise, I think it becomes far too generic and vanilla, and that's when the value sometimes can erode slightly.

Speaker 3:

It's a really good point, actually, because big data always suggests volume.

Speaker 3:

But we know from research is, when we start to gather more and more data, for example when validating tools or processes, what happens is you start to replicate the numbers and therefore you get that regression to the mean, and so what you don't get is kind of unique voices or unique views when you get really big data.

Speaker 3:

So Jig's point is a really important one, which is actually it's the variety of the data that's so important, and that's a beautiful link to the cognitive diversity point that Jig and I are both absolutely passionate about, both from a talent strategy and implementation approach, but also from a data approach too that whenever Jig and I have conducted research within the organizations in which we've worked, we have many, many different data points that we're gathering, whether it's qualitative, it's quantitative, it might be external research, benchmarking-based data. We look at a whole load of different data streams in order to come to some conclusions, which, by the time we get to those conclusions, we can feel quite confident that, if we're going to present this to the board, that we believe in the messages that we're sharing. It's not just an inkling or a feeling or an intuition. There's really really good sound data that is being validated across multiple sources, so for me, that's what the real differentiation is.

Speaker 2:

It just reminds me of a very cool thing that I was asked to do when I was working at Bloomberg, many, many years ago from the chairman.

Speaker 2:

He felt that we could do a lot more around let's call it, broader talent strategy.

Speaker 2:

He didn't describe it in that way, but we could do a lot more around our broader talent strategy and one of the things he asked me to do was go and talk to some of the best organizations out there around what they do, and I had the privilege of talking to some great people in great organizations.

Speaker 2:

I won't kind of name them all because then it will start to create a bias of what are the greatest organizations in the world versus those that are not, but one of the organizations I talked to were Microsoft and one of the things they said when we were discussing about our approach. They said I love the fact that you're going deep into your own organization to understand, love the fact that you're going deep into your own organization to understand historically why some of the things that matter from a leadership perspective has worked so well, and the context and the nuance of Bloomberg was so different to other organizations that I'd worked in that it was so important to go deep into that organization to understand what mattered most to them, and I think that just really helps to create not only a valid data point but actually real buy-in from an organisation too to say actually this is really significant research, both external but actually internal to our organisation as well.

Speaker 3:

One of the things that's really important about that story is the fact that many of our clients go very internal when they're doing their data gathering and so they're very inward looking. And, by habit, we notice that organizations that are running at pace, that are relentless, that are working and driving really hard particularly private equity owned organizations they are very inward looking because they're focused so heavily on their P&L and on their numbers and they don't take stock of what's happening in the external market and do that competitor analysis or what the best in class are doing. So how amazing that your chairman suggested that you do that and go external and talk to other organizations, because there is a tendency for us to go way, way too internal and not get the context. The landscape that we're operating in is so important. I hope this podcast is interesting to people, because it's really interesting to me. I find this stuff super cool because the process we go through and the insights we gather as a result I mean the last piece of research we did together Jig was amazing, wasn't it?

Speaker 3:

The insights that we gathered with the team that Jess and Co were all included on. I mean, it was really quite a profound piece of work in the end because we took that very big approach.

Speaker 2:

Yes, and I think the balance piece that you just kind of referred to is actually really, really important. I think you know many organizations also, to be totally honest with you, can get so drawn into the external piece of what makes a great leader outside or in an alternative organisation. You risk, by just going through that one lens, that you don't look at what matters most to you as an organisation. So that balance and looking both ways and being quite internally focused but also externally focused at the same time, I think gives you that perfect balance of good quality data and perhaps, Angela, as you rightly pointed out, that real big data point.

Speaker 3:

Oh my gosh, I have one more comment. Angela, I keep jumping in. I know Angela wants to ask another question.

Speaker 1:

No, no, you're fine.

Speaker 3:

You're popping in. I was going to pop in with another point which is my biggest irritation. I was going to say the word hate, but I don't really hate. It is the fact that I have known clients pick up and use talent models out of books or they buy in a consultancy talent model and the issue with that, of course, is that it may be well validated, it may be well researched, but it is just a data point. It isn't the answer.

Speaker 3:

You just said, jig there, that some organizations might overly rely on published or researched models or data or insights. And that's one of my biggest challenges to our clients that we work with, which is, yes, that is a robust model, yes, it is a well thought through and considered approach, but it's one approach, and all of our clients come to us telling us they're unique, unique, and then it shocks me that they then buy a standardised, off the shelf model of talent or model of potential in order to do the most important thing, which is to identify the future talent of their organisation, which isn't, of course, going to make them unique. That, for me, is the biggest challenge I think that we have in our industry.

Speaker 2:

One of the things that I've made plenty of mistakes along the way in my career. But I certainly think had I not had the experience of working in different countries and quite different organizations but in particular different parts of the world, I don't think I would have quite cognitive diversity piece for me is important, but also that cultural diversity piece and that's where I think there is a real danger sometimes in looking at off-the-shelf solutions which often are quite culturally nuanced towards the Western market. So I do think it's a really important point you raised there about how important it is to really understand the fabric and nuances of every aspect of your organisation.

Speaker 3:

And we're thinking about that with psych safety, because we're in the most amazing position now, with a couple of our products in particular that are being rolled out around the world, that we're getting large enough samples. I hope we're getting large enough samples in certain markets that we can create internal norm groups and we'll be able to compare country by country, what are the precursors to psychological safety in Japan versus Korea versus France. So that's pretty cool.

Speaker 1:

That's hopefully coming down the line and it's not like any organization is operating in a vacuum. And then you have all these opportunities for various different points of data. But with so much varied data available to you, how do you ensure that your talent data is actually robust?

Speaker 2:

So I think this is where some of the models that we've worked on in the past have been quite powerful, because oftentimes we are using different parts of the employee's life cycle. Let's say it's probably a nice way of describing it to collect those data points so, as an example, as to when an individual is coming into the organization, what are some of the data points around selection and assessment that we're collecting throughout the performance cycle, how they're performing in terms, in particular, about the behavioural traits that they're demonstrating. All of those data points within the employee lifecycle are quite distinct and unique, but what we've tried to do in the past within organisations is to almost ensure that they pivot towards that framework that we've designed within the organisation. So therefore, albeit that they pivot towards that framework that we've designed within the organization, so therefore, albeit that they are different points of the employee lifecycle, we're actually collecting data points around, for example, leadership. Therefore, you're getting validity and reliability, albeit through different parts of the employee lifecycle. So, actually, your data points are becoming even more robust and your data sets are becoming even more robust.

Speaker 2:

What I've always said and I think Amanda and I have probably wasted quite a lot of time talking about this is when people use almost distinct models at different parts of the employee lifecycle, or different leadership models here and different leadership models there. I think that's your biggest risk and that's when data, therefore, isn't robust. It just becomes a collection of data rather than consistent data that tells a story about an organisation at a given point in time, not forever at a given point in time.

Speaker 3:

So I said earlier, I was irritated by external models I think I might use the H word for this one because I think it's a restriction for organisations when they have a plethora of models in the organization. You know, they have these incredible values, models, and then leadership frameworks, high potential frameworks, and then behavioral frameworks and technical competencies, or they might call them capabilities, and it's very confusing for the line manager, it's confusing for HR, let alone the line manager. And the point that you've made, jig, is a really good one, because, whilst we don't want to go so far down the standardization route that it means we'll just buy a single off the shelf model from a book, we need both. We need standardization, yet agility in something like a talent or a potential framework or a leadership framework that will roll across the whole organization. And what you don't want is too many models, because none of them ever get used or they get used differently. So it needs to be something that the organization works at the center to really agree. But that can be the hardest thing Getting an organization to the point where they realize that we have too many models for different purposes and that we need to bring it together into a coherent story that could be used across the whole organization is probably the hardest piece of consulting that I have to do at the very start of a project to agree that something needs to change.

Speaker 3:

But you can usually encourage the client to see the importance of it when you can help them understand the whole point of this podcast, which is about the impact and the change it can make. And for me, one of the things I talk about with clients is line of sight. If we're going to use the word capability just for argument's sake, for the model, if we're going to build a capability framework, there needs to be a really clear line of sight from their organization's ambitions and their strategies and their goals and the behaviors or the requirements of individuals so that they can see what they're doing links back to that strategy and to that ambition. And if there's loads of different models all saying different things in slightly different phrasing and some of it might be really funky core phrasing and some of it's really direct assertive phrasing then it's really confusing about how I need to show up and what I need to do.

Speaker 2:

It's interesting because that agreement of data set piece I think is so, so important and, by the way, it is tough, but it's the same structure that we use within, for example, finance to tell the same story. What are the data sets we're going to use to tell the story of growth, revenue, profitability, success? We would agree that within an organization, at the early onset and, yes, there's consistency around how we report some of that data but there is a data set and a story and an expectation set for investors and the public around. These are the things that are going to tell the story as to whether the organization is moving in the right direction, is successful, is ready for hyper growth, is ready to IPO all of those different channels, I think if we could get the same rigor, I think is the right word for people data, there is no doubt in my mind that you can have both finance and people data telling an equally powerful story as to the health of an organisation and how successful it is currently and how successful it can be going forward too.

Speaker 3:

I'm going to be a pain in the bum and say the biggest issue we have with it is bias, people bias, Because we are a walking set of biases, all of us. We use our mental shortcuts to help us process information and deal with the myriad of information that we get on a daily basis. That we have biases and shortcuts in order to sort through that information and therefore we make judgments about people or about situations. So, whilst we can create really good, robust systems, it still relies on evidence and objectivity of the person who's rating that person.

Speaker 2:

And I think that's part of the challenge that we face for years, because I think, intellectually, most people, when you tell this story and talk about this, like, yes, absolutely, that's exactly what I would love to be able to say or tell the story on.

Speaker 2:

But it is that human bias or that interpretation piece that often gets in the way.

Speaker 2:

But an interesting little story that I'm going to kind of just challenge our thinking on a little bit.

Speaker 2:

If you think about growth of organizations today and in particular, the tech boom that has existed probably over the last 10 to 15 years, if you were an old school financial investor I can certainly say there are people within my life who have very strong views on the tech boom and growth and the valuation of companies that is completely different today than it was 20, 30 years ago and that is still relying on a human bias to say well, actually it doesn't show the natural results that I would have wanted to see 20, 30, 40 years ago.

Speaker 2:

But I know that if they got X, y and Z right over the course of five, that ability to hypergrow is way, way, way more significant than perhaps other industries 20, 30 years ago. So I appreciate it's not exactly the same, but there is still an interpretation piece of financial data and our own biases based on perhaps sometimes generational pieces that sometimes still require people to make objective decisions based on the evidence that they see, and I think it's the same with people data too. But you're right, it has definitely been one of our biggest challenges.

Speaker 3:

I love the point that Dulce made she's a previous guest on the podcast that the most dangerous people in that situation are the people who believe that they don't have a bias bias. So I don't have a bias bias, so I don't have a bias bias. In other words, everything I say about you I think is correct, or I say about myself is thinking.

Speaker 2:

I think is correct and I think that often happens to many people there's an amazing group of people I also often relate quite nicely to, who are constantly challenging themselves as well to say what am I not thinking of, what am I seeing? That is blinkered because of who I am and the way I see the world, and I think it's really fascinating to see those really incredibly self-aware people as well. At the same time, I do often see those individuals who have a bias that I have, no bias.

Speaker 1:

I'd love to throw out a curveball and ask if we are encouraging organizations to look at the data, use the data more effectively. Are we reducing our talent population to just numbers?

Speaker 3:

God, that's a great challenge. It's a great challenge, but no, we're not. Actually, we're doing the opposite. We're actually using data and using insight in order to help people have greater happiness, well-being, to find jobs that they enjoy and they're fulfilled in, for them to operate at their highest level of potential, to operate with people who they'll connect with or they might even challenge each other's thinking in a positive way. So, actually, I think we're using data to create more engaged, happy, fulfilled workforces. As a result, the organisation has better numbers at the end of it.

Speaker 2:

I think it is a really good challenge and it's a really good question, and I'm just going to, for a second, just really compliment the work that Amanda and team have done for me in the past and actually really helped me to think about how to create the most effective type of talent model.

Speaker 2:

And, in particular, one of the things that and we've talked about this a lot is that humans have a tendency to have a bias and therefore, how do you mitigate that? I think if you create a really thoughtful talent model, one of the principles and one of the founding principles of it is how do you eliminate bias as much as possible, and I think old school talent models in particular, were incredibly binary Is this person talent? Is this person not talent? Is this person middle talent? Is this person got no talent? Whereas actually and again singing the praises of amanda and team the models, when they work well, are really centered on ensuring that the individual, the leader, the leaders of an organization, are helped to eliminate bias and I think you, by reducing cognitive load and asking questions in the right way, which then creates your data point. This is exactly what we did with Amanda in the past. We create models that actually don't reduce people to single sets of data, but actually allow for objective assessment, which then creates robust data sets.

Speaker 3:

And the point is a good one, because we're never going to remove subjectivity. All we can do is aspire for object sets. And the point is a good one, because we're never going to remove subjectivity. All we can do is aspire for objectivity. And the work by John Amietci is that how I say his name, obe.

Speaker 3:

He talks about the fact there is no such thing as unconscious bias and the reason why unconscious bias training actually increases people's bias at work, because as soon as you use the word unconscious, people move their accountability or their responsibility for owning the bias and they say well, it's not me, it's just my brain. I can't help it. So therefore, I'm just going to live with it and I'm going to carry on feeling and thinking the way I do, where, in fact, dulcy's point's a really good one is if we have that, I don't have a bias bias. Therefore, again, we're going to avoid the ownership of that bias and we're not going to do anything about it. So therefore, when we create these models, we have to be as objective as we can and the data needs to be observable as possible if we're creating a leadership framework or a capability framework so people can actually see the presence of that action or that behavior.

Speaker 2:

I do think it's interesting, the whole concept of unconscious bias. I think sometimes, when it's almost become one of those statements, that people don't quite appreciate the depth of what we're trying to say. When we say unconscious bias, we're talking about the amygdala overload, we're talking about the reptilian brain. We're talking about the most foundational element of how your brain works, as opposed to the capability of the brain to reflect and evaluate in a much more objective way. So I do think it's funny when individuals do turn around and say well, you know, unconscious bias is my brain.

Speaker 3:

Forgive me, yeah, yeah, but not me it's not me, it's my reptilian brain I know, know and I understand the neuroscience. I understand the psychology behind why, but we are in a position to challenge the way we think. We know, don't we, angela, from previous podcasts, that we lie to ourselves. We're great big liars and we tell lies to ourselves and to each other all the time. Therefore, we have to challenge ourselves and we have to challenge our thinking, and that's what these models are about Doing the research. I think I'm going to take a full circle now are prepared to put ourselves on the line as external consultants to say that this is working, but this is really not working and this needs to change. This needs to stop in order for you to get to where you want the organization to be. Now. The interesting one for me is this morning I've been doing exactly what we've been talking about.

Speaker 3:

We had a debrief myself, julie and Jess for a client that's had consistently 80% year-on-year growth. They're about to refinance and they're looking to see what they could do to make sure they're in the best position possible. Now. You can't really argue with 80% year-on-year growth. That's pretty incredible, and their CEO is amazing. The leadership team are all incredibly bright, incredibly good at what they do. Yet there is still feedback, there are still themes, there are still messages that we're going to share. That would help them go even further in terms of the way they operate and lead that organization. But there is a risk with those sorts of numbers that they could become complacent or arrogant. More often we get come in to organisations not necessarily the ones Jiggs worked for.

Speaker 2:

Interestingly, when there's a challenge or when there's something that needs to change or the business needs to go through transition in order to improve the numbers, I think that, for me, is the most incredibly powerful way of creating a culture within the organisation, though, because you're're right, I think it's very easy and actually happens very frequently within organizations where they've been successful and there is there is a real risk of complacency.

Speaker 2:

We've seen some of the greatest organizations in history lose out because of complacency, and it's only by reducing that complacency and creating humility and a growth mindset continuously that I think organizations create that natural desire and aspiration to look at themselves and think about how they could be even better. And I'm sure across MBAs and strategy courses, everywhere they talk about some of the organizations who continuously have failed because of complacency, whether it's Kodak and them having, you know, being first to market with digital photography, or examples where Apple swooped in and took over the telco, the phone market. There are so many examples where Blockbuster not decided that they didn't want to go down that route and they had the chance to go there.

Speaker 2:

Yeah, for sure, and all you know, those are stories in history which everyone can relate to, and it was only through complacency of leadership teams, or senior leadership teams, or even, in some cases, founder or CEOs, that led to the demise of what you would have described as a market leader. So actually, your organization that you talk about, who've had 80% growth year on year, to still have that humility to want to improve and be better than they are today, I think, is a phenomenal leadership trip.

Speaker 3:

And the CEO is amazing, to be fair, and, as stated by all of his leadership team as well, he's tough, he's relentless, he doesn't suffer falls, but at the same time there is that humility, but there is also compassion in there too. He's truly got that gift of bringing both sides to the table. But I'd like us to go onto a different topic, if that's okay, which is how do we get the data? You know we're talking about building models and challenging thinking and going external and not being too narrow, but how do we get the data? That would be quite a good topic, do you think? Think, angela.

Speaker 1:

Absolutely. That was actually going to be my next question.

Speaker 2:

So I think for me, one of the things that and this might be a very fundamental or foundational thing to talk about, but I think it's quite an important one just to reiterate or re-articulate because I think people forget but you've got to get that data set and I think it's creating the muscle within the fantastic debate many, many years ago around bringing in data analysts within an organization, and we brought some really incredible people in and I remember having a conversation with one of them and they said I can't do anything at this organization because we simply don't have a data set. And I think the foundational element of ensuring that this all matters and this all works is having a quality data set. Without a data set, nothing is really possible. In the same way that finance would say the same thing for financial data, right?

Speaker 3:

Yeah, I've got a really small example. I won't mention the company name and I won't mention the product, but people probably work out what the product is. But it's a competitor to our strengths questionnaire and you can buy a book and you can get a strengths questionnaire out of the back of the book because there's a website and a code. So if you buy the book, you get one use of that questionnaire and you get a mini report back. And at the time we were building Be Talent and it was quite early days, to be fair. So they decided not to buy our product. They decided to buy the book for everybody and use the questionnaire out of the back. Instead we ran the workshops. It was all good.

Speaker 3:

But about a year later they came to me and said could we have the data please for the event, so could we have everyone's strengths results?

Speaker 3:

And of course I had to go back to my gray matter to try and remember what did we do and realize we didn't use our product, we'd use a competitor's tool and then had to say to the client I'm so sorry, there is no data because you decided to go down the route of buying the book and using individual codes.

Speaker 3:

That means everybody's got a printout of their own report, but there's no central composite of data.

Speaker 3:

And it was a really great lesson for us because that became one of the messages that was really important for us with the build of Be Talent, which is, we can help you create that data set through our platform, through our system, because we've got this suite of incredible tools that assess 90 aspects of talent and, depending on the model of talent the organization is driving towards, we can map our tools to those models. So our aspiration has been over the last 10 plus years is to provide clients, who have had that limited data, with a way of collecting really good talent insight through our 360, through our strengths questionnaires and so on. So now if a client says to me we don't want to buy your tool, but we'd like to buy the book and get the questionnaire out the back, I'll say that's great, but I would rather not be involved, because when you come back to me asking for the themes and to understand the insights from this, I won't be able to help you.

Speaker 2:

It's so interesting that that example that you cited, I think, has happened over and over and over again within an organization and it's almost going back to that really earlier conversation that we were having around agreeing the data set. Granted, it's not sometimes the most glamorous or sexy part of some of this work, because the really exciting part of this work is when you get the insight and you can bring to the table, whether it's to a board or whether it's to a leadership team. This is what the data is telling us right now about your organization, your team, your senior leaders. All of a sudden, everyone gets really excited.

Speaker 2:

But the work needs to happen right at the very beginning to ensure data discipline, as I call it, and without that data discipline, you don't get the data set. Without that data set that is valid and reliable, you don't get insight. Without that insight discipline, you don't get the data set. Without that data set that is valid and reliable, you don't get insight. Without that insight, you don't get those moments with the board, you don't get those moments with your leadership teams and you don't get those moments, most importantly, with individuals who then create insight for themselves and awareness for themselves. So I think it's it's that stage process, isn't it to get real discipline in this whole topic early on?

Speaker 3:

It's so amazing and this podcast may be super, super dry for some people, but the most exciting bit, I think, about the job that I do is that when I get to read the incredible work that Jess has done to pull together all the interviews, or the team have done to produce the data at the end, and I read it and I usually have been in the situation of having conducted some of those interviews or analysed the data myself is the aha moments.

Speaker 3:

The aha moments that we had at EA right at the beginning around longevity and psychological safety, for example, which we've talked about previously on the podcast or another organisation when they identified that they needed three main things in order to drive performance in the organization, which were around disruption, having a broad external macro focus and around empathy. And realizing then that when we did the analysis that they were three of their weakest areas because of the relentlessness they just didn't have time to relate to people, they didn't have time to think externally and they certainly didn't want to disrupt anything because they were too busy doing today's job, let alone tomorrow's. So is that kind of aha moment that's so, so incredible.

Speaker 2:

Do you know what I don't know? I'm getting awareness for myself at the moment, amanda, because this, to me, is not dry at all. This is the stuff that I get really excited about, so I don't know whether that says more about me or whether we've got fellow people who are going to be listening to this podcast going. This is the stuff that I can listen to for hours.

Speaker 3:

Angela's still with us. She hasn't gotten snoozing or anything yet.

Speaker 1:

No, and hopefully we'll have lots of other listeners who are saying, yes, bring it on, yeah, yeah.

Speaker 2:

And I think you know it's one of those things where I do sometimes feel that when you're looking at data or you're looking at patterns and you're looking at data sets over a period of time, it shouldn't be easy. It really shouldn't. If it was easy, people would press a button and get really everything that they need. I think that discipline and rigor is what makes this actually one of the most exciting movements that I think we can create together as a set of professionals. So my personal view is perhaps for some people it may not necessarily be as exciting, but there's no doubt in my mind that when you get it right and you get those aha moments, you get the insight and you get data that tells a story that an organization suddenly uses as almost catalysts for change and transformation. That, for me, is more powerful than anything else.

Speaker 3:

Yeah, second to none, isn't it? I always say that my aspiration is to give B-Talent wings because of the difference I feel it can make, both at an individual level, a team level and an organizational level. That's the most exciting thing for me, but I would love to talk about the over-reliance on psychometrics as well.

Speaker 1:

Well, that was going to be my question, Amanda. If you're gathering data, data, data, is there a risk of becoming over-reliant on psychometrics and questionnaires and 360s for your data?

Speaker 2:

Well, there's certainly a huge risk and I think we've touched on it in the past where some of these things don't sing in harmony together.

Speaker 2:

I think that's a real risk and I think one of the things I've found with psychometrics, certainly within hiring, in the past, especially that top end, you know, senior, senior, senior roles.

Speaker 2:

Actually, let me tell a story Many, many, many, many, many moons ago, the first company that I alluded to earlier that I worked for was a startup. It was recruitment and it was executive search within public sector, central government and there was a huge, huge risk mitigation approach for using executive assessment let's call it encyclopedias. And I think that's where I get really worried and concerned, because it isn't about making the right decision for the organization, it's a risk mitigation approach, and what I mean by that is and it still happens today, and I've seen it over and over again I won't name organizations and I won't talk about how organizations have got it wrong. I can certainly say around how I've got it wrong in the past, and that is when you use some of those psychometrics, which are merit and they can be utilized in the right way, to make the decision as to whether an individual is right for a role in an organisation.

Speaker 3:

I think is a huge, huge challenge yeah agreed, the overuse.

Speaker 2:

The overuse because it then is the biggest detriment to cognitive diversity within an organisation. And sometimes not always, but sometimes some of those psychometrics are, so they can be incredibly robust, but individuals who are interpreting those psychometrics aren't doing it in the way that they should be done. And there is no doubt that, even though I'm credentialed on many psychometrics myself, I still don't know the deep depths of every single one. And therefore, for me to then make a recommendation in that scenario to say yes or no, I think that's when the utilization of psychometric can be incredibly dangerous, because they should tell you things, but that should not be yes or no. It should tell you about an individual and across multiple different data sets, including including psychometrics, perhaps to tell a story. But interpretation and over-reliance on a yes or no approach for me is a huge, huge challenge.

Speaker 3:

Everybody who listens to this podcast regularly will know that I completely, 100%, agree with you, and I have a big problem with profile matching. I have a big problem with the over-interpretation, and we use the car analogy that Julie Lee identified when she was talking to her husband Trevor. They were talking about the cars now parking for you and the sensors stopping you from going outside of your lane, and it stops you from actually thinking about driving. And the tools if they're overly interpreted in terms of their reports, what happens is it stops the psychologist or the person using those tools actually having to think and just literally reading the insights and letting the insights wash over them. So, yes, I'm in complete agreement, but I suppose the point is we still need that data and psychometrics are one source, so, as long as they're being used appropriately and that we're not misusing that data that can actually contribute to our big data approach.

Speaker 2:

And huge value as well. But I think that's the challenge. I mean, there are so many individuals who are also interpreting psychometrics, who shouldn't be interpreting psychometrics. I'll say it quietly because we know we shouldn't talk about it. But one of the interesting things and again I'm sure we've heard it or seen it across boardrooms or executive rooms where someone said that individual didn't work out in that role. What happened? And there is no doubt in my mind, at some point people go. Well, you know, the psychometrics didn't say anything. They said this person was suitable for the role.

Speaker 2:

Psychometric must be wrong, then or the reality, as we all know, in that scenario is that we're using psychometrics to make a decision on hiring or not hiring, as opposed to utilising, as you described, amanda, so eloquently, as one part of the big data set.

Speaker 3:

And it is interesting that we lay blame naturally, as humans.

Speaker 2:

The point I just made which is well, it's obviously the psychometric is wrong, other than it's how we used it and the need for diversity once again it kind of reminds me actually and amanda, forgive me because this is definitely a conversation that we've had and we're kind of moving into popular television now, so forgive me here but one of the risks that I do sometimes see with approaches that are used in particular for, again, hiring, more so than anything else, and succession planning and things like that people can be incredibly binary and judgmental on individuals.

Speaker 2:

And it still bothers me today when people can say no or yes and they'll kind of list reasons and you kind of think they don't feel like reasons. To me it feels like the personality of that individual is the individual traits that make that individual so fabulous and great at what they do. They're different to the incumbent, they're different to members of the leadership team today, but people are quite judgmental on certain factors. I think one of the things you and I were talking about was that fabulous line from Ted Lasso. If anyone hasn't seen Ted Lasso yet, please, please watch it because it's so great and it's just so good.

Speaker 2:

And I think he says you know, be curious, not judgmental. And I think there is a real tendency with some of the tools, the measures and the approaches today in boardrooms and senior leadership teams where there is a lot more judgmental behavior than curious behavior. And I think that's when that complacency can set in, that's when sometimes people make decisions that are just at the detriment of the organization, not to the benefit of the organization. So if people are finding some of this element dry, then at least if they remember one thing. For me that's uh, be curious, not judgmental. And take the recommendation to watch Ted Lasser. It's the best thing you'll do for weeks on end.

Speaker 3:

They've got through 50 minutes of listening to us that one piece of recommendation. That one piece of that, one recommendation.

Speaker 2:

You'll get so many people writing us. Can we just have a recommendation for TV shows every week on your podcast?

Speaker 1:

I think we have had a really fascinating conversation. I think it might be time to close it now. You started off the conversation, jig, saying that you wanted this to be a call to action, so I wonder if you can just articulate that call to action now as a final thought.

Speaker 2:

I've always said that the most powerful thing we can do within our world as people, practitioners, psychologists, talent professionals however you want to describe yourself is to help tell a cohesive story and provide insight to individuals. And the most powerful tools that we have at our disposal are the data sets that we collect across the employee lifecycle. Data leads to insight. Insight leads to moments that can create catalysts for change, and I think that's my call to action. There is no greater power than being the catalyst for change within an organization or, if you're outside of an organization as a consultant, to create that catalyst for change for an organization. So for me, that's the call to action Be as powerful as you can.

Speaker 1:

Amanda, any parting thoughts from you?

Speaker 3:

Wow, that's it. I have nothing else to add. I agree Wow, amazing. That's why Jake and I work together, because I completely and utterly agree.

Speaker 1:

Yeah.

Speaker 3:

I'm just very lucky that I get to work with very cool clients who are prepared to think differently and prepared to think big, and we just challenge each other and debate. It's cool.

Speaker 1:

I found the conversation really fascinating. I know I haven't spoken as much as I maybe normally would on other podcasts, but it's because I've just been listening with such keen interest. I think one of the takeaways for me that I sort of started this conversation with and I'm leaving it with is is to lead with curiosity. I started out curious. I'm still curious. I would love to go and explore more about big data and how it can impact our talent in a positive way, and not reduce people to numbers, but actually, as you said, Jig, create that insight that tells the story, that is a catalyst for change. So thank you both for your conversation today and for our listeners. If you enjoyed this conversation as much as I did, please share it with your friends and press the follow button on whatever platform you're listening on. Thank you so much, both of you, for a great conversation.

Speaker 2:

Thank you very much.

Speaker 3:

Thank you, angela, and thank you everyone for listening. I hope you have a wonderful and successful day.

People on this episode