[ad_1]
The higher you rise, the less truth you hear. The moment you become the boss, people start editing and curating what they say to you and how they say it. They become more aware of how you might react to what they say, more agreeable with you, and less likely to challenge your thinking or tell you uncomfortable truths. And now, AI is making it worse.
A series of new studies have shown that AI has a deep and persistent positivity bias, in that it is more likely to agree with you and confirm what you have already told it than disagree. And the risk here is that even if you don’t use AI to help with research and generate new ideas, the people who work for you might. And that means the ideas reaching your desk are more likely to have been developed in a cloud of positivity than through rigorous analysis and debate.
In his new book, The Power Trap: How Leadership Changes People and What to Do About It, psychologist and author Nik Kinley describes the dangers of sycophancy and overly positive input to decision-making, and suggests five basic solutions. Critically, none of these aims to improve the decision-making process itself; rather, they seek to improve the information that informs decisions.
Anonymous AI survey
A first step is to identify how much AI is used in your firm and how. That means asking people via an anonymous survey that can differentiate across levels and areas of the business. It won’t tell you everything, but it may highlight where it is used particularly heavily, and that’s important because these are the areas most at risk of AI’s positivity bias.
Create a disused pile
Knowing whether ideas have been rigorously tested can be difficult. But one sign to look for is what, if any, options were explored and then dismissed. So, explicitly ask people, “What alternatives and options were discussed and dropped, and why?” If people can give a good account of this, then it’s a good sign that proper analysis was undertaken. Importantly, this needs to become the norm – something people expect to do – so make sure you ask it every time. That way, it will encourage people to analyze initiatives thoroughly before bringing them to you.
Encourage uncertainties
Leaders often feel they need to sound clear and certain to project authority and evoke confidence in what they’re saying. But one way to combat the biasing effects of overly positive AI input to the development of ideas and initiatives is to be open and clear what you are uncertain about. So, rather than saying, “It is like this”, try saying, “There’s a roughly 80% chance that…” and then explain where the uncertainty comes from. This can be useful for both your team and stakeholders, as research suggests that reporting uncertainties can increase trust in your judgment.
Identify the counter
A related but simple idea is to always require people to state why something should be done and what the risks are. It sounds basic and is certainly easy enough to do; the challenge is to make it a habit and do it consistently, so it becomes the norm across all teams in your business. So someone always asks, “Why shouldn’t we do this? What’s the counterargument here?”
Go inside one interesting founder-led company each day to find out how its strategy works, and what risk factors it faces. Sign up for 1 Smart Business Story from Inc. on Beehiiv.
[ad_2]
Marcel Schwantes
Source link
