Chemistry and human connection are at the heart of coaching, but that doesn’t exclude AI from the conversation. There is an expanding role for AI, but that´s a long way to go for and should be followed with a healthy dose of suspicion.
Artificial Intelligence (AI) has achieved plenty of publicity in the past year, principally from the rollout of ChatGPT and its ever-expanding list of workplace capabilities, from drafting cover letters to providing feedback on reports and proposals. Most of all AI can provide the opportunity to improve the quality of business decisions, by drawing on a bigger, broader dataset than is possible for the human leader. This can also help to make sure personal preferences (that may be discriminatory) do not bias decisions. Considering people’s unconscious bias, decision-making can be vastly improved if you allow computers to do the initial analysis of all the relevant data sources.
Business leadership is one thing. Coaching seems to offer a greater challenge to AI. An experienced human coach has the knowledge and empathy to explore emotions, make associations, and use intuition and imagination to help their client progress – all deeply human qualities. Given these qualities are very hard and currently impossible to replicate, where does AI’s potential in coaching lie?
In general, AI has worked best when tackling complex but well-defined problems, like diagnosing rare sarcomas or identifying fraudulent financial transactions. AI works best if it can start with a ‘clean’ dataset where, e.g., all facts are guaranteed to be about the same individual or all statements are evidence-based and not made up. AI can add immense value in an ecosystem of truth (such as in medicine or accountancy) where data are factually correct and reliable. This is a risk for AI applications on the internet and inside organisations, where there is a lot of opinion and falsehood that AI may well base unhelpful conclusions on. Developers and users of AI in coaching have therefore focused on models and functions that work with data from a single client and where data can translate effectively into a digital format.
Personal chemistry can’t be automated
There’s potentially a role for AI at the outset of a coaching assignment, during the initial scoping exercise. AI can ask questions of each prospective coachee and reach conclusions about best next steps.
While AI can effortlessly organize coachee data, selecting the right human coach is a matter of psychology (not data). „At Hult EF, selecting a human coach is done by the coachee. An experienced specialist provides a shortlist of potential coaches, but the coach is selected by the coachee based on personal chemistry,“ says my colleague Naysan Firoozmand, global head of coaching. “Chemistry is one of those things that you can’t easily quantify; it can’t be diagnosed through an assessment or a survey questionnaire or AI.”
Chemistry needs to be a meeting of equals and a process, a search for understanding of what is needed here and now; and why and what the potential outcomes and risks may be. We don’t need an answer with lightning speed, we rather need a tentative question and a slow and careful appreciation of the question from various angles. The best chemistry is already like coaching, so this meeting of equals in a slow, reflective process continues. The faster AI will become the further it will stand from actual coaching!
Once the human connection has been established, it cannot be substituted. At the heart of coaching is that unique relationship between coach and coachee and it’s exclusive. Once it’s been formed, it cannot be replaced by another human or by technology. That would create what we would call a rupture, a break which unnecessarily damages or even wipes out the carefully established connection.
The AI coach can´t help with unexpected goals
In order to think about how AI might help it is important to understand first what coaching does and what it achieves. During every coaching conversation I tend to focus on the content journey and the relational journey. The former is the step-by-step process to provide answers to queries, to get closer to stated goals, and even to move from a degree of anxiety to more confidence and determination, with regard to an issue or problem. The latter, the relational journey, is all about how content is mirrored in the relationship in the room, how partners get on and how they mutually co-regulate. If this second journey is successful, we tend to see more agreement and more affinity in the room, as well as expressions of gratitude, confidence, and determination. Coach and coachee can learn from both journeys and usually one does not move without the other. Therefore, an explicit assessment of the relational ‘mood’ in the room can help in evaluating the achievement of goals. This is called relational coaching and requires an ability to understand both content and relationship, and a degree of courage to speak honestly to both (De Haan, 2008).
This unique relationship in the room is the central part of relational coaching, which has an emotional and psychological focus. However, one can see a role for AI on the content journey, in what we call ‘goal-directed coaching’, because this is more of a linear process and can more easily be put into facts and words. The individual sets a goal, develops a plan, takes action, monitors and evaluates their performance, before adapting their actions to improve performance and attain that goal. “In this domain, an AI coachbot could coach as well as any basic human coach,” says Firoozmand. “It’s able to pick up on a much broader spectrum of potential questions and solutions associated with addressing a goal.”
An AI coach can ask questions about what needs to happen next, monitor progress, and respond to any input at any time. So, it could help members of staff who need a sounding-board – by conducting a dialog to help them think through a business problem, such as setting up a new project. “There is hope that AI can complement the human coach in advancing the further democratization of coaching – reaching as many individuals as possible, which is one of our key commitments, because we believe that coaching can be so positive for so many people,” says Hult´s global head of coaching.,.
But coaching is not all about reaching measurable goals. Together with Nicky Terblanche from Stellenbosch University I was involved in a randomized controlled trial of AI in coaching (Terblanche et al., 2022). This longitudinal study tested the efficacy of a chatbot AI coach called Vici. An experimental group (n=75) used Vici for six months. Eight measurements on goal attainment, resilience, psychological wellbeing, and perceived stress were collected from the experimental and control group (n=94). Data was collected at baseline, after each of the six chatbot usage months, and three-months later. We found that the experimental group showed a statistically significant increase in goal attainment, while all other more psychological measures yielded non-significant results, i.e. the chatbot was no better than no coaching at all in the second, relational journey.
In other words, in moving toward your goals, the AI application was as good as a human coach. But when it comes to establishing empathy or promoting well-being, on the psychological measures, the AI didn’t move the dial. This provides some indirect support for the idea that human coaches are aware of the importance of maintaining a strong coach-coachee relationship and that they do more than facilitating a content journey. Their generally supportive and illuminating relational dimension could positively influence aspects such as wellbeing.
AI might be very good with goals, but this can also come at a price. In order for AI applications such as Vici to help, certain key assumptions need to be fulfilled which are not always true: in order for AI to contribute, coaching goals need to be stable and represented well by simple, factual words and discrete steps. However, complex goals often vary from session to session (and even from moment to moment!) and cannot necessarily be broken down into subgoals and/or simpler steps. Moreover, there may be an important ‘analogue’ or ‘somatic’ understanding in the room, where goals are not just communicated by words but also by the body and gestures such as responding to one’s gaze and barely visible nods or shrugs.
Another more human attribute might be the ability to respond to the unexpected. What if the goal the person wants to achieve is not formulated? In those cases, the coaching may not be about doing something differently, it may be about being someone different or having a different mindset or a fresh attitude, which happens in the here and now and is not goal-oriented.
Acting as another ear in the room?
There are a few more extreme, potentially useful applications of AI in the coaching room, which however could have more serious ethical ramifications at the same time. I have always thought that the moment-by-moment ‘material’ in a coaching conversation is a worthy and promising area of study (see, e.g., my book about moments of coaching; De Haan, 2019). In the study of this material AI could also make powerful contributions, although it would mean letting AI into the ‘sacred’, confidential space of coaching conversations.
Although my colleague Naysan Firoozmand does not see AI taking over from the human coach (yet), he can envisage an AI application supporting the coach and coachee by listening in on coaching sessions and logging the common themes that emerge. “It’s like a third-party ear in the room,” he says, “unbiasedly capturing the nuances or trends that are happening in the dialog, so you start to tap into things you might not have considered or might have dismissed as noise.” See Bridgeman & Giraldez-Hayes (2022) for a first application of this idea.
By analysing these trends, the ‘AI listener’ could also help to train new coaches, enabling them to improve their coaching style. “If you spoke for, say, 60% of the time, consider speaking less,” says Firoozmand, “so that provides a narrative for future conversations.” But any such assistance should be used safely and transparently. Confidentiality is vital and anything that undermines the exclusivity of the relationship is problematic.
This brings us to the ethical issues around the use of AI. Bringing AI into the coaching room may seem to be similar to bringing in a simple recording device (such as a notebook!) that can produce a permanent ‘audit trail’ of sessional themes outside of the coaching room. However, the risk here of feeding into a huge database that can have unethical uses (e.g., through hacking), or of linking personal data to the recording, is much greater here precisely because of AI’s ability to handle vast amounts of data at lightning speed. In the current trend towards opening up information for all sorts of bots and AI to use freely, it is very hard to see how AI can ever earn a place inside a confidential coaching room.
Another ethical consideration comes from the popular idea of ‘democratization’ of coaching, especially in Silicon Valley. If we are not careful this may really mean that software engineering companies back the building of platforms aiming to take massive market share, whilst paying coaches peanuts and offering cookie-cutter (simple stable goal-based, like Vici) coaching with the sole aim of getting warm and fuzzy ‘likes’ on social media. By contrast, independent coaches can democratize coaching right now by reducing fees for underserved groups and by offering pro bono coaching.
A long way to go for AI and enough grounds to follow developments with a healthy dose of suspicion
References
Bridgeman, J., & Giraldez-Hayes, A. (2023). Using artificial intelligence-enhanced video feedback for reflective practice in coach development: benefits and potential drawbacks. Coaching: An International Journal of Theory, Research and Practice, 1-18.
De Haan, E. (2008). Relational coaching: journeys towards mastering one-to-one learning. Chichester: Wiley.
De Haan, E. (2019). Critical Moments in Executive Coaching: Understanding coaching process through research and evidence-based theory. London & New York: Routledge.
Terblanche, N., Molyn, J., De Haan, E., & Nilsson, V. O. (2022). Coaching at Scale: Investigating the Efficacy of Artificial Intelligence Coaching. International Journal of Evidence Based Coaching & Mentoring, 20(2).
Erik de Haan is Director of the Hult Ashridge Center for Executive Coaching and Professor of Organisational Development and Coaching at VU University Amsterdam. He has published nearly 200 technical and research articles and 16 books covering his major areas of expertise as a leadership and organization consultant, facilitator, supervisor, and coach.