Be More Human?

Be More Human?

Written by

Avatar of author

Simon Stoker

"Be more human."

Scroll any talent acquisition feed and you find it in posts about AI or conference themes. The instruction is well-intentioned and the diagnosis is likely right.

But as design guidance, it stops too soon.

Being human is an orientation, it is an attitude. It does not describe how a team is structured or where the emotional load of hiring sits.

That is a design question. And here is my hypothesis.

Most hiring systems have no design for emotional load. Emotional efficiency is the ability of a hiring system to carry emotional load cleanly and convert it into decisions, rather than letting it accumulate as drag. Most TA functions have never been asked to design around it. The cost of that absence is higher than most organisations realise. Where does the load go when the system cannot carry it? Into requirements that keep shifting because the real constraints were never surfaced or perhaps into the gap between what a hiring manager says they want and what they will recognise when they see it? None of this appears in a capacity model or is assigned to a role. It accumulates and it finds its own level.

Consider what that means for AI.

The working assumption is to automate the transactional and keep humans in the meaningful moments. That is a good place to start. The problem is that "meaningful" is carrying all the weight and nobody has defined it clearly enough to build from.

A more useful question: what is the emotional weight of this interaction, and what do the people involved experience when technology handles it?

AI video interviewing removes reciprocity. Reciprocity is what makes an interview feel like a conversation rather than an audition. That is not an argument against the technology. It is an argument for understanding what you are trading when you use it and whether that trade-off was ever made explicitly or simply defaulted into.

Most automation decisions are made on volume and cost. The emotional stakes of each interaction rarely come into it until later, where you might see candidate experience deteriorate while process metrics look fine, and the load the system was never designed to carry ends up concentrated on the people least protected from it.

Most organisations are likely to keep making automation decisions on volume and cost. The emotional load will keep going somewhere. Emotional efficiency as a design dimension does not solve that automatically, but it at least makes the choice conscious.