Medium Article

November 6, 2025

where does the value of the human-in-the loop fit in the product strategy process, truly?

I’m struggling with something that feels rather existential at the moment. Everything about building products, from strategy, to ux, to design, to development is at an AI-instigated crossroads, and we’re sprinting towards the end of work as we know it in the product space. How we did it a year ago is already dead — but how far will it go? What will the required skills look like, teams look like, timelines look like, in 12 months? Things are moving so fast, I can’t say with any level of certainty that more than 1 human will even need to be involved in product in a year. I am not confident in that, but I also wouldn’t be surprised.

article originally posted:

As I am sure that everyone has been in their own right, I’ve been really focused on how to speed up, streamline, and in some cases completely reimagine how we work at tension — where can we offload to AI, what can we remove, what current processes can be depreciated, and where do we really need the human in the loop to ensure we’re delivering the same quality, craftsmanship, innovation and strategy as we always have?

This is exactly the question I am wrestling with:

Is it enough to have humans review and validate the outputs from an AI, editing, massaging and augmenting where needed,

or,

is the process itself, the chaos, the time it takes, the pauses between activities, the brief silences in a workshop or meeting, the banging your head against the wall, perhaps NOT an inefficiency, but a truly necessary and active part of innovation and ingenuity itself? What I am asking myself is this: is being in the mess, a required catalyst to solving the mess?

I don’t know if I am overvaluing my own creativity, or putting too much weight in how I personally get to those Aha! moments that are keystones in how we conceive something that we’re proud of, and that will drive results for our clients.

Is confusion the real context? When talking it out with clients or with their customers, and I don’t understand something — that’s when the brain kicks it up a gear. Questioning, digging, proposing. Sometimes correctly, often incorrectly. But through this back and forth — is where clarity emerges and the seeds of meaningful ideas are sowed.

Is an idea just that? Something that’s planted, soil fertilized, watered and tended too, and given enough sun until it blossoms? Is the process, its duration, its starts and stops, its ups and downs simply thought nurturing?

In most engagements, you never know where those keystone realizations might occur, though they always do. It could be in a stakeholder or customer interview, or in a workshop, or a prototype testing session. It could be when the first hit of caffeine from your morning coffee reaches your brain 2 days after you heard something. Or while in the shower. It could be a Costanza-moment, and happen 15 minutes after the call ends (the absolute worst).

I don’t know, not really, if those realizations require only a single moment or input, or the gradual and continuous ratcheting up of smaller understandings and realizations that one gets over a period of time while going through the linear, human-only, arduous process of 1:1 interviews, workshops, and research. Is it the relatively slow process, or perhaps time itself, that allows the marinating of thought occur — or is this my mind simply trying to justify how we used to do things? I don’t think it is.

If we were to feed a well architected multi-agent system with the right contexts, guardrails, and the same questions we ask ourselves, the ability to understand that itself doesn’t yet understand enough and thus to dig deeper, to make leaps of faith in proposing conceptual understanding or ideas and learning from them, whether they were right or wrong, and the ability to combine, and compound, ideas and insights — would it not be able to get to exactly where we as humans would get, just in a fraction of the time?

For most, probably. For us, I doubt AI could, at least today. Yes. I obviously think quite highly of myself and my team. But tomorrow, and at the speed things are changing, it could literally be tomorrow. Who knows.

Now, let’s say we had a system in place that did exactly what we suggested above, and it did it really well — only, it just couldn’t make those leaps of creativity and arms length connections. Is the human in the loop able to make those leaps when reviewing the output alone, or is being knee deep in it the whole time a requirement to jump creatively?

Would we be denying ourselves the pain, the awkwardness, the uncertainty, the exploration and especially the being wrong, that I feel in my heart of hearts, are core ingredients to truly and creatively solve messy problems?

Is my thinking that it cant make creative leaps just my affection for the flawed reality of the human condition, in its drive for creativity, speaking? Or, is it a requirement for true creativity, regardless of human or AI? More importantly, do the same rules apply to AI or am I merely transferring my own flaws upon it? Can AI have those ratchets, just in fractions of seconds where our human minds, distracted by all the other things, can act either a quickly, or as slowly as multiple days?

Are we in fact perceiving this wrong? Am I simply trying to attribute traditional human thinking models and behaviours to AI, when it’s more akin to quantum mechanics?

As I write this, I am starting to wonder: is AI output reviewed by a human who hasn’t gone through the process themselves, the equivalent to a human giving an AI a partial, incompletely contextualized prompt in terms of quality of final output?

Perhaps the question at its true core, I now think, is not ‘where humans fit in the process’ at all, the question very well may be: in the pursuit of creativity via the symbiotic relationship between humans and AI, is AI the problem, or are we?

Exacerbating the problem, will AI just make us creatively lazier, significantly weakening our thinking and our questioning muscles, until we no longer bother to challenge, no longer strive for different, new or exceptional?

I believe it will. Which will make us even more of a problem than we are today.

Fuck.

see all news