We still don't understand one another

Photo credit: Vardan Papikyan via Unsplash

In 1864, Charles Babbage wrote, ‘On two occasions I have been asked, — "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" . . . I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.’

It is worth looking at that quotation twice. The first time we see the stupidity of people asking a blatantly ludicrous question. How could anyone imagine that the machine knows what the right numbers are supposed to be? The second time we see the complacency of the technical expert assuming an unrealistic level of understanding in his audience. How could anyone imagine that the audience knows what a brand new machine is capable of?

Technical people and non-technical people have been misunderstanding each other since before computers even existed, and we keep on misunderstanding each other today. Despite years of multi-disciplinary teams, digital literacy programmes, standups, backlogs and other tools and rituals, it is common to find groups of people who don’t understand how each other work and think, and who routinely frustrate each other.

Technical people often wish that non-technical people would express themselves clearly and precisely, say what they want, show some curiosity and interest in what was being built, and understand the constraints and unpredictability of software development.

Non-technical people often wish that technical people would explain themselves in simple language, pay attention to what was needed, stop talking about the obscure details of technology, and say when the thing was going to be finished.

This lack of understanding and empathy has been going on for so long that it might seem that we will never find a remedy: we will just exist as sullen, separate groups. But we haven’t failed until we give up . . .

Perhaps AI can help. Not in the sense that we should ask AI what the answer is: that just results in the usual bland synthesis of received wisdom. But maybe the experience of using AI to build solutions can foster empathy between technical and non-technical teams.

A non-technical team, encouraged by coverage of phenomena such as vibe coding, may try using an AI tool to build a solution from scratch, without relying on a technical team. That is likely to lead to some fast and hard lessons: that precision matters, that technical expertise matters, that you don’t really know what you want until you start building it, that software development is not just coding, and that technology gets very complex very quickly. The fast turnround between prompt and result, and the ability of tools to show their chain of activity gives non-technical people a glimpse into the work that technical people do - and, I hope, a new respect for it.

A technical team using new AI development tools has to get to grips with the concept of prompting rather than coding. They are likely to have an experience in which the tool keeps getting it wrong, even though they think that they have explained their needs clearly, in which the tool does weird things that they didn’t ask it to, and in which the time taken to get to a result is elastic and unpredictable. In short, they experience something of what it is like for a non-technical person to get a technical person to do what they ask. AI tools are often compared to an enthusiastic intern: they can also sometimes resemble a recalcitrant and obstinate developer.

Using AI is not, on its own, enough to get technical and non-technical teams to understand and respect each other. But the adoption of these tools creates a moment where our relationship with technology is changing - as is our relationship with each other. Everyone is talking about AI: why not use AI as a prompt to talk about relationships between teams?

Previous
Previous

Out of the shadows

Next
Next

There’s plenty of room at the top