It is impossible not to notice that the pace of change in “AI” development and deployment has accelerated. Here I am talking about AI in a very broad sense, to include weak AI and any forms of machine learning that support the automation of human activities; the breadth of this definition, and any gaps in my knowledge, should not distract from and do not impact my core argument. Most of the concerns about AI that I read focus on either
- Short-term negative impacts, such as algorithmic bias; or
- Long-term negative potential, such as AI alignment.
I take concerns in both of these timeframes seriously, and I believe they need far quicker and more flexible policy responses that we currently see. However my personal concern is around the medium-term, and it’s something that I don’t see much discussion around, or even acknowledgement of; the possibility in which AI deployment in inappropriate fields risks degrading the fabric of human society to such an extent as to render it untenable.
Human society – all human societies – are woven from threads of human interaction. Everything that we value – and everything that we despise – are woven from these threads, tying us together in relationships wide and deep. Our institutions are woven from these threads, and at the same time they produce the conditions that enable that weaving to take place in more constructive ways.
Technology – in the narrow machine sense – tends to reconfigure these human relations in odd and unexpected ways. In general technology amplifies human capability, and the core human capability is connection with other humans – the social skill that enables collective endeavour, for better and for worse. At the same time, however, technology also produces action at a distance, which tends to push people apart – again, for better and for worse.
An example: self check-out using contactless payment cards is a logical step in retail, driving down costs while producing a frictionless experience; but the friction it removes is not just the need to scuffle for notes and coins, but also the need to interact with a cashier. Maybe the cashier is not somebody you care to interact with, and vice versa; but the weave is made up of interactions great and small, good and bad. Remove one, and you weaken the weave.
Many areas of human endeavour are based primarily on such interactions, to the extent that the more that you remove the component of interaction, the more you degrade that endeavour; and that degradation might not be noticeable at first, especially if it is deliberately obscured by a superficially more pleasant experience – such as making it easier and quicker to pay for things – but it is happening, and by the time you notice it may be too late.
Most often the degradation is obscured by the language of efficiency, which is another way of describing the removal of friction from our interactions. This friction costs money, which governments and corporations generally want to avoid, and so they tell us that we also want to avoid it, that our lives will be better if they are more efficient, that we will achieve more in less time, and so be more productive and have more free time for what we really desire.
What we really desire, of course, is to connect to other humans. It’s the desire that facilitates all the other desires; food and shelter, for example, are only really possible in a sustainable way in cooperation with other humans; emotional and sexual desire can only be satisfied by other humans, regardless of whether we want that or not; and our higher-order desires such as status-seeking and self-expression emerge entirely from our social milieu.
I believe, then, that the introduction of AI into any activity which is predicated on human interaction degrades and eventually destroys the very activity it is supposed to facilitate, and I suspect that this is inherent to the nature of the technology itself. I further believe that widespread deployment of this technology will degrade such a wide range of human activity in this specific way that it will make it impossible to sustain what we regard as “society”.
Let me talk about something very simple: online interactions of the type that almost all of us engage in every day; shadows of real-life interaction, but interaction nonetheless. Some people struggle with these interactions, such as those with poor literacy, and their lives might be made better by a bot that can formulate their calls and responses for them, facilitating their personal lives, or their business success; and we’d agree that would be good.
Yet such technology is unlikely to be used only by those people; it will inevitably be made available more widely. On some of the messaging apps I use, I am already given a choice of some automated responses, such as “I’ll call you later”; these choices will be widened to enable me to respond with generic answers to generic questions; perhaps because I don’t want to talk with my parents at that moment, or I want to avoid a friend for a day or two.
My parents and friends will also deploy their own similar bots, and these actors will become more sophisticated to the point where it is difficult to distinguish between human and machine in online interactions; and so we will find ourselves in a situation where, at any given moment, we will not know if we are interacting with a human or their machine – essentially we will not know if we can trust whoever or whatever it is we are interacting with.
When we consider that artificial intelligence can already generate audio and video simulacra of people, as well as text, images and sounds that mimic human creativity, and that all these are likely to be deployed at scale in a similar way, we can start to see the shape of the problem. We will begin to assume – we will have to assume – that all of our online interactions are with ghosts – even those interactions that are with humans.
This sense of the situation will be replicated across all of our online interactions, in all our endeavours. Was this piece of art made by a human or generated by a machine? Was this story written by a human or generated by a machine? Is this online class being taught by a human or generated by a machine? Is this online consultation being conducted by a human or generated by a machine?
This in turn will affect our real-life relationships. How will we begin to feel about our nearest and dearest in real life, as our online interactions with them are increasingly mediated by machines? How will we feel when we look at art in a gallery when we’ve been conditioned to think about art differently, not as the inspired product of human talent, but as the mediated product of machine generation? At the very least, we will feel differently than we did before.
What will be missing is trust that our respondents – whether individuals or institutions – are being honest with us, or whether they are using machines to obscure and deflect. Think about how you feel when you call a company and are placed in a queue while a pre-recorded voice massages your expectations. You no longer believe that company really cares about you; now imagine feeling that way about your family, your friends, everybody.
And so I believe that these developments will unpick the weave that constitutes human society, thread by thread. It won’t happen suddenly, and it won’t happen all at once; it will be a slow and subtle process which we will barely notice until we suddenly realise that the carpet we’re walking on has worn through, that the moths of mechanical intelligence have eaten clear through it in places, and that it’s no longer fit for purpose.
Broadly, then, I’m against introducing machine intelligences into the activities which constitute the threads of human interaction, but unfortunately they have already been released into a range of such activities, including games, art and teaching. Our society emerges from the interaction between artist and audience, between teacher and student, between friends at play; degrade those interactions, and that society no longer emerges.
Surely there is another, different society on the other side of this revolution; the question is whether you or I will want to live there.