Intentionally Co-Shaping Society

AI is a technology cluster; but it is also a cultural practice of building trust in autonomous systems. It is unique because it responds to and impacts people in an active process of co-shaping: machines learn from us and we learn from them. So long as there are gaps in our understanding of how to intentionally wield that process, AI is more likely to shape us in unexpected and uncontrolled ways.

How do we build AI worthy of our trust and what do we risk when we trust AI?

This interaction is the crux of AI. To master it, we need to deeply understand the machinations of the machines that we build as they interact with societal structures. But more importantly, we have a social responsibility to control that process in defined, challengable and open ways to avoid encouraging negative patterns of behaviour.

Trustworthy AI

Over time, our users develop patterns of reliance through interplay with our system. Concurrently, our system learns a 'better' internal policy based on a pre-selected objective that we have encoded. These two actors feed information and respond to each other in an active process of co-shaping that converges towards... something.

The pace and direction of that convergence is intrinsically linked to their trust in our system. Consequently, the degree of control that we can exert over that process is directly connected to the degree of trust that a person has in our technology.

Consider a chatbot for at-home pain assessment that asks:

💬 On a scale of 1-10, describe your pain. If the pain is a 9 or 10, contact emergency services.

If our users trust that system, they might:

  • self-censor at the clinic by saying that their pain is less than 8 (or else they would have called for an ambulance)
  • avoid going to the clinic entirely depending on the pain assessment

Our chatbot is simultaneously adapting itself towards whatever we have encoded as 'better'. So, how do we assess the pace and direction of co-shaping between our chatbot and our users? Where will it be in three months after tens of thousands of people have interacted with it?

The pinnacle of trust is reliance. When a person truly trusts something, they will rely on it to do something of relative importance. Trust matters because it is deeply connected with adoption.

Trust is critical in determining how quickly, to what extent and in what direction that co-shaping leads us. Yet trust is difficult to engineer because it emphasises an intensely personal experience of interrogation and imagination rather than a measurable characteristic of a system.

Elements of Trustworthiness

Trust is the extent to which a person commits to a narrative of reliance. Trustworthiness refers to all of the things that we do to make that narrative easy to construct and tell.

What makes a system worthy of trust?

We can take a limited view trustworthiness through three modes of socio-technical analysis:

  1. Trustworthy by Knowledge of Limitation
  2. Trustworthy by Proxy of Authority
  3. Trustworthy by Resistance to Corruption

Last modified: 2022-11-07