Singularities in Pop Culture

What the movies get right and wrong

Author
Affiliation

Dr Charles T. Gray, Datapunk

Good Enough Data & Systems Lab

Published

Invalid Date

Mistaking heurisic systems for intentional systems

The Matrix (Wachowski and Wachowski 1999) almost has this right. We are trapped within a system of subsystems, and we can bend it.

As a person, we exist in systems of people and computation. It is not only an interaction with computaton, but people that can cause us harm.

However, the danger from other human agents in the singularity is a little less hyperbolic. Instead, reframe the agent as the administrative worker you need to interface with who is overwhelmed within the heuristics of their own singularity.

In that administrative worker’s world, their husband is not doing enough housework, the kid is in trouble at school again, everyone in the office is unhappy with the new email tool, they are starting to worry about their drinking problem and here’s you, unsatisfied with a standard response. You can see that your slight deviation from usual heuristic is reasonable, but because they are overwhelmed in their singularity, they cannot accommodate.

The heuristic system caused a breakdown in the connection between one human and another; this is outside of humanity’s general intent of harmonious interoperations of people. In general, people tend to be heroes of their own lives, most people mean well. There is no architect determining your fate in the system, it’s just what Amos calls the churn.

It is better to think of people and tools interoperating in potentially chaotically emergent ways. It’s just your bad luck that your request from that adminstrative worker deviated from their usual heuristic in some way you did not anticipate. It made both humans day’s worse, but it was the heuristics of the system that produced the emergence.

Consider this system in terms of three things:

  1. People.
  2. Machines.
  3. Relationships between people and machines.


A category-theoretic way of measuring the stabilty of the system might be to ask:

How many of the people, machines, and relationships between are operational?

Humans are harmful to each other when they default to heuristic thinking; machines are harmful to humans when humans misplace expectations on machines thining with intent. All interoperations need to be governed.

We will never control all of it, but can at the very least understand what we are and are not controlling within different contexts; once we understand these things, we optimse. That is bending the singularity.

Bending the singularity

When we understand which the systems we are in then we can govern with intent; the system becomes more intelligent. We can see the different systems, and think within a system where we reorient how we consider our interoperability.

The Matrix was right, there is no spoon; category theory gives us rigorous methods of reframing the spoon in ways we can govern it. Consider this; perhaps a particular tool is really frustrating to a developer. But when they look at the tool as an element of all tools and how they have interoperated in their life, they realise, they actually would rather be a baker than write code. By reframing the tool as contextual within their singularity, the problem they were solving was now about something else.

To do all their most important tasks, the team had to enter The Matrix; we cannot solve ethics in this debate through boycotts.

The Matrix has you…

We must accept humans use tools like Facebook and ChatGPT and study the emergences of naive interoperation so scientists can make informed, demonstrative recommendations to industry. In this way we lay pathways to governing humans toward intent, and machines toward constrained heuristic.

Governance is required

Dr Susan Calvin did not hesitate to fire the positronic gun when she was confronted with a robot that displayed emergences outside of the three laws.

And, again, instead of focussing on whether the robot’s emergence was human or not, consider that it was a question of emergence, operating outside of constraint in a way potentially dangerous to humans. In so doing, we move fiction to reality. The lack of hesitation, however, now this is grounded in reality. We hard stop any machine system displaying harmful effects to humans and should, at the very least, be encoding the three laws of robotics into these systems we are operating within.

Similarly, Atlas Shepherd did not hesitate to terminate robots displaying harmful intent to humans.

Now, reframe her interface with Smith as a governed intelligence structure, and Harlyn as an ungoverned intellience structure. Atlas’ journey was about learning she could not reject technology, no more than we can escape The Matrix; but she will choose to interoperate in governed systems and guard against harmful emergence.

Reframing virtuous human-machine interoperation

We can govern our singularities to empower humanity, or we can allow chaos, that is the Reapers to reign in the worst-case ending.

Rather than a city-destroying robot, think of a train breaking down because it was reliant on some code that failed in a production pipe. Think of the train stalled over a crossing preventing humans getting to a hospital.

We can reframe Mass Effect’s max-Palladin ending out of science fiction by considering these representations of singularities as governed singularities.

Our mission, resist the heuristic chaos of The Matrix overriding the system of humanity

Without governance, chaos reigns

Any developer, or any person on a team for that matter, will tell you that at kick off, leadership are convinced that the plan is well defined. However, any developer will also tell you they’ve never seen a well-defined plan. Every developer is living in a singularity of tasks that are required to be computationally isolated, perhaps change a number from 3to 4 i n particular file. All of these tasks are meant to reassemble into leadership’s vision. Despite the advances of agile, chaos still reigns. Development tasks are dinosaurs in the Jurassic Park of structured intelligence systems.

The result of ungoverned structured intelligence systems? Developers are blamed and traumatised by unfair demands; the wellbeing of humans in the system they interoperate within, the singularity, is not considered. This is shredding the very talent we need to solve critical problems facing humanity, such as climate change and inequality.

A virtuous singularity is possible

loop(Critical theory -> Mathematicians -> Developers -> Data scientists -> Decision makers ->)

Mathematicians need to formalise frameworks produced by fields such as postcolianlism, queer theory, and poststructuralism. These need to be instantiated in opinionated tools by those who can understand the categorical frameworks mathematicians develop for structured intelligence governance. Those opinions need to be reported in a way that serve those who have power to make category-level changes to the singularities we all exist in.

And, honestly, if we don’t, to put it in ’Strayan, we’re all farked, mayte.

References

Wachowski, Lana, and Lilly Wachowski. 1999. The Matrix. Warner Bros., Village Roadshow Pictures, Groucho Film Partnership.