On Being Curious About Artificial Intelligence

On Being Curious About Artificial Intelligence

By Ax de Klerk | 17 Dec 2025

1. Introduction

What draws me to AI is not novelty, nor the illusion of intelligence, but the way it quietly interferes with how expertise has traditionally been understood. It challenges long-held assumptions about learning, mastery, and legitimacy, not by breaking them outright, but by rerouting them. The discomfort this creates is not a flaw in the technology. It is an ethical signal, asking where understanding now lives, and who is responsible for it.


1. The Moment of Unease

There is a moment most people experience when they first engage seriously with AI, a flicker of surprise followed by a quieter unease. Not fear, exactly, but a sense that something familiar has shifted. The interaction feels less mechanical and more conversational, as though the system is reflecting something back rather than simply responding. That moment is where the fascination begins for me.


2. Learning Turned Backwards

Public discussion around AI tends to fixate on speed, automation, or replacement. Faster outputs. Fewer barriers. Less time spent learning the hard way. But those framings miss the deeper change taking place. AI does not simply alter what can be produced, it alters how learning itself is oriented.

For centuries, expertise followed a largely linear path. Process and theory came first. Foundations were built patiently. Mastery was something you moved towards rather than something you started from. Whether in philosophy, carpentry, music, or the kitchen, the assumption was the same. Learn the method, and the outcome will follow.

AI quietly inverts that logic. It is now possible to begin with the outcome and work backwards. This shift introduces an ethical tension that is often misunderstood. It is not that people no longer need expertise. It is that expertise is migrating. Less emphasis is placed on memorising process and theory, and more on recognising what good looks like when it appears.


3. Cooking, Judgement, and Outcome Knowledge

Cooking makes this easier to understand, particularly through lived experience. As a trained and experienced chef, my confidence in the kitchen never came from memorising recipes. It came from knowing the dish before it existed. I knew how it should taste, how it should smell, how the texture should feel, and what the final plate needed to communicate. Early on, recipes were essential. Over time, they became scaffolding rather than instruction.

Eventually, the process reversed. I stopped cooking from the beginning forwards and started cooking from the imagined end backwards. Ingredients were adjusted instinctively. Techniques were chosen deliberately. The route mattered less than the destination, because the destination was clear. That ability was not a shortcut. It was the result of deep, embodied learning, internalised to the point where the process could be reconstructed when needed.

AI invites the same reversal. It allows people to specify the destination first and explore routes later. Used responsibly, this can accelerate learning by exposing gaps, testing assumptions, and encouraging iteration. Used carelessly, it produces convincing surfaces with no depth beneath them. This is where ethics enters, not as a brake on technology, but as a demand for discernment.


4. Dark Country, Echoes, and the Ethics of Meaning

The same tension appears in music, particularly in darker genres. Take ‘Dark Country’, a style built on loss, fatalism, moral decay, and haunted interiors. Traditionally, its weight comes from lived experience. The voice carries history. The songs feel earned. When a human sings about damnation, it resonates because something has been risked.

AI-generated Dark Country complicates this. The system does not suffer. It does not remember. It does not fear. What it does is learn how suffering has been described. It recombines the language, imagery, and atmosphere that humans have historically used to express despair and produces something that sounds like experience without having one.

And yet, the listener still feels something. That is the unsettling part. The emotion does not originate in the machine. It completes itself in the human. Meaning is not generated by the system; it is projected onto it. In that sense, AI is not a storyteller with scars. It is an archive that learned how scars are described.

Dark Country almost welcomes this eeriness. The hollowness, the disembodied voice, the sense that something is not quite right, these qualities already belong to the genre. An emotionless system singing about emptiness becomes part of the aesthetic. A digital ghost, echoing human grief back to us.


5. Responsibility Over Authenticity

This is why the ethical debate cannot stop at authenticity. The real question is responsibility. When outputs are easy to generate, judgement becomes the scarce skill. When form is abundant, meaning must be curated. AI does not ask whether something should exist or whether it carries weight. It simply produces. The ethical burden remains human, in deciding what to accept, what to reject, and what to treat with care.

Seen this way, AI does not erase expertise. It demands a different kind. One rooted in recognising quality without relying on process, understanding outcomes deeply enough to work backwards, and knowing when something sounds right but means nothing.


6. Conclusion: What Actually Matters

The fascination with AI is not about intelligence. It is about what happens when humans are given tools capable of reproducing form without experience and are forced to decide what meaning still requires. AI can echo a voice, shape a sound, or assemble an answer. Only humans decide whether it is worth listening to.


7. Footnote: Reading This Alongside the Next

This essay focuses on how Artificial Intelligence reshapes learning, expertise, and responsibility by rerouting how outcomes are reached. It asks where understanding now lives when process is no longer the primary gatekeeper of mastery.

The following essay turns toward meaning. It explores how AI-generated art, particularly within darker musical forms, functions less as a creator and more as a mirror, reflecting human experience back to the listener without having lived it.

Read together, these pieces are not arguments about technology, but reflections on judgement, projection, and the increasingly human task of deciding where meaning still resides.