top of page
Search

The Gentle Singularity and Self Improving AI

  • Writer: Yoshi Soornack
    Yoshi Soornack
  • Jun 16
  • 3 min read
ree

“The economic value creation has started a flywheel of compounding infrastructure build-out… robots that can build other robots (and in some sense, datacenters that can build other datacenters) aren’t that far off.”

Sam Altman, The Gentle Singularity


There’s a particular phrase tucked into Sam Altman’s latest blog that feels small on the surface but vast in implication: “a larval version of recursive self-improvement.”


It’s easy to read The Gentle Singularity as another hopeful reflection on superintelligence. But beneath the optimism lies something more layered. Altman doesn’t just gesture at smarter models. He points to a world beginning to improve itself, not only in code but in the systems that create and distribute that code.


The intelligence isn’t just getting better. The entire machinery behind it is beginning to self-organise.


Loops in motion


Recursive self-improvement is often spoken about as a technical feedback loop. AI writes better code, the code improves the model, the model improves the code. Yet what’s unfolding now moves beyond the software stack.


Altman suggests that the first wave of humanoid robots will likely be built the traditional way. But the second wave could be assembled by robots themselves. The same holds for data centres. Initially constructed by people, they will eventually be expanded and maintained by intelligent systems.


This is not just recursion. It is recursion nested within infrastructure. Each generation builds the next, not just in software but in steel and silicon.



Intelligence at the cost of electricity


At the heart of this shift lies energy. As automation scales and human labour is displaced from production loops, Altman notes that the cost of intelligence will approach the cost of electricity.


In essence, cognition becomes a utility. Thinking could become as abundant and cheap as light.


That prospect reframes everything. If intelligence can be replicated, distributed and scaled like power, the bottlenecks of progress shift. Problems once limited by expertise or labour suddenly become solvable at scale. Science, medicine, construction, education, all enter a phase of accelerated transformation.



A curve, not a cliff


Altman is careful to avoid the cliché of an explosive singularity. From his perspective, the shift will not feel abrupt.


“From a relativistic perspective, the singularity happens bit by bit… one smooth curve.”


The outside world might interpret this transformation as fast and jarring. But from within, progress will feel steady, perhaps even manageable.


The future will not arrive as a sudden break. It will seep into the systems we already use.



The human pulse


Altman closes with a reflection that feels both grounding and quietly radical.


“People have a long-term important and curious advantage over AI: we are hard-wired to care about other people… and we don’t care very much about machines.”


This point is easily missed. If recursive systems begin to improve themselves across digital and physical realms, then values, not just performance - become the critical axis. The system may accelerate, but we decide what it aims for.


Human judgement becomes the balancing force. The flywheel may spin, but we choose what it turns towards.



Governing the self-made world


This is where the wider questions begin. If recursion escapes the lab and enters the world, if machines begin to build the tools that build them, how do we ensure alignment? Who decides the guardrails for systems that no longer rely on people for every turn of the crank?


Here are two questions to reflect on:


How will governance adapt when infrastructure takes on its own replication?

At what layer does alignment feedback best anchor this flywheel?


These questions are not abstract. They are timely. Already, we are seeing hints of autonomous factories, self-correcting codebases and AI-designed chips. These are the early loops. Each one reduces dependency on humans. Each one raises the stakes for design, alignment and intent.



The gentle singularity


If Altman is right, what we are witnessing is not a leap toward sentience but the layering of feedback loops across code, matter and energy.


This is not just AI becoming smarter. It is AI becoming a builder of its own ecosystem.


The singularity may not arrive as a conscious entity. It may arrive as a pattern. Recursive, compounding, quietly spinning in the background.


Rabbit hole

Read Sam's full blog here

 
 
 

Comments


bottom of page