George Gilder just reached a huge audience with an idea that might sound familiar to you.
In a recent Wall Street Journal essay, he argued that the age of the microchip — the very technology that built Silicon Valley — is coming to an end.
Now, if you don’t know George like I do, this might sound like utter nonsense.
But for decades, he’s been ahead of the curve on calls like this.
George predicted the rise of the internet long before Wall Street did. He warned Bill Gates that web browsers would upend Microsoft’s software monopoly. He even foresaw a new computing universe based not on faster chips but on endless bandwidth, long before most people thought it possible.
Now he’s doing it again. And this time, millions of Wall Street Journal readers got a glimpse of what we’ve been talking about for months…
What just might be the next big leap in computing.
A Computer the Size of a Dinner Plate
In his WSJ essay, George argued that the microchip is still extremely important to the U.S.
The U.S. government considers chips vital and strategic. The 2022 Chips Act authorized more than $200 billion to support chip fabrication in the U.S. and keep it away from China. Microchips shape U.S. foreign policy from the Netherlands, home of ASML, the No. 1 maker of chip-fabrication tools, to Taiwan and its prodigious Taiwan Semiconductor Manufacturing Co.
But he also notes that the microchip’s design hasn’t changed much since the 1970s.
Engineers still carve a silicon wafer into hundreds of smaller chips, package them individually and wire them together inside data centers.
That system has worked for half a century. But it’s hitting its limits.
That’s why George and I are so excited about wafer-scale chips.
Image: Cerebras
These revolutionary single-wafer computers flip the old microchip model on its head. Instead of slicing the wafer, the whole disk becomes one massive processor. Every transistor stays connected on a single surface, letting data move at lightning speed.
It’s like a computer without borders…
One giant piece of silicon where memory, logic and communication all live together.
That’s the vision behind companies like Cerebras Systems, which builds 12-inch wafer-scale processors holding 2.6 trillion transistors and 850,000 AI cores. The Department of Energy has been using them for nuclear fusion research and advanced physics simulations.
And as George and I discussed recently, it’s also what Tesla implemented with its Dojo supercomputer, a custom-built AI training system using wafer-scale tiles to train autonomous-driving models.
That concept lives on in Tesla’s upcoming AI6 unified AI chip.
And George believes this kind of architecture will eventually replace the microchips that dominate AI computing today.
I agree with him. At least in the long run. But for now, the reality is that wafer-scale chips have limits too.
They can handle AI models with up to about 100 billion parameters. That’s impressive, but far smaller than something like ChatGPT, which runs on 1.8 trillion parameters. And this is because wafer-scale chips can’t yet pack enough memory close to the processor.
There’s also the challenge of scale.
Traditional GPUs are made in batches. If one chip is defective, you toss it and move on.
But a wafer-scale processor is one enormous piece of silicon. One tiny flaw can ruin the entire device.
That’s why these systems are mostly being used in specialized research environments for now.
As I told my team last week, you can absolutely use wafer-scale chips for specific, high-performance workloads today. But not for full-scale cloud operations.
Not yet, at least.
But George has a way of spotting where the puck is going before anyone else sees it. And if you look at history, most of his “too early” calls end up being right on time a few years later.
I also agree with Geroge that the U.S. needs to lead the way in what he calls “the post-microchip era.”
But as he warns in the WSJ piece:
By cutting off the Chinese chip market, which contains the majority of semiconductor engineers, U.S. industrial policies have hampered American producers of wafer-fabrication equipment—essential for making chips—without slowing China’s ascent. In the wake of these protectionist policies, launched around 2020, Chinese semiconductor capital equipment production has risen by 30% to 40% annually, compared with annual growth of about 10% in the U.S.
The paradox George is pointing to is what concerns both of us. America invented the microchip, yet we risk falling behind in the race to build what comes after it.
Because wafer-scale computing isn’t just another generation of hardware. It represents a deeper shift in how intelligence and industry will connect in the future.
That’s what George and I mean when we talk about “Convergence X.”
It’s the moment when AI, advanced manufacturing and energy systems stop evolving in separate lanes and start merging into one unified ecosystem.
And wafer-scale architecture is a path that will make this future possible.
These new processors blur the line between chip and computer. They move data almost instantly across a single surface. And they can train models locally without relying on cloud data centers halfway around the world.
In other words, they bring intelligence closer to where things are made.
That’s a big factor of Convergence X: putting the “brain” of the digital world inside the machines, factories and power systems that drive the physical world.
And you can already see it taking shape across the U.S.
Whether with Intel’s new “Silicon Heartland” factories in Ohio, or TSMC’s advanced facility rising from the Arizona desert, or Tesla’s Dojo supercomputer, built to train millions of autonomous vehicles simultaneously.
Each one is part of a larger pattern.
It’s about bringing intelligence home, embedding it directly into production and reducing America’s dependence on foreign supply chains.
Here’s My Take
Wafer-scale integration isn’t ready to replace the data centers that power today’s AI quite yet.
But although George might be slightly early, he’s not wrong.
When wafer-scale systems finally overcome their manufacturing limits, entire server farms could shrink to the size of a single disk.
Meaning, the future he’s describing could be just around the corner.
Regards,
Ian KingChief Strategist, Banyan Hill Publishing
Editor’s Note: We’d love to hear from you!
If you want to share your thoughts or suggestions about the Daily Disruptor, or if there are any specific topics you’d like us to cover, just send an email to [email protected].
Don’t worry, we won’t reveal your full name in the event we publish a response. So feel free to comment away!


















