
OpenAI founder and CEO Sam Altman
OpenAI founder and CEO Sam Altman today took to his blog to state unequivocally that “growth in the use of AI services has been astonishing.” At the other end of the spectrum, he also noted recently that “when bubbles happen, smart people get overexcited about a kernel of truth.” Confused about this?
Well, so were we, so we decided to ask ChatGPT to explain what appears to be a basic disconnect. And the answer thrown back was both illuminating as well as disconcerting. While the AI chatbot was gung-ho about user growth and adoption rates, it also highlighted the massive investments required to sustain the growth as well as the need for global-level policies and regulation.
On the risks front, the chatbot listed job displacement, misinformation, bias and discrimination, privacy and surveillance, data security and concentration of power by the Big Tech as causes for worry. What was disconcerting was that none of the above found a mention in Altman’s latest post, which appeared to be fuelling the bubble.
Here is what he said in a nutshell: As AI gets smarter and access to it becomes a fundamental driver of the economy, delivering what the world needs would require more groundwork to be put in place in the form of AI infrastructure “for inference compute to run these models, and for training compute to keep making them better and better.”
There was no mention of the challenges that have sprung up of late where businesses claim they aren’t seeing enough benefit in the AI of the day to pay for it. An article in The Information (paywall enabled) indicated that Microsoft was pushing customers to use Copilot in Office 365 in order to bolster software sales.
Instead, Altman goes on blithely to suggest that if AI stays the course, we could soon be using 10 gigawatts of compute to figure out cancer cure or provide customized tutoring to every student in the world. Then comes the gobsmack: “If we are limited by compute, we’ll have to choose which one to prioritize; no one wants to make that choice, so let’s go build,” he says. An obvious reference to OpenAI’s recent deal with Nvidia.

the RBI deputy governor Rajeshwar Rao
There was another article published by Financial Times that discussed how companies in the US kept discussing about AI without actually ever explaining the upsides. Closer home, the RBI deputy governor Rajeshwar Rao warned that the euphoria over AI must never overshadow prudent risk management.
He made this point at a keynote speech on September 16 while the RBI website published the transcript earlier this week. “From the risk perspective, the long-term implications of AI adoption on the financial system remain uncertain but carry potentially far-reaching consequences,” he noted.
Rajeshwar made the point that it was indeed imperative for the financial sector to approach AI adoption with foresight — investing not just in innovation but also in resilience by building strong governance structures, diversifying dependencies, continually assessing emerging risks, and ensuring their AI strategies align with the long-term safety and sustainability of the financial system.
These words of wisdom appear to have made no impact on Altman and his ilk. In the blog, he ideates on creating a factory that can produce a gigawatt of new AI infrastructure every week. Accepting that the challenge would be huge, the OpenAI boss reveals that over the next couple of months he would reveal plans and the partners to make this a reality.
“Later this year, we’ll talk about how we are financing it; given how increasing compute is the literal key to increasing revenue, we have some interesting new ideas,” he concludes. Another disconcerting fact in this blog was the total avoidance of sustainability and the massive power demand that existing AI hardware requires to power compute.
A technology report published by A Bain & Co yesterday provides critical evidence that business spending on AI is nowhere close to justify data centre upgrades required to meet the “insatiable demand” from large AI companies for computing capacity. It also notes that there is little evidence to suggest that consumers are willing to spend more.
The irony is that Altman’s argument is not based on demand growth but on the fear of missing out (FOMO). “As AI gets smarter, access to AI will be a fundamental driver of the economy and maybe eventually something we consider a fundamental human right,” is his lofty claim. Reminds us of the big tech companies making a similar claim that the Internet is a fundamental right alongside food, water, shelter, clean air and health.

Mark Rushworth, founder and CEO of the all-optical network switch company, Finchetto,
Which brings us to the moot question on whether Altman and his tribe are actually tackling the problem or the symptom? Mark Rushworth, founder and CEO of the all-optical network switch company, Finchetto, thinks otherwise. “The real culprit at the heart of this crisis is the inefficiency of the hardware itself,” he says.
In an article published by SDX Central, Rushworth notes that data centres face a double whammy situation. There isn’t enough clean energy being generated fast enough to meet the growing demand. Where renewable power resources are available, there is a challenge to deliver it where required.
To cope with this challenge, operators are seeking creative solutions such as immersion cooling, district heating systems and underwater data centres. But, the real issue is that the hardware powering AI and cloud workloads today are power-hungry. “Nvidia now projects racks consuming up to 250 kW each, with this set to rise to as much as 99kW each, but most existing data centres weren’t designed for this,” he writes.
“Instead of asking how we feed the beast, we should be questioning how we can shrink it. What we need is a new generation of energy-efficient computing and networking infrastructure. One promising solution is all-optical networking, which slashes energy use by tackling the fundamental inefficiency in data transmission,” says Rushworth.
Innovation is the name of the game and that too upstream. We had reported how Microsoft is seeking to harness smartwatch display technology to reduce high power consumption in AI infrastructure. Maybe, this is where the large cache of funds that AI enterprises are generating must be spent on.
So, instead of spending billions in companies like OpenAI to push the gravy train forward, Nvidia and others working on selling AI chips can also focus on research that helps reduce the power demand from existing AI hardware. Maybe, the two companies should join hands and discuss how to reduce power demand before deciding on their approach to set up the next data centre.

Torsten Slok, Partner and Chief Economist at Apollo Global Management
Already there are rumblings in the interior. We came across a commentary from Torsten Slok, Partner and Chief Economist at Apollo Global Management where he called out that equity investors were dramatically overexposed to AI. He warned that people could get hurt badly when this merry-go-round stops. And the bigger irony is that Sam Altman too made the same comment three months ago about the AI bubble bursting.
If you have an interesting Article / Report/case study to share, please get in touch with us at editors@roymediative.com roy@roymediative.com, 9811346846/9625243429.










