The $ 100 billion partnership between Nvidia and Openai has been announced on Monday, for now-AI Infrastructure Landscape resume the latest mega-sad-sad. The agreement includes non-voting shares that are associated with large-scale chip purchases and adequate computing power for more than 5 million American homes, deepening the relationship between two of the two most powerful players of AI.
Meanwhile, Google Cloud is completely placing a different bet. While the biggest players in the industry sometimes cemented fixed-participation, Google Helbant is on capturing the next generation of AI companies before the Google Court became too large.
Francis Dasuza, its COO, has seen the AI revolution from many convenience points. As a former CEO of Genomics giant Illumina, he saw the machine learning transform drug discovery. As a co-founder of a two-year-old AI alignment startup, Synthes Labs, he has rapidly struggled with the safety challenges of the powerful model. Now, after joining the C-Suite in Google Cloud in January, he is betting on the second wave of AI on a large scale.
This is a story that likes desouza in numbers. In a conversation with this editor earlier this weekHe noted several times that nine out of the top 10 AI labs use Google’s infrastructure. He also said that almost all generative AI unicorn on Google Cloud goes on that 60% of all General AI startups worldwide have chosen Google as their cloud provider, and the company has made a $ 58 billion in new revenue commitments in the next two years, which is doubled by its current annual run rate.
Asked what percentage of the revenue of Google Cloud comes from AI companies, instead it provides that “AI Cloud is resetting the market, and Google Cloud is moving forward in this way, especially with startups.”
The Nvidia-Openai deal gives the infrastructure an example of the scale of the scale of consolidation. Microsoft’s original $ 1 billion Openai investment is about 14 billion dollars, giving cloud market fundamentally again. Amazon, with $ 8 billion in anthropic investment, secured deep hardware adaptation, which is essentially tailoring AI training to do better work with Amazon’s infrastructure. Oracle has emerged as an amazing winner, also, also launched a $ 30 billion cloud deal with Openai and then left the jaw to a five -year commitment of $ 300 billion starting in 2027.
Even Meta, despite the construction of its own infrastructure, signed a $ 10 billion deal with Google Cloud, planning a $ 600 billion in US infrastructure spending through 2028. The $ 500 billion “Stargate” project of the Trump administration, which includes softbank, openi and Oracle, adds another layer to these interlocking partnerships.
Techcrunch event
San francisco
,
27-29 October, 2025
These huge deals can be a threat to Google, given that companies like Openi and NVidia are seen cementing elsewhere. In fact, it seems that Google is being excluded from doing some frenzied deal.

But corporate Bhamth is not sitting on his hands at all. Instead, Google is signing small companies like Cloud Loveable and Windsurf – which desouza “calls the next generation companies” calls – as “primary computing partners” without major advance investments.
The approach reflects both the opportunity and the need. In a market where companies “having a startup” can be a multi-time-dollar company in a very short time, “can be more valuable than fighting on today’s veterans, before Desuza has said, before Desuza has said, before they are capturing the future unicorn before mature.
The strategy extends beyond simple customer acquisition. Google AI provides startups $ 350,000 in cloud credits, access to its technical teams and Go-to-market support through its market. Google Cloud also provides that desouza is described as a “no compromise” AI stack – from chips to models applications – an “open coconut” that prefers customers on every layer.
Desuza said during our interview, “Companies love the fact that they can get access to our AI stack, they can reach our teams to understand where our technologies are going.” “They also like that they are gaining access to enterprise grade Google Class Infrastructure.”
This infrastructure profit became more pronounced this month when reporting revealed the view of Google to expand its custom AI chip business. According to informationFor the first time, Google has made deals with its tensor processing units (TPUS) in data centers of other cloud providers, including a compromise with London -based Fluidstac, including up to 3.2 billion dollars in financial support for the convenience of New York.
Competition directly with AI companies, they require infrastructure together. Google Cloud provides TPU chips to Openai and hosts the cloud model of anthropic through its vertex AI platform, even combat head-to-head with both its own Gemini models. (According to the documents of the New York Times Court, the original company of Google Cloud, Alphabet, earlier this year, anthropic is the owner of a 14% stake in the anthropic, although when asked about Google’s financial relations with anthropic, Desouza called the relationship “multiple partnership”, then a hurry ” The redirected-Grands can reach various foundation models.)
But if Google is trying to become Switzerland by furthering his own agenda, then there is a lot of practice in it. The roots of this approach are in Google’s open-source contribution, from Kuberanets to Foundational “Attention is All You Need” paper, which enabled transformer architecture to be the most modern AI. Recently, Google published an open-source protocol called agent-to-agent (A2A) for inter-agent communication in an attempt to showcase its continuous commitment in competitive sectors as well.
“We have made a clear choice for years to stay open on every layer of stack, and we know that companies can take our technology completely and use it to create a competitor in the next layer,” Desuza accepted. “This has been happening for decades. This is something with which we are fine.”
Google Cloud’s startups courtship falls in a particularly interesting moment. This month, Federal Judge Amit Mehta ruled a fine rule in the government’s five -year -old discovery monopoly case, which tried to curb the dominance of Google without disrupting his AI ambitions.
While Google avoided the most serious proposed punishments of the Department of Justice, including the forced division of its Chrome browser, the ruling underlined regulatory concerns about the company, which takes advantage of its discovery monopoly to dominate AI. Critics are worried, wisely, that the huge contingent of Google provides an unfair advantage in developing the AI system, and that the company can deploy the same monopoly strategy that secured its discovery dominance.
In interaction, Desouza focuses on far more positive results. “I think we have the opportunity to originally understand some major diseases, which we do not understand well today,” Desuza said, for example, underlining a vision where Google Cloud helps in power research in Alzheimer’s, Parkinson’s and climatic technologies. “We want to work hard to ensure that we are leading technologies that will enable that work.”
Critics cannot be easily assimilated. By positioning themselves as an open platform, which empowers the next generation of AI companies, the Google Cloud can show the regulators that it promotes competition, rather than that it forms the relationship with the startup, which can help in the case of Google if the regulators increase.
For our complete conversation with Desouza, check this week Download Strictlyvc podcast; Every Tuesday a new episode comes out.

