LLM Watch

LLM Watch

Google Cloud’s AI Stack Challenges Rivals with a Full-Platform Play

Daniel Brooks

By: Daniel Brooks

Thursday, September 25, 2025

Sep 25, 2025

6 min read

Startup written with a design using Googles colours
Startup written with a design using Googles colours
Startup written with a design using Googles colours

Google Cloud is making an aggressive push with its integrated AI tech stack, leveraging a $58 billion infrastructure bet to attract startups. Photo Credit: Google Cloud

Key Takeaways

  • Aggressive Full-Stack Strategy: Google Cloud is executing an aggressive strategy to challenge AI market leaders by offering an integrated, end-to-end platform, positioning itself as a one-stop destination for AI development.

  • Three-Layer Value Proposition: The company’s core offering is a three-layer AI stack comprising its custom infrastructure (TPUs), the Vertex AI platform for MLOps, and its proprietary Gemini foundation models.

  • Massive Infrastructure Investment: The strategy is backed by a significant $58 billion investment in AI infrastructure, signaling a long-term commitment to compete on scale with AWS, Microsoft, and Nvidia.

  • Focus on Startups: A central part of Google’s go-to-market strategy is targeting AI startups with credits, specialized programs, and an integrated platform designed to win long-term customers and capture the next wave of innovation.

  • A Market Counterweight: While currently third in cloud market share, Google’s full-stack approach offers a direct alternative to the concentrated power of rivals, aiming to simplify development, reduce vendor complexity, and provide a credible challenger in the AI infrastructure race.

This last week, headlines have been dominated by the $100 billion partnership between Nvidia and OpenAI, a deal so large it sparked antitrust scrutiny from regulators. Yet while rivals are busy making splashy moves, Google Cloud has quietly positioned itself as the underdog. Instead of betting on a single high-profile alliance, Google has been building patiently in the shadows, investing in infrastructure and engineering to create a full-stack AI platform that could help it reclaim its place at the center of the AI economy.

Google Cloud’s aggressive play for the AI market

Google’s strategy is to offer an integrated AI platform that eliminates the need for organizations to piece together solutions from multiple vendors. At recent events such as Google Cloud Next ’25 and the Google Builders Forum, the company unveiled product updates, customer partnerships, and startup programs, all reinforcing one message: Google is not just a cloud provider, but a one-stop destination for AI development [3][5].

At the core of this push is a $58 billion commitment, backed by a broader $106 billion backlog of contracts that will convert into revenue over time [5]. This scale of investment underscores Google’s belief that owning the infrastructure layer is essential to competing against AWS, Microsoft, and Nvidia.

The three layers of Google Cloud’s AI stack

1. Optimized infrastructure

The foundation is Google’s custom Tensor Processing Units (TPUs), processors designed specifically for artificial intelligence workloads. The fourth-generation TPU v4 delivers over twice the performance of the previous version and 2.7 times greater efficiency per watt [2]. Combined with GPUs, high-speed networking, and advanced cooling systems, Google’s infrastructure is optimized to handle the largest AI training jobs.

What are TPUs? 

Tensor Processing Units are custom-designed processors built by Google specifically for artificial intelligence tasks such as training and running machine learning models. Unlike general-purpose chips, TPUs are optimized for the types of mathematical operations required in deep learning.

What are GPUs? 

Graphics Processing Units were originally designed for rendering images in video games. Their ability to perform many calculations at once also makes them ideal for training large AI models. Nvidia is the market leader in producing GPUs for AI workloads.

2. Vertex AI platform

On top of the hardware sits Vertex AI, Google’s managed platform for machine learning. It provides tools for training, deployment, monitoring, and governance, and gives developers access to more than 130 models, including both Google’s own offerings and popular open-source alternatives [3]. By centralizing these functions, Vertex reduces the complexity of building production-ready AI systems.

What is Vertex AI? 

Vertex AI is Google Cloud’s managed platform for machine learning. It provides tools to train, deploy, and monitor AI models, as well as access to a library of pre-built models. It is designed to simplify the process of moving from prototype to production.

3. Gemini foundation models

The stack’s top layer is the Gemini family, Google’s most advanced multimodal models capable of handling text, code, and images. Tight integration with Vertex AI means Gemini can be deployed with lower latency and stronger performance guarantees. Startups such as Lovable and Windsurf are already using Gemini 2.5 Pro to power new AI applications [4].

What are foundation models? 

Foundation models are large, pre-trained AI models that can perform a wide variety of tasks with minimal fine-tuning. Google’s Gemini family is an example, capable of handling text, images, and code in a single model.

Companies they are working with

Recent coverage highlights developer-first AI startups such as Lovable and Windsurf, which are building on Google Cloud and using Gemini 2.5 Pro. Google has also showcased a broader cohort of generative AI teams and model providers through Next ’25 announcements and Model Garden updates, underscoring momentum among early-stage builders choosing Vertex AI and Gemini to scale. [4][3]

Competing in a crowded field

The competition is formidable. In Q2 2025, Amazon Web Services (AWS) held 31 percent of the global cloud infrastructure market, Microsoft Azure held 24 percent, and Google Cloud remained third at 13 percent [1]. Microsoft leverages its exclusive partnership with OpenAI to integrate advanced models directly into its cloud and productivity software. AWS continues to dominate through its scale and modular offerings, supported by its in-house chip, Trainium, designed for AI training workloads. Nvidia, the leading GPU supplier, powers much of the global AI ecosystem.


In Q2 2025, Amazon Web Services (AWS) held 31 percent of the global cloud infrastructure market, Microsoft Azure held 24 percent, and Google Cloud remained third at 13 percent.

Cloud Infrastructure Market in Q2 2025 [1]

Meanwhile, the Nvidia–OpenAI partnership has raised concerns about concentration of power and limited access to critical hardware, highlighting the importance of credible alternatives. Google’s counter is its full-stack strategy: unlike competitors that dominate only one layer, Google offers a platform spanning infrastructure, orchestration, and models. This integrated approach is meant to appeal to enterprises and startups that want speed and simplicity without juggling multiple providers.

Startups at the center

Google is betting that today’s startups are tomorrow’s industry leaders. To win them over, it has launched founder-focused initiatives, offering cloud credits, technical support, and specialized programs to make it easier for young companies to grow without switching platforms [5]. This not only lowers the barrier to entry for small teams, but also seeds long-term customer relationships. Reports indicate that AI startups are already driving a significant share of Google Cloud’s recent growth, reinforcing the importance of this strategy [4].

Why it matters

Google’s full-stack approach has implications for developers, enterprises, and the broader market. For developers, it promises less integration complexity and a faster path from experimentation to production, since hardware, platform, and models are designed to work together. For enterprises, it provides the convenience of a single provider with unified service guarantees, even if that introduces some risk of vendor lock-in. And for the market as a whole, Google’s emergence as a stronger challenger ensures that the AI infrastructure race is not limited to AWS, Azure, and Nvidia. More viable competitors mean better pricing, faster innovation, and a healthier, more dynamic ecosystem. By investing heavily and moving deliberately, Google positions itself as a counterweight to concentrated power — a challenger that could shift the balance of the AI industry.

What is vendor lock-in? 

Vendor lock-in happens when switching from one provider to another becomes costly or impractical due to proprietary technology or integration dependencies. In cloud services, this is a common concern when adopting tightly integrated platforms.

Challenges ahead

Despite its ambition, Google faces obstacles. It remains far behind AWS and Microsoft in market share, making enterprise adoption a steep climb [1]. Many organizations are already deeply invested in other ecosystems, and switching providers carries both technical and political costs. Maintaining a competitive edge will require continuous improvements to both hardware and models, which demands heavy capital expenditure. Open-source frameworks that allow developers to run models across providers could also weaken the appeal of Google’s integrated stack. Finally, while Google has avoided the regulatory spotlight so far, the broader antitrust scrutiny of large AI alliances means customers may remain wary of vendor concentration.

Sources

  1. Synergy Research Group. “Q2 Cloud Market Nears $100 Billion Milestone — and it’s Still Growing by 25% Year Over Year,” Aug 2025.
    https://www.srgresearch.com/articles/q2-cloud-market-nears-100-billion-milestone-and-its-still-growing-by-25-year-over-year

  2. Google Cloud Blog. “TPU v4 enables performance, energy and CO₂e efficiency gains,” Feb 2023.
    https://cloud.google.com/blog/topics/systems/tpu-v4-enables-performance-energy-and-co2e-efficiency-gains

  3. Google Cloud Blog. “Google Cloud Next 2025 Wrap Up: 229 things we announced,” Apr 12, 2025.
    https://cloud.google.com/blog/topics/google-cloud-next/google-cloud-next-2025-wrap-up

  4. TechCrunch. “How AI startups are fueling Google’s booming cloud business,” Sept 18, 2025.
    https://techcrunch.com/2025/09/18/how-ai-startups-are-fueling-googles-booming-cloud-business/

  5. TechCrunch. “It isn’t your imagination, Google Cloud is flooding the zone,” Sept 24, 2025.
    https://techcrunch.com/2025/09/24/it-isnt-your-imagination-google-cloud-is-flooding-the-zone/?utm_source=flipboard&utm_content=user%2FTechcrunch

Share this article

Related Articles

Related Articles

Related Articles

Subscribe to PromptWire

Don't just follow the AI revolution—lead it. We cover everything that matters, from strategic shifts in search to the AI tools that actually deliver results. We distill the noise into pure signal and send actionable intelligence right to your inbox.

We don't spam, promised. Only two emails every month, you can

opt out anytime with just one click.

Copyright

© 2025

All Rights Reserved

Subscribe to PromptWire

Don't just follow the AI revolution—lead it. We cover everything that matters, from strategic shifts in search to the AI tools that actually deliver results. We distill the noise into pure signal and send actionable intelligence right to your inbox.

We don't spam, promised. Only two emails every month, you can

opt out anytime with just one click.

Copyright

© 2025

All Rights Reserved

Subscribe to PromptWire

Don't just follow the AI revolution—lead it. We cover everything that matters, from strategic shifts in search to the AI tools that actually deliver results. We distill the noise into pure signal and send actionable intelligence right to your inbox.

We don't spam, promised. Only two emails every month, you can

opt out anytime with just one click.

Copyright

© 2025

All Rights Reserved