All blogs

Hybrid Cloud Infrastructures Set to Dominate AI Market

Liam Yu
Senior Product Solutions Marketing Manager, Integrated Systems

February 11, 2025

Hybrid Cloud Infrastructures Set to Dominate AI Market

About 100 years ago, Austrian economist, Joseph Schumpeter introduced the idea of “creative destructionopens in a new tab.” It describes the process by which new innovations and technologies replace and make outdated ones obsolete. It’s an ongoing process that constantly reshapes the economy emphasizing innovation-driven economic development and growth, leading to the dismantling or transformation of old industries and economic structures, and the creation of new ones.

Sometimes these are slow-moving changes, such as the transition from rail, to auto, to air travel. Other times they come with a flurry of stages all at once. Such was the case with ChatGPT, and now DeepSeekopens in a new tab, a disruptoropens in a new tab shaking up global AI markets and  kicking off a vigorous debate around the future of large language models, and AI costs and adoption. And once again accelerating the creative destruction process at breakneck speed.

Photo credit: DeepSeek Chief’s Journey from Math Geek to Global Disruptor © Rachel Mendelson/WSJ, Getty Images


In every instance, however, the success of any kind of creative destruction process relies on another more essential, foundational “general-purpose” set of technologies. This is the case for every transformative innovation that emerges once in a generation and profoundly alters various aspects of society. Examples run the gamut from smart phones to computers and the Internet, and going back a bit further, to electricity and steam power.

Artificial Intelligence (AI) is now poised to become the next such technology, with far-reaching implications that are already influencing our daily lives. But to succeed, it will need to count on a number of these general-purpose technologies. In this case, on agile and resilient data infrastructure.

Then of course there is the human aspect to contend with. The main issue being that those of us who work in IT aren’t always great at thinking proactively about the implications of such change. We often rely on past experiences and skills and have difficulty imagining things changing so rapidly. Especially when it involves risk and uncertainty. This is why it's so hard for many of us to wrap our heads around the pace of artificial intelligence and even more so, the requirements around the data infrastructure to power the workloads that it will immediately demand.

Thankfully we can hedge our bet.

Setting the Table for AI with Hybrid Cloud

By employing a hybrid cloud approach to infrastructure, we can “fail fastopens in a new tab forward” as we make corrections to our business models, assess costs and make decisions on how AI technologies will power our competitive edge.

So, let’s take a deeper (pun intended) look at how a hybrid cloud infrastructure approach provides numerous strategic advantages for businesses using AI:

  1. Scalability and Flexibility: Hybrid cloud environments empower businesses to efficiently scale AI workloads. Public cloud services can manage resource-intensive tasks such as training large datasets, while private clouds handle sensitive data and applications requiring strict compliance, especially across geographic regions and borders.
  2. Cost Efficiency: Establishing and maintaining infrastructure for extensive AI projects can be costly. Hybrid cloud models enable businesses to optimize expenditures by using public cloud resources on a pay-as-you-go basis, thus avoiding substantial upfront investments. This is particularly important as businesses experiment and shift directions.
  3. Enhanced Data Management: Robust data storage and management is essential for AI success. Hybrid cloud solutions provide comprehensive data management capabilities, ensuring data is stored, processed and accessed efficiently across various environments. But you must have flexibility to migrate workload in real-time while maintaining data sovereignty.opens in a new tab
  4. Security and Compliance: Hybrid clouds offer a secure framework for managing sensitive data. Private clouds can be used to meet regulatory compliance requirements and can employ advanced security, DR and business continuity options. While public clouds provide the flexibility to handle fewer sensitive workloads, both leverage access to large language models (LLMs).
  5. Resource Optimization: AI projects often necessitate specialized hardware like GPUs or TPUs. Hybrid cloud allows businesses to access these resources as needed, optimizing their utilization and reducing costs associated with purchasing and maintaining such hardware. This is particularly important as business experiment and shift from using new innovations in AI architectures to gain an economic edge, e.g. reinforce learning, mixture-of-experts (MoE) model, opens in a new tabmulti-head latent attention, distillation techniques, etc.
  6. Innovation and Agility: With hybrid cloud businesses can swiftly experiment with new AI technologies and solutions without being limited by existing infrastructure. This fosters innovation and enables rapid adaptation to evolving market demands. One of many reasons why,  at Hitachi Vantara, we’ve seen businesses deploy multiple hybrid cloud environments across VMware, Azure, Red Hat OpenShift, Google and so forth, to power their AI-driven customer experience while reducing cost of ownership.

For example, at the Red Hat Summit 2024 (and likely again in 2025),opens in a new tab we saw a huge emphasis on democratizing AI/ML modelsopens in a new tab. As well as training and development tools with innovations in open-source driving hybrid cloud (80% of OpenShiftopens in a new tab AI customers), automation and introduction of new tools, e.g., Red Hat’s recent acquisition of Neural Magicopens in a new tab (another MIT born technology) to help design next gen AI much faster and at lower costs. Indeed, many customers are switching from VMware to OpenShift virtualization not only for costs but because it allows customization with more free reinopens in a new tab than VMware in developing their environment. Classic creative destruction at work, folks!

Delivering Workloads to Scale AI in Hybrid Cloud at Your Own Flexibility and Pace

It’s an important reason why we offer Hitachi Vantara’s solution for Red Hat OpenShiftopens in a new tab and Ansible integrationopens in a new tab – so customers can optimize performance and lower operating costs with automation as they experiment and scale out their AI hybrid cloud infrastructure. This gives businesses great flexibility and confidence so they can leverage the advantages of hybrid cloud infrastructure listed above.

Red Hat OpenShift Virtualization with Hitachi Converged, Storage Operations & Data Services for Modern Apps


In the case of DeepSeek’s cost modelopens in a new tab (using the open-source MIT, MoE architecture), they’re demonstrating that large scale inference models are economically accessible at dramatically lower costs than big tech giantsopens in a new tab. This is allowing much broader accessibility to smaller developers of AI applications to enter the market quickly and competitively. Once again, creative destruction in action.

Some experts, such as Dr. Yonggang Wen at Nanyang Technological University (NTU) in Singapore also predict AI-driven data centers (AIDC) are becoming pivotal to the digital economyopens in a new tab. Historically, the training phase of AI models has demanded more resources than the inference phase. This trend is now shifting, with large-scale inference applications, such as recommendation systems and generative AI, exceeding training in terms of computational and energy requirements.

This will only accelerate such projections which indicate the distribution of training and inference workloads will evolve from an 80:20 split in 2023 to 20:80 by 2028, with a 50:50 balance anticipated during 2025.

Of course, DeepSeek’s cost model still needs to be thoroughly evaluated.opens in a new tab But as competitors across North America and Europe begin to replicate their methods, things can change dramatically as global privacy and security concerns subside. And then just like that, we’re off to the races in 2025, with general purpose AI-technology powering it all.

The Power of Global, Cross-Industry Experience

One of the wonderful things about working at Hitachi is that we have access to a wide range of information across every major industry worldwide. This gives us a perfect perch to help our customers take advantage of this trend early.

This includes over 570 different companies within the Hitachi family, such as Hitachi Energy, Rail, Automotive Systems, R&D labs and more. It also taps into Fortune 500, 2000 and mid-size organizations, giving us insights into a broad range of data and infrastructure needs to deploy the right fit to keep pace with our customers’ growth in AI.

In fact, through Hitachi iQ, we employ AI within our products to provide intelligent infrastructure deploymentopens in a new tab crafted for industrial enterprises and forward-thinking businesses. It empowers organizations to streamline processes, expedite insights and drive innovation, positioning them at the forefront of AI advancement.

As we all work quickly to stay a step ahead of the pace of AI driven creative destruction, it’s vital to work with partners like Hitachi Vantara that not only understand the evolving AI landscape, but also have broad experience with hybrid cloud infrastructures applied across your industry, geographic footprint and pace of growth, no matter your size.

We’ll be talking a lot more about the importance of hybrid cloud to success with AI, especially in support of multicloud environments (Red Hat OpenShiftopens in a new tab, Microsoft Azureopens in a new tab, Google Anthos, etc.). Until then, if you’d like to learn more about how we can help you navigate your hybrid cloud AI journey, connect with your Hitachi Vantara representative.

Read more