Chipmakers race to acquire advanced software capabilities to win the AI chip war.
The AI landscape is undergoing a transformative shift as chipmakers, traditionally focused on hardware innovation, are increasingly recognizing the pivotal role of software.
This strategic shift is redefining the AI race, where software expertise is becoming as crucial as hardware prowess.
AMD’s recent acquisitions: a case study
AMD’s recent acquisition of Silo AI, Europe’s largest private AI lab, exemplifies this trend. Silo AI brings to the table a wealth of experience in developing and deploying AI models, particularly large language models (LLMs), a key area of focus for AMD.
This acquisition not only enhances AMD’s AI software capabilities but also strengthens its presence in the European market, where Silo AI has a strong reputation for developing culturally relevant AI solutions.
“Silo AI plugs important capability gap [for AMD] from software tools (Silo OS) to services (MLOps) to helping tailor sovereign and open source LLMs and at the same time expanding its footprint in the important European market,” said Neil Shah, partner & co-founder at Counterpoint Research.
AMD’s move follows its previous acquisitions of Mipsology and Nod.ai, further solidifying its commitment to building a robust AI software ecosystem. Mipsology’s expertise in AI model optimization and compiler technology, coupled with Nod.ai’s contributions to open-source AI software development, provides AMD with a comprehensive suite of tools and expertise to accelerate its AI strategy.
“These strategic moves strengthen AMD’s ability to offer open-source solutions tailored for enterprises seeking flexibility and interoperability across platforms,” said Prabhu Ram, VP of industry research group at Cybermedia Research. “By integrating Silo AI’s capabilities, AMD aims to provide a comprehensive suite for developing, deploying, and managing AI systems, appealing broadly to diverse customer needs. This aligns with AMD’s evolving market position as a provider of accessible and open AI solutions, capitalizing on industry trends towards openness and interoperability.”
Beyond AMD: A broader industry trend
This strategic shift towards software is not limited to AMD. Other chip giants like Nvidia and Intel are also actively investing in software companies and developing their own software stacks.
“If you look at the success of Nvidia, it is driven not by silicon but by software (CUDA) and services (NGC with MLOps, TAO, etc.) it offers on top of its compute platform,” Shah said. “AMD realizes this and has been investing in building software (ROCm, Ryzen Aim, etc.) and services (Vitis) capabilities to offer an end-to-end solution for its customers to accelerate AI solution development and deployment.”
Nvidia’s recent acquisition of Run:ai and Shoreline.io, both specializing in AI workload management and infrastructure optimization, also underscores the importance of software in maximizing the performance and efficiency of AI systems.
But this doesn’t mean chipmakers follow similar trajectories toward their goals. Manish Rawat, semiconductor analyst at Techinsights pointed out that for a large part, Nvidia’s AI ecosystem has been established through proprietary technologies and a robust developer community, giving it a strong foothold in AI-driven industries.
“AMD’s approach with Silo AI signifies a focused effort to expand its capabilities in AI software, positioning itself competitively against Nvidia in the evolving AI landscape,” Rawat added.
Another relevant example in this regard is Intel’s acquisition of Granulate Cloud Solutions, a provider of real-time continuous optimization software. Granulate assists cloud and data center clients in optimizing compute workload performance while lowering infrastructure and cloud expenses.
Software to drive differentiation
The convergence of chip and software expertise is not just about catching up with competitors. It’s about driving innovation and differentiation in the AI space.
Software plays a crucial role in optimizing AI models for specific hardware architectures, improving performance, and reducing costs. Eventually, software could decide who rules the AI chip market.
“The bigger picture here is that AMD is obviously competing with NVIDIA for supremacy in the AI world,” said Hyoun Park, CEO and chief analyst at Amalgam Insights. “Ultimately, this is not just a question of who makes the better hardware, but who can actually back the deployment of enterprise-grade solutions that are high-performance, well-governed, and easy to support over time. And although Lisa Su and Jensen Huang are both among the absolute brightest executives in tech, only one of them can ultimately win this war as the market leader for AI hardware.”
The rise of full-stack AI solutions
The integration of software expertise into chip companies’ offerings is leading to the emergence of full-stack AI solutions. These solutions encompass everything from hardware accelerators and software frameworks to development tools and services.
By offering a comprehensive suite of AI capabilities, chipmakers can cater to a wider range of customers and use cases, from cloud-based AI services to edge AI applications.
For instance, Silo AI, first and foremost, brings an experienced talent pool, especially working on optimizing AI models, tailored LLMs, and more, according to Shah. Silo AI’s SIloOS particularly is a very powerful addition to AMD’s offerings allowing its customer to leverage advanced tools and modular software components to customize AI solutions to their needs. This was a big gap for AMD.
“Thirdly, Silo AI also brings in MLOps capabilities which are a critical capability for a platform player to help its enterprise customers deploy, refine and operate AI models in a scalable way,” Shah added. “This will help AMD develop a service layer on top of the software and silicon infrastructure.”
Implications for enterprise tech
The shift of chipmakers from purely hardware to also providing software toolkits and services has significant ramifications for enterprise tech companies.
Shah stressed that these developments are crucial for enabling enterprise and AI developers to fine-tune their AI models for enhanced performance on specific chips, applicable to both training and inference phases.
This advancement not only speeds up product time-to-market but also aids partners, whether they are hyperscalers or manage on-premises infrastructures, in boosting operational efficiencies and reducing total cost of ownership (TCO) by improving energy usage and optimizing code.
“Also, it’s a great way for chipmakers to lock these developers within their platform and ecosystem as well as monetize the software toolkits and services on top of it. This also drives recurring revenue, which chipmakers can reinvest and boost the bottom line, and investors love that model,” Shah said.
The future of AI: a software-driven landscape
As the AI race continues to evolve, the focus on software is set to intensify. Chipmakers will continue to invest in software companies, develop their own software stacks, and collaborate with the broader AI community to create a vibrant and innovative AI ecosystem.
The future of AI is not just about faster chips — it’s about smarter software that can unlock the full potential of AI and transform the way we live and work.
Related reading: