News
Intel has reached a multi-year strategic partnership with SambaNova, jointly developing AI inference solutions based on the Xeon processors.
Currently, AI workloads are becoming increasingly diverse and complex, and more and more enterprises and institutions are flexibly choosing solutions based on their own needs. This has also driven the demand for heterogeneous infrastructure, which requires consistency in multiple aspects such as computing power, memory, network, and software to support large-scale inference deployments at the data center level.
Recently, SambaNova announced a multi-year partnership with Intel, based on the Intel® Xeon® platform infrastructure, to provide high-performance and cost-effective AI inference solutions for global AI-native enterprises, model providers, enterprise customers, and institutions. At the same time, Intel Capital will also participate in SambaNova's Series E financing round.
For those customers whose AI workload is highly compatible with SambaNova's technical solution, before Intel's GPU-based solution is ready, the combination of Intel CPUs and SambaNova's AI platform can offer a set of attractive rack-level inference solutions.
This collaboration not only aligns with Intel's current plans in the field of data center GPUs, but also will continue to consolidate and advance Intel's established strategic direction in the AI domain. Intel will continue to invest in core IP, architecture, products, software and system capabilities in GPUs, and will continuously advance the development of the product roadmap in the AI layout from the edge to the cloud.
Looking to the future, Intel and SambaNova will collaborate to drive the technological advancement of the next-generation heterogeneous AI data centers. By deeply integrating Intel's Xeon processors, GPUs, network and storage solutions, as well as SambaNova's system-level capabilities, they will jointly seize the opportunities in the AI inference market worth billions of dollars.