Building data centre networks for AI workloads: key requirements and considerations for CSPs
11 October 2023 | Research
Strategy report | PPTX and PDF (6 slides) | Cloud Infrastructure Strategies
As communications service providers (CSPs) train continuously larger AI models (especially generative AI models), models must be trained on an increasingly large number of graphics processing units (GPUs). CSPs that want to train these models in their own data centres will need to upgrade their back-end networks, that is, the networks which connect different GPUs within a data centre.
Information included in this report
- Insights into the back-end data centre networking requirements of AI workloads
- Analysis of the approaches to AI networking that CSPs can adopt in their data centres
Author
![](/contentassets/2e4c135ecad14e98b95f5dace49ea770/joseph-attwood_127x91.jpg)
Joseph Attwood
AnalystRelated items
Article
SUSECON 2024: SUSE is advancing its telecoms portfolio to improve its position in the CaaS market
Perspective
Open Network Index: evaluating operators’ progress and attitudes to ‘openness’ across core, RAN and edge
Perspective
Evaluating private versus public cloud models for CSPs’ cloud-native mobile core deployments