Telecoms operators and other enterprises will increasingly opt for private cloud AI infrastructure

17 April 2025 | Research

Joseph Attwood

Article | PDF (3 pages) | Cloud and AI Infrastructure


"Using an AI model hosted by an external developer comes with privacy/security concerns, but running models on infrastructure provided by public cloud providers is also not without risk."

infrastructure_735x70_802320836.jpg

Telecoms operators and other enterprises have a range of options for deploying AI workloads using either public or private infrastructure. The least privacy-conscious enterprises can use large language models (LLMs) hosted by generative AI (GenAI) model developers such as OpenAI and DeepSeek. Privacy concerns emerge here due to the potential of these developers to store user prompts, which may then be used to train further LLMs or for whatever purposes one might speculate to be of interest to a Chinese AI company. Enterprises can instead deploy AI models on public cloud infrastructure, but data privacy and security concerns associated with this infrastructure type remain. The most privacy-conscious enterprises may therefore wish to deploy AI models on private cloud AI infrastructure.

This article is based on Analysys Mason’s Private cloud AI infrastructure: requirements and strategies for telecoms operators and other enterprises.

USD549

Log in

Log in to check if this content is included in your content subscription.

Author

Joseph Attwood

Analyst