z-logo
open-access-imgOpen Access
TelecomGPT: A Framework to Build Telecom-Specific Large Language Models
Author(s) -
Hang Zou,
Qiyang Zhao,
Yu Tian,
Lina Bariah,
Faouzi Bader,
Thierry Lestable,
Merouane Debbah
Publication year - 2025
Publication title -
ieee transactions on machine learning in communications and networking
Language(s) - English
Resource type - Magazines
eISSN - 2831-316X
DOI - 10.1109/tmlcn.2025.3593184
Subject(s) - computing and processing , communication, networking and broadcast technologies
The emergent field of Large Language Models (LLMs) has significant potential to revolutionize how future telecom networks are designed and operated. However, mainstream LLMs lack the specialized knowledge required to understand and operate within the highly technical telecom domain. In this paper, we introduce Telecom GPT, the first telecom-specific LLM, built through a systematic adaptation pipeline designed to enhance general-purpose LLMs for telecom applications. To achieve this, we curate comprehensive telecom-specific datasets, including pre-training datasets, instruction datasets, and preference datasets. These datasets are leveraged for continual pre-training, instruction tuning, and alignment tuning, respectively. Additionally, due to the lack of widely accepted evaluation benchmarks that are tailored for the telecom domain, we proposed three novel LLM-Telecom evaluation benchmarks, namely, Telecom Math Modeling, Telecom Open QnA, and Telecom Code Tasks. These new benchmarks provide a holistic evaluation of the capabilities of LLMs in telecom math modeling, open-ended question answering, code generation, infilling, summarization and analysis. Using the curated datasets, our fine-tuned LLM, TelecomGPT, significantly outperforms general-purpose state of the art (SOTA) LLMs, including GPT-4, Llama-3 and Mistral, particularly in Telecom Math Modeling benchmarks. Additionally, it achieves comparable performance across various evaluation benchmarks, such as TeleQnA, 3GPP technical document classification, telecom code summarization, generation, and infilling. This work establishes a new foundation for integrating LLMs into telecom systems, paving the way for AI-powered advancements in network operations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom