LLM4Telecom & Telecom4LLM

Exploring the intersection of Large Language Models and Telecommunications. From applying LLMs to solve telecom problems, to leveraging telecom domain knowledge for better AI systems.
RF-GPT architecture: from RF waveform to spectrogram to language model

RF-GPT: Teaching Language Models to See the Invisible Spectrum

RF-GPT is the first radio-frequency language model — it converts RF signals into spectrograms and feeds them through a vision-language model, achieving 99.6% wireless technology recognition across 6 standards while general-purpose VLMs score near-random.

February 17, 2026 · 14 min · Hang Zou
TelecomGPT training pipeline

TelecomGPT: A Framework to Build Telecom-Specific Large Language Models

Paper: TelecomGPT: A Framework to Build Telecom-Specific Large Language Models Authors: Hang Zou, Qiyang Zhao, Yu Tian, Lina Bariah, Faouzi Bader, Thierry Lestable, Merouane Debbah TL;DR — We built a three-stage pipeline (continual pre-training → instruction tuning → alignment) to adapt open-source LLMs for telecom. Key results: 75.3% on 3GPP document classification — nearly 2x GPT-4o’s 38.9% Outperforms GPT-4 on telecom math equation reconstruction (49.45 vs 49.38 MathBERT score) 4x improvement on telecom code infilling over base Llama3-8B-Instruct All built on 7-8B parameter models — a fraction of GPT-4’s size and cost Motivation Large Language Models like GPT-4 and Llama-3 are impressive generalists, but they struggle with the telecom domain. Ask GPT-4 to classify a 3GPP technical specification into the correct working group, and it gets it right less than 40% of the time. Ask it to infill a missing equation in a wireless communications paper, and it barely outperforms a coin flip. ...

July 12, 2024 · 8 min · Hang Zou