by farbodtavakkoli
Open source · 86k downloads · 0 likes
OTel LLM 32B IT is a language model specialized in the telecommunications sector, designed to meet the technical and operational needs of the industry. Trained on high-quality data from recognized standards and specifications such as those from GSMA, 3GPP, or O-RAN, it excels in understanding and analyzing complex technical documents. Its core capabilities include processing question-answer tasks on telecom standards, as well as integration into retrieval-augmented generation (RAG) applications to streamline access to specialized information. The model stands out for its industry-specific expertise, developed in collaboration with over 200 experts and major organizations in the field. It is particularly aimed at telecom professionals, developers, and researchers seeking to automate or optimize tasks related to infrastructure, protocols, or technological innovations in the sector.
OTel-LLM-32B-IT is a telecom-specialized language model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.
| Attribute | Value |
|---|---|
| Base Model | allenai/OLMo-3-32B |
| Parameters | 32B |
| Training Method | Full parameter fine-tuning |
| Language | English |
| License | Apache 2.0 |
The model was trained on high-quality telecom-focused data curated by 200+ domain experts from organizations including AT&T, RelationalAI, AMD, GSMA, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.
Data Sources:
This model is optimized for:
@misc{otel2026,
title={OTel: Open Telco AI Models},
author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
year={2026},
url={https://huggingface.co/farbodtavakkoli}
}
If you have any technical questions, please feel free to reach out to [email protected] or [email protected]