by farbodtavakkoli
Open source · 86k downloads · 0 likes
OTel LLM 27B IT is a language model specialized in the telecommunications sector, designed to meet the technical and regulatory needs of the industry. Fine-tuned on data from recognized sources such as 3GPP specifications, GSMA documents, or RFCs, it excels at analyzing and understanding complex technical documents related to networks, security, APIs, or standards like O-RAN. Its core capabilities include providing precise question-answering on telecom standards and assisting in generative search applications (RAG) for professionals in the field. What sets it apart is its deep sector-specific expertise, developed through collaboration with over 200 experts and major academic and industrial institutions. Ideal for engineers, researchers, or decision-makers, it streamlines access to critical information while reducing misinterpretations of technical standards.
OTel-LLM-27B-IT is a telecom-specialized language model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.
| Attribute | Value |
|---|---|
| Base Model | google/gemma-3-27b-it |
| Parameters | 27B |
| Training Method | Full parameter fine-tuning |
| Language | English |
| License | Apache 2.0 |
The model was trained on high-quality telecom-focused data curated by 200+ domain experts from organizations including AT&T, RelationalAI, AMD, GSMA, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.
Data Sources:
This model is optimized for:
@misc{otel2026,
title={OTel: Open Telco AI Models},
author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
year={2026},
url={https://huggingface.co/farbodtavakkoli}
}
If you have any technical questions, please feel free to reach out to [email protected] or [email protected]