by farbodtavakkoli
Open source · 703k downloads · 0 likes
OTel LLM 1.7B IT is a language model specialized in the telecommunications sector, designed to meet the technical and regulatory needs of the industry. Trained on data from recognized sources such as 3GPP specifications, GSMA documents, or RFCs, it excels in understanding and analyzing technical content related to networks, security, APIs, or domain standards. Its key capabilities include addressing precise questions about telecom standards and assisting in AI-powered research applications (RAG) tailored to this field. The model stands out for its deep adaptation to the technical challenges of telecommunications, resulting from collaboration between industry and academic experts. It is particularly aimed at professionals seeking to automate the extraction or interpretation of technical knowledge in contexts where precision and compliance with standards are critical.
OTel-LLM-1.7B-IT is a telecom-specialized language model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.
| Attribute | Value |
|---|---|
| Base Model | Qwen/Qwen3-1.7B |
| Parameters | 1.7B |
| Training Method | Full parameter fine-tuning |
| Language | English |
| License | Apache 2.0 |
The model was trained on high-quality telecom-focused data curated by 200+ domain experts from organizations including AT&T, RelationalAI, AMD, GSMA, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.
Data Sources:
This model is optimized for:
@misc{otel2026,
title={OTel: Open Telco AI Models},
author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
year={2026},
url={https://huggingface.co/farbodtavakkoli}
}
If you have any technical questions, please feel free to reach out to [email protected] or [email protected]