by farbodtavakkoli
Open source · 890k downloads · 0 likes
OTel LLM 0.6B IT is a language model specialized in the telecommunications sector, designed to meet the technical and regulatory needs of the industry. Fine-tuned using data validated by over 200 domain experts, it excels in analyzing and generating precise responses on topics such as 3GPP specifications, O-RAN standards, or roaming protocols. Its primary applications include retrieval-augmented generation (RAG) systems and conversational assistants dedicated to interpreting technical documents. What sets it apart is its ability to understand and contextualize complex telecom-specific concepts while relying on reliable sources recognized by major industry players. Ideal for professionals seeking to automate the extraction or synthesis of technical information, it offers a tailored solution for the challenges of modern telecommunications.
OTel-LLM-0.6B-IT is a telecom-specialized language model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.
| Attribute | Value |
|---|---|
| Base Model | Qwen/Qwen3-0.6B |
| Parameters | 0.6B |
| Training Method | Full parameter fine-tuning |
| Language | English |
| License | Apache 2.0 |
The model was trained on high-quality telecom-focused data curated by 200+ domain experts from organizations including AT&T, RelationalAI, AMD, GSMA, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.
Data Sources:
This model is optimized for:
@misc{otel2026,
title={OTel: Open Telco AI Models},
author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
year={2026},
url={https://huggingface.co/farbodtavakkoli}
}
If you have any technical questions, please feel free to reach out to [email protected] or [email protected]