AI ExplorerAI Explorer
ToolsCategoriesSitesLLMsCompareAI QuizAlternativesPremium

—

AI Tools

—

Sites & Blogs

—

LLMs & Models

—

Categories

AI Explorer

Find and compare the best artificial intelligence tools for your projects.

Made within France

Explore

  • All tools
  • Sites & Blogs
  • LLMs & Models
  • Compare
  • Chatbots
  • AI Images
  • Code & Dev

Company

  • Premium
  • About
  • Contact
  • Blog

Legal

  • Legal notice
  • Privacy
  • Terms

© 2026 AI Explorer. All rights reserved.

HomeLLMsBERT Tiny L 2 H 128 A 2

BERT Tiny L 2 H 128 A 2

by nreimers

Open source · 21k downloads · 3 likes

0.8
(3 reviews)EmbeddingAPI & Local
About

BERT Tiny L 2 H 128 A 2 is a compact and optimized version of the BERT model, designed for natural language processing tasks with limited resources. With just two layers, 128 hidden units, and two attention heads, it strikes a balance between performance and efficiency, making it ideal for applications requiring low memory or computational consumption. This model excels in text comprehension, classification, question answering, and semantic analysis while remaining accessible on less powerful devices. Its lightweight nature makes it particularly well-suited for embedded environments or projects where fast inference is critical. Despite its reduced size, it retains some of BERT’s contextual modeling capabilities, delivering reliable results for moderate precision needs.

Documentation

This is the BERT-Medium model from Google: https://github.com/google-research/bert#bert. A BERT model with 2 layers, 128 hidden unit size, and 2 attention heads.

Capabilities & Tags
transformerspytorchjaxbertfeature-extractionendpoints_compatible
Links & Resources
Specifications
CategoryEmbedding
AccessAPI & Local
LicenseOpen Source
PricingOpen Source
Rating
0.8

Try BERT Tiny L 2 H 128 A 2

Access the model directly