AI/EXPLORER
ToolsCategoriesSitesLLMsCompareAI QuizAlternativesPremium
—AI Tools
—Sites & Blogs
—LLMs & Models
—Categories
AI Explorer

Find and compare the best artificial intelligence tools for your projects.

Made within France

Explore

  • ›All tools
  • ›Sites & Blogs
  • ›LLMs & Models
  • ›Compare
  • ›Chatbots
  • ›AI Images
  • ›Code & Dev

Company

  • ›Premium
  • ›About
  • ›Contact
  • ›Blog

Legal

  • ›Legal notice
  • ›Privacy
  • ›Terms

© 2026 AI Explorer·All rights reserved.

HomeLLMstiny random llama 2

tiny random llama 2

by stas

Open source · 309k downloads · 41 likes

2.0
(41 reviews)ChatAPI & Local
About

The *tiny random llama 2* model is a lightweight and randomized version of the well-known Llama 2, derived from the original 7-billion-parameter model. Designed primarily for functional testing, it does not produce high-quality outputs, as its weights are randomly generated and its vocabulary is limited to 3,000 tokens. It enables quick validation of a pipeline or application integration without requiring significant computational resources. Its purpose is solely to verify that interactions with a language model are functioning correctly, without claiming to generate coherent or relevant responses. This model stands out for its simplicity and efficiency, making it ideal for preliminary testing or basic demonstrations.

Documentation

This is a tiny random Llama model derived from "meta-llama/Llama-2-7b-hf".

See make_tiny_model.py for how this was done.

This is useful for functional testing (not quality generation, since its weights are random and the tokenizer has been shrunk to 3k items)

Capabilities & Tags
transformerssafetensorsllamatext-generationtext-generation-inferenceendpoints_compatible
Links & Resources
Specifications
CategoryChat
AccessAPI & Local
LicenseOpen Source
PricingOpen Source
Rating
2.0

Try tiny random llama 2

Access the model directly