par mradermacher
Open source · 200 downloads · 1 likes
Le modèle *Zen Musician GGUF* est une version optimisée et quantifiée du modèle *zenlm/zen-musician*, spécialement conçue pour générer de la musique ou des compositions musicales à partir de prompts textuels. Il excelle dans la création de mélodies, d’harmonies ou de structures musicales cohérentes, tout en offrant une grande flexibilité pour adapter le style ou l’ambiance selon les besoins. Ses cas d’usage incluent la composition assistée par IA, l’exploration créative ou l’intégration dans des outils de production musicale. Ce qui le distingue, c’est son approche légère et efficace grâce à la quantification, permettant une utilisation fluide même sur des machines modestes, tout en conservant une qualité sonore remarquable.
static quants of https://huggingface.co/zenlm/zen-musician
For a convenient overview and download list, visit our model page for this model.
weighted/imatrix quants are available at https://huggingface.co/mradermacher/zen-musician-i1-GGUF
If you are unsure how to use GGUF files, refer to one of TheBloke's READMEs for more details, including on how to concatenate multi-part files.
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|---|---|---|---|
| GGUF | Q2_K | 2.5 | |
| GGUF | Q3_K_S | 2.9 | |
| GGUF | Q3_K_M | 3.2 | lower quality |
| GGUF | Q3_K_L | 3.4 | |
| GGUF | IQ4_XS | 3.5 | |
| GGUF | Q4_K_S | 3.7 | fast, recommended |
| GGUF | Q4_K_M | 3.9 | fast, recommended |
| GGUF | Q5_K_S | 4.4 | |
| GGUF | Q5_K_M | 4.5 | |
| GGUF | Q6_K | 5.2 | very good quality |
| GGUF | Q8_0 | 6.7 | fast, best quality |
| GGUF | f16 | 12.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better):

And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized.
I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.