by kaitchup
Open source · 1M downloads · 2 likes
The Phi 3 mini 4k instruct GPTQ 4-bit model is an optimized and lightweight version of a large language model, designed to deliver high performance while minimizing resource consumption. It excels in text comprehension and generation, accurately and consistently following complex instructions. Its primary use cases include conversational assistance, content generation, textual data analysis, and automating language-related tasks. What sets it apart is its balance between efficiency and capability, achieved through techniques like 4-bit quantization and GPTQ optimization, making it particularly well-suited for environments with limited resources.
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
Use the code below to get started with the model.
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
BibTeX:
[More Information Needed]
APA:
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]