Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ddidacus
/
Mol-MoE-6x1b
like
0
Text Generation
Transformers
Safetensors
mixtral
text-generation-inference
Inference Endpoints
arxiv:
2502.05633
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Mol-MoE-6x1b
/
generation_config.json
ddidacus
Upload MixtralForCausalLM
0f93c66
verified
about 2 months ago
raw
Copy download link
history
blame
contribute
delete
120 Bytes
{
"do_sample"
:
true
,
"max_new_tokens"
:
64
,
"min_length"
:
-1
,
"top_k"
:
0.0
,
"transformers_version"
:
"4.47.1"
}