Quantized MLX model based on FluffyKaeloky/Luminum-v0.1-123B
- Downloads last month
- 14
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for shanginn/Luminum-v0.1-123B-mlx-quantized-4bit
Base model
FluffyKaeloky/Luminum-v0.1-123B