v0a

This is a merge of pre-trained language models created using mergekitty.

Merge Details

Merge Method

This model was merged using the SCE merge method using unsloth/Mistral-Small-24B-Base-2501 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: unsloth/Mistral-Small-24B-Base-2501
merge_method: sce
dtype: float32
out_dtype: bfloat16
models:
  - model: allura-org/Mistral-Small-24b-Sertraline-0304
    parameters:
      select_topk: 0.50
  - model: lars1234/Mistral-Small-24B-Instruct-2501-writer
    parameters:
      select_topk: 0.20
  - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
    parameters:
      select_topk: 0.20
  - model: trashpanda-org/Llama3-24B-Mullein-v1
    parameters:
      select_topk: 0.175
  - model: arcee-ai/Arcee-Blitz
    parameters:
      select_topk: 0.15
  - model: mistralai/Mistral-Small-24B-Instruct-2501
    parameters:
      select_topk: 0.15

# apt install git nano -y
# uv tool install mergekitty --with hf_transfer
# uv tool install https://github.com/aphrodite-engine/aphrodite-engine/releases/download/v0.6.7/aphrodite_engine-0.6.7-cp38-abi3-manylinux1_x86_64.whl --with aphrodite-engine --with setuptools --with hf_transfer
# uv tool install huggingface_hub
# huggingface-cli login
# nano merge.yml
# mergekitty-yaml --cuda --lazy-unpickle merge.yml v0a
Downloads last month
0
Safetensors
Model size
23.6B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for estrogen/ms24b-exprmerge-v0a