smollm2-135M_pretrained_200k_fineweb_uncovai_selected

This model is a pre-trained version of HuggingFaceTB/SmolLM2-135M on the first 200k samples from the Fineweb dataset dump CC-MAIN-2024-18 using the UncovAI model for text to remove synthetic labeled data. We observed that more than 16% of the data were detected as having been generated by AI by our model.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1
Downloads last month
20
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected

Finetuned
(359)
this model
Finetunes
1 model

Datasets used to train FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected