ruAntiToxic_v1

NeuroSpaceX/ruAntiToxic_v1 — модель для классификации русского текста на токсичный и нетоксичный.

Установка

pip install transformers torch

Использование

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tokenizer = AutoTokenizer.from_pretrained("NeuroSpaceX/ruAntiToxic_v1")
model = AutoModelForSequenceClassification.from_pretrained("NeuroSpaceX/ruAntiToxic_v1")

text = "Ваш тестовый текст"
inputs = tokenizer(text, return_tensors="pt", padding=True, truncation=True)
logits = model(**inputs).logits.squeeze()
score = torch.sigmoid(logits).item()
label = "токсичный" if score > 0.5 else "нетоксичный"
print(label, score)
Downloads last month
13
Safetensors
Model size
29.2M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for NeuroSpaceX/ruAntiToxic_v1

Finetuned
(45)
this model