austinpatrickm
commited on
Commit
·
c56f79d
1
Parent(s):
203794a
Upload 12 files
Browse files- 1_Pooling/config.json +7 -0
- README.md +126 -0
- config.json +32 -0
- config_sentence_transformers.json +7 -0
- eval/Information-Retrieval_evaluation_results.csv +29 -0
- modules.json +14 -0
- pytorch_model.bin +3 -0
- sentence_bert_config.json +4 -0
- special_tokens_map.json +7 -0
- tokenizer.json +0 -0
- tokenizer_config.json +15 -0
- vocab.txt +0 -0
1_Pooling/config.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"word_embedding_dimension": 768,
|
3 |
+
"pooling_mode_cls_token": true,
|
4 |
+
"pooling_mode_mean_tokens": false,
|
5 |
+
"pooling_mode_max_tokens": false,
|
6 |
+
"pooling_mode_mean_sqrt_len_tokens": false
|
7 |
+
}
|
README.md
ADDED
@@ -0,0 +1,126 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
pipeline_tag: sentence-similarity
|
3 |
+
tags:
|
4 |
+
- sentence-transformers
|
5 |
+
- feature-extraction
|
6 |
+
- sentence-similarity
|
7 |
+
- transformers
|
8 |
+
|
9 |
+
---
|
10 |
+
|
11 |
+
# {MODEL_NAME}
|
12 |
+
|
13 |
+
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
|
14 |
+
|
15 |
+
<!--- Describe your model here -->
|
16 |
+
|
17 |
+
## Usage (Sentence-Transformers)
|
18 |
+
|
19 |
+
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
|
20 |
+
|
21 |
+
```
|
22 |
+
pip install -U sentence-transformers
|
23 |
+
```
|
24 |
+
|
25 |
+
Then you can use the model like this:
|
26 |
+
|
27 |
+
```python
|
28 |
+
from sentence_transformers import SentenceTransformer
|
29 |
+
sentences = ["This is an example sentence", "Each sentence is converted"]
|
30 |
+
|
31 |
+
model = SentenceTransformer('{MODEL_NAME}')
|
32 |
+
embeddings = model.encode(sentences)
|
33 |
+
print(embeddings)
|
34 |
+
```
|
35 |
+
|
36 |
+
|
37 |
+
|
38 |
+
## Usage (HuggingFace Transformers)
|
39 |
+
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
|
40 |
+
|
41 |
+
```python
|
42 |
+
from transformers import AutoTokenizer, AutoModel
|
43 |
+
import torch
|
44 |
+
|
45 |
+
|
46 |
+
def cls_pooling(model_output, attention_mask):
|
47 |
+
return model_output[0][:,0]
|
48 |
+
|
49 |
+
|
50 |
+
# Sentences we want sentence embeddings for
|
51 |
+
sentences = ['This is an example sentence', 'Each sentence is converted']
|
52 |
+
|
53 |
+
# Load model from HuggingFace Hub
|
54 |
+
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
|
55 |
+
model = AutoModel.from_pretrained('{MODEL_NAME}')
|
56 |
+
|
57 |
+
# Tokenize sentences
|
58 |
+
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
|
59 |
+
|
60 |
+
# Compute token embeddings
|
61 |
+
with torch.no_grad():
|
62 |
+
model_output = model(**encoded_input)
|
63 |
+
|
64 |
+
# Perform pooling. In this case, cls pooling.
|
65 |
+
sentence_embeddings = cls_pooling(model_output, encoded_input['attention_mask'])
|
66 |
+
|
67 |
+
print("Sentence embeddings:")
|
68 |
+
print(sentence_embeddings)
|
69 |
+
```
|
70 |
+
|
71 |
+
|
72 |
+
|
73 |
+
## Evaluation Results
|
74 |
+
|
75 |
+
<!--- Describe how your model was evaluated -->
|
76 |
+
|
77 |
+
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
|
78 |
+
|
79 |
+
|
80 |
+
## Training
|
81 |
+
The model was trained with the parameters:
|
82 |
+
|
83 |
+
**DataLoader**:
|
84 |
+
|
85 |
+
`torch.utils.data.dataloader.DataLoader` of length 698 with parameters:
|
86 |
+
```
|
87 |
+
{'batch_size': 10, 'sampler': 'torch.utils.data.sampler.SequentialSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
|
88 |
+
```
|
89 |
+
|
90 |
+
**Loss**:
|
91 |
+
|
92 |
+
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
|
93 |
+
```
|
94 |
+
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
|
95 |
+
```
|
96 |
+
|
97 |
+
Parameters of the fit()-Method:
|
98 |
+
```
|
99 |
+
{
|
100 |
+
"epochs": 2,
|
101 |
+
"evaluation_steps": 50,
|
102 |
+
"evaluator": "sentence_transformers.evaluation.InformationRetrievalEvaluator.InformationRetrievalEvaluator",
|
103 |
+
"max_grad_norm": 1,
|
104 |
+
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
|
105 |
+
"optimizer_params": {
|
106 |
+
"lr": 2e-05
|
107 |
+
},
|
108 |
+
"scheduler": "WarmupLinear",
|
109 |
+
"steps_per_epoch": null,
|
110 |
+
"warmup_steps": 139,
|
111 |
+
"weight_decay": 0.01
|
112 |
+
}
|
113 |
+
```
|
114 |
+
|
115 |
+
|
116 |
+
## Full Model Architecture
|
117 |
+
```
|
118 |
+
SentenceTransformer(
|
119 |
+
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
|
120 |
+
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
|
121 |
+
)
|
122 |
+
```
|
123 |
+
|
124 |
+
## Citing & Authors
|
125 |
+
|
126 |
+
<!--- Describe where people can find more information -->
|
config.json
ADDED
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "/root/.cache/torch/sentence_transformers/BAAI_bge-base-en/",
|
3 |
+
"architectures": [
|
4 |
+
"BertModel"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"classifier_dropout": null,
|
8 |
+
"gradient_checkpointing": false,
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"id2label": {
|
13 |
+
"0": "LABEL_0"
|
14 |
+
},
|
15 |
+
"initializer_range": 0.02,
|
16 |
+
"intermediate_size": 3072,
|
17 |
+
"label2id": {
|
18 |
+
"LABEL_0": 0
|
19 |
+
},
|
20 |
+
"layer_norm_eps": 1e-12,
|
21 |
+
"max_position_embeddings": 512,
|
22 |
+
"model_type": "bert",
|
23 |
+
"num_attention_heads": 12,
|
24 |
+
"num_hidden_layers": 12,
|
25 |
+
"pad_token_id": 0,
|
26 |
+
"position_embedding_type": "absolute",
|
27 |
+
"torch_dtype": "float32",
|
28 |
+
"transformers_version": "4.33.1",
|
29 |
+
"type_vocab_size": 2,
|
30 |
+
"use_cache": true,
|
31 |
+
"vocab_size": 30522
|
32 |
+
}
|
config_sentence_transformers.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"__version__": {
|
3 |
+
"sentence_transformers": "2.2.2",
|
4 |
+
"transformers": "4.28.1",
|
5 |
+
"pytorch": "1.13.0+cu117"
|
6 |
+
}
|
7 |
+
}
|
eval/Information-Retrieval_evaluation_results.csv
ADDED
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
epoch,steps,cos_sim-Accuracy@1,cos_sim-Accuracy@3,cos_sim-Accuracy@5,cos_sim-Accuracy@10,cos_sim-Precision@1,cos_sim-Recall@1,cos_sim-Precision@3,cos_sim-Recall@3,cos_sim-Precision@5,cos_sim-Recall@5,cos_sim-Precision@10,cos_sim-Recall@10,cos_sim-MRR@10,cos_sim-NDCG@10,cos_sim-MAP@100,dot_score-Accuracy@1,dot_score-Accuracy@3,dot_score-Accuracy@5,dot_score-Accuracy@10,dot_score-Precision@1,dot_score-Recall@1,dot_score-Precision@3,dot_score-Recall@3,dot_score-Precision@5,dot_score-Recall@5,dot_score-Precision@10,dot_score-Recall@10,dot_score-MRR@10,dot_score-NDCG@10,dot_score-MAP@100
|
2 |
+
0,50,0.5954356846473029,0.7991009681881052,0.8627247579529738,0.9163208852005532,0.5954356846473029,0.5954356846473029,0.26636698939603504,0.7991009681881052,0.17254495159059471,0.8627247579529738,0.09163208852005532,0.9163208852005532,0.7084682155480899,0.7592512347307195,0.7124668377887484,0.5525587828492393,0.7669432918395575,0.8333333333333334,0.901106500691563,0.5525587828492393,0.5525587828492393,0.2556477639465191,0.7669432918395575,0.16666666666666663,0.8333333333333334,0.09011065006915628,0.901106500691563,0.6732448846297382,0.7286825302103023,0.6775401174557549
|
3 |
+
0,100,0.6047717842323651,0.8042876901798064,0.8630705394190872,0.9180497925311203,0.6047717842323651,0.6047717842323651,0.2680958967266021,0.8042876901798064,0.17261410788381742,0.8630705394190872,0.09180497925311201,0.9180497925311203,0.7156846198599306,0.7651243979075643,0.7196377706406355,0.5715767634854771,0.7762793914246197,0.8423236514522822,0.9007607192254495,0.5715767634854771,0.5715767634854771,0.2587597971415399,0.7762793914246197,0.16846473029045642,0.8423236514522822,0.09007607192254494,0.9007607192254495,0.6863132286109453,0.7386296743128387,0.6909179253677203
|
4 |
+
0,150,0.5916320885200553,0.7876901798063624,0.8433609958506224,0.9107883817427386,0.5916320885200553,0.5916320885200553,0.2625633932687874,0.7876901798063624,0.16867219917012446,0.8433609958506224,0.09107883817427385,0.9107883817427386,0.7018950745351151,0.7526861728474171,0.7060674181435297,0.5774550484094052,0.7797372060857538,0.8378284923928078,0.9031811894882434,0.5774550484094052,0.5774550484094052,0.25991240202858457,0.7797372060857538,0.1675656984785615,0.8378284923928078,0.09031811894882431,0.9031811894882434,0.6910861105622507,0.7426940231657718,0.6956044246034082
|
5 |
+
0,200,0.6123789764868603,0.8053250345781466,0.8675656984785616,0.9266943291839558,0.6123789764868603,0.6123789764868603,0.2684416781927155,0.8053250345781466,0.17351313969571228,0.8675656984785616,0.09266943291839558,0.9266943291839558,0.7218046773804032,0.771744040109879,0.7252810656599419,0.6037344398340249,0.7994467496542186,0.8644536652835408,0.9242738589211619,0.6037344398340249,0.6037344398340249,0.26648224988473956,0.7994467496542186,0.17289073305670813,0.8644536652835408,0.09242738589211617,0.9242738589211619,0.714399328196008,0.7655112724502717,0.7180057724380137
|
6 |
+
0,250,0.6141078838174274,0.8170816044260027,0.8793222683264177,0.9329183955739973,0.6141078838174274,0.6141078838174274,0.2723605348086676,0.8170816044260027,0.1758644536652835,0.8793222683264177,0.09329183955739971,0.9329183955739973,0.7264561241739657,0.7769278282803395,0.7297410366086686,0.6040802213001383,0.8094744121715076,0.8724066390041494,0.9304979253112033,0.6040802213001383,0.6040802213001383,0.26982480405716924,0.8094744121715076,0.17448132780082984,0.8724066390041494,0.0930497925311203,0.9304979253112033,0.7185692880194948,0.7703709249640216,0.7220011221004472
|
7 |
+
0,300,0.5957814661134163,0.8084370677731674,0.867911479944675,0.9239280774550485,0.5957814661134163,0.5957814661134163,0.2694790225910558,0.8084370677731674,0.173582295988935,0.867911479944675,0.09239280774550482,0.9239280774550485,0.7124780456211978,0.7642023170368565,0.716110449865697,0.5878284923928078,0.7970262793914247,0.8644536652835408,0.923582295988935,0.5878284923928078,0.5878284923928078,0.2656754264638082,0.7970262793914247,0.17289073305670813,0.8644536652835408,0.09235822959889349,0.923582295988935,0.7052628762431652,0.7585687434255987,0.7088684714616608
|
8 |
+
0,350,0.5757261410788381,0.8025587828492393,0.8661825726141079,0.9249654218533887,0.5757261410788381,0.5757261410788381,0.26751959428307975,0.8025587828492393,0.17323651452282154,0.8661825726141079,0.09249654218533887,0.9249654218533887,0.7004920525148723,0.7554206651425398,0.703993707043986,0.5701936376210235,0.7918395573997233,0.8606500691562933,0.9246196403872753,0.5701936376210235,0.5701936376210235,0.2639465191332411,0.7918395573997233,0.17213001383125862,0.8606500691562933,0.09246196403872751,0.9246196403872753,0.6947947540011843,0.7508540120235336,0.6983021894214881
|
9 |
+
0,400,0.5968188105117566,0.8108575380359613,0.8748271092669433,0.9343015214384509,0.5968188105117566,0.5968188105117566,0.27028584601198713,0.8108575380359613,0.17496542185338862,0.8748271092669433,0.09343015214384506,0.9343015214384509,0.7152812081494642,0.7687680757601261,0.7181334552452423,0.5899031811894883,0.8060165975103735,0.8658367911479945,0.9308437067773168,0.5899031811894883,0.5899031811894883,0.26867219917012447,0.8060165975103735,0.1731673582295989,0.8658367911479945,0.09308437067773165,0.9308437067773168,0.7093244363213232,0.763371675522544,0.7124376695143495
|
10 |
+
0,450,0.5992392807745505,0.8077455048409405,0.8647994467496543,0.9242738589211619,0.5992392807745505,0.5992392807745505,0.2692485016136469,0.8077455048409405,0.1729598893499308,0.8647994467496543,0.09242738589211617,0.9242738589211619,0.7136236525499996,0.7651042594678189,0.7171990441947274,0.5926694329183956,0.8015214384508991,0.8592669432918395,0.9239280774550485,0.5926694329183956,0.5926694329183956,0.2671738128169664,0.8015214384508991,0.1718533886583679,0.8592669432918395,0.09239280774550483,0.9239280774550485,0.7085502700388576,0.7610775093433033,0.7121266215136859
|
11 |
+
0,500,0.5926694329183956,0.809820193637621,0.8730982019363762,0.9339557399723375,0.5926694329183956,0.5926694329183956,0.2699400645458737,0.809820193637621,0.17461964038727523,0.8730982019363762,0.09339557399723375,0.9339557399723375,0.7129853289863647,0.7669573081857943,0.7160457814171952,0.5850622406639004,0.8004840940525588,0.8675656984785616,0.9304979253112033,0.5850622406639004,0.5850622406639004,0.2668280313508529,0.8004840940525588,0.1735131396957123,0.8675656984785616,0.09304979253112032,0.9304979253112033,0.7064455311861942,0.7611153634308084,0.7097135429731951
|
12 |
+
0,550,0.5947441217150761,0.8115491009681881,0.867911479944675,0.9315352697095436,0.5947441217150761,0.5947441217150761,0.270516366989396,0.8115491009681881,0.173582295988935,0.867911479944675,0.09315352697095435,0.9315352697095436,0.7138389426771165,0.7670163048815,0.7170064307727477,0.5860995850622407,0.8022130013831259,0.8634163208852006,0.9291147994467497,0.5860995850622407,0.5860995850622407,0.2674043337943753,0.8022130013831259,0.17268326417704005,0.8634163208852006,0.09291147994467494,0.9291147994467497,0.7061306230652695,0.7605021198137504,0.7094413067682568
|
13 |
+
0,600,0.6186030428769018,0.8253803596127247,0.8852005532503457,0.9370677731673582,0.6186030428769018,0.6186030428769018,0.2751267865375749,0.8253803596127247,0.1770401106500691,0.8852005532503457,0.09370677731673582,0.9370677731673582,0.7312788151221751,0.7816200590350365,0.7343471531989427,0.6092669432918395,0.8129322268326418,0.8789764868603043,0.9367219917012448,0.6092669432918395,0.6092669432918395,0.27097740894421396,0.8129322268326418,0.17579529737206084,0.8789764868603043,0.09367219917012447,0.9367219917012448,0.7237808459022145,0.7757473117722125,0.7268108599827174
|
14 |
+
0,650,0.6092669432918395,0.8233056708160442,0.8845089903181189,0.9381051175656985,0.6092669432918395,0.6092669432918395,0.2744352236053481,0.8233056708160442,0.17690179806362377,0.8845089903181189,0.09381051175656983,0.9381051175656985,0.725853888120484,0.7777807261298915,0.7288485960232514,0.6044260027662517,0.8132780082987552,0.8807053941908713,0.936030428769018,0.6044260027662517,0.6044260027662517,0.2710926694329184,0.8132780082987552,0.17614107883817426,0.8807053941908713,0.09360304287690177,0.936030428769018,0.7215099398450008,0.7739689090330896,0.7245733666952261
|
15 |
+
0,-1,0.6033886583679114,0.8101659751037344,0.8689488243430152,0.9287690179806363,0.6033886583679114,0.6033886583679114,0.27005532503457813,0.8101659751037344,0.173789764868603,0.8689488243430152,0.09287690179806361,0.9287690179806363,0.716792492700167,0.7684775250545536,0.7201993139678575,0.5982019363762102,0.8001383125864454,0.8672199170124482,0.927731673582296,0.5982019363762102,0.5982019363762102,0.26671277086214845,0.8001383125864454,0.1734439834024896,0.8672199170124482,0.09277316735822959,0.927731673582296,0.7126115556872807,0.7650591718804082,0.7160549494519313
|
16 |
+
1,50,0.609612724757953,0.8188105117565698,0.8772475795297372,0.936030428769018,0.609612724757953,0.609612724757953,0.27293683725219,0.8188105117565698,0.17544951590594743,0.8772475795297372,0.0936030428769018,0.936030428769018,0.7246912665481117,0.7763469321209491,0.7276933741632036,0.6016597510373444,0.8146611341632088,0.8769017980636238,0.9343015214384509,0.6016597510373444,0.6016597510373444,0.27155371138773626,0.8146611341632088,0.17538035961272475,0.8769017980636238,0.09343015214384506,0.9343015214384509,0.7190129036861386,0.771647493277243,0.7221280484536906
|
17 |
+
1,100,0.6026970954356846,0.8160442600276625,0.8748271092669433,0.9343015214384509,0.6026970954356846,0.6026970954356846,0.2720147533425542,0.8160442600276625,0.17496542185338865,0.8748271092669433,0.09343015214384506,0.9343015214384509,0.7210387439899875,0.7732003549041732,0.7240947333699587,0.5985477178423236,0.8146611341632088,0.8758644536652835,0.9343015214384509,0.5985477178423236,0.5985477178423236,0.2715537113877363,0.8146611341632088,0.17517289073305667,0.8758644536652835,0.09343015214384509,0.9343015214384509,0.7176573580100534,0.7706277205419523,0.7206918835210201
|
18 |
+
1,150,0.5975103734439834,0.80567081604426,0.8599585062240664,0.923582295988935,0.5975103734439834,0.5975103734439834,0.26855693868142,0.80567081604426,0.17199170124481328,0.8599585062240664,0.09235822959889349,0.923582295988935,0.7124169850051577,0.7639307606136736,0.7160894951714374,0.593015214384509,0.8042876901798064,0.8599585062240664,0.9218533886583679,0.593015214384509,0.593015214384509,0.2680958967266021,0.8042876901798064,0.17199170124481328,0.8599585062240664,0.09218533886583678,0.9218533886583679,0.7099493128279425,0.7617209254312846,0.7136849724814708
|
19 |
+
1,200,0.5992392807745505,0.8029045643153527,0.8641078838174274,0.9294605809128631,0.5992392807745505,0.5992392807745505,0.2676348547717842,0.8029045643153527,0.17282157676348545,0.8641078838174274,0.0929460580912863,0.9294605809128631,0.7141924081758095,0.7665892553359721,0.7174219402702303,0.5912863070539419,0.8011756569847857,0.8630705394190872,0.9263485477178424,0.5912863070539419,0.5912863070539419,0.26705855232826187,0.8011756569847857,0.1726141078838174,0.8630705394190872,0.09263485477178422,0.9263485477178424,0.7092534962348226,0.7622098564677001,0.7126870321942367
|
20 |
+
1,250,0.6026970954356846,0.8073997233748271,0.8651452282157677,0.9266943291839558,0.6026970954356846,0.6026970954356846,0.26913324112494236,0.8073997233748271,0.17302904564315352,0.8651452282157677,0.09266943291839558,0.9266943291839558,0.7165043414784066,0.7677789659845837,0.7200102240000958,0.5975103734439834,0.8067081604426003,0.8620331950207469,0.927731673582296,0.5975103734439834,0.5975103734439834,0.26890272014753347,0.8067081604426003,0.17240663900414935,0.8620331950207469,0.09277316735822959,0.927731673582296,0.7131788019495473,0.7654889305810164,0.7165294163422276
|
21 |
+
1,300,0.5982019363762102,0.8077455048409405,0.8613416320885201,0.9270401106500692,0.5982019363762102,0.5982019363762102,0.2692485016136469,0.8077455048409405,0.17226832641770398,0.8613416320885201,0.0927040110650069,0.9270401106500692,0.7133545741948215,0.7654290784301596,0.7167392877262367,0.5975103734439834,0.8029045643153527,0.8616874135546335,0.9246196403872753,0.5975103734439834,0.5975103734439834,0.2676348547717842,0.8029045643153527,0.17233748271092666,0.8616874135546335,0.09246196403872751,0.9246196403872753,0.7114173746953815,0.7633975946453793,0.7149779597728564
|
22 |
+
1,350,0.5999308437067773,0.8112033195020747,0.8675656984785616,0.9301521438450899,0.5999308437067773,0.5999308437067773,0.27040110650069155,0.8112033195020747,0.1735131396957123,0.8675656984785616,0.09301521438450899,0.9301521438450899,0.7163132011679715,0.7684903995990144,0.7194991401712434,0.5961272475795297,0.8046334716459198,0.8668741355463347,0.9273858921161826,0.5961272475795297,0.5961272475795297,0.26821115721530664,0.8046334716459198,0.17337482710926694,0.8668741355463347,0.09273858921161825,0.9273858921161826,0.7127756646688165,0.7651533163997125,0.7161747308051355
|
23 |
+
1,400,0.5999308437067773,0.8067081604426003,0.867911479944675,0.9308437067773168,0.5999308437067773,0.5999308437067773,0.2689027201475334,0.8067081604426003,0.173582295988935,0.867911479944675,0.09308437067773165,0.9308437067773168,0.7158544918659014,0.7682615953816291,0.7189882232156225,0.5909405255878285,0.8015214384508991,0.8641078838174274,0.9280774550484094,0.5909405255878285,0.5909405255878285,0.26717381281696634,0.8015214384508991,0.17282157676348547,0.8641078838174274,0.09280774550484093,0.9280774550484094,0.7099424520845672,0.7631853195513307,0.7132918249777777
|
24 |
+
1,450,0.5964730290456431,0.8053250345781466,0.8668741355463347,0.9270401106500692,0.5964730290456431,0.5964730290456431,0.2684416781927155,0.8053250345781466,0.17337482710926694,0.8668741355463347,0.0927040110650069,0.9270401106500692,0.712132401369952,0.764546230563512,0.7155187572713126,0.5926694329183956,0.7980636237897649,0.8616874135546335,0.9263485477178424,0.5926694329183956,0.5926694329183956,0.2660212079299216,0.7980636237897649,0.17233748271092666,0.8616874135546335,0.09263485477178422,0.9263485477178424,0.7090303848602594,0.7619848285449365,0.7124578049101015
|
25 |
+
1,500,0.5926694329183956,0.8067081604426003,0.8686030428769018,0.9284232365145229,0.5926694329183956,0.5926694329183956,0.26890272014753347,0.8067081604426003,0.17372060857538033,0.8686030428769018,0.09284232365145226,0.9284232365145229,0.7112383092932874,0.7642796619644506,0.7145727407191483,0.5909405255878285,0.8015214384508991,0.8644536652835408,0.9284232365145229,0.5909405255878285,0.5909405255878285,0.2671738128169664,0.8015214384508991,0.17289073305670813,0.8644536652835408,0.09284232365145226,0.9284232365145229,0.7089428417747908,0.7624311292555161,0.7122053927757861
|
26 |
+
1,550,0.5895573997233748,0.8053250345781466,0.8661825726141079,0.926002766251729,0.5895573997233748,0.5895573997233748,0.2684416781927155,0.8053250345781466,0.17323651452282154,0.8661825726141079,0.09260027662517288,0.926002766251729,0.7087514270346208,0.7618368558587995,0.712227226748592,0.5881742738589212,0.7991009681881052,0.8623789764868603,0.9263485477178424,0.5881742738589212,0.5881742738589212,0.26636698939603504,0.7991009681881052,0.17247579529737206,0.8623789764868603,0.09263485477178422,0.9263485477178424,0.7068634876726151,0.7604001359192012,0.7102599980577198
|
27 |
+
1,600,0.5937067773167358,0.8049792531120332,0.867911479944675,0.9284232365145229,0.5937067773167358,0.5937067773167358,0.26832641770401106,0.8049792531120332,0.173582295988935,0.867911479944675,0.09284232365145229,0.9284232365145229,0.7115296164570009,0.7644629649263047,0.7148435105106903,0.5874827109266944,0.8001383125864454,0.863762102351314,0.9270401106500692,0.5874827109266944,0.5874827109266944,0.26671277086214845,0.8001383125864454,0.17275242047026276,0.863762102351314,0.09270401106500688,0.9270401106500692,0.70718004237195,0.7608454331136686,0.710557185868797
|
28 |
+
1,650,0.5954356846473029,0.8049792531120332,0.8686030428769018,0.9298063623789765,0.5954356846473029,0.5954356846473029,0.26832641770401106,0.8049792531120332,0.17372060857538033,0.8686030428769018,0.09298063623789764,0.9298063623789765,0.7125159169246292,0.765501467337404,0.7157060073722703,0.5892116182572614,0.8011756569847857,0.863762102351314,0.926002766251729,0.5892116182572614,0.5892116182572614,0.26705855232826187,0.8011756569847857,0.17275242047026276,0.863762102351314,0.09260027662517288,0.926002766251729,0.7083713418516314,0.7615305855765044,0.7118335793601137
|
29 |
+
1,-1,0.5954356846473029,0.80567081604426,0.8672199170124482,0.9301521438450899,0.5954356846473029,0.5954356846473029,0.26855693868142005,0.80567081604426,0.1734439834024896,0.8672199170124482,0.09301521438450899,0.9301521438450899,0.7126222584469458,0.7656577040284513,0.7157852241346846,0.588865836791148,0.8015214384508991,0.8641078838174274,0.9266943291839558,0.588865836791148,0.588865836791148,0.2671738128169664,0.8015214384508991,0.17282157676348547,0.8641078838174274,0.09266943291839558,0.9266943291839558,0.7082637653955071,0.7615980383902604,0.7116618700389964
|
modules.json
ADDED
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
[
|
2 |
+
{
|
3 |
+
"idx": 0,
|
4 |
+
"name": "0",
|
5 |
+
"path": "",
|
6 |
+
"type": "sentence_transformers.models.Transformer"
|
7 |
+
},
|
8 |
+
{
|
9 |
+
"idx": 1,
|
10 |
+
"name": "1",
|
11 |
+
"path": "1_Pooling",
|
12 |
+
"type": "sentence_transformers.models.Pooling"
|
13 |
+
}
|
14 |
+
]
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:38a70d998ebb3696a18ecb277935feda88ac073ae12e7af2a1ed84149933cdf8
|
3 |
+
size 437995689
|
sentence_bert_config.json
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"max_seq_length": 512,
|
3 |
+
"do_lower_case": true
|
4 |
+
}
|
special_tokens_map.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"cls_token": "[CLS]",
|
3 |
+
"mask_token": "[MASK]",
|
4 |
+
"pad_token": "[PAD]",
|
5 |
+
"sep_token": "[SEP]",
|
6 |
+
"unk_token": "[UNK]"
|
7 |
+
}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"clean_up_tokenization_spaces": true,
|
3 |
+
"cls_token": "[CLS]",
|
4 |
+
"do_basic_tokenize": true,
|
5 |
+
"do_lower_case": true,
|
6 |
+
"mask_token": "[MASK]",
|
7 |
+
"model_max_length": 512,
|
8 |
+
"never_split": null,
|
9 |
+
"pad_token": "[PAD]",
|
10 |
+
"sep_token": "[SEP]",
|
11 |
+
"strip_accents": null,
|
12 |
+
"tokenize_chinese_chars": true,
|
13 |
+
"tokenizer_class": "BertTokenizer",
|
14 |
+
"unk_token": "[UNK]"
|
15 |
+
}
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|