betteib commited on
Commit
62d057b
·
verified ·
1 Parent(s): 3ae628e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -37
README.md CHANGED
@@ -1,6 +1,10 @@
1
  ---
2
  library_name: transformers
3
- tags: []
 
 
 
 
4
  ---
5
 
6
  # Model Card for Model ID
@@ -18,11 +22,11 @@ tags: []
18
  This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
 
20
  - **Developed by:** Bacem ETTEIB
21
- - **Funded by [optional]:** University of Luxembourg
22
  - **Model type:** Encoder only model
23
  - **Language(s) (NLP):** Tunisian dialect
24
  - **License:** MIT
25
- - **Finetuned from model [optional]:** GPT2
26
 
27
 
28
 
@@ -36,37 +40,4 @@ Fine tune on different downstream tasks such as sentiment analysis or dialect id
36
  ### Training Procedure
37
 
38
  <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
39
- Continual pretraining.
40
-
41
-
42
-
43
-
44
- ## Citation [optional]
45
-
46
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
47
-
48
- **BibTeX:**
49
-
50
- [More Information Needed]
51
-
52
- **APA:**
53
-
54
- [More Information Needed]
55
-
56
- ## Glossary [optional]
57
-
58
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
59
-
60
- [More Information Needed]
61
-
62
- ## More Information [optional]
63
-
64
- [More Information Needed]
65
-
66
- ## Model Card Authors [optional]
67
-
68
- [More Information Needed]
69
-
70
- ## Model Card Contact
71
-
72
- [More Information Needed]
 
1
  ---
2
  library_name: transformers
3
+ license: mit
4
+ language:
5
+ - ar
6
+ base_model:
7
+ - openai-community/gpt2
8
  ---
9
 
10
  # Model Card for Model ID
 
22
  This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
23
 
24
  - **Developed by:** Bacem ETTEIB
25
+ - **Funded by:** University of Luxembourg
26
  - **Model type:** Encoder only model
27
  - **Language(s) (NLP):** Tunisian dialect
28
  - **License:** MIT
29
+ - **Finetuned from model:** GPT2
30
 
31
 
32
 
 
40
  ### Training Procedure
41
 
42
  <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
43
+ Continual pretraining.