mtyrrell commited on
Commit
62b2b8f
·
1 Parent(s): bdaddd7

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -9
README.md CHANGED
@@ -7,20 +7,14 @@ metrics:
7
  model-index:
8
  - name: IKT_classifier_economywide_best
9
  results: []
10
-
11
- widget:
12
- - text: "Unconditional Contribution In the unconditional scenario, GHG emissions would be reduced by 27.56 Mt CO2e (6.73%) below BAU in 2030 in the respective sectors. 26.3 Mt CO2e (95.4%) of this emission reduction will be from the Energy sector while 0.64 (2.3%) and 0.6 (2.2%) Mt CO2e reduction will be from AFOLU (agriculture) and waste sector respectively. There will be no reduction in the IPPU sector. Conditional Contribution In the conditional scenario, GHG emissions would be reduced by 61.9 Mt CO2e (15.12%) below BAU in 2030 in the respective sectors"
13
- example_title: "ECONOMY-WIDE"
14
- - text: "System for railway Electrification of railway system and double track construction Improved and enhanced Inland Water Transport Implementation of solar irrigation pumps Installation of prepaid gas meter Phasing out HCFCs AFOLU Sector The cost estimate for the implementation of Key mitigations measures in the AFOLU sector under the unconditional and conditional scenario is outlined in table 7."
15
- example_title: "NEGATIVE"
16
  ---
17
 
18
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
19
  should probably proofread and complete it, then remove this comment. -->
20
 
21
- # IKT_classifier_econ_best
22
 
23
- This model is a fine-tuned version of [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) using the QA dataset from [GIZ/policy_qa_v0_1](https://huggingface.co/datasets/GIZ/policy_qa_v0_1).
24
  It achieves the following results on the evaluation set:
25
  - Loss: 0.1819
26
  - Precision Weighted: 0.9628
@@ -49,7 +43,7 @@ More information needed
49
  The following hyperparameters were used during training:
50
  - learning_rate: 4.427532456702983e-05
51
  - train_batch_size: 3
52
- - eval_batch_size: 8
53
  - seed: 42
54
  - gradient_accumulation_steps: 2
55
  - total_train_batch_size: 6
 
7
  model-index:
8
  - name: IKT_classifier_economywide_best
9
  results: []
 
 
 
 
 
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
  should probably proofread and complete it, then remove this comment. -->
14
 
15
+ # IKT_classifier_economywide_best
16
 
17
+ This model is a fine-tuned version of [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
  - Loss: 0.1819
20
  - Precision Weighted: 0.9628
 
43
  The following hyperparameters were used during training:
44
  - learning_rate: 4.427532456702983e-05
45
  - train_batch_size: 3
46
+ - eval_batch_size: 3
47
  - seed: 42
48
  - gradient_accumulation_steps: 2
49
  - total_train_batch_size: 6