cedricbonhomme commited on
Commit
5a10e1a
·
verified ·
1 Parent(s): b7d62c7

End of training

Browse files
Files changed (5) hide show
  1. README.md +13 -26
  2. emissions.csv +1 -2
  3. model.safetensors +1 -1
  4. tokenizer.json +2 -16
  5. training_args.bin +2 -2
README.md CHANGED
@@ -5,39 +5,26 @@ base_model: gpt2
5
  tags:
6
  - generated_from_trainer
7
  model-index:
8
- - name: vulnerability
9
  results: []
10
- datasets:
11
- - CIRCL/vulnerability
12
  ---
13
 
14
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
  should probably proofread and complete it, then remove this comment. -->
16
 
17
- # vulnerability
18
-
19
- This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on a the dataset [CIRCL/vulnerability](https://huggingface.co/datasets/CIRCL/vulnerability).
20
 
 
21
  It achieves the following results on the evaluation set:
22
- - Loss: 1.8112
23
 
24
  ## Model description
25
 
26
-
27
- ## How to get started with the model
28
-
29
- ```python
30
- from transformers import pipeline
31
- pipe = pipeline("text-generation", model="CIRCL/vulnerability-description-generation-gpt2")
32
-
33
- >>> print(pipe("A new vulnerability in OpenSSL allows", max_length=300))
34
- [{'generated_text': 'A new vulnerability in OpenSSL allows remote attackers to create insecure connections. The impact of this vulnerability is that one or more TLS connections will be created under one username or one username/logon in a session for which another username or logon is valid. An attacker that can control the username or logon string of an openSSL host can effectively manipulate the OpenSSL host in a way that enables the attacker to create arbitrary openSSL connections by calling `http-server-create` in a non-secure sequence across other hosts. The vulnerability may be used to perform a man-in-the-middle attack, making the attacker completely different to the attacker. An exploitation may include MITM attacks and man-in-the-middle attacks. NOTE: the vendor states that "SUSE OpenSSL\'s implementation of \'openSSL_connect`, is not vulnerable to MITM attacks. If the attack vector is a MITM attack, OpenSSL will work under any circumstances." The CVE has been assigned for tracking purposes. In no way does the vendor\'s position change that an OpenSSL client should not use openSSL in the context of another OpenSSL server, but an attacker must choose the vulnerability according to their configuration if they are to exploit their attack. NOTE: the vendor indicates that it has considered the impact of this vulnerability "moderate". If by any measure, an OpenSSL client is susceptible to MITM attacks, that vulnerability would be considered low because it would be difficult to exploit a vulnerability that'}]
35
- ```
36
-
37
 
38
  ## Intended uses & limitations
39
 
40
-
41
 
42
  ## Training and evaluation data
43
 
@@ -59,16 +46,16 @@ The following hyperparameters were used during training:
59
 
60
  ### Training results
61
 
62
- | Training Loss | Epoch | Step | Validation Loss |
63
- |:-------------:|:-----:|:-----:|:---------------:|
64
- | 1.0023 | 1.0 | 24029 | 1.9373 |
65
- | 0.937 | 2.0 | 48058 | 1.8403 |
66
- | 0.9287 | 3.0 | 72087 | 1.8112 |
67
 
68
 
69
  ### Framework versions
70
 
71
  - Transformers 4.49.0
72
  - Pytorch 2.6.0+cu124
73
- - Datasets 3.3.1
74
- - Tokenizers 0.21.0
 
5
  tags:
6
  - generated_from_trainer
7
  model-index:
8
+ - name: vulnerability-description-generation-gpt2
9
  results: []
 
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
  should probably proofread and complete it, then remove this comment. -->
14
 
15
+ # vulnerability-description-generation-gpt2
 
 
16
 
17
+ This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 2.9055
20
 
21
  ## Model description
22
 
23
+ More information needed
 
 
 
 
 
 
 
 
 
 
24
 
25
  ## Intended uses & limitations
26
 
27
+ More information needed
28
 
29
  ## Training and evaluation data
30
 
 
46
 
47
  ### Training results
48
 
49
+ | Training Loss | Epoch | Step | Validation Loss |
50
+ |:-------------:|:-----:|:----:|:---------------:|
51
+ | No log | 1.0 | 225 | 3.2806 |
52
+ | No log | 2.0 | 450 | 2.9980 |
53
+ | 1.7017 | 3.0 | 675 | 2.9055 |
54
 
55
 
56
  ### Framework versions
57
 
58
  - Transformers 4.49.0
59
  - Pytorch 2.6.0+cu124
60
+ - Datasets 3.3.2
61
+ - Tokenizers 0.21.0
emissions.csv CHANGED
@@ -1,3 +1,2 @@
1
  timestamp,project_name,run_id,experiment_id,duration,emissions,emissions_rate,cpu_power,gpu_power,ram_power,cpu_energy,gpu_energy,ram_energy,energy_consumed,country_name,country_iso_code,region,cloud_provider,cloud_region,os,python_version,codecarbon_version,cpu_count,cpu_model,gpu_count,gpu_model,longitude,latitude,ram_total_size,tracking_mode,on_cloud,pue
2
- 2025-02-20T10:52:12,codecarbon,d351206f-c536-487e-9b9d-6dd923f68ba6,5b0fa12a-3dd7-45bb-9766-cc326314d9f1,11060.430779200047,0.1966975604750237,1.778389688446204e-05,42.5,660.8094434377647,94.34470081329346,0.1304955346793106,1.4484598773780704,0.2896742153997869,1.8686296274571668,Luxembourg,LUX,luxembourg,,,Linux-6.8.0-48-generic-x86_64-with-glibc2.39,3.12.3,2.8.3,64,AMD EPYC 9124 16-Core Processor,2,2 x NVIDIA L40S,6.1294,49.6113,251.5858688354492,machine,N,1.0
3
- 2025-02-24T12:30:18,codecarbon,cbc49903-c6b4-4c11-a5f0-19d2bfd1ba03,5b0fa12a-3dd7-45bb-9766-cc326314d9f1,21351.801025332883,0.3785538638987247,1.7729364536958205e-05,42.5,181.56609208587705,94.34470081329346,0.2519147019968381,2.7851517931196383,0.5592006063220578,3.5962671014385363,Luxembourg,LUX,luxembourg,,,Linux-6.8.0-48-generic-x86_64-with-glibc2.39,3.12.3,2.8.3,64,AMD EPYC 9124 16-Core Processor,2,2 x NVIDIA L40S,6.1294,49.6113,251.58586883544922,machine,N,1.0
 
1
  timestamp,project_name,run_id,experiment_id,duration,emissions,emissions_rate,cpu_power,gpu_power,ram_power,cpu_energy,gpu_energy,ram_energy,energy_consumed,country_name,country_iso_code,region,cloud_provider,cloud_region,os,python_version,codecarbon_version,cpu_count,cpu_model,gpu_count,gpu_model,longitude,latitude,ram_total_size,tracking_mode,on_cloud,pue
2
+ 2025-02-25T06:09:09,codecarbon,73cbee70-7925-41ec-83e8-7f642fd618fc,5b0fa12a-3dd7-45bb-9766-cc326314d9f1,212.86531715653837,0.003588920371916175,1.6860052261481988e-05,42.5,191.8382004803663,94.34470081329346,0.002511619702152287,0.026008019973062346,0.005575155000181051,0.034094794675395675,Luxembourg,LUX,luxembourg,,,Linux-6.8.0-48-generic-x86_64-with-glibc2.39,3.12.3,2.8.3,64,AMD EPYC 9124 16-Core Processor,2,2 x NVIDIA L40S,6.1294,49.6113,251.58586883544922,machine,N,1.0
 
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:571d7484c95c00bf2a1e9c5ba1179df552c2de3c2628508b767ec2977bc2c756
3
  size 497774208
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9427cdd3624a203ac07e0baf04281588e0b5597497c826671807faad3e556926
3
  size 497774208
tokenizer.json CHANGED
@@ -1,21 +1,7 @@
1
  {
2
  "version": "1.0",
3
- "truncation": {
4
- "direction": "Right",
5
- "max_length": 512,
6
- "strategy": "LongestFirst",
7
- "stride": 0
8
- },
9
- "padding": {
10
- "strategy": {
11
- "Fixed": 512
12
- },
13
- "direction": "Right",
14
- "pad_to_multiple_of": null,
15
- "pad_id": 50256,
16
- "pad_type_id": 0,
17
- "pad_token": "<|endoftext|>"
18
- },
19
  "added_tokens": [
20
  {
21
  "id": 50256,
 
1
  {
2
  "version": "1.0",
3
+ "truncation": null,
4
+ "padding": null,
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  "added_tokens": [
6
  {
7
  "id": 50256,
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8936963c9e1d8ad9629ed527b2028deedfedbf64e3ac8569ec3693cc4c6efaa5
3
- size 5304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6f57248cd68a5287b1c079bb669b90fa07c2ff8f7a44299a1911daefa2da3cbd
3
+ size 5368