Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
CHZY-1
/
sqlcoder-7b-2_FineTuned_PEFT_QLORA_adapter_alpha_r_32
like
0
PEFT
TensorBoard
Safetensors
English
trl
sft
Generated from Trainer
QLora
SQL
causal-lm
License:
cc-by-sa-4.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
Use this model
f204fbe
sqlcoder-7b-2_FineTuned_PEFT_QLORA_adapter_alpha_r_32
1 contributor
History:
2 commits
CHZY-1
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
f204fbe
verified
5 months ago
runs
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
.gitattributes
Safe
1.52 kB
initial commit
5 months ago
README.md
1.32 kB
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
adapter_config.json
Safe
644 Bytes
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
adapter_model.safetensors
Safe
67.1 MB
LFS
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
special_tokens_map.json
Safe
602 Bytes
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
tokenizer.json
Safe
1.84 MB
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
tokenizer.model
Safe
500 kB
LFS
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
tokenizer_config.json
Safe
1.9 kB
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago
training_args.bin
pickle
Detected Pickle imports (9)
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.SchedulerType"
,
"accelerate.state.PartialState"
,
"transformers.trainer_utils.IntervalStrategy"
,
"accelerate.utils.dataclasses.DistributedType"
,
"torch.device"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"transformers.trainer_utils.HubStrategy"
,
"trl.trainer.sft_config.SFTConfig"
How to fix it?
5.5 kB
LFS
Re-trained QLora Adapter with 260 data (5 epoch), R:32, Alpha:32, Dropout:0.1
5 months ago